I have a suspicion. AMD has always focused more on memory bandwidth then NVIDIA, where NVIDIA tends to favor raw shader performance. NVIDIA's approach works well when their cards get released, but as they age and new games naturally start to use more and more VRAM, that lesser VRAM performance causes NVIDIA's cards to age less gracefully then AMDs.
If I'm right, we should see NVIDIAs performance track with how much VRAM is on their cards (at a given settings level) to a point, at which point the difference in shader performance starts to matter more.