Six Generations of GeForce GPUs Compared: From GTX 480 to GTX 1080

To what end? There's already a reason stated in the article. "Whether you are looking to upgrade from an older GPU or simply appreciate the statistical significance of the data presented, ..."

I haven't see it done anywhere. But wouldn't it help appreciate the advance GPU's have made over all these years? Besides, it's simply interesting. Why else this was done in the first place?

If you are rocking a 16 year old GPU you probably aren't interested in gaming, or modern computing for that matter. You are talking about a DX7 part here that uses an interface I haven't seen in a decade. How would we even go about making a meaningful comparison?
 
All I see are the curves on the line graph at the end, it's exponential and I like how the things are evolving. Look like I'll really experience mind-blowing things in my life with all that technology.
 
Nvidia cripples old graphic card performance with new drivers. So if latest driver version was used with every card, including Keplers, Nvidia's development looks way too rosy.
 
Those line graphs tell a big story. The steep up-turn for the 1080 is very apparent. I wonder if GPU's can keep that momentum in the next generation of cards or will they flatten out in minor evolutionary upgrades as the 780 - 980 did? Only time will tell I suppose. But it's good to see the chips are in a much better place.
Well, if you look at it from a different angle, it makes more sense. 780 was derived from big kepler gpu, while 980 is not. You could actually compare 680->980 and 780Ti-> 980Ti. Don't know where exactly 780 is supposed to be, but I'd say if they would release it earlier, nvidia would call it 680Ti
 
If you are rocking a 16 year old GPU you probably aren't interested in gaming, or modern computing for that matter. You are talking about a DX7 part here that uses an interface I haven't seen in a decade. How would we even go about making a meaningful comparison?

Outside of people making "classic" computers, nobody is. But that would not be the point. There's value in charting progress in itself. Or as you put it, "appreciate the statistical significance of the data presented"

You can't compare directly via fps. But you can do it relatively. Take lowest couple of cards, like 480 and 580 or even with 680 and retest them, along with older cards like 280, 9800gtx, etc or whatever makes sense in terms of dx3d or interface, with a different set of games from 2007-08, like company of heroes, bioshock, etc. down until performance drops off. Then pick a few lowest generations of that set, and retest them again with another, still older set of games. Those couple of cards will serve as a bridge to stitch a graph of performance along X axis, and compare all the cards all the way back to 256, like the chart on page 6 does now. Except that on Y axis it will not be possible to show unbroken fps. It would need to have separate fps counter for each set of games.

It's basically GeForce 256, 2, 3, 4, 5, 6, 7, 8, 9 and 200 series, + however equivalent releases are within series. It shouldn't be more than a few different sets, unless I'm horrifically wrong about agp/dx3d/etc support in cards and games. I'm sure it's a pain to source all those old cards, MOBOs and CPUs and drivers, but if successful, comparison would be just so epic.
 
No that is incorrect. The 980 and 780 Ti, 780 and so on were re-tested mate, the results were only updated where we found changes. There are a number of changes to most of the games so I am not sure why you are seeing the exact same results. There were a few instances where the 780 Ti and 980 were being bottle-necked on the old system and you will see slightly better performance now, particularly at the lower resolutions.

Ah, thanks. Good to know you retested. I just looked at the Crysis 3 and Bioshock Infinite benchmarks on the front pages of the articles and jumped to conclusions from that. It may be that these are totally GPU bound, but it was still strange to see the exact same scores on a system that's different in most respects and with different drivers (even just given variability of different runs of the game on the same system). Looking further now, it does look like Tomb Raider gets a higher score with the 980 on the new system. Metro Redux and Battlefield 4 have the exact same scores, but Sleeping Dogs has higher scores for the 980 and 780 Ti, and a slightly different score for the 780, which does suggest a retest.
 
Techspot please, in´st a real comparison, nobody uses a high-end card with no AA settings, a real comparison needs AA settings, because higher settings shows less diferences in performance, low settings is just a marketing tool

have great day
 
Waiting for the _70 series review for common gamers ( budget enthusiasts? :-D ). Like 470, 570, 670, 770, 970, 1070.
 
I now have the Gigabyte GTX 1080 G1 Gaming gpu and have to say.. it's pretty amazing. I upgraded from the GTX 780 Classy and was quite shocked as to how much the 1080 chews up and spits out whatever you throw at it. I've had quite a few of the cards in this article.. 480, 580, 780, and now the 1080. Been a fun ride so far! :p
 
That's a good article. Excellent. It must take a while to get all the cards and do the testing. I am still using a 460GTX and would love to see the same style article but comparing all the budget models instead. 460/560/660 etc. But only because I want to see where my card ranks compared to todays budget cards.
Agreed! Personally I should like to see the various top 128-bit cards ( enough to play with, but not blasting a hole in the energy budget - 'green' cards ). GTS450, GTX650 Ti, GTX750 Ti, GTX 960 and the appropriate 10 series card if it ever arrives.
 
If you are rocking a 16 year old GPU you probably aren't interested in gaming, or modern computing for that matter. You are talking about a DX7 part here that uses an interface I haven't seen in a decade. How would we even go about making a meaningful comparison?

Outside of people making "classic" computers, nobody is. But that would not be the point. There's value in charting progress in itself. Or as you put it, "appreciate the statistical significance of the data presented"

You can't compare directly via fps. But you can do it relatively. Take lowest couple of cards, like 480 and 580 or even with 680 and retest them, along with older cards like 280, 9800gtx, etc or whatever makes sense in terms of dx3d or interface, with a different set of games from 2007-08, like company of heroes, bioshock, etc. down until performance drops off. Then pick a few lowest generations of that set, and retest them again with another, still older set of games. Those couple of cards will serve as a bridge to stitch a graph of performance along X axis, and compare all the cards all the way back to 256, like the chart on page 6 does now. Except that on Y axis it will not be possible to show unbroken fps. It would need to have separate fps counter for each set of games.

It's basically GeForce 256, 2, 3, 4, 5, 6, 7, 8, 9 and 200 series, + however equivalent releases are within series. It shouldn't be more than a few different sets, unless I'm horrifically wrong about agp/dx3d/etc support in cards and games. I'm sure it's a pain to source all those old cards, MOBOs and CPUs and drivers, but if successful, comparison would be just so epic.
Sounds interesting... Why don't you do it?
You won't/can't? Then why are you complaining? We want people to get on dx12 and here you are complaining that nobody cares about dx7 cards..
 
Very informative and interesting article showing how much progress has been made with gpu's.
Can you also please do one for AMD as well one day?

Would love to see their progression throughout the years.
 
> GTX1080 Price at release: $600.
> Even a few months later after release it's still not $600...
 
If you ask me the progression seems slow
it probably could have been with much greater leaps if they didnt try to milk every generation
they artificially make progress slow to achieve maximum bank
 
they artificially make progress slow to achieve maximum bank
I for one don't care how long they milk each generation. They need to create job security somehow. There is a such thing as putting your cart before your horse.

It is the needless desire to create bugs and prolonging fixes in each generation that irritate me. You may think they are artificially creating slow progress. I don't think so. If they sped up progress, you would see a massive increase in the amount of bugs for each generation.
 
I'm not sure how you can look at the 1080 and say that isn't major progress in performance vs. power efficiency? The jump to 4K is some major pixel pushing that can't be overstated enough. That resolution needs all you can throw at it.
 
Back