Six Generations of GeForce GPUs Compared: From GTX 480 to GTX 1080

Love this article, Steve- thanks for doing it.

My SLI Gigabyte 2 GB GTX 670 Windforce setup (further OC'd by me and pretty much offering GTX 690 performance) has performed admirably at 1440p, and even today I've barely had to make concessions on the latest games. But times-are-a-changing, and I'm waiting on the 1080 Ti. I have about $750 USD into my four year old GPUs, so I'll be relieved if the 1080 Ti is close to that $ - and offering wayyyy more performance.

Funny how a lot of folks are complaining about Nvidia's pricing; I guess they don't get that the price-to-performance ratio is even better this generation! A $250 USD 1060 hangs with the (retail) $550 USD GTX 980 from the last generation?? I'm planning on an eventual upgrade to 4K, so that's the only reason I didn't pounce on the regular 1080. I've always had good luck with SLI personally, so if I really get crazy- another one is still an option!

Anyway, cheers and Happy Holidays good sir!

Dave
Minnesota, USA
 
This is a great article.
Is it possible for you to do a similar comparison but between last 5 or 6 generations of Intel CPUs? I bet the difference won't be very big. That's why I still don't really find the reason to upgrade my 5 year old 2500k overclocked to 4.4ghz. AMD and their lack of true competition is to blame, of course.

Cheers
 
I think this article is lacking as VERY little attention was paid to the pricing of the cards. Probably not so simple to do, but the that one sentence and one row in a table really doesn't cut it. I find this strange, as reviewing the evolution of Nvidia cards should also take into account the price increase for the high-end cards...
 
I think this article is lacking as VERY little attention was paid to the pricing of the cards. Probably not so simple to do, but the that one sentence and one row in a table really doesn't cut it. I find this strange, as reviewing the evolution of Nvidia cards should also take into account the price increase for the high-end cards...
Should the article also take into account the price differences world wide to accommodate your wish? There has to be a stopping point somewhere, why not stop where they did?
 
Should the article also take into account the price differences world wide to accommodate your wish? There has to be a stopping point somewhere, why not stop where they did?

Worldwide? No need for hyperboles. I'm not specifying how much exactly they 'should've' done, I'm no tech editor...
Isn't price usually (excepting a small group of enthusiasts) one of the deciding factors for a purchase, one of the most important aspects of a product? I'd say that definitely warrants more attention than it got here.
 
Great article, thanks!
Nice to see some benchmarks at 1600p rather than the inferior, but more prevalent 1440p.
Inferior my arse nothing like a snob 1440p is really good and of course there are better resolution and that for me would be 4k over the inferior 1600p.
1440p or 16:9 actually displays more screen than 1600p or 16:10 in games. Every game I have come across is coded this way.
 
Would it be possible to do the testing since the earliest of cards like GeForce 256 and Geforce 2/3? I know it's impossible to test using the same set of games, but using multiple generations of games with overlap in few GPU generations it should be possible to establish a reasonable conclusion on the relative power of GPUs.

I know it would be a lot of work, but it would be just so interesting. Pretty please?

To what end? I am not sure digging up an old AGP system to show how badly these 16 year old GPU's perform in any game made in the last decade would be good use of my time ;)

To what end? There's already a reason stated in the article. "Whether you are looking to upgrade from an older GPU or simply appreciate the statistical significance of the data presented, ..."

I haven't see it done anywhere. But wouldn't it help appreciate the advance GPU's have made over all these years? Besides, it's simply interesting. Why else this was done in the first place?


If you want a more overreaching set of articles showing how GPUs have advanced over the years, you want to start here -- https://www.techspot.com/article/650-history-of-the-gpu/ -- & read all 4 parts.

As for benchmark comparisons...the problem is there would be little to no meaningful comparison between a 4 core/4 texture mapper/4 ROP card with 32 or 64 MB of VRAM (GeForce 256) & even the 640 core/40 texture mapper/32 ROP card with 2 GB of VRAM (GTX 1050), let alone the 2,560 core/160 texture/64 ROP & 8GB VRAM monster (GTX 1080).

I'm not saying it wouldn't be possible to do comparisons...but you'd have to figure out cut-off points (I.e. what games were the last ones that could be played with 32MB of VRAM), then dig up the old hardware needed to run a PCI or AGP graphics card (including the necessary RAM & CPU for that)...& dig up working copies of Windows 98 & XP to even run the software.

And, to be honest, I'm not sure how useful such a comparison would really be. A GTX 1080 easily has the power of at least 50 GeForce 256s (at least in terms of VRAM bandwidth -- in terms of fillrates, it's more like 200 to 500 times).
 
Those line graphs tell a big story. The steep up-turn for the 1080 is very apparent. I wonder if GPU's can keep that momentum in the next generation of cards or will they flatten out in minor evolutionary upgrades as the 780 - 980 did? Only time will tell I suppose. But it's good to see the chips are in a much better place.

We shouldn't expect much more graphics horsepower until nVidea is able to produce GPUs using a 10 nm process. They're working on it, I'm sure, but there will be a wait. Until then, expect tweaks but nothing very substantial.

The 1080 Ti is a tweak: it's a cut-down Titan which will slot between the 1080 and the Titan in both price and performance. After it's out, it'll probably be a long while before anything truly new arrives from nVidea in terms of either power consumption or graphics performance.

Until 10 nm GPUs get here, what's interesting is pricing. Lately, nVidea has been happily pushing GPU prices upward. If AMD can produce competitive GPUs at lower prices next year, it should drag nVidea's pricing downward somewhat. Unfortunately, that isn't a sure bet. AMD is lagging behind badly to this point, and their Radeon unit seems to be troubled.

One thing this article demonstrates is that for lower-resolution gaming, those older nVidea cards aren't all that bad on most titles. You don't need Pascal GPU to drive older monitors, because you really can't get a better gaming experience out of 190 FPS.

Where you run into problems with older GPUs is trying to drive higher-resolution displays, displays which most gamers don't own: 4k monitors and high-end ultra-wides and the like. If you get a truly high-resolution monitor, you'll need a high-end Pascal card, maybe even in SLI, to drive it.

If you do make that jump, I don't think obsolescence is going to hit you smack between the eyes very quickly. Getting to 10 nm is a tougher engineering challenge than any prior GPU process jump. Pascal is going to be around for a while.
 
Why not test these cards using these games: Dirt Rally, Grid, Game Stock Car Extreme to name a few. They are very demanding on frame rates as well as all the other games. Last year racing gamers spend 265 million on games. Never mind how much they spent on computer gear.So they are worthy of being tested.
Gee, how many other people will make the same request with THEIR favorite games? Do you expect Techspot to test 200 games? If the games you mentioned are as demanding as Crysis 3 or Mtro Redux, use yer noggin and figure that the frame rates would be similar. It's not that complicated...
 
Would it be possible to do the testing since the earliest of cards like GeForce 256 and Geforce 2/3? I know it's impossible to test using the same set of games, but using multiple generations of games with overlap in few GPU generations it should be possible to establish a reasonable conclusion on the relative power of GPUs.

I know it would be a lot of work, but it would be just so interesting. Pretty please?
What's the point? That's like asking Car and Driver to test a '74 Dodge Dart. It's meaningless in 2016.
 
If you are rocking a 16 year old GPU you probably aren't interested in gaming, or modern computing for that matter. You are talking about a DX7 part here that uses an interface I haven't seen in a decade. How would we even go about making a meaningful comparison?

Outside of people making "classic" computers, nobody is. But that would not be the point. There's value in charting progress in itself. Or as you put it, "appreciate the statistical significance of the data presented"

You can't compare directly via fps. But you can do it relatively. Take lowest couple of cards, like 480 and 580 or even with 680 and retest them, along with older cards like 280, 9800gtx, etc or whatever makes sense in terms of dx3d or interface, with a different set of games from 2007-08, like company of heroes, bioshock, etc. down until performance drops off. Then pick a few lowest generations of that set, and retest them again with another, still older set of games. Those couple of cards will serve as a bridge to stitch a graph of performance along X axis, and compare all the cards all the way back to 256, like the chart on page 6 does now. Except that on Y axis it will not be possible to show unbroken fps. It would need to have separate fps counter for each set of games.

It's basically GeForce 256, 2, 3, 4, 5, 6, 7, 8, 9 and 200 series, + however equivalent releases are within series. It shouldn't be more than a few different sets, unless I'm horrifically wrong about agp/dx3d/etc support in cards and games. I'm sure it's a pain to source all those old cards, MOBOs and CPUs and drivers, but if successful, comparison would be just so epic.
Wow- do you think Techspot is your butler, fulfilling your every whim no matter how stupid or unpopular? Use your imagination- they sucked then and they really suck now. Or better yet, start your own website and do these utterly meaningless (and NON-interesting) tests yourself. Your arrogance is shocking.
 
Nvidia cripples old graphic card performance with new drivers. So if latest driver version was used with every card, including Keplers, Nvidia's development looks way too rosy.
FALSE. That's a myth- and you bought it hook, line and sinker- like the typical AMD fanboy. Why haven't my four year old SLI 670s slowed down then? They haven't- and unlike yourself, I own Nvidia cards and have been monitoring their framerates since day one, and I revisit games like Crysis now and then.

Don't believe everything you're told, kiddo.
 
Techspot please, in´st a real comparison, nobody uses a high-end card with no AA settings, a real comparison needs AA settings, because higher settings shows less diferences in performance, low settings is just a marketing tool

have great day
I hardly ever use AA- it's usually not necessary on a high-res monitor. So yeah- it's VERY real. Take care now.
 
Where's the GTX 1080 Ti?

OTOH the GTX 400/500s are still relevant because of dx11 and more recently dx12 support.
So how about treat all hardware that are relevant(=usable for the latest SW) with every feature article like the list with the worst GPUs?
 
Where's the GTX 1080 Ti?

OTOH the GTX 400/500s are still relevant because of dx11 and more recently dx12 support.
So how about treat all hardware that are relevant(=usable for the latest SW) with every feature article like the list with the worst GPUs?
this article was published before the Ti came out ( I think before it was even confirmed?)
 
I feel many enthusiasts make the mistake I caught myself making - considering a card obsolete when it stops being a total overkill in the newest games on ultra. Once you remember that there are also high settings the useful life of the cards is extended by quite a lot, especially for the flagship products - I'd say easily three years. Plus, it gives us something to tinker with, which is always a plus in my book! Great work as usual, Steve!
 
Great article, it's great to see how far graphical power has jumped since the 400 series to the current 1000 series.

If I may suggest perhaps you could maybe do an article featuring all the Titans cards from the first one to the latest Titan XP Pascal :D
 
Back