Fermi through Maxwell: 5 Generations of GeForce Graphics Compared

It was probably just a case of "don't buy an 800w PSU for $40'' but I've been paranoid ever since.

That was the problem. I have seen dirt cheap PSU's take out entire systems when they blow.
 
That was the problem. I have seen dirt cheap PSU's take out entire systems when they blow.
Aye. Most of the catastrophic system failures I've ever dealt with arose from no-brand PSUs failing. The second biggest cause are probably CCFL transformer failures. The PSU should be the keystone of the system, yet people are driven to channel their budget to the flashy components and modding supplies thinking they can get away with a cheapo PSU with more ripple than an over-70's aerobics class.
Even if the system appears to work, many stability and longevity issues including add-in cards, I/O hubs, and harddrives usually accumulate as the PSU ages.

Oh, and primo article!
 
Well unless you have a 4,000w PSU it's pretty hard to run much of anything at 10% under load. I'm going to assume that I use between 400-500w under load and that puts my 1050w PSU at around 50% load. I want a PSU that far exceeds my needed capacity because I don't want to stress it, not because I'm looking for efficiency. The extra efficiency is just a bonus. My goals are to baby the PSU to protect the other components.
Dont get me wrong, I'm not saying anyone should run their psu at 100% even for a minute. But new haswell system with new graphics can run at 50watt idle. And people are idling their computers most of the time when browsing the internet and stuff like that. But gtx 970 and stock haswell i5 constumption under game load is below 300w.

Sure, buy 800w good psu if you have your i7/FX overclocked and you have something like heavily overclocked 290x or even 970 in sli. Most of us are not running anything like that.
 
I have seen them catch on fire and burn furnisher ;)
My older corsair tx 650 burnt (yes smoke and fire, I always though plastic used there is not supposed to catch fire so easily) one its sata connectors to a crisp (seem like it was damaged and made a short). Single rail did not shut down my computer then, I took the power connector out, so in a way I was lucky I sat there in front of my opened computer when finished installing something. Sure computer and PSU is working fine after removing that sata connector, but still... makes me think
 
So the tip is true. If you own a certain gen, you can skip the following one and wait one more round. Owners who own a GTX 680 are in for a treat if they go for a GTX 980. Owners of a GTX 780Ti not so much.
 
My older corsair tx 650 burnt (yes smoke and fire, I always though plastic used there is not supposed to catch fire so easily) one its sata connectors to a crisp (seem like it was damaged and made a short).
If I understand you correctly, that is not the fault of the PSU. I have also witnessed a SATA connector catch fire. Luck would have it, the fire was put out quick enough the hard drive that was connected was not damaged. And the damaged connector was actually replaced as it was a 4pin/SATA adapter. It wasn't a short, it was a bad connection. Bad connections will cause extreme heat, enough heat to start fires. This is one reason why I don't want a modular power supply.
 
I also loved this article. I believe there are plenty of people that want to see the older generation cards tested along with the newest, especially nowadays. This is a tremendous help to people looking to gauge the performance gain they'll see from upgrading their older GPU, plus it's just interesting to see how the performance has scaled over the years.

It does hurt a bit when I think about how I dropped $500+ on the 480 and 580 back when they launched... They definitely haven't aged very well, lol. This is the reason I really don't like to spend major $ on GPUs anymore. They lose their value so fast that it's not even funny. Definitely not a good investment over the long term.

I have a GTX 580 (bought around the end of 2010 according to my Newegg information) and I feel I've gotten my money's worth. I spent $533 on it and I bought it shortly after release. I definitely don't feel like it was a bad investment. I'll probably use it another year, maybe two. At five years that's about $106 a year. That's less than buying two full price games for a console. The thing is I can still play whatever I want with the card with settings maxed out on most games (Borderlands 2, Titanfall, World of Warcraft, for example). I guess it also depends on the rest of your system configuration too though.

I love this article by the way, it also helps me determine it's about time for an upgrade judging by the performance of the GTX 980 versus my GTX 580. I'll probably wait one more generation though.

Borderlands 2,titan fall or WOW aren't games that require alot of gpu power that's why you are still fine.

Metro, Battlefield 4, any many others would be unplayable at max settings on that gpu.

That aside its a great article and waiting to see the Red's version in the future.

The thing is, I have no desire to play those games. I have Metro 2033 and Battlefield 4 don't care for either of them. Played them for a few hours and dropped them. I'm also not saying I need to play all my games at max eye candy either. Just because I can't play a modern game at the maximum quality settings anymore doesn't mean I need to upgrade.
 
Your 980 would actually do just fine with 1440p. It's way overkill for 1080p, at least.

I wouldn't call it way overkill. If you play at 120 or 144fps you still can't max many new games. Or if you want the minimum frame rate to be 60 instead of the average you need to average much higher. Honestly I tried crisis 3 with full AA the day my card came in and I still had 30 fps dips and with new games being a broken mess you need so much power just to keep them running unfortunately. I say this as I go back to running 2d indie games like ftl...
 
If I understand you correctly, that is not the fault of the PSU. I have also witnessed a SATA connector catch fire. Luck would have it, the fire was put out quick enough the hard drive that was connected was not damaged. And the damaged connector was actually replaced as it was a 4pin/SATA adapter. It wasn't a short, it was a bad connection. Bad connections will cause extreme heat, enough heat to start fires. This is one reason why I don't want a modular power supply.
No, I must of damaged it somehow while changing disk configuration. But nothing was connected on that sata connector when I turned on the power. Actually that whole cable had nothing connected on it. All connected drives were on the other 4x power SATA cable. But connector was instantly on fire after some sparks and plastics started to melt between two pins.
 
Extremely awesome article. We need more stuff like this.

Can't wait for the AMD equivalent article on the Omega Catalyst driver.
Yeah I was about to say the same thing, this type of article is unique and fantastic. Can't wait for the AMD version.

It is very interesting to see the progression over the years. The difference between the 480 and 680 accentuates how important more VRAM and better architecture is.
 
Last edited:
It is very interesting to see the progression over the years. The difference between the 480 and 580 accentuates how important more VRAM and better architecture is.
I presume you mean 980 ? The 580 (GF 110) was basically the same GF 100 of the 480, but fully enabled and with a metal layer respin.

The importance of vRAM in high end graphics is overstated for the most part (3GB GTX 580's made little is any impact at the time). The only real difference is when you deliberately overwhelm a standard framebuffer with ultra high res textures - then a larger framebuffer looks like a better deal as evidenced here.

The single largest difference in the architectures was Nvidia moving away from hot-clocking shaders. Running a smaller amount of shaders (I.e. 512 in the GTX 580) at double the core frequency imposed a huge power penalty on the architecture - even now with the prodigious (by comparison) speed increases and process node shrink for Maxwell, you don't see the 1544MHz stock shader clock of 580.
 
I presume you mean 980 ? The 580 (GF 110) was basically the same GF 100 of the 480, but fully enabled and with a metal layer respin.

The importance of vRAM in high end graphics is overstated for the most part (3GB GTX 580's made little is any impact at the time). The only real difference is when you deliberately overwhelm a standard framebuffer with ultra high res textures - then a larger framebuffer looks like a better deal as evidenced here.

The single largest difference in the architectures was Nvidia moving away from hot-clocking shaders. Running a smaller amount of shaders (I.e. 512 in the GTX 580) at double the core frequency imposed a huge power penalty on the architecture - even now with the prodigious (by comparison) speed increases and process node shrink for Maxwell, you don't see the 1544MHz stock shader clock of 580.
Oh yeah, typo. I corrected it. I meant to say 480-680 (Fermi -> Keplar).

Very interesting point about the hot-clocking. I never noticed that.
 
Back