Top 10 Most Significant Nvidia GPUs of All Time

1080 has abismal tdp. GTX 1060 is the stuff of legends, glaring omission.

Not sure where you get that info. If you look at TechPowerUp's performance/watt metrics for 1440p (the 1080's target gaming resolution), the 1080 comes out on top for efficiency for Pascal GPUs. The 1060 is notably worse at all resolutions. See here:

https://www.techpowerup.com/review/nvida-geforce-rtx-2070-super/28.html

Here are some updated numbers from a more recent review just before they stopped testing Pascal GPUs. 1080 still tops for Pascal, well above the 1060:

 
Last edited:
Those FX graphics or MX versions are the baddest from the bad, the lowest end (usually older chips renamed) gpu and memories, very narrow busses and very low performance. Garbage.

In general terms I would agree but there are exceptions.
The Geforce 2mx was brilliant for budget gaming, as was the Geforce 4 mx 440 if I recall.
 
Performance/watt charts are mostly pure undiluted bs, they even manage to make ampere -a massive flop- look good.
 
Oh, you just don't like higher power GPUs. Fair enough.

I used to game exclusively on iGPUs, so the 1060 is a total power pig compared to those, using 8x the wattage.

Whatever line in the sand anyone chooses as "too much" is completely arbitrary.
 
Why is lower better?

And if lower actually *was* better, then the 1650 is the best Nvidia card, not the power pig 1060. No wait it's the 1030, which uses 30W. And Intel Integrated Graphics are best as they only use about 15W.
 
Dumb . the 1060 unlike those cards is COMPETENT, like just - 15 / 20 fps in Valhalla at 1080p vs the 3060... a turrd that consumes x3 more. ravaged
 
Which just means you've drawn an arbitrary line in the sand at 120W. The odd thing is you didn't draw it at the 1650 Super, which is faster than the 1060 yet consumes 20W less.

And then there's the problem that the 1060 is too underpowered to run lots of current games at 1440p. So you could choose the 1660 Super which takes all of 5W more and yet is 35% faster at 1440p, getting you 60fps on average.

The 1060 is old news and not particularly efficient 4 years later. It's not bad, just old and getting slow.
 
And on average the 3060 gets just under 2x the FPS of the 1060:


at only 1.5x the power usage. I don't know where you're getting your 3x number from but go back to the graph you posted previously and look again. It's hard to take someone seriously who exaggerates their claims by 2x.
 
Im seeing up to 160W gpu power in riva turner... in as creed ( 3060) my old pascal does the same but a 50- 60W! so my claims are based on reality no fairy dust bs. and look for 3060 reviews not the 6700 the numbers are there
 
Those results are relevant for one game under one set of gaming conditions. Which is fine if that's all you do. However TS , TPU, and every other testing site uses a suite of games as not everyone uses those conditions. For instance, if I was using a 1060 for play Horizon Zero Dawn at 1440p (which I'm doing), I'd be getting less than 40fps and using 115-120W, which is not good enough. Instead I'm using a 1080 to get 60-65fps under the same conditions and I undervolt it to 0.9V so it uses 135W max.

Huge win for the 1080 there but again, that's a single condition.
 
Igp pci agp pcie future vr needs more bandwidth.


If s3 card could arrive marked we would not needing ati nvidia and matrox gpus so much in that time. voodoo was great but to ekspensive.
3dfx card like voodoo 1 2 3 4 5 was great. nglide tool
can now be used simulating opengl 3dfx gaming.
playing with that gpu would be LIKE playing d3d and melted stone in some games to look perfect. ps1 gpu was low end.
but getting 3dfx on a pc would run better games like crock 1 2.
pixels would go away in better rendering. some agp x8 could only run low res games pcie could run doubble. we missed agp x16 as it was newer was released.
if AGP x16 x32 x64 x128 was still out and competing with pcie it may had a leading role.

when nvidia bought up 3dfx they could now use d3d opengl pcie to get gaming gpu too look better. a 4 mb s3 virge 375/385 could run many 3dfx titles in low OPEN GL d3d. when 3dfx was taken away they lost competetion to nvidia and later ati. they was fast in that time. I was playing carmageddon 1 2 on a p 60 128 mb ram and 128 sb sound card. now trying that on 4 mb it would simply crash. even a p 233 game run good. shadow man. you could run in all modes . resident evil had many gpu selction modes too. re 2 could change between low high and turn of some zombies ingame for less rendering time. re 3 was more future proofed.


re 4 was no more like r e 1 2 3. that changed many game to b e picture perfect and no more low res.
the fine with even doom e +exp pack can now run as low as 320x200 psx1 and high as 4k or dual screens 2k 4k screens for getting 5k 6k 8k supported.
if win 9x se could have been brought back in time we could played many old games today. but most of games are in 16 fat mode and cant even run. fat 32 was taking over. games that was made for fat 16 was not taken in to the heat. they was set out into the cold. try getting doom 95 running.
power vr gpu s was also out so talk about that too.



blood shadow warrior has gotten remakes


while blood 2 stands still


https://www.gog.com/game/blood_2_the_chosen_expansion im not stating or selling theese games. just showing what and where they are in the future. better rendering faster cpu gpu chipset pcie 1-7 20xx released.
 
Last edited:
I have no fondness for any video card, CPU, etc I ever owned. Now dogs and cats that's different
 
This is clearly a bias fanboy post and the OP doesn't have a long history of PC hardware since they skipped over significant AMD cards and 3DFX which basically ushered in high end GPUs and is now the technology Nvidia uses in most of their cards.

I joined just to say this. I hate bad or bias info like this article.


I respect this site and Hardware Unboxed a lot, specially compared to the sell out LTT, but this repost on their part is really distasteful.

I agree with, hate the bias pushed in here.

Nvidia needs to be called out on all their anticonsumer cr@p that they keep doing to their customers and industry, not this @sskissing fluff article.
 
Then I will ask seriously, would you or any other staff member be willing to make an article describing all the anti-competitive and anti-consumer b.s. that nvidia has done to the industry and even their own customers throughout the years?

First, no. We were not paid to write this article or any other article about Nvidia. Or AMD, for that matter. Because for this Nvidia related article, we have the same for AMD/ATI. These are articles highlighting the best GPUs they've developed over the years, many of which we have reviewed, owned, and enjoyed as gamers and enthusiasts. That has nothing to do with either getting paid, or your disgust with a given company or their practices.

About your question -- which has nothing to do with the article, nor the insinuation about getting paid, which is borderline trolling... You will have to be more specific.

We have covered the chip shortage. We have called them out for BS products when we've found that to be the case. And we've tried to put some light on the reasons for price gouging. Is that it? We're also ready to give a good product a positive review. TechSpot has been around for over 20 years now, we have a long-standing track record.
 
Phhhff! I started with a Cirrus Logic 5465! Beat that, paupers!


But seriously, it was rubbish. With the exception of the rotating Direct3D cube in DxDiag.exe, any other Direct3D task would cause a system freeze. That cube was the most exciting 3D action I saw running on the PC. Got over an hour of play...
 
My first GeForce was 256 SDR. Next, 2 MX400, 2 Pro, 4 MX440, FX5900XT, 670, 980, Titan X Maxwell and 2 1080's. The rest of the time I used Monster 2, Radeons 4870 and 6870. And now a Radeon 6900XT. That's all my graphics card history :)
 
Last edited:
Back