Top 10 Most Significant Nvidia GPUs of All Time

Fond memories of playing hours & hours of Call Of Duty 1 & 2 multiplayer with a 6800GT.

M1 Garand, Lee Enfield, Mosin Nagant & Kar 98 'Rifle Only' servers were amazing, creeping around the maps looking for those skilled single shot kills.
 
All seems about right, my only issues is the 6800 may have has SM3 support. It grossly lacked the performance to use it by the time SM3 equipped games hit the market in mass. But those who bought an X800 or X850 ended up having more performance to run the newer games all Beit limited to SM2. The 6800 fell flat fast and was quickly replaced with the 7800.
 
Fond memories of playing hours & hours of Call Of Duty 1 & 2 multiplayer with a 6800GT.

M1 Garand, Lee Enfield, Mosin Nagant & Kar 98 'Rifle Only' servers were amazing, creeping around the maps looking for those skilled single shot kills.
I still remember the sound of MP44 on the United Offensive MP servers. Or the Springfield shots on outskirts of Foy. Memorable times... played on 7300GT though...
 
The worst graphics card I ever had was Nvidia, it was an FX5200 and it was a mess! Before that I remember my dad upgrading the TNT with a voodoo. I never had much love for Nvidia growing up.

Things have changed though because now they absolutely smash Radeon.
 
The MX 400 was my first Geforce.

Upgraded from a Voodoo 3 3000.

The MX400 had more video memory and I noticed that details in my games, such as gauges in flight simulators (Janes USAF) looked animated and better.

From thence on, I prefer to get as much VRAM as possible.

Th3 most important Nvidia GPU is always the next one.

Right now my 3090FTW3 and the Kingpin 3090 are at the absolute top of the pile.

What Nvidia really needs is a 3080Ti for $1099 that isn't scalpable. IE: sell it direct to consumers.
 
The worst graphics card I ever had was Nvidia, it was an FX5200 and it was a mess! Before that I remember my dad upgrading the TNT with a voodoo. I never had much love for Nvidia growing up.

Things have changed though because now they absolutely smash Radeon.
The card was an absolute piece!
My first card was a cheap MSI FX 5200 because in the beginning when I got into computers I thought gaming with a kb/m was stupid. Then I tried playing a game at Low detail.

20 GPUs later.....
 
"It seemed future proof from the get-go, with 4 MB of SGRAM and AGP 2X support."
- That sentence has me in stitches! :laughing:

I've had four desktop nVidia cards and one mobile nVidia GPU. The desktop cards were the TNT2, the FX-5400, the 6200 and the 8500GT. The desktop cards were back when games were still mostly 2D side-scrollers or fake/rudimentary 3D like Wolfenstein and Doom.

My newest craptop (an ASUS) has a mobile GTX 1050 along with the ATi Vega GPU that's part of the R5-3500U APU. I only have the GTX 1050 because it was essentially a freebie. Other craptops of the same price and otherwise identical configurations didn't have it so I said "What the hell" and got it. I don't use it for the display because the Vega IGP uses less power and I don't game on craptops. I do find it useful for its ability to transcode video with CUDA (not great but pretty quick).
 
Last edited:
This is clearly a bias fanboy post and the OP doesn't have a long history of PC hardware since they skipped over significant AMD cards and 3DFX which basically ushered in high end GPUs and is now the technology Nvidia uses in most of their cards.

I joined just to say this. I hate bad or bias info like this article.
 
This is clearly a bias fanboy post and the OP doesn't have a long history of PC hardware since they skipped over significant AMD cards and 3DFX which basically ushered in high end GPUs and is now the technology Nvidia uses in most of their cards.

I joined just to say this. I hate bad or bias info like this article.
Ummm, you're clearly blind as a bat. Last week there was an article that covered the most significant ATi cards and it's still on the front page just down and to the right of this article. Here's the link to it:
https://www.techspot.com/article/2172-top-10-amd-graphics/
And you accuse Sami of being a fanboy? You're the clueless fanboy here and I say that as someone who hates nVidia.
 
Geforce 2 GTS, 4 mx400, 7800GT, 7800GT SLI, 8800GT, 8800GT SLI, GTX 480, GTX 970, GTX1080, GTX 1080TI - RTX 3070 (whenever my order os shipped).

The first time ever I had a gfx failing on me was the GTX 480. The only other time was another GTX 480 😅.

All other cards I've had lived long enough to get sold off in time for their replacement upgrades (both 3dfx, ATI, AMD and Nvidia cards).
 
Geforce 2 GTS, 4 mx400, 7800GT, 7800GT SLI, 8800GT, 8800GT SLI, GTX 480, GTX 970, GTX1080, GTX 1080TI - RTX 3070 (whenever my order os shipped).

The first time ever I had a gfx failing on me was the GTX 480. The only other time was another GTX 480 😅.

All other cards I've had lived long enough to get sold off in time for their replacement upgrades (both 3dfx, ATI, AMD and Nvidia cards).
Kinda funny that Fermi cards ran so hot when Fermi the scientist was the inventor of the nuclear reactor. :laughing:
 

The worst graphics card I ever had was Nvidia, it was an FX5200 and it was a mess! Before that I remember my dad upgrading the TNT with a voodoo. I never had much love for Nvidia growing up.

Things have changed though because now they absolutely smash Radeon.

I had a FX5200 once, it didn't last long (a couple of hours) because my PC back then was built with very cheap parts and it killed it. But the short time I got to test it, it didn't impress me one bit, not even coming from a Geforce 4 MX420.
 
Tegra is not significant at all. Nvidia had excess amount of Tegra chips and sold them cheaply for Nintendo. There is no custom logic on chip Nintendo uses, it's just off the shelf chip.

It tells us two things:

1. It was never meant to be used on Switch
2. Nvidia had no interest developing chip for console like Switch

Basically same as saying AMD has made significant release with Ryzen 5 4500U because upcoming handheld console Aya Neo Founder Edition is going to use it. Again, no custom there, just off the shelf chip.

Nothing compared against Playstation and XBox chips that are custom stuff.
 
I think my first Nvidia card was an GeForce 2 MX in a Mac Cube. Yeah, the see thru thing that was an overpriced flop. Of course I got it for free post-flop, and upgraded it with some stuffs. Never gamed on it though as, well, Mac & a 2MX.

First real Nvidia card was a 1060 6GB in Summer 2017 just as the crypto boom started, I got it for list (lucky) for an eGPU setup and it's done service in 4 computers so far, and is now back in the eGPU connected to this NUC8i5 I'm typing on.

Also have a 1080, 1660 Super, and 1050Ti in those various computers.
 
The worst graphics card I ever had was Nvidia, it was an FX5200 and it was a mess! Before that I remember my dad upgrading the TNT with a voodoo. I never had much love for Nvidia growing up.

Things have changed though because now they absolutely smash Radeon.
Oh wow, I thought I was the only one with a FX5200, it definitely could have been better.
 
The MX 400 was my first Geforce.

Upgraded from a Voodoo 3 3000.

The MX400 had more video memory and I noticed that details in my games, such as gauges in flight simulators (Janes USAF) looked animated and better.

From thence on, I prefer to get as much VRAM as possible.

Th3 most important Nvidia GPU is always the next one.

Right now my 3090FTW3 and the Kingpin 3090 are at the absolute top of the pile.

What Nvidia really needs is a 3080Ti for $1099 that isn't scalpable. IE: sell it direct to consumers.

I had a MX400 too and was pretty happy with it at the time.
 
As a kid I remember the 8800 GTX and Crysis came out. That was the most exciting GPU era for me.
I remember budgeting for an 8800 and by the time I could afford it, the 9-series came out and I opted for the 9600 instead. Excellent price for the performance!
 
This is clearly a bias fanboy post and the OP doesn't have a long history of PC hardware since they skipped over significant AMD cards and 3DFX which basically ushered in high end GPUs and is now the technology Nvidia uses in most of their cards.

I joined just to say this. I hate bad or bias info like this article.

Not sure if this is an attempt at trolling or being funny, but I covered the AMD/ATI cards last week. I mention it at the end of this article, but if you want a link to that story, here you go:

https://www.techspot.com/article/2172-top-10-amd-graphics/

Thanks for reading and commenting though!
 
"It seemed future proof from the get-go, with 4 MB of SGRAM and AGP 2X support."
- That sentence has me in stitches! :laughing:

I've had four desktop nVidia cards and one mobile nVidia GPU. The desktop cards were the TNT2, the FX-5400, the 6200 and the 8500GT. The desktop cards were back when games were still mostly 2D side-scrollers or fake/rudimentary 3D like Wolfenstein and Doom.

My newest craptop (an ASUS) has a mobile GTX 1050 along with the ATi Vega GPU that's part of the R5-3500U APU. I only have the GTX 1050 because it was essentially a freebie. Other craptops of the same price and otherwise identical configurations didn't have it so I said "What the hell" and got it. I don't use it for the display because the Vega IGP uses less power and I don't game on craptops. I do find it useful for its ability to transcode video with CUDA (not great but pretty quick).

It's funny to look back and see how worked up we got over things like four whole megabytes! Today we have GPUs with 24 GBs!
 
Would have thought a brilliant bang for buck card like the 2mx would be on here.
Pretty much skipped over cards like the GF3 Ti200 and the GF4 4200 64/128.
 
My first nvidia card was a TnT2, I upgraded from an ATI 3d Rage Pro and was blown away by it. I've stayed nvidia as my main gaming cards since then.

I've loved pretty much all my nvidia cards, even the FX5900 (in spite being slower than the 9700 PRO), in fact that's one of my favorites.
 
Last edited:
BTW picking just 10 best is an easy task on nivdia. Now having a 10 worst, heck even 5 worst cards, that would be a challenge :):laughing:
 
Back