Nvidia's GPU Classes Through the Years: What to Expect from the RTX 5080

No... your post just compared 1 high end card to another for each generation... for an article of this "depth", I'd expect average performance increase of each generation based on techspot's benchmarks over the years...

Not just 780 vs 980... but 760 vs 960, 3060 vs 4060, etc...

It's probably something ChatGPT could whip up in a minute...

That is not possible to do without benchmarking the same games on every GPU. The games tesred when the 980 was compared to the 780 are different than the ones comparing the 3080 to the 4080. That gives you a good comparo between 2 to 3 successive generations but nothing can be claimed between the 4080 and the 780.

There is an excellent example of this at TechPowerUp. When they tested the 3050 6GB, the 1060 6GB was one of the GPUs tested alongside. In that test of 25 current games, the 3050 6GB is 25% faster than the 1060. However in their GPU database, they list the 3050 6GB as only 2% faster because the old data listed in there is never updated. The direct comparo is clearly giving the right answer and the GPU Database with it's loads of old data can easily be dismissed.
 
Last edited:
That is not possible to do without benchmarking the same games on every GPU. The games tesred when the 980 was compared to the 780 are different than the ones comparing the 3080 to the 4080. That gives you a good comparo between 2 to 3 successive generations but nothing can be claimed between the 4080 and the 780.

There is an excellent example of this at TechPowerUp. When they tested the 3050 6GB, the 1060 6GB was one of the GPUs tested alongside. In that test of 25 current games, the 3050 6GB is 25% faster than the 1060. However in their GPU database, they list the 3050 6GB as only 2% faster because the old data listed in there is never updated. The direct comparo is clearly giving the right answer and the GPU Database with it's loads of old data can easily be dismissed.
You can still extrapolate… and game averages can be used to give a pretty good idea - even if the exact same games aren’t used…

Techspots’ reviews tend to use a LOT of games so info shouldn’t be impossible to get.
 
Titan X Pascal and 1080ti were the last truly impressive GPUs in terms of performance increase; closest to 8800 GTX leap l in my opinion; my 2080Ti/3080ti/4080 Super feel like faaar less impressive jumps
 
Very interesting article, thank you.

Have the flagships improved at a consistent pace from series to series? My gut sense is that the 4090 moved the ball more than others. If that's true, it may be a little unfair to say the lower-level cards were cut down more, when it may be what really happened is the flagship got pushed up higher (both power and price.)

I bet price on the next series is going to feel too high across the board, but at least part of that will legitimately be inflation. Would be interesting to see the inflation-adjusted version of the historical pricing chart when we're able to put the new cards up on it as well.
 
IMO, NVIDIA is basically forced into their terrible pricing scheme. At this point, if they were to actually engage in a competitive price war with AMD and Intel, NVIDIA would get to 97+ percent market share within a generation or two, if not completely forcing AMD and Intel out of the market. This is strongly evidenced by the fact that they have successfully demonstrated with the 40 series that they can legitimately compete in terms of sales and product placement despite shifting their product stack down a die size.

If they won the GPU market though, they would almost certainly face the wrath of antitrust lawsuits, regulators and the like and probably lose tens or even hundreds of billions against those anyways.

So NVIDIA instead wisely (unfortunately) chooses to continue increasing their margins. Their overpriced lineup is their way of inviting the illusion of competition to keep regulators off their backs.

This is an unintended consequence of antitrust laws by the way, but, it’s clearly easier for most to stick their heads in the sand and continue demonizing large corporations…
Very well thought out post. TBH, that hadn't occured to me, but yes, due the regulations you mention, they could achieve what would be viewed as a monopoly of sorts at the highend at least. It's an excellent point.

There troubles would probably begin in Europe due to a raft of regulations. (some are good, others debatable.)

I can't say much more without giving it proper thought. But your post is one of the best I've seen. Thank you.
 
Titan X Pascal and 1080ti were the last truly impressive GPUs in terms of performance increase; closest to 8800 GTX leap l in my opinion; my 2080Ti/3080ti/4080 Super feel like faaar less impressive jumps

Titan X Pascal was 1200 dollars almost 10 years ago and RTX made GTX obsolete.

4090 is the biggest leap in many generations and 1600 dollars on release is the same or less than Titan X Pascal in todays dollaz. That is what I paid and 5080 probably won't beat my 4090 for 1200 dollaz here more than 2 years later. By the time 5080 releases, it will be 2½ years since I got my 4090.

I bet I can sell my 4090 for 1000 bucks when 5090 and 5080 hits anyway. Lost 600 dollars over 2½ years while having the best gaming consumer GPU on the planet. Nvidia products simply retain their value much better than AMD.

Some paid 1100 dollars for their 7900XTX and 950 for their 7900XT, loook at prices today. AMD constantly lowers prices to stay competitive which ruins the resell value for actual users.
 
Last edited:
Does Nvidia even care about its gaming division now? 80% of its earnings are from Data centres/AI chips/enterprise in 2024. It should just separate their gaming division into a seperate company so they can excel on their own.
 
Does Nvidia even care about its gaming division now? 80% of its earnings are from Data centres/AI chips/enterprise in 2024. It should just separate their gaming division into a seperate company so they can excel on their own.
They literally print billions a quarter off of gaming. If ther eis one thing nvidia should do, it's take advice from Techspot Commenters.....
 
Everyone complaining about the company charging $1200 makes me wonder if anyone even lived through Cryptocovid. Nvidia and AMD BOTH released 8-series cards in the $6-700 range. They were GONE on day 1, more like hour 1. Want one? That'll be $1000-1500, and to a scalper, no less. Demand went through the roof, and both companies missed out on a ton of potential profits.

Also not sure how the 5080 will reach 4090 performance with specs closer to the 4080, but time will tell. If it does this, it will once again fly off the shelves at $1200 as people are currently paying $1600+. If it's drawing 400W and higher efficiency, it should deliver. If it matches the 4090 Jensen will, and should, aim higher. $1400 is $200 off the 4090, same thing they did with the 4080 Super. 5090 will be $1999.

Sad that AMD 8-series will be delayed. I just want a decent upgrade for my 6800 XT, now valued at about $350 new. I'd have to go to the 7900 XTX for $900 just to get a decent performance bump. Not worth it. 6800 XT such a perfect sweet spot. Still getting 80's regularly in FF16 with FSR3/Quality 4K. A $600 RX 8800 that performs between the 7900XT and XTX (and outperforming the 4070 Supers) would be awesome and needed to keep AMD relevant and Nvidia in check.
 
Everyone complaining about the company charging $1200 makes me wonder if anyone even lived through Cryptocovid. Nvidia and AMD BOTH released 8-series cards in the $6-700 range. They were GONE on day 1, more like hour 1. Want one? That'll be $1000-1500, and to a scalper, no less. Demand went through the roof, and both companies missed out on a ton of potential profits.

Also not sure how the 5080 will reach 4090 performance with specs closer to the 4080, but time will tell. If it does this, it will once again fly off the shelves at $1200 as people are currently paying $1600+. If it's drawing 400W and higher efficiency, it should deliver. If it matches the 4090 Jensen will, and should, aim higher. $1400 is $200 off the 4090, same thing they did with the 4080 Super. 5090 will be $1999.

Sad that AMD 8-series will be delayed. I just want a decent upgrade for my 6800 XT, now valued at about $350 new. I'd have to go to the 7900 XTX for $900 just to get a decent performance bump. Not worth it. 6800 XT such a perfect sweet spot. Still getting 80's regularly in FF16 with FSR3/Quality 4K. A $600 RX 8800 that performs between the 7900XT and XTX (and outperforming the 4070 Supers) would be awesome and needed to keep AMD relevant and Nvidia in check.

Tbf they weren't the same at all.
Getting into 2021 and after the initial storm of interest (and panic) for both the lines were drawn that extend to now, maybe tomorrow too?
My 6800XT, best of stack Sapphire Nitro+ SE cost £1200 in May 2021, and was immediately available. Such price and stock had been so for months by then and continued so.
Against that, any 3080 (with that wtf 10Gb cap) was £1800-2400 and stock TBA, lucky if you get anything at all nm your choice of model that side of Christmas. To wit: You pay more for a card you can't have or use yet, which by the time you get it (if not before) trades blows in raster vs the other and has an edge in RT that requires DLSS to back up and isn't sustainable anyway over or past that period with such a low VRAM cap. I'd say the choice there was effectively made for me, if it wasn't a no brainer and if Nvidia didn't play their own pivotal part in prices and stock reaching that level... which they went on to double down on despite the issues stated (by them) as reason for that continuing into Ada. Btw I've put prices here in my currency but it was the same in USD and most others too.

Btw on top of the above the 3070 at that same time cost £1400 and up, more than the 6800XT. The 6900XT was around £1600 tops, the 3090 3-4 grand. Yet Nvidia and partners say they had absolutely zero control over where their products ended up other than in the hands of the long term loyal fanbase that made them so wealthy and popular. They could've done better, starting with maybe not taking bulk consumer cards out of circulation to sell direct to crypto farms...

Aaand then it happened again this last gen, long after crypto and scalping went away. 4080's traded blows with 7900XTX's plus a slight edge in RT... up to twice the price. Same for the 4090, 20%/30 fps overhead at 4K... but did it match the 7900XTX's £8-10 per fps before whatever might be a fair added cost for RT? Try 2.5-3x the price. Two grand could get you a premium 4080/4090, that's it. Or a premium 7900XTX and the rest of the PC build you need to make it do stuff to much the same standard. Hell, there's plenty of 4080/4090 users that won't use RT cos it's still too costly. On a flagship card. At the better part of 3 grand. Several gens after RT became the big deal for Nvidia. I'd even say the worst part hasn't been only the top end price raise but the price range for a single tier. For example, 4080's went from £1350-2100 last year, the difference being that between a reference card, basic shroud, little or no extras... and a premium all bells and whistles card. That range used to far more closely cover the pricing from xx60 to xx80ti, premiums included, only a couple of gens back.
Inflation? Nothing such... and excuse for a planned strategy, sure. Just like talking about MSRP is a waste of time, that kind of measure is dead now.

Nvidia are laughing here. They've literally all but declared the gaming market is no great shakes to them but the money they can make from entrenchment, hype and outright lies supported and told vs their competition (on the forum level, which is a good part of AMD's loss and current strategy) but ppl still pay whatever's asked... and Nvidia continue to do so with a minimum of effort just rolling on what's already laid down. How much do they need to offend before ppl have second thoughts?
 
If Nvidia can rip off the consumer, they will.

Of course its enormously helpful to Nvidias cause that PC Gamers are such a bunch of FOMO suckers when it comes to GPUs.
If Nvidia can rip off the consumer, they will.

Of course its enormously helpful to Nvidias cause that PC Gamers are such a bunch of FOMO suckers when it comes to GPUs.
Yeah they will I only this year did I replace my 2080ti with a used 3080ti off eBay I refuse to pay the ludicrous prices for GPUs now even though I can afford them. NVIDIA don't care about their customers anymore just what they can suck out of them
 
Yea, this just reminds me how horrible 4080 is.. and how cut down+expensive it is. I mean lol, you can still find many 4080 supers for 1200-1300 euro. My god. It seems that 5080 will be even worse. We shall see of course. I hope they learn something from the unsellable 1300 dollars RTX 4080L edition (L for lame). Nobody wanted these, they were rotting in the stores for years.

However, if they pull - it's a new generation, tis moar powerful bla bla, it has to cost more... It's over lol. Greedia won't sell any 5080s. It will go down in history as the worst card ever. It will be massively cut, if the rumors are true. All other cards under 5080 won't deliver real next gen fps either. It will be similar performance to the 40 series. If you want next gen, 5080-90 are the cards. Which is a shame too. People waiting for the 50 series won't be happy. A bit faster 5060/70? Who cares, you can get 4070/ti super today, or like 2 year ago! You might not even save any money either. It's Ngreedia after all. The new 5060 might be overpriced, just the same way 4070 was when it came out. Here, it was selling for 800, then it got down to 740.

It's fine, I can wait for the RTX 6080, if 5080 is a horrible card from a horribly greedy company. My card is fine today, and I got a huge backlog. They can keep their 5070 in disguise cards.

P.s. Lord knows 5090 won't be cheap in Europe. That's an automatic skip. As far as gaming and fun goes, no card should be more than a 1000 imho.
 
They literally print billions a quarter off of gaming. If ther eis one thing nvidia should do, it's take advice from Techspot Commenters.....

From Nvidia's last quarterly report in August directly from their site:

Data Center
Second-quarter revenue was a record $26.3 billion, up 16% from the previous quarter and up 154% from a year ago.

Gaming and AI PC
Second-quarter Gaming revenue was $2.9 billion, up 9% from the previous quarter and up 16% from a year ago.

Professional Visualization
Second-quarter revenue was $454 million, up 6% from the previous quarter and up 20% from a year ago.

Automotive and Robotics
Second-quarter Automotive revenue was $346 million, up 5% from the previous quarter and up 37% from a year ago.

Do the math. Gaming was merely 9.7% of their revenue last quarter & 9% the quarter before that. 91% of Nvidia's revenue is not from gaming anymore. It's peanuts to them.
 
From Nvidia's last quarterly report in August directly from their site:

Data Center
Second-quarter revenue was a record $26.3 billion, up 16% from the previous quarter and up 154% from a year ago.

Gaming and AI PC
Second-quarter Gaming revenue was $2.9 billion, up 9% from the previous quarter and up 16% from a year ago.

Professional Visualization
Second-quarter revenue was $454 million, up 6% from the previous quarter and up 20% from a year ago.

Automotive and Robotics
Second-quarter Automotive revenue was $346 million, up 5% from the previous quarter and up 37% from a year ago.

Do the math. Gaming was merely 9.7% of their revenue last quarter & 9% the quarter before that.
There comes a point where %, while important, can still mislead...
3 billion dollars a quarter - or 12 billion dollars a year - is still nothing to sneeze at!!

To abandon the gaming market and say goodbye to 12 billion dollars a year would be insanity... yes, they make more elsewhere - diversity is still important... if the AI bubble pops, Nvidia will be happy to keep that gaming revenue....
 
From Nvidia's last quarterly report in August directly from their site:

Data Center
Second-quarter revenue was a record $26.3 billion, up 16% from the previous quarter and up 154% from a year ago.

Gaming and AI PC
Second-quarter Gaming revenue was $2.9 billion, up 9% from the previous quarter and up 16% from a year ago.

Professional Visualization
Second-quarter revenue was $454 million, up 6% from the previous quarter and up 20% from a year ago.

Automotive and Robotics
Second-quarter Automotive revenue was $346 million, up 5% from the previous quarter and up 37% from a year ago.

Do the math. Gaming was merely 9.7% of their revenue last quarter & 9% the quarter before that. 91% of Nvidia's revenue is not from gaming anymore. It's peanuts to them.
Dude, Nvidia is not leaving gaming market. They are dominating gaming market with absolute ease, without spending much R&D funds doing so.

You are talking about billions of dollars here, that is crazy amounts of money for any company. Just because gaming is "only" 10% of Nvidia does not mean they don't want it. No succesful company says no to free money.

Nvidia's GPU focus is mid to high-end.
AMD's GPU focus is low to mid-end.
Intel's is low-end but Xe2 looks like it can compete in mid-end as well.

There is way higher margins on mid to high-end.

AMD always competes on price, lowers prices all the time, lowering their margins. This is the reason why AMD left high-end as well, waste of R&D funds when they are not selling anyway.
 
Hmm perhaps but it will be foolish to pay 1200 usd for a 16 gb gpu at this point. 16gb isn't enough for 4k rt gaming in quite a few games.

I'll wait for the 24gb version (if there is one).
Which games can't run 4K/UHD with RT on a 4080/4080S due to memory size?

Besides, most people will use DLSS for 4K RT, meaning you won't be running 4K internally and this lowers VRAM usage too.

The only GPU capable of doing 4K native with RT is 4090 mostly due to alot more GPU power and you will still see drops below 60 fps in the demanding ones, meaning most will enable DLSS anyway to get 100+ fps, or even smack Frame Gen on top, to hit 150-200 fps instead.

Path Tracing, which is RT on drugs, won't run on 4K native on any card with more than 20-30 fps avg. DLSS is needed here and 5090 won't change this.

7900XTX does like 2-4 fps avg. with Path Tracing for comparison.
 
Last edited:
5080 will be a garbage heap 12-16 GB gpu focused entirely on using upscaling to obtain performance .. put in NON-RTX modes it will be 5-10 FPS better than a 4080 and cost 1200$ at minimum. the 5060 will be the bread and butter and use a much higher speed 128bit memory bus and run 8-12GB of ram and will force the user to run RTX or not be any better than a low end gddr6 standard 4080.

Welcome to the future .. thanks AMD for abandoning everyone, thanks intel for going bankrupt and selling off parts of yourself like a street walker ... now were only left with NV ... awful future
 
The 4080's biggest fault was its price, it's about 50% faster than the 3080 which is a phenomenal upgrade considering how big of an upgrade the 3080 already was to the 2080. Jumping from $699 though to $1199 was ridiculous! It priced longtime 80 series buyers, like myself, out of the 80 series. My biggest hope for the 5080 is that Nvidia gets back to pricing that allows me to continue to be an 80 series consumer. The 4080 Super reduced it back to $999, but that is still too expensive.
"My biggest hope for the 5080 is that Nvidia gets back to pricing that allows me to continue to be an 80 series consumer"
I don't understand that phrase, from a marketing point-of-view.
 
"My biggest hope for the 5080 is that Nvidia gets back to pricing that allows me to continue to be an 80 series consumer"
I don't understand that phrase, from a marketing point-of-view.

Some people get great joy from looking at the number on their graphics card box.
 
What do I expect... Uhm... I expect to keep my 4090 and buy a 6080/6090 on TSMC 3nm or better, or RDNA5 if AMD is to enter high-end GPU market by then.

5000 series in a nutshell:

Won't face competition at 5070 and higher. Top RDNA 4 SKU will compete with 5060 series tops. RDNA5 is probably not going to happen before 2026, close to RTX 6000 series.

Uses the same 5nm process as 4000 series, meaning they can only increase die size and power usage to gain performance.

GDDR7 will add little performance at less than 4K/UHD, and might only be 10% diff from GDDR6X anyway, while price will go up 50%

5090 will be a beast for sure, in all areas, meaning power consumption as well.
 
Last edited:
Now, I am not the biggest NVIDIA Fan. It is not because of their cards. They have great cards but I hate NVIDIA's drivers compared to AMD. There is One NVIDIA card that I kept over the years. It is a rare NVIDIA Leadtek 5900xt with a custom Japanese bios. My friends father worked there and he got me one. Compared to other 5900 Nvidia GPUs, that thing was like Godlike when comparing performance. Sadly the card died but I keep her as a Trophy. I am looking to see if I can restore her but that has to wait until I do have time for it.
 
Back