Nvidia's GPU Classes Through the Years: What to Expect from the RTX 5080

All 5080 needs to do, is deliver 4090 performance or close, for less.

AMD is not competing in this segment.

Expect 1200 dollars on release.
Eventually, maybe price will drop to 1000 dollars.

AMDs best SKU in Radeon 8000 series will probably barely compete with RTX 5060.

Rumours claim top RDNA4 SKU will get 7900GRE/4070 performance with improved RT performance (compared to RDNA3), so lets hope price is going to be sub 500 dollars.

Rumors are nice for some people to parrot and feel better about themselves but I'll wait for actual reviews. Note that the 4090 does not perform well in games when compared to the 4080 on specs alone, which is why reviews are everything.

There is a tiny chance of 5080 being 999 dollars, considering Nvidia uses TSMC 5nm again. However I don't see why Nvidia should settle, as it will sell easily at 1199 dollars.

4090 performance for 500 dollars less? Many people will take that deal.

So much wishful thinking. A 10K core GPU on the same node will not compete well with a 16K core GPU unless there's some magicially huge IPC improvement which is highly unlikely to happen.
 
For you.

Most people want the most performance for money spent.
Then people should consider the TCO, and include power bill, then most savings will vanish over time anyway.

Everyone in the industry knows performance per watt is the sole best metric to compare GPUs.

If you save 100 bucks but use 200 bucks more on power over the next few years, the joke is on you.
 
Last edited:
5080 will probably get 3 GHz boost out of the box and use GDDR7 on top. With custom cards hitting 3.2-3.4 GHz.

5000 series uses TSMC 4NP, not 4N like 4000 series, this means ~10% improvement in efficiency which will result in higher clocks at same power draw, yet power draw is also increased + architectural improvements as well.

We don't know where performance will land, before reviews hit. Obviously 5080 won't be much slower than 4090.

5080 will get very close to 4090 for sure, especially in 1440p. You really want the 5090 if you are into 4K or higher. 5090 will be a 4K monster no doubt, delivering 40-50% uplift from 4090.

However, RTX 6000 series on TSMC 3N/2N will be the right upgrade path for 4090 owners. Unless they splash big on 5090 in 3 months time.
 
Last edited:
Then people should consider the TCO, and include power bill, then most savings will vanish over time anyway.

Everyone in the industry knows performance per watt is the sole best metric to compare GPUs.

If you save 100 bucks but use 200 bucks more on power over the next few years, the joke is on you.

Industry and consumer use are very different things. If consumer GPUs were designed and marketed for efficiency, they wouldn't be run at over 1.0 V where they are well out of their efficiency zone. But all of them are aside from those 70W PCIe slot-powered GPUs that have no choice.

Gamers want max FPS and very few care or are even aware of power usage/efficiency.
 
Industry and consumer use are very different things. If consumer GPUs were designed and marketed for efficiency, they wouldn't be run at over 1.0 V where they are well out of their efficiency zone. But all of them are aside from those 70W PCIe slot-powered GPUs that have no choice.

Gamers want max FPS and very few care or are even aware of power usage/efficiency.
Good efficiency means you can drive chips harder. Which means more performance.

Having superior performance per watt is ALWAYS BETTER!

If 4000 series did not have good efficiency, clockspeeds would be massively lower, and then performance would be lower as well.

4000 series runs 1 GHz higher than 3000 series due to BETTER EFFICIENCY!
 
Then people should consider the TCO, and include power bill, then most savings will vanish over time anyway.

Everyone in the industry knows performance per watt is the sole best metric to compare GPUs.

If you save 100 bucks but use 200 bucks more on power over the next few years, the joke is on you.
If you are worried about $200 in electricity over 5 years, you shouldn't be buying $1000+ GPUs, period.
 
Good efficiency means you can drive chips harder. Which means more performance.

Having superior performance per watt is ALWAYS BETTER!

If 4000 series did not have good efficiency, clockspeeds would be massively lower, and then performance would be lower as well.

4000 series runs 1 GHz higher than 3000 series due to BETTER EFFICIENCY!

I'm sure all-caps cheerleading is fun but what is "drive chips harder"? How are higher clock speeds driving a chip "harder"? They run at the same voltages as the previous 5 generations so they are driven no harder as far as I can tell. The increased clock speeds are mostly from newer, smaller nodes which Nvidia has no input into, other than purchasing capacity at whichever fab makes the node they want to pay for.

Driving a chip "harder" matches more closely with increasing the voltage to increase clock speed, which destroys efficiency and is what you were arguing against just a couple of posts above. Which is it to be?
 
Are you blind? 4000 is listed all the time in the top 50 GPU list. Fun fact: 7900XTX did not even make top 50.
WTF are you trying to argue here? That simply because a GPU is in the top 50 list that automatically means it sold well? There are many AMD models in the top 50 too, does that mean AMD is selling well in your world?

Do you see those numbers to the right side of the GPU models? Do you know what those numbers means? Because it seems you don't. Those numbers are the percentages of people using those GPUs. For the 4000 series, those numbers are as small as the numbers from the 2000 series (another disappointing series) and 3000 series (availability tarnished by the pandemic/crypto), and a fraction of the numbers the 1000 series (that was actually successful) achieved.

I don't know how to make this simpler for you.

The only reason 3000 is higher, is because its a generation older and sold for cheaper + second hand market.
I'm not comparing the 4000 series today to the 3000 series today. I'm comparing the 4000 series 2 years after its launch (today) to the 3000 series 2 years after its launch (late 2022), and the 4000 series still loses.

4000 is better than 3000 series by far.
It literally isn't. That is the whole point.

The 4060 was the same price as the 3060, only 15% faster, and has less VRAM.

The 4060 Ti was the same price as the the 3060 Ti and offered no performance improvement whatsoever, and in a few case even a performance reduction due to the smaller memory bus.

The 4070 was 20% faster than the 3070 but also 20% more expensive ($500 vs $600), and matched the 3080 in performance while costing just $100 less, which is better than nothing but still very disappointing after 2 years.

For perspective, the 3070 matches the 2080 Ti for less than half the price ($500 vs $1200). The GTX 1060 matched the GTX 980 for half the price ($250 vs $500) while also having 50% more VRAM. Those are the kinds of gen-on-gen upgrades that people expect, not the wet fart that was the 4000 series.

To me, you sound like a silly 3000 series owner not willing to accept that 4000 series is better.
No, I'm a 2000 series owner (2060 Super) who is one of the people that Nvidia is incapable of enticing to upgrade, because they have only released garbage in the $300~$400 segment for the past 4 years. The 3060 wasn't enough of an upgrade to warrant it, and the 4060 and 4060 Ti are both awful 8 GB products that I'm not touching with a 10-foot pole.
 
WTF are you trying to argue here? That simply because a GPU is in the top 50 list that automatically means it sold well? There are many AMD models in the top 50 too, does that mean AMD is selling well in your world?

Do you see those numbers to the right side of the GPU models? Do you know what those numbers means? Because it seems you don't. Those numbers are the percentages of people using those GPUs. For the 4000 series, those numbers are as small as the numbers from the 2000 series (another disappointing series) and 3000 series (availability tarnished by the pandemic/crypto), and a fraction of the numbers the 1000 series (that was actually successful) achieved.

I don't know how to make this simpler for you.


I'm not comparing the 4000 series today to the 3000 series today. I'm comparing the 4000 series 2 years after its launch (today) to the 3000 series 2 years after its launch (late 2022), and the 4000 series still loses.


It literally isn't. That is the whole point.

The 4060 was the same price as the 3060, only 15% faster, and has less VRAM.

The 4060 Ti was the same price as the the 3060 Ti and offered no performance improvement whatsoever, and in a few case even a performance reduction due to the smaller memory bus.

The 4070 was 20% faster than the 3070 but also 20% more expensive ($500 vs $600), and matched the 3080 in performance while costing just $100 less, which is better than nothing but still very disappointing after 2 years.

For perspective, the 3070 matches the 2080 Ti for less than half the price ($500 vs $1200). The GTX 1060 matched the GTX 980 for half the price ($250 vs $500) while also having 50% more VRAM. Those are the kinds of gen-on-gen upgrades that people expect, not the wet fart that was the 4000 series.


No, I'm a 2000 series owner (2060 Super) who is one of the people that Nvidia is incapable of enticing to upgrade, because they have only released garbage in the $300~$400 segment for the past 4 years. The 3060 wasn't enough of an upgrade to warrant it, and the 4060 and 4060 Ti are both awful 8 GB products that I'm not touching with a 10-foot pole.

Nah there are not really that many AMD cards in the top 50, maybe you should look at the total percentage. AMDs best selling cards are old and cheap.

3000 series are built on a trash node, cheap and soon to be 5 years old, obviously marketshare will be bigger, when most pc gamers are cheapskates and buy second hand or wait for sales. Nothing new here.

AMDs most popular GPUs, are 6-8 years old too. Logic, do you understand it?

AMDs two most widely used GPUs on Steam, are iGPUs not dGPUs.

Maybe you should start making some money so you are not locked in the 300-400 dollar bracket.

4060 runs in circles around 2060 and you say 4060 is slow. Oh the irony.

2000 series, and 2060 series especially, is so slow and dated it is not listed in most reviews anymore, so how do you know?
 
Last edited:
3000 series are built on a trash node, cheap and soon to be 5 years old, obviously marketshare will be bigger, when most pc gamers are cheapskates and buy second hand or wait for sales. Nothing new here.
Buddy, this is incredible, I'm baffled. I don't understand how I have explained to you multiple times already that I'm not comparing the 4000 series to the 3000 series today, but that I'm comparing both 2 years away from their initial launch (4000 series today vs 3000 series in october 2022), and you have consistently failed to comprehend this idea every single time.

Maybe you should start making some money so you are not locked in the 300-400 dollar bracket.
LMAO
So the solution to Nvidia releasing awful value products is to just be a good consoomer and give them even more money? I'd rather save my money for other more fulfilling aspects of my life.

4060 runs in circles around 2060 and you say 4060 is slow. Oh the irony.
No, it literally doesn't.

Even if it did, I'm not buying an 8 GB GPU in 2024.

2000 series, and 2060 series especially, is so slow and dated it is not listed in most reviews anymore, so how do you know?
Google "techspot 4060 review" and your mind will be blown. Spoiler: in the 15 games tested, the 4060 averaged 91 FPS while the 2060 Super averaged 72 FPS, making the 4060 only 26% faster. I'm not paying $300 in 2024 for a 8 GB card that is only 26% faster than the 6 year old one that I already own, that is a pathetic improvement.
 
Buddy, this is incredible, I'm baffled. I don't understand how I have explained to you multiple times already that I'm not comparing the 4000 series to the 3000 series today, but that I'm comparing both 2 years away from their initial launch (4000 series today vs 3000 series in october 2022), and you have consistently failed to comprehend this idea every single time.


LMAO
So the solution to Nvidia releasing awful value products is to just be a good consoomer and give them even more money? I'd rather save my money for other more fulfilling aspects of my life.


No, it literally doesn't.

Even if it did, I'm not buying an 8 GB GPU in 2024.


Google "techspot 4060 review" and your mind will be blown. Spoiler: in the 15 games tested, the 4060 averaged 91 FPS while the 2060 Super averaged 72 FPS, making the 4060 only 26% faster. I'm not paying $300 in 2024 for a 8 GB card that is only 26% faster than the 6 year old one that I already own, that is a pathetic improvement.
Yeah it does. 4060 literally stomps on 2060 in most new games with ease.

So in 4 years, you still make the same dime and can't afford anything else than low tier GPUs. Impressive.

You should look into AMD cards or step a tier up to 70 series.


4060 performs like a RTX 2080 pretty much.

2060 is far behind and on RTX 3050 level.
 
Yeah it does. 4060 literally stomps on 2060 in most new games with ease.

So in 4 years, you still make the same dime and can't afford anything else than low tier GPUs. Impressive.

You should look into AMD cards or step a tier up to 70 series.


4060 performs like a RTX 2080 pretty much.

2060 is far behind and on RTX 3050 level.

🤦

It's like talking to a door.
 
Anyone postulating high prices for likes is probably a paid troll at this point!

Just because the 4090 sold well at $1599 and less so at >$1999 doesn't mean that the 5090 will have the same success especially with rumored power requirements and a smaller improvement with similar node.
The delta performance would have to be similar to gain similar hype imo.
Wait 1 or 2 quarters post launch after the limited whale pool gets satisfied and watch the supply outpace demand.
 
And you think most PC gamers care for 4K/UHD with RT enabled, why? 99% of PC gamers use 1440p or lower and barely anyone are enabling RT. If they do, they enable upscaling as well in most cases.

Most people praising high amounts of VRAM are AMD users, and AMD users can't use RT to begin with.

See the irony here?

NOT A SINGLE GAME today needs more than 12GB at 4K/UHD maxed out, unless you enable RAY TRACING or PATH TRACING on top, AT NATIVE 4K that is. The only GPU capable of this, is 4090. Most RASTER ONLY games barely use 8GB at 4K ULTRA.

Link me a SINGLE GAME that uses RASTER ONLY that struggle on a 12GB GPU in native 4K. I am waiting. By struggle I want to see LOW MINIMUM FPS not high VRAM USAGE because many game engines today just allocate most of the VRAM, this does not mean its needed. And this is why 4090 can use 16-20GB in some games, that runs fine on 4070 12GB anyway. SIMPLE ALLOCATION.

99% of PC gamers use 1440p or lower.
99% of PC gamers are not enabling RT at native res, if at all.

That is reality.
Why all the yelling? Relax.
If you play at 1440p you don't need a 5080.
 
Why all the yelling? Relax.
If you play at 1440p you don't need a 5080.

If you render games at higher res using DLDSR, yes you do.

If you pair it with a high-end CPU like the upcoming 9800X3D for maximum fps using 240+ Hz, yes you do.

1440p at 480 Hz exist you know. Not everyone is satisfied with 60 fps locked!
 
Right? Brick wall is what I would say...


No, you don't. A 4070 is more than enough for 1440p, 4070Super maybe. A 5080 would be overkill, assuming it performs at levels that will be where they are expected to be.

Once again you state that you know absolutely nothing about high-end hardware and have low demands.

Come again when you have seen 1440p running at 360-480 Hz and tell me a 4070 will be more than enough.
 
Right? Brick wall is what I would say...
If I had a dime for every time a brand new account showed up here making a string of nonsensical comments in the last month or so, I'd have two dimes. Which isn't a lot, but it's funny that it happened twice.

There's this guy, and that other one who goes into every article that mentions AMD or Intel to talk about how great Intel is and how much better than AMD they are.
 
Check my post, it is all there...
No... your post just compared 1 high end card to another for each generation... for an article of this "depth", I'd expect average performance increase of each generation based on techspot's benchmarks over the years...

Not just 780 vs 980... but 760 vs 960, 3060 vs 4060, etc...

It's probably something ChatGPT could whip up in a minute...
 
Back