GPU Pricing Update, April 2023: Is the Nvidia RTX 4070 Another Flop?

Curious to know how the 1080 stacks up against the 4070. If, for a moment, you set aside product names, which GPU delivers more performance? Honestly, if you paid $600 in 2017 that would equate to $745 in today's dollars. That would seem to make the 4070 a better buy, perhaps?
I've actually commented on TS GPU comparison reviews a couple of times, asking them to do just this. Even if they only do it "virtually" by pulling numbers from older reviews of similar games, it would be interesting to see how the 1080ti stacks up against the 20xx, 30xx, and 40xx series. I suspect that the 1080ti is more than a comfortable match for the 2080 and 3080 at 1440p w/o raytracing, and it only finally being consistently surpassed by the 4080. And even then, I can see it not being a royal trouncing of the 1080ti. Ignoring Ray Tracing, it probably takes 4K settings for it to make a noticeable difference.
 
it would be interesting to see how the 1080ti stacks up against the 20xx, 30xx, and 40xx series. I suspect that the 1080ti is more than a comfortable match for the 2080 and 3080 at 1440p w/o raytracing, and it only finally being consistently surpassed by the 4080.
It's sort of been done before and the 1080 Ti wasn't vastly better than a 3060, when averaged across 33 games:


Of course, in some games, it was quite a lot faster, but the 3060 isn't anything like a 3080.
 
It's sort of been done before and the 1080 Ti wasn't vastly better than a 3060, when averaged across 33 games:


Of course, in some games, it was quite a lot faster, but the 3060 isn't anything like a 3080.
Thanks for the link. This is the line that caught my eye from that review

Upon release, the RTX 3080 was almost 60% faster than the GTX 1080 Ti at 1440p and almost 80% faster at 4K,
 
What Nvidia considers "not selling well" would be a dream for AMD.
This is why Nvidia sits at 85% dGPU marketshare and AMD barely competes.

6950XT uses like 400 watts avg in gaming, people that want a efficient card, won't look at that for a second.

Hunh?
Why do you think Gamers care about "marketshare"..? (You sound absurd)


RDNA is thee gaming community:
RDNA3 is in the new ROG Ally
RDNA3 is in the new AYANEO
RDNA2 is in the Steamdeck
RDNA1.5 is in the Xbox Series X/S
RDNA1.5 is in the PlayStation 5

RDNA3 discreet Navi 31 is unmatched in the gaming world, except by AD102 die that is 50% moAr costly ($1600) and only 17% in performance. So win for AMD anyways.

While AMD's RDNA2 (navi21) still beats 60% of the RTX4 line up for $599 and AMD has margin to go lower..



But we are all pulling for You and your 5 other friends who, are afraid of performance bcz the cost of energy.
 
I gotta say that for 1440p even my 1080ti is plenty for high settings/60+FPS; I'm far less impressed with current cards rather stilted performance increases; like the 5000/7000 (FX/Geforce era for you young 'uns) series feeling.

Sure my 7900XT and 3080Ti are faster; but in most scenarios the average person won't perceive a huge difference for the most popular games like CSGO/R6S/DOTA; and would have to fiddle with graphics settings for most newer games to force the 1080ti to perform noticeably worse ; and I bet the average person just uses the settings defaults presented.
 
Last edited:
Hunh?
Why do you think Gamers care about "marketshare"..? (You sound absurd)


RDNA is thee gaming community:
RDNA3 is in the new ROG Ally
RDNA3 is in the new AYANEO
RDNA2 is in the Steamdeck
RDNA1.5 is in the Xbox Series X/S
RDNA1.5 is in the PlayStation 5

RDNA3 discreet Navi 31 is unmatched in the gaming world, except by AD102 die that is 50% moAr costly ($1600) and only 17% in performance. So win for AMD anyways.

While AMD's RDNA2 (navi21) still beats 60% of the RTX4 line up for $599 and AMD has margin to go lower..



But we are all pulling for You and your 5 other friends who, are afraid of performance bcz the cost of energy.

Gamers probably don't care, but reality exist

AMD soon dips below 10% dGPU marketshare on Steam

AMD owns the console market, because Intel and Nvidia don't care about it. There's very low profits here for AMD. Microsoft and Sony are the winners, not AMD.

Yet you claim AMD is doing well? Lmao :joy:

AMD is doing worse and worse in the dGPU PC segment:

29160939924s.jpg


 
I gotta say that for 1440p even my 1080ti is plenty for high settings/60+FPS; I'm far less impressed with current cards rather stilted performance increases; like the 5000/7000 series feeling.

Sure my 7900XT and 3080Ti are faster; but in most scenarios the average person won't perceive a huge difference for the most popular games like CSGO/R6S/DOTA; and would have to fiddle with graphics settings for most newer games to force the 1080ti to perform noticeably worse ; and I bet the average person just uses the settings defaults presented.
And this is also why Nvidia dominate.

If you ask 100 people on the street, 95 or more will choose Nvidia over AMD.

AMD just have a bad rep and stuff like this sticks and spreads fast.

I am 100% sure that Nvidia have better drivers overall (outside of popular games, early access titles etc). Nvidia also have better features than AMD.

AMD is cheaper for a reason. And this is why AMD should focus on performance per dollar and RASTER ONLY instead of trying to compete on RT perf - It's a lost battle.
 
And this is also why Nvidia dominate.

If you ask 100 people on the street, 95 or more will choose Nvidia over AMD.

AMD just have a bad rep and stuff like this sticks and spreads fast.

I am 100% sure that Nvidia have better drivers overall (outside of popular games, early access titles etc). Nvidia also have better features than AMD.

AMD is cheaper for a reason. And this is why AMD should focus on performance per dollar and RASTER ONLY instead of trying to compete on RT perf - It's a lost battle.
Sure. But now nVidia is getting a bad rap for price gouging and small amounts of VRAM (which they really have no excuse for; VRAM is cheap as dirt). I have a 1080ti now and have always bought nVidia GPUs, but I am fully expecting my next GPU to be an AMD one because they actually give you enough VRAM to display 4K textures.
 
Sure. But now nVidia is getting a bad rap for price gouging and small amounts of VRAM (which they really have no excuse for; VRAM is cheap as dirt). I have a 1080ti now and have always bought nVidia GPUs, but I am fully expecting my next GPU to be an AMD one because they actually give you enough VRAM to display 4K textures.
There's zero issues, if you buy a GPU that actually can handle 4K to begin with.

1080 Ti is slow as dirt today, even if it had 22GB VRAM. This is why it's pointless to try and futureproof on VRAM. GPU will be the limiting factor anyway.

Zero 8, 10 or 12GB cards are aimed at 4K.

Not even 6950XT does well in 4K on high settings in demanding games in 2023. 3090 and 3090 Ti sucks as well.

Pointless really.

However, do you even run 4K? 1080 Ti is barely a 1440p option in 2023...

In reality, 3070 8GB beats 6700XT 12GB in 4K across 25 popular and demanding games.

1% low / Minimum fps: https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/36.html

AMD started this VRAM crap, because they have nothing else on Nvidia right now. They loose in pretty much anything; Performance, features etc.

VRAM is all they have. And it makes no difference in 99.99% of games. As you can see.

You can force a 8GB to max out in 1440p in a handful of games yes, HOWEVER, settings are unplayable even if the card had twice the VRAM, because GPU is too slow. It's all talk from AMD, and Youtubers ramble on about it, to get views (=money)

It's a non-issue for actual gamers.
 
Last edited:
Gamers probably don't care, but reality exist

AMD soon dips below 10% dGPU marketshare on Steam

AMD owns the console market, because Intel and Nvidia don't care about it. There's very low profits here for AMD. Microsoft and Sony are the winners, not AMD.

Yet you claim AMD is doing well? Lmao :joy:

AMD is doing worse and worse in the dGPU PC segment:

29160939924s.jpg



Holy hell.... *laughing

Dude! Gamer's don't care about the stock markets and where you have your money. They care about their games...!


Secondly, you keep referencing the past, not current reality where RDNA and RDNA drivers are multi-threaded and much better than nVidia GFE. I run 3 gaming rigs and deal with both. nVidia has fallen behind AMD and it's the reason RDNA is in everything, not CUDA.

Nobody is talking about Intel or nVidia because nobody cares.



What matter is... more people are using RDNA for gaming, than anything else. And no amount of you crying or talking about the past will change the era of RDNA in everything.
 
...

However, do you even run 4K? 1080 Ti is barely a 1440p option in 2023...
...
What on earth are you smoking? My 1080ti handles 1440p, even on Ultra settings, just fine. I think the only game that makes it cry is Cyberpunk 2077, but that game brings most GPUs to their knees if you push the settings.

The simple fact is that the XBSX and PS5 have 16GB of VRAM. Games that are built to run on those consoles are designed with that much VRAM in mind. Whether you have enough VRAM is a fairly binary data point - you either have enough or you don't - because you can't really compress the game textures while they are in use. At this point, 16GB should be considered the minimum a mid-tier GPU should be shipping with, but nVidia is still trying to get away with 8-12GB cards.
 
The simple fact is that the XBSX and PS5 have 16GB of VRAM. Games that are built to run on those consoles are designed with that much VRAM in mind.
Not the full amount, though. Depending on which platform and the OS settings selected by the developers, games can use 12 to 14 GB -- and that's for everything. So in terms of working VRAM, for rendering, the latest consoles realistically have 10 to 12 GB for graphics. Obviously more than the de facto 8 GB on graphics cards, though.
 
Besides, PS5 and XSX don't have 16GB VRAM. They have 16GB _Shared RAM_ meaning that OS/Game takes up about 10GB, leaving 6GB tops for graphics.
 
What on earth are you smoking? My 1080ti handles 1440p, even on Ultra settings, just fine. I think the only game that makes it cry is Cyberpunk 2077, but that game brings most GPUs to their knees if you push the settings.

The simple fact is that the XBSX and PS5 have 16GB of VRAM. Games that are built to run on those consoles are designed with that much VRAM in mind. Whether you have enough VRAM is a fairly binary data point - you either have enough or you don't - because you can't really compress the game textures while they are in use. At this point, 16GB should be considered the minimum a mid-tier GPU should be shipping with, but nVidia is still trying to get away with 8-12GB cards.
Dude, your 1080 Ti don't handle new demanding games on ultra settings in 1440p LMAO. Lets be real now. Only if your fps goal is low, meaning console level low. Older and lesser demanding games, might run fine with playable fps on high setings yeah, not demanding ones. My guess is that you don't play new and demanding games.

I replaced my 1080 Ti years ago, because it was too slow for 1440p. I demand 100+ fps at times, on high settings - not ultra - I always tweak settings manually. Ultra preset often forces crap like motion blur and DoF... Useless.

Cyberpunk 2077 is not demanding without RT. What you you smoking? If you think it is, it just shows the age of your 1080 Ti. My 3080 maxes it out easily without RT and DLSS Quality mode works wonders in this game, allowing for 125-150 fps using very high settings. Without DLSS Quality, I still run close to 100 fps.

2080S replaced my 1080 Ti (pretty much a free upgrade, because I was offered alot for my 1080 Ti and got the 2080S for cheap) ... and 3080 replaced that one, for 699 dollars on launch day. Huge upgrades both times. Tons of games will make your 1080 Ti bleed, try RDR2 on max settings in 1440p, you will be looking at 25-30 minimum fps, and ~40-45 avg. This is a 5 year old game.

Just because you avoid demanding games, or lower settings, don't mean 1080 Ti is still a high-end GPU. It's not even mid-end today, more like low-end. Cards like 3060 Ti completely smashes it + has option for DLSS on top.

GTX is outdated and the 1000 series gets zero optimization in never games either. Not from game devs and not from Nvidia. It's a 2016 arch, 7+ years old at this point. Obsolete tech. Nvidia has full focus on RTX series. GTX is pretty much dead at this point, no focus at all. Just because Nvidia supports older cards in their drivers, does not mean they are doing anything to improve performance or fix errors in games. This is true for AMD as well. When a GPU is 3+ generations old, forget about optimization. The card might die any day as well. It's pretty much trash.

The fact that you think PS5 and XSX have 16GB VRAM shows me you have no clue. They barely use 6GB for actual graphics. Games are not built for AMD just because consoles use AMD. Lmao. We are using standards here. X86/X64 and DX/OGL/Vulkan APIs.
 
Last edited:
Dude, your 1080 Ti don't handle new demanding games on ultra settings in 1440p LMAO. Lets be real now. Only if your fps goal is low, meaning console level low. Older and lesser demanding games, might run fine with playable fps on high setings yeah, not demanding ones. My guess is that you don't play new and demanding games.

I replaced my 1080 Ti years ago, because it was too slow for 1440p. I demand 100+ fps at times, on high settings - not ultra - I always tweak settings manually. Ultra preset often forces crap like motion blur and DoF... Useless.

Cyberpunk 2077 is not demanding without RT. What you you smoking? If you think it is, it just shows the age of your 1080 Ti. My 3080 maxes it out easily without RT and DLSS Quality mode works wonders in this game, allowing for 125-150 fps using very high settings. Without DLSS Quality, I still run close to 100 fps.

2080S replaced my 1080 Ti (pretty much a free upgrade, because I was offered alot for my 1080 Ti and got the 2080S for cheap) ... and 3080 replaced that one, for 699 dollars on launch day. Huge upgrades both times. Tons of games will make your 1080 Ti bleed, try RDR2 on max settings in 1440p, you will be looking at 25-30 minimum fps, and ~40-45 avg. This is a 5 year old game.

Just because you avoid demanding games, or lower settings, don't mean 1080 Ti is still a high-end GPU. It's not even mid-end today, more like low-end. Cards like 3060 Ti completely smashes it + has option for DLSS on top.

GTX is outdated and the 1000 series gets zero optimization in never games either. Not from game devs and not from Nvidia. It's a 2016 arch, 7+ years old at this point. Obsolete tech. Nvidia has full focus on RTX series. GTX is pretty much dead at this point, no focus at all. Just because Nvidia supports older cards in their drivers, does not mean they are doing anything to improve performance or fix errors in games. This is true for AMD as well. When a GPU is 3+ generations old, forget about optimization. The card might die any day as well. It's pretty much trash.

The fact that you think PS5 and XSX have 16GB VRAM shows me you have no clue. They barely use 6GB for actual graphics. Games are not built for AMD just because consoles use AMD. Lmao. We are using standards here. X86/X64 and DX/OGL/Vulkan APIs.
Sounds to me like you have a bottleneck somewhere else in your system if the 1080ti isn't handling 1440p.
 
Sounds to me like you have a bottleneck somewhere else in your system if the 1080ti isn't handling 1440p.
Nah, my 5800X3D with 32GB 3200/CL14 memory + 2TB WD SN850X does fine thank you. Zero CPU or memory bottlenecks, not even on my 3080, which destroys a 1080 Ti completely.

Nothing about 1080 Ti is considered high-end today. It's a dated GPU by all means. 7+ year old arch, forgotten by even Nvidia at this point. Gets zero optimization in new games.

You just have low requirements. As I said, I demand 100+ fps at all times. You don't, it seems...
 
Nah, my 5800X3D with 32GB 3200/CL14 memory + 2TB WD SN850X does fine thank you. Zero CPU or memory bottlenecks, not even on my 3080, which destroys a 1080 Ti completely.

Nothing about 1080 Ti is considered high-end today. It's a dated GPU by all means. 7+ year old arch, forgotten by even Nvidia at this point. Gets zero optimization in new games.

You just have low requirements. As I said, I demand 100+ fps at all times. You don't, it seems...
Naw, I hit 100 at 1440p pretty easily in most games. I think you just didn't have right complimentary hardware. Probably a weak PSU.
 
Sure. But now nVidia is getting a bad rap for price gouging and small amounts of VRAM (which they really have no excuse for; VRAM is cheap as dirt). I have a 1080ti now and have always bought nVidia GPUs, but I am fully expecting my next GPU to be an AMD one because they actually give you enough VRAM to display 4K textures.
so far that hasn't notably affected Nvidia's bottom line; and that is the only thing that matters to them with the huge market dominance they have.
 
so far that hasn't notably affected Nvidia's bottom line; and that is the only thing that matters to them with the huge market dominance they have.
Generally I agree, but I do think its hitting a tipping point right now. I think most people assumed that 4K gaming was still beyond their budget for the 20xx and 30xx, especially given the scalping that happened with the 30xx series cards. But the 40xx is the first time that it seems like 4K might be cheaper enough for the mid-level gamers - except the GPUs are still priced way more than they're actually worth (all have around a 50% markup from where they should be, imo), and nVidia seems to be trying to tier their cards based not just on processor, but VRAM as well.

The fact nVidia has chosen to scale-back production due to weak demand, rather than cut prices, I think indicates that we are at the tipping point right now. Scaling back only makes sense if you anticipate a permanent loss in demand, since you can't easily throttle production lines up or down. And reducing the number of units produced also reduces your profit margins, since that nearly always negatively impacts any savings due to scale. nVidia has likely internally acknowledged that their mid-tier cards for this generation missed the mark. Hopefully they also acknowledge that this is mostly due to their own pricing and not due to "the economy".
 
Back