Nvidia GeForce RTX 4070 Review: $600 Mid-Range is Here

Status
Not open for further replies.
As long as we're making wild predictions, I think the 4070 will end up outselling the 6950XT by at least 5 times, if not 10 or more.
I think that is not so easy, so it is a welcomed challenge. Let's see how customers are willing to vote with their wallets nowadays in this strange videocard market prices.
 
Oh jeez, not THIS crap again! If "inflation" had ANYTHING to do with it, we'd see CPU prices skyrocketing as well because they're both made of silicon. Except, oh yeah, that hasn't happened.

An examination of AMD's Ryzen pricing from 2017-2022 shows that there's really no excuse:

2017:
Ryzen 5 1600X - $249
Ryzen 7 1700X - $399
Ryzen 7 1800X - $499

2018:
Ryzen 5 2600X - $229
Ryzen 7 2700X - $329

2019:
Ryzen 5 3600X - $249
Ryzen 7 3700X - $329
Ryzen 7 3800X - $399

2020:
Ryzen 6 5600X - $299
Ryzen 7 5800X - $449

2022:
Ryzen 7 7600X - $299
Ryzen 7 5700X - $299
Ryzen 7 7700X - $399

So, between 2017 and 2022, with some slight ups and downs, the Ryzen 6 has increased by $50 and the Ryzen 7 x7xx has remained exactly the same. Meanwhile, between 2017 and 2020, the price of the Ryzen 7 x8xx has decreased by $50.

Inflation my posterior! It's just greed. Anyone who thinks otherwise is incredibly naive.
The best part of your post has been diluted in the many replies. Good way to destroy the inflation narrative.
 
Are you kidding me? Beside the 4090, this whole lineup by Nvidia has been unimpressive.

If you put in perspective the MSRPs of the 4000 series, than it is rather an abysmal showing from Nvidia.

Just add the whole not enough VRAM problems on most of Nvidia SKUs and you have all the ingredients to skip any SKUs beside the 4090. The 4080 would be interesting, but it is 300-400$ overpriced.
Unimpressive overall but performance in isolation has been pretty good until this GPU.
 
Oh jeez, not THIS crap again! If "inflation" had ANYTHING to do with it, we'd see CPU prices skyrocketing as well because they're both made of silicon. Except, oh yeah, that hasn't happened.

An examination of AMD's Ryzen pricing from 2017-2022 shows that there's really no excuse:

2017:
Ryzen 5 1600X - $249
Ryzen 7 1700X - $399
Ryzen 7 1800X - $499

2018:
Ryzen 5 2600X - $229
Ryzen 7 2700X - $329

2019:
Ryzen 5 3600X - $249
Ryzen 7 3700X - $329
Ryzen 7 3800X - $399

2020:
Ryzen 6 5600X - $299
Ryzen 7 5800X - $449

2022:
Ryzen 7 7600X - $299
Ryzen 7 5700X - $299
Ryzen 7 7700X - $399

So, between 2017 and 2022, with some slight ups and downs, the Ryzen 6 has increased by $50 and the Ryzen 7 x7xx has remained exactly the same. Meanwhile, between 2017 and 2020, the price of the Ryzen 7 x8xx has decreased by $50.
Intel also has been reasonably consistent with their pricing, being not that far off US inflation.

i7
2600K - $320 Released in Q1 2011
3770K - $310
4770K/4790K - $340
5775C - $365
6700K - $350
7700K - $340
8700K - $380
9700K - $420
10700K - $400
11700K - $420
12700K - $420
13700K - $450 Released Q4 2022

Compared to Nvidia
xx80 + Memory and Die Size
GTX 480 - $500 - 1536MB - 529mm² Released in Q4 2010
GTX 580 - $500 - 1536MB - 520mm²
GTX 680 - $500 - 2GB - 294mm²
GTX 780 - $650 - 3GB - 561mm²
GTX 980 - $550 - 4GB - 398mm²
GTX 1080 - $600 - 8GB - 314mm²
RTX 2080 - $700 (FE $800) - 8GB - 545mm²
RTX 3080 - $700 - 10GB - 628mm²
RTX 4080 - $1200 - 16GB - 379mm² Released in Q4 2022
 
The best part of your post has been diluted in the many replies. Good way to destroy the inflation narrative.

The price increases on graphics cards likely outstrip inflation for many other types of goods, but the comparison in that post is just poor.

Inflation has a cumulative effect and there are more parts from various suppliers involved in building a graphics card vs a CPU. The vendors of memory chips, circuit boards, connectors, heatsinks and fans are all affected by rising costs of raw materials, labor and transportation. And despite the price increases, Nvidia's net margins have in fact been decreasing.

 
In a few years people will say : This works fine on the 6800XT but not on the 4070 just because of VRAM.

As opposed to saying today: This works fine on the 4070 but not on the 6800XT because it doesn't do upscaling and raytracing very well. And it doesn't do shader execution reordering.

The VRAM size talking point is certainly all the rage these days. Just like the old 'fine wine' narrative, will it even matter during the actual relevant lifetime of the product?
 
Oh jeez, not THIS crap again! If "inflation" had ANYTHING to do with it

Inflation has something to do with everything. It exists, whether or not you like it. If the CPUs have not seen increases in their prices as measured in dollars, that simply means they have gotten cheaper.
 
The best part of your post has been diluted in the many replies. Good way to destroy the inflation narrative.

Inflation isn't a narrative. It's an objective fact. If you think GPUs should be getting cheaper and cheaper rather than holding their value, that is an entirely separate topic.
 
Compared to Nvidia
xx80 + Memory and Die Size
GTX 480 - $500 - 1536MB - 529mm² Released in Q4 2010
GTX 580 - $500 - 1536MB - 520mm²
GTX 680 - $500 - 2GB - 294mm²
GTX 780 - $650 - 3GB - 561mm²
GTX 980 - $550 - 4GB - 398mm²
GTX 1080 - $600 - 8GB - 314mm²
RTX 2080 - $700 (FE $800) - 8GB - 545mm²
RTX 3080 - $700 - 10GB - 628mm²
RTX 4080 - $1200 - 16GB - 379mm² Released in Q4 2022

Yes... the xx80 series is a good example of prices outpacing inflation, so criticism is warranted here. Not so much with the xx70 series.
 
Impressive how they target all sectors of the market at once by producing a mid range card for a high end price and equip it with low range RAM. The marketing people must be clapping themselves on the back.
 
Yes... the xx80 series is a good example of prices outpacing inflation, so criticism is warranted here. Not so much with the xx70 series.
GTX 470 - $350 - 1280MB - 529mm² Released in Q1 2010
GTX 570 - $350 - 1280MB - 520mm²
GTX 670 - $400 - 2GB - 294mm²
GTX 770 - $400 - 2GB - 294mm²
GTX 970 - $330 - 4(3.5)GB - 398mm²
GTX 1070 - $379 - 8GB - 314mm²
RTX 2070 - $500 (FE $600) - 8GB - 445mm²
RTX 3070 - $500 - 8GB - 392mm²
RTX 4070 - $600 - 12GB - 295mm² Released in Q2 2023
RTX 4070Ti - $800 - 12GB - 295mm² Released in Q1 2023
Important to note that the 4070 is the worst generational improvement since the 670 to 770 which are basically the same GPU.
When looking at the expected generational improvement from different architectures, it should be just below the 4070 Ti.
Inflation would put it at under $500 btw. Also looking at this makes me remember how trash the 2070 was.
 
I just want to say a few days ago I called it that it would be faster than the 3080 at 1440p (slightly) and slower at 4k (slightly). It was an intuitive guess based on the fact the 30 series scaled better at 4K and the 3080 has a much higher memory bandwidth. So, there you have it, the 4070 is the best value Nvidia has to offer at the moment, but if you are looking for a significant value uplift from the 30 series, it's not here. Just $100 cheaper than 3080 in costs with equal performance 2.5 years later. Still, part of the reason for this is that the 3080 was based on GA102 when originally it was supposed to be GA103. This made the 3080 12% or so faster than it was originally meant to be. I guess if Nvidia had released the GA103 3080 instead, the 4070 would be roughly 10-15% faster for $100 less and seem like a much better value. It's also worth noting that the 6800 XT is also on par for rasterization performance but is nearly just as expensive. I think we'll see both the 3080 and 6800XT decrease in resale value due to the 4070, at least to around or under $500. Should reduce 3070, 3070 Ti cards as well.
Yeah the smaller L2 cache is probably what limits it at higher resolutions since 4K spills over it too much and the lesser VRAM bandwidth (and maybe shaders cores, ROPS/TMUs as well) is insufficient.
 
GTX 970 - $330 - 4(3.5)GB - 398mm²
GTX 1070 - $379 - 8GB - 314mm²
RTX 2070 - $500 (FE $600) - 8GB - 445mm²
RTX 3070 - $500 - 8GB - 392mm²
RTX 4070 - $600 - 12GB - 295mm²
Although there's no easy way to do this (or at the very least, have an accurate figure for it), accounting for the die fabrication costs is worth considering here. The GM204 in the GTX 970 was made on TMSC 28nm node, which was a lot cheaper than those used for the models that came afterward. For example, the TU106 in the 2070 was a custom 16nm node (aka 12FFN), so not only does the normal higher wafer cost need to be absorbed, but the one-off-for-Nvidia manufacturing process needs to be included too. Plus the relatively large die size didn't help matters either.

I should imagine that's part of the reason why Nvidia switched to Samsung, despite using another custom node (8N is an Nvidia-specific version of Samsung's 8LPU), rather than going with TMSC's N7. The latter, at the time, was expensive and in high demand from AMD, and as big as Nvidia is, it has nothing like the wafer order size that AMD does. Of course, using a cheaper node doesn't mean that the cost reduction ever gets passed down to the end user. Given that the complaints about Turing revolved around not getting much of a performance boost for the big jump in price, it was a no-brainer for Nvidia to price Ampere roughly the same as Turing, but reap the benefits of lower manufacturing costs.
 
Inflation isn't a narrative. It's an objective fact. If you think GPUs should be getting cheaper and cheaper rather than holding their value, that is an entirely separate topic.
I should have said "the inflation narrative for GPUs", I thought it was obvious with the quote. But talking about the general inflation, corporations benefits also show clearly a weaponization of inflation.
 
As opposed to saying today: This works fine on the 4070 but not on the 6800XT because it doesn't do upscaling and raytracing very well. And it doesn't do shader execution reordering.

The VRAM size talking point is certainly all the rage these days. Just like the old 'fine wine' narrative, will it even matter during the actual relevant lifetime of the product?


But "TODAY" the 4070 was just released... and it doesn't have the VRAM for higher resolutions & lush environments and is priced above AMD's 2.5 year old 6800xt., which happens to beat the 4070 in seven of the games reviewed today.

So who is going to "upgrade" to a 4070..?


GTX, RTX2 & RTX3 Owners have ZERO interest in RTX4 cards... those people will just spend their money on UPGRADING their visual experience with the many new super-monitors available... and continue gaming using upscaling.

Who is buying a new GPU to upscale..? (dERP!)
 
Last edited:
Looking at the performance gap between the 4080 and 4070/4070ti, and Core count, and small bump in memory size, What we're looking at here is a RTX4060 that shareholders decided to slap a 4070 sticker on. There won't be a "best bang per buck REAL 4070" this generation. It's canceled. The 4060 will be a rebranded bottom of the barrel 4050. If they release a 4050, it will be from last gen off the self silicon that already depreciated to $zero.

That's from someone who's been following closely since the Nvidia GTX400/ AMD HD5000 series (when I got my first paycheck).

These are extreme anti consumer companies with anti consumer products. Lack of competition in the last six years allowed the shareholders to take control and focus on profit alone.

Buy an Xbox/steam deck now or wait for the PS5 Pro/Switch 2. They're all great products and if everyone actually boycotts gaming PCs starting today, I guarantee you that next year you will see the RTX5090 for $1,000 with %75 more performance than the 4090, you'll also see the 5080 slightly below that for $800 and a "proper" 5070 slightly below the 5080 for $600, and a 5060 that doubles the performance of this 4070 for $400, and a $200 RTX5050 playing games @2K with 120FPS.
 
I had the chance to be on the right time at the right place, and he gave me this amazing price because I already bought a lot of hardware components for a client of mine.
Well, if you ever find any high-end Radeons lying around, let me know! :laughing:
 
Techspot says "The Radeon 6800 XT can be had for around $570 and it offers more VRAM, but you'll have to weigh that up against the lack of DLSS support and inferior ray tracing performance - we'd probably still go with the RTX 4070."
Yeah, I thought that this statement was absurd as well. Things like DLSS and RT are FRILLS, nothing more.
16GB can save you from playing a game without stutters VS RTX and DLSS ( a better FSR lol).
Come on. In a few years people will say : This works fine on the 6800XT but not on the 4070 just because of VRAM.
Hear, hear! It's the reason I always say "When you're buying hardware, buy hardware!" because all of these software frills are just that, frills. A lack of VRAM is extremely crippling, a lack of frills like DLSS and RT are not. I don't know what the hell Steve's talking about but it seems to me that his priorities are bass-ackwards in that regard.
 
Looking at the performance gap between the 4080 and 4070/4070ti, and Core count, and small bump in memory size, What we're looking at here is a RTX4060 that shareholders decided to slap a 4070 sticker on. There won't be a "best bang per buck REAL 4070" this generation. It's canceled. The 4060 will be a rebranded bottom of the barrel 4050. If they release a 4050, it will be from last gen off the self silicon that already depreciated to $zero.
You know, that does make a lot of sense when I think about it. I think that you're right.
That's from someone who's been following closely since the Nvidia GTX400/ AMD HD5000 series (when I got my first paycheck).
Ah yes, ol' Fermi! The source of the meme:
"nVidia, the way it was meant to be delayed!"

And Evergreen, whose #1 son, Hemlock, "brutally sodomized" the GTX 295! :laughing:
These are extreme anti consumer companies with anti consumer products. Lack of competition in the last six years allowed the shareholders to take control and focus on profit alone.
This is true, but the lack of competition is also the fault of lazy and/or brain-dead consumers who just throw money at nVidia without actually looking at which is best for their case-use. Right now, there is literally no reason to buy a GeForce card from a price/performance perspective (I mean, it's not even close) but it doesn't have the effect that it's supposed to in a non-broken market.

I'm willing to bet that consumers today are sooo stupid that, even at the same price , the RTX 3070 Ti is out-selling the RX 6950 XT? That's the only way that its price could get that high to begin with. Even with all of the hard evidence that getting an 8GB or 10GB RTX card that can be found everywhere, I still see some muppets talking about getting one (even on here, where you'd think that people would know better). It blows my mind but it also explains why we're in the situation that we're in. The lack of competition isn't because of nVidia or AMD but instead is the result of consumers buying nVidia no matter what. I think that this has been the case for three generations now and AMD has given up actually trying to compete by offering better value because consumers haven't responded to it, like, at all.
Buy an Xbox/steam deck now or wait for the PS5 Pro/Switch 2. They're all great products and if everyone actually boycotts gaming PCs starting today, I guarantee you that next year you will see the RTX5090 for $1,000 with %75 more performance than the 4090, you'll also see the 5080 slightly below that for $800 and a "proper" 5070 slightly below the 5080 for $600, and a 5060 that doubles the performance of this 4070 for $400, and a $200 RTX5050 playing games @2K with 120FPS.
^^^ THIS ^^^

As long as things don't get any worse than they are now, I'll probably be a member of the PC Master Race until I die (because I've already been in it for almost 40 years) but I'm unable to recommend it to anyone I know because when a video card costs more than an entire console, it's just a bad financial decision to get one.

If things do get worse, I will eventually buy a console. Fortunately, I expect that I'll have at least five years to make that decision because, at this point, I'm not buying any new hardware until my R7-5800X3D and RX 6800 XT become completely unusable for games. Unless things get A LOT better, I won't upgrade until my current PC can't even game at potato settings. If we're still in the same boat then that we are now, I won't even upgrade my PC because I'll just use it as an HTPC and I'll buy a PlayStation.
 
The best part of your post has been diluted in the many replies. Good way to destroy the inflation narrative.
I'm confused. What do you mean? (You have to be simple with me, I'm not that bright!) :laughing:
 
Last edited:
Intel also has been reasonably consistent with their pricing, being not that far off US inflation.

i7
2600K - $320 Released in Q1 2011
3770K - $310
4770K/4790K - $340
5775C - $365
6700K - $350
7700K - $340
8700K - $380
9700K - $420
10700K - $400
11700K - $420
12700K - $420
13700K - $450 Released Q4 2022

Compared to Nvidia
xx80 + Memory and Die Size
GTX 480 - $500 - 1536MB - 529mm² Released in Q4 2010
GTX 580 - $500 - 1536MB - 520mm²
GTX 680 - $500 - 2GB - 294mm²
GTX 780 - $650 - 3GB - 561mm²
GTX 980 - $550 - 4GB - 398mm²
GTX 1080 - $600 - 8GB - 314mm²
RTX 2080 - $700 (FE $800) - 8GB - 545mm²
RTX 3080 - $700 - 10GB - 628mm²
RTX 4080 - $1200 - 16GB - 379mm² Released in Q4 2022
Exactly. Since the manufacture of GPUs and CPUs is the exact same process, the fact that one skyrocketed in price but not the other just shows that "inflation" is not the cause.

This is what I call a high-quality post. A statement is made and evidence is used to demonstrate it. I notice that the people who just cry "inflation" never have any evidence to demonstrate it. This is because they're not smart enough to actually understand what they're talking about or they're just lying to shill for nVidia.
 
Status
Not open for further replies.
Back