Nvidia GeForce RTX 5070 Ti Review

If I had to guess, probably because:

a: There is a greater time crunch to get the review ready by the time the review embargo is lifted... and when you're testing 16 games between 12 cards I imagine certain talking points like overclocking were put aside for a later article.

b: Overclocking is extremely subjective and dependent on many variables, and skews results when against other cards of different makes and generations, if you are not also overclocking them as well which is a ton of work. Overclocking benchmarks I feel are best when applied against itself and not with data from other "unrelated" cards.

c: Overclocking results have the implication of what could be and not what is. As a consumer, I would much rather have my expectations tempered to what is typically experienced out of the box versus what is possible after tinkering, especially so at release. Even still it takes weeks and months for drivers to mature and optimize, so the results of overclocking now vs a few weeks down the line could be drastically different (a reason why reviewers like Mr. Watson often revisits cards later down the line).




And to be honest, assume the 5070 Ti could also get another 10% of performance by OC, it would still not change the current reality of supply and prices at all, the lack of perceived generational uplift by reviewers and public alike, as well that nearly every claim Nvidia made about the 5000 series cards have been misleading at best.
Nvidia why go for 1x margins when you can get 10x margins with 1/10 of the supply. Nvidia math 101.

I really hope their is a silver lining in game development. Gamers will not purchase games on brute forced hardware they can't purchase. Devs will have no choice but to code for current fixed stagnated hardware in the 2 year outlook.
 
Again, another proof that AMD is having a huge opportunity this year, let's see if they can win our heart once more after the 5700xt or just screw us over again xD
It is the biggest they have, but I am going to bring everyone back to reality quite fast...

1. The BoM for the 9070XT is similar to the one of the 7800XT, meaning it should cost about the same to manufacture.

2. AMD waited for Nvidia to see their value proposition with the 5070TI and 5070. This is not just a pro-consumer move, it can also backfire heavily as a pro-corporate move. Frank Azor initial comments to PCWorld quickly got amended by David McAffe putting doubt to "Price will Please", "We learned our lesson", "It is a 70 Class card".

3. With the recent Value proposition of today, AMD have NO pressure for proposing a disruptive price of 550$-600$ MSRP for the 9070XT since the 5070TI will retail for 900$.

4. The GPU supply is gone, even for AMD's GPUs at retailers right now. They are all sold out meaning AMD can put the price they want at first to price gouge since they are going to sell everything and Nvidia cannot provide supply.

5. AMD is going to have supply. They have been stocking supply for almost 3 months now and nothing else will be available. AMD will literally release their GPU on about the same day as the 5070. They are going to sell everything because there will be no other alternatives. They can ask the price they want as long as it is close to the 5070 TI.

6. The tarrifs is hitting TSMC by 100% on chips and 10% on Chinese imports. This means about 25% increase over a GPU, which is what AIBs demonstrated with the 5070TI, 5080 and 5090. AMD will also suffer from this meaning their MSRP of 550$ is long gone.

7. AMD can adjust their pricing, LATER... and they will eventually...

For these reasons, I predict the MSRP for the 9070XT will be 650$ for the Made By AMD GPU and 700$ for the AIBs versions.
 
If I had to guess, probably because:

a: There is a greater time crunch to get the review ready by the time the review embargo is lifted... and when you're testing 16 games between 12 cards I imagine certain talking points like overclocking were put aside for a later article.

b: Overclocking is extremely subjective and dependent on many variables, and skews results when against other cards of different makes and generations, if you are not also overclocking them as well which is a ton of work. Overclocking benchmarks I feel are best when applied against itself and not with data from other "unrelated" cards.

c: Overclocking results have the implication of what could be and not what is. As a consumer, I would much rather have my expectations tempered to what is typically experienced out of the box versus what is possible after tinkering, especially so at release. Even still it takes weeks and months for drivers to mature and optimize, so the results of overclocking now vs a few weeks down the line could be drastically different (a reason why reviewers like Mr. Watson often revisits cards later down the line).




And to be honest, assume the 5070 Ti could also get another 10% of performance by OC, it would still not change the current reality of supply and prices at all, the lack of perceived generational uplift by reviewers and public alike, as well that nearly every claim Nvidia made about the 5000 series cards have been misleading at best.

In this review they overclocked and got nearly the same performance as the 5080, but the counterargument is that overclocks aren't stable for everyone (as you pointed out) and you can also overclock the 5080. https://www.pcgamer.com/hardware/graphics-cards/nvidia-rtx-5070-ti-review-msi-ventus-3x/
 
These days, reviewing high end GPUs on price feels like reviewing a Ferrari on value for money. Its always going to be **** and the buyers dont care.

Budget cards are what should be judged on cost per frame. And by budget, I mean $500 or less.
I would agree for the xx90 series and maybe for the xx80 series. However, the frustration is that the xx70 series used to be the perfect blend of price and performance. Not the cheapest "budget" card (50 and 60 series), but a lot of performance for the price. 70 series now occupies a price range of what used to be the high end. The iconic 1080 Ti cost $699 MSRP in 2017 ($900 in today's dollars). The 1070 Ti cost $450 (about $590 today). More importantly, pre-cyrpto and pre-AI, you could routinely get them on sale. I bought my 1070 Ti on black Friday for $300 ($400 today). The 5070 Ti is expected to be $750, but more like $900-$1000 in actuality. We are a far cry from "affordable" now a days.

FYI, I have owned a 760, 1070 Ti, and now run a 3070 Ti (had to buy it used).
 
What can we infer from this?
The 4080 is still slightly faster than the 5070 Ti with only 768 additional cuda cores, that's only around 9% more cores, and other than in 4K due to the additional memory bandwidth, the 5070 Ti is nearly the same % behind at lower resolutions.

This means that the 4070 S for $600 is likely going to beat the 5070 at $549, by a significant margin. The 4070 S has 7168 cuda cores and the 5070 only has 6144. Based on everything we have seen the cuda to performance uplift is not much better than 1:1. This means the 4070 S will probably still have a 15% performance advantage on average to the 5070.... Which will actually make the 4070 S at $599 the better deal as the 5070 would need to be priced below $510.00! So much for 4090 performance (impossible with && without AI apparently) on the 5070.

In other words, if you are looking for a mid-range GPU, you should buy the 4070 S before the 5070 reviews. Oh wait! You can't Nvidia made sure they were out of stock long before now.

It also means that the RTX 5070 will not be that much better than the RTX 4070 and likely on par with the 3080, think about that for a second. Two generations later and the 70 (non-TI) still won't pass the 80 series from 5 years ago by any significant margins. Nvidia is giving the mid-range nothing and it's not because it's not in their power to do so. It's because they have had zero competition.

Even if you are a diehard Nvidia fanboy, it's probably time to start rooting for AMD. If serious competition doesn't come, Nvidia has no reason to give you performance increases without significant cost increases and history has shown, they won't.
 
Last edited:
I wish these #@$)(*& reviews would STOP EVEN DISCLOSING THE FAKE MSRPs.
I see these cards selling for $900. The cost per frame is $11.38.
The value is roughly the same as 4080 Super at discount ($970).
4080 Super is not a terrible value, it's a poor value, but not a terrible value.
But this card doesn't move the needle very far away from $12/frame for NVidia cards these days !!

One place NVidia is definitely LEADING the market is the top-5 most overpriced cards @ streetprices.
There's a reason they get called "NGreedia".
 
Last edited:
It is the biggest they have, but I am going to bring everyone back to reality quite fast...

1. The BoM for the 9070XT is similar to the one of the 7800XT, meaning it should cost about the same to manufacture.

2. AMD waited for Nvidia to see their value proposition with the 5070TI and 5070. This is not just a pro-consumer move, it can also backfire heavily as a pro-corporate move. Frank Azor initial comments to PCWorld quickly got amended by David McAffe putting doubt to "Price will Please", "We learned our lesson", "It is a 70 Class card".

3. With the recent Value proposition of today, AMD have NO pressure for proposing a disruptive price of 550$-600$ MSRP for the 9070XT since the 5070TI will retail for 900$.

4. The GPU supply is gone, even for AMD's GPUs at retailers right now. They are all sold out meaning AMD can put the price they want at first to price gouge since they are going to sell everything and Nvidia cannot provide supply.

5. AMD is going to have supply. They have been stocking supply for almost 3 months now and nothing else will be available. AMD will literally release their GPU on about the same day as the 5070. They are going to sell everything because there will be no other alternatives. They can ask the price they want as long as it is close to the 5070 TI.

6. The tarrifs is hitting TSMC by 100% on chips and 10% on Chinese imports. This means about 25% increase over a GPU, which is what AIBs demonstrated with the 5070TI, 5080 and 5090. AMD will also suffer from this meaning their MSRP of 550$ is long gone.

7. AMD can adjust their pricing, LATER... and they will eventually...

For these reasons, I predict the MSRP for the 9070XT will be 650$ for the Made By AMD GPU and 700$ for the AIBs versions.
The BOM is not exactly the same, the 9070xt has a chip that costs $45 more to manufacture at TSMC (at 90% yields) than the 7800xt. So the BOM for the 9070xt, including extra power stages and connectors, and reasonable margins, means the 9070xt has a floor price of $580 ($100 more than the 7800xt @ $480). But this is only when yields reach 90% - they are usually 70% at the start of production. You can plug the numbers into a VLSI calculator - 390mm^2 vs. 346mm^2 chip size, $20,000 for a 300mm N4 wafer vs $16,000 for a 300mm N5 wafer, to see that the 9070xt is $45 more expensive to manufacture than the 7800xt.

I tend to believe AMD will ask $750 for the 9070xt. That will not make me very happy because I have seen 7900xtx cards for sale under $800, and $750 will be a price increase because you'll lose 8GB of RAM but gain a few raytracing frames (AMD owners don't care). Using 7900xtx performance, it will cost $8.92 per frame which is good but - once again - an overpowered card for the very limited RAM it has. They waited to exploit the current tight market conditions. They waited in order to squeeze more blood out of the turnip.
 
Last edited:
That's pretty standard for any product... Selling your old stock out.

The point is that the $549 asking price for the 5070 will not even be a better value than the msrp of the 4070 S and the 4070 S is no longer available to buy. No, companies do not always sell out of old product before the launch of a new product. The reason the 40 series is no longer in production is because it's on the same node as the 50 series. The 5070 should have been at least 56 SMs like the 4070 S that still would have given you a significant delta between the 5070 Ti and 5070 and it would have really given those looking for a mid-range GPU something they could potentially be excited about. Only 48 SMs when the 5070 Ti has 70! That's 45% more SMs for less than 33% more $, the 5070 is going to be a horrible price to performance GPU.
 
The point is that the $549 asking price for the 5070 will not even be a better value than the msrp of the 4070 S and the 4070 S is no longer available to buy. No, companies do not always sell out of old product before the launch of a new product. The reason the 40 series is no longer in production is because it's on the same node as the 50 series. The 5070 should have been at least 56 SMs like the 4070 S that still would have given you a significant delta between the 5070 Ti and 5070 and it would have really given those looking for a mid-range GPU something they could potentially be excited about. Only 48 SMs when the 5070 Ti has 70! That's 45% more SMs for less than 33% more $, the 5070 is going to be a horrible price to performance GPU.
4070 was the WORST price to performance GPU of the 4000-series generation, I see no reason for NVidia not to continue this mistake with the 5070 being the WORST price to performance of the 5000-series generation. AMD was actually DESTROYING NVIDIA in the midrange in the autumn of 2023 and they were outselling NVidia both in cards sold per month AND average selling prices (@ mindfactory of Germany). NVidia rushed the super cards to market as a desperation move, basically applying a $200 price cut to the 4070, 4070 ti, and 4080 cards, to cut prices and cover up their big life mistake!
 
The BOM is not exactly the same, the 9070xt has a chip that costs $45 more to manufacture at TSMC (at 90% yields) than the 7800xt. So the BOM for the 9070xt, including extra power stages and connectors, and reasonable margins, means the 9070xt has a floor price of $580 ($100 more than the 7800xt @ $480). But this is only when yields reach 90% - they are usually 70% at the start of production. You can plug the numbers into a VLSI calculator - 390mm^2 vs. 346mm^2 chip size, $20,000 for a 300mm N4 wafer vs $16,000 for a 300mm N5 wafer, to see that the 9070xt is $45 more expensive to manufacture than the 7800xt.

I tend to believe AMD will ask $750 for the 9070xt. That will not make me very happy because I have seen 7900xtx cards for sale under $800, and $750 will be a price increase because you'll lose 8GB of RAM but gain a few raytracing frames (AMD owners don't care). Using 7900xtx performance, it will cost $8.92 per frame which is good but - once again - an overpowered card for the very limited RAM it has. They waited to exploit the current tight market conditions. They waited in order to squeeze more blood out of the turnip.
You both are close.

AMD's is going to play NVidia's game and keep the "suggested" MSRP low, but now allow AIBs to price their "aftermarket" cards at whatever they want, reaping the rewards. AMD is giving AIBs what NVidia won't, profit margins & sales.

OC'd top tier RX 9070XT are said to tuck right under the 7900XTX, so they will price accordingly...
 
You both are close.

AMD's is going to play NVidia's game and keep the "suggested" MSRP low, but now allow AIBs to price their "aftermarket" cards at whatever they want, reaping the rewards. AMD is giving AIBs what NVidia won't, profit margins & sales.

OC'd top tier RX 9070XT are said to tuck right under the 7900XTX, so they will price accordingly...
When AMD built RDNA3, the goal was 3Ghz+ clock speeds but they ran out of time and couldn't get it to clock above 2.5Ghz, so the entire 7000-series generation was a crippled mistake. Now that they straightened out the performance, we see that the 7800xt was originally going to run like a 7900xt, and this 9070 beast with only 64CUs runs like a 7900xtx, and that if the 9090xtx had been REAL (with 96CUs) it would run FIFTY PERCENT FASTER than the 7900xtx, and it would nuke a 5090, easily.

This is why I like to follow AMD architecture. They are the only ones that can poke NVidia in the eye. Intel is still struggling to get out of the 2010's with their very poorly-designed GPUs. It turns out that laying off your all VLSI architects to give bonuses to the marketing folks (2014) has given Intel a really bad reputation and their current designs are done with 2nd-tier engineers, which no longer works if your process tech is no longer 2 generations ahead ...
 
Glad to see more and more of the press calling out Nvidia on their used car salesman tactics, although collectively they could go farther. Products that are not actually on shelves should not get reviews and "launch day" coverage, they should be treated as "technology previews" with minimal fanfare. The fake MSRP shouldn't even get coverage at all, and/or articles should be written so that they can automatically adjust to actual price & availability.

I'm not mad at the world though, just Nvidia's sleazy conduct. If the market has determined that limited semiconductor capacity should go first to AI capacity, than so be it. Ultimately I'm happy with a gaming industry where almost every existing game was designed to be fully enjoyable at much more modest graphics levels. Really the only thing out of sync here is that large televisions went 4K many years ago, way before GPUs were remotely ready to take full advantage of that resolution, and most PC gaming rigs haven't really been able to fully catch up yet. But I can wait more generations if I have to, ultimately it's a luxury vs a core part of the gaming experience.
 
I would agree for the xx90 series and maybe for the xx80 series. However, the frustration is that the xx70 series used to be the perfect blend of price and performance. Not the cheapest "budget" card (50 and 60 series), but a lot of performance for the price. 70 series now occupies a price range of what used to be the high end. The iconic 1080 Ti cost $699 MSRP in 2017 ($900 in today's dollars). The 1070 Ti cost $450 (about $590 today). More importantly, pre-cyrpto and pre-AI, you could routinely get them on sale. I bought my 1070 Ti on black Friday for $300 ($400 today). The 5070 Ti is expected to be $750, but more like $900-$1000 in actuality. We are a far cry from "affordable" now a days.

FYI, I have owned a 760, 1070 Ti, and now run a 3070 Ti (had to buy it used).
The only reason I disagree is because I'm quite sure the xx70 class will sell very well at NVidias prices and that's the only metric on pricing that matters. Nvidia doesn't owe gamers anything. They owe their shareholders profits. If the card sells, its not been priced to high. Thats it. Just because in the past the xx70 has been good value doesnt mean Nvidia have to do that again.

Also, isn't everyone getting paid way more these days? They are where I live in the PNW of the USA. I do the same job and vie seen my salary more than double over the last 5 years. The sign outside my local Mcdonalds has gone from advertising jobs at $14 an hour to $25 an hour in that time. For me $1000 is affordable, my monthly car payment is more.

I'm actually running a 3070ti. I'm happy with it for 1440p but its a bit miserable when I plug my shiny new 4K OLED TV in it that supports 144hz. I wont buy the new GPUs at the prices they are selling at. But I don't feel angry or upset at Nvidia for this. To me its obvious what's going on, the market for these things has exploded and now there's way more people with way more money going out there and buying graphics cards.
 
When AMD built RDNA3, the goal was 3Ghz+ clock speeds but they ran out of time and couldn't get it to clock above 2.5Ghz, so the entire 7000-series generation was a crippled mistake. Now that they straightened out the performance, we see that the 7800xt was originally going to run like a 7900xt, and this 9070 beast with only 64CUs runs like a 7900xtx, and that if the 9090xtx had been REAL (with 96CUs) it would run FIFTY PERCENT FASTER than the 7900xtx, and it would nuke a 5090, easily.

This is why I like to follow AMD architecture. They are the only ones that can poke NVidia in the eye. Intel is still struggling to get out of the 2010's with their very poorly-designed GPUs. It turns out that laying off your all VLSI architects to give bonuses to the marketing folks (2014) has given Intel a really bad reputation and their current designs are done with 2nd-tier engineers, which no longer works if your process tech is no longer 2 generations ahead ...
Correct^
I've explained this many times ovr, since RDNA and it's new software stack vs Ampere and it's GFE/CUDA hilarity for Gamers.

When you had the (at the time) $679 6900xt beating the $1,800 RTX3090 in Call of Duty.. everyone knew that RDNA was smaller and offered higher performance, everyone but lemmings who were bemused by Nvidia hand waving magic & glitz marketing. And sadly most/many consumers live in a vacuum/don't play alongside PC friends, thus absolutely no REAL LIFE hardware experiences and left using internet to get reviews/opinions, most often filtered through a bias marketed filter, (again) without any real life experiences. just marketing fluff.

The fluff is ovr:
RX 9070xt (rdna4) is going to be the new hot Hors d'oeuvres.. and will serve up 80% of the gaming community. Price/performance is a winning ratio.




And late this summer, when everyone's mouth is wet with rdna4's dominance...

Dr Lisa Su will slap the Entrées on the table (& flip the Blackwell/cuda paradigm on it's head..), as AMD will be announcing high-end custom chiplet gaming cards by the end of the year (based on the MI400 chiplet w/hbm) starting @ $1,300? ~ $2,400? (*shrugs)

AMD's thinking is: If someone is going to spend $1,500+ on a Gaming Card (not wanting a CUDA workstation w/pseudo-gaming marketing), THEN... Gamers will surely spend $1,500~ on an actually Gaming card built specifically for THEM, using 2nd gen chiplet-based modular architecture built on AMD's CDNA "Next" architecture, sharing resources with PRO cards.

AMD might beat NVidia at their own game in high-end power gaming cards. Or so the rumor goes...
 
Strange measurements... I am not able to get more than 280W during mining on 4070Ti Phantom. And you have how much, 360 during gaming???? lol, Haven't had during gaming this much even with 3090 FTW3... mining no problem 380W.
 
I would like to thank Steve for his review, another excellent set of base #'s and breakdowns. (albeit missing a few data points, XTX power consumption, etc.)

I would like to point out, that my non-EVGA RX 7900xtx is 2 years old, next to my outdated ryzen 7700x. Both are itching for an upgrade...
RTX50 is almost a joke..

That 7700X is probably holding your GPU back if not wanting to buy a new GPU then seriously think about getting an 9800X3D CPU and give your GPU more leg room to perform
 
The BOM is not exactly the same, the 9070xt has a chip that costs $45 more to manufacture at TSMC (at 90% yields) than the 7800xt. So the BOM for the 9070xt, including extra power stages and connectors, and reasonable margins, means the 9070xt has a floor price of $580 ($100 more than the 7800xt @ $480). But this is only when yields reach 90% - they are usually 70% at the start of production. You can plug the numbers into a VLSI calculator - 390mm^2 vs. 346mm^2 chip size, $20,000 for a 300mm N4 wafer vs $16,000 for a 300mm N5 wafer, to see that the 9070xt is $45 more expensive to manufacture than the 7800xt.

I tend to believe AMD will ask $750 for the 9070xt. That will not make me very happy because I have seen 7900xtx cards for sale under $800, and $750 will be a price increase because you'll lose 8GB of RAM but gain a few raytracing frames (AMD owners don't care). Using 7900xtx performance, it will cost $8.92 per frame which is good but - once again - an overpowered card for the very limited RAM it has. They waited to exploit the current tight market conditions. They waited in order to squeeze more blood out of the turnip.
The reason why Nvidia was able to push their frequency so high is because they were on 4nm, not because AMD screwed up. It was the 5nm process that clock lower.

Not to mention that 7000 was MCM, adding to the complexity.

As for the 7800XT, it was also an MCM chip, so the packaging price was way higher. In the end, it would not change much in term of pricing with a monolithic design of a mid range GPUs. The BoM is pretty close and if AMD really wanted to do it, they could charge 550$ as a MSRP, but they will not. They would be fool to do so when they can sell it at a similar price of the 5070TI because of supply and disappointing generational uplift.
 
Last edited:
You know, people accuse AMD all the time of blowing opportunities left and right (and with some justification, to be fair). But it's not like NVIDIA doesn't do the same. Look at the "two RTX 4080 models" debacle from last generation. That was nothing compared to the horrendous missteps of this generation. Let's count them off:

1. A "generation" produced on the exact same node that fails to provide a true generational uplift
2. The worst stock release they have ever done
3. The failure to advance their own technological capabilities to a usable level (I.e. Ray-Tracing)
4. The most unaffordable pricing scheme ever implemented for their product
5. Dubious engineering and QA control resulting in damaged connectors and PSUs, a serious fire hazard, and possibly an upcoming class action lawsuit

The sum failure of all of the above is essentially that they are now unable to maintain their ironclad grip on the gaming market. There's no supply to support it of either past or this generation's cards and most people can't afford them when they are here. But for Intel's own woes, they could've swooped in and taken a huge amount of midrange market share.

Now AMD has the opportunity staring them in the face and the field completely wide open. Yes, they could mess it up but the fact that they actually delayed their own launch, however inconvenient it was to suppliers, vendors and customers, shows to me that they are at least assessing the situation carefully and are poised to strike. They may mess it up, but let's not pretend like they're the only company in this space making mistakes. The reason why they're in a prime opportunity spot right now is due to NVIDIA's own missteps.

P.S. And for those who are saying that the GPU prices are essentially handing the gaming market right back to consoles, I agree with you and once again, that is solely due to NVIDIA's fault. We went from "the PS5 Pro is too expensive" to "the PS5 Pro is a bargain' in a flash. Oh, and what company benefits aside from the console makers themselves if NVIDIA drives the PC gaming market away and into the console gaming arena? That's right, AMD.
 
IMO, the performance of the RTX 5070 Ti is quite good, and could be an upgrade option for owners of the 2000 series, like myself..
but it's a shame that the price is unreasonable..
 
Back