Radeon RX 7900 XT vs. GeForce RTX 4070 Ti: 50 Game Benchmark

Status
Not open for further replies.
I completely agree with you about the pricing and ray-tracing being a gimmick. I did want to ask though, have you tried using FSR and if so, is there really that much difference between FSR and DLSS? I've never been able to tell from looking at reviews.
It depends, in most cases you won't notice the difference unless you zoom in. Some games may have some scenes with a bit of flicker that won't be seen on DLSS.
 
I completely agree with you about the pricing and ray-tracing being a gimmick. I did want to ask though, have you tried using FSR and if so, is there really that much difference between FSR and DLSS? I've never been able to tell from looking at reviews.
I tried and used FSR in some games, but if DLSS is present I typically use that, DLSS seems more clear with less artifacts. Techspot or Techpowerup have plenty of side by side screenshots with slider that you can check
 
Hear, hear (or is that here, here?)
You were right the first time, it is indeed "Hear, hear". (y) (Y)
Anyway, I've nothing but good to say and think about AMD's rise up from what most certainly looked like ashes within the last decade. Further, coming from the near mandatory Intel/Nvidia ecosystem during that period, my jump to Zen 3 and RDNA 2 have been worthy and regret free... nm the incredible price difference for what amounted to the same when I had to make a necessary upgrade to that in May 2021 (my 6800XT: £1200... pricey sure but... any 3080 10Gb: £1800-2400)
Ah, another member of the RX 6800 XT club! That's exactly what I have too. :D
But yeah, I'll agree... any goodwill they may have accrued (even against the hype and bias of many in the Nvidia camp, however misinformed) or could gain is in real danger of disappearing due to some duff moves of late. I'd still take the 7900XTX over a 4080 (should I need or see fit to upgrade my 6800XT @ 3440x1440 this gen) certainly given RT is neither here nor there for me and the latter of the two is still some £200 more for an equally premium card but there's a real risk of Intel stealing the show from them by the time next gen rolls out.
Yep. It just makes me wonder WTF happened in AMD's boardroom that would prompt them to shoot themselves in the foot so badly and so many times. It's hard to understand how a company under the leadership of Dr. Lisa Su could do so well for so long and then just do an about-face like that. I can tell you that if I were in charge of that company, margins would be lower, marketshare would be much higher and total revenue and profit would be up. I would have consumers lining up around the block to adopt the AM5 platform and AMD's position in both the consumer CPU and GPU spaces would be secure.

Interestingly, AMD has managed to not drop the ball on the server side with EPYC and Radeon Instinct. Maybe Lisa decided to focus her attention on that (more profitable) side of the business and left the consumer side to some incompetent lackey of hers. Whoever it is, I hope that their career is over because they made an absolute mess of things.
 
I tried and used FSR in some games, but if DLSS is present I typically use that, DLSS seems more clear with less artifacts. Techspot or Techpowerup have plenty of side by side screenshots with slider that you can check
Yeah, I've looked at both and I couldn't see anything that I'd notice while gaming either way. I asked because I figured that you'd have tried both and might know how big the difference is. I'm sure that DLSS is better, I just think that the difference is not really enough to be noticeable when actually gaming.

After all, I haven't heard about anyone actually complaining about either implementation so I figure that it's probably a matter of "GOOD" (DLSS) and "GOOD ENOUGH" (FSR).

Thanks for your reply. :)(y) (Y)
 
Yeah, I've looked at both and I couldn't see anything that I'd notice while gaming either way. I asked because I figured that you'd have tried both and might know how big the difference is. I'm sure that DLSS is better, I just think that the difference is not really enough to be noticeable when actually gaming.

After all, I haven't heard about anyone actually complaining about either implementation so I figure that it's probably a matter of "GOOD" (DLSS) and "GOOD ENOUGH" (FSR).

Thanks for your reply. :)(y) (Y)
Yeah it's hard to spot differences in still images, easier in motion, both can be good however prefer DLSS most times

Both can be bad as well, a bad implementation for me is when images gets blurry and soft looking, DLSS1 was kinda terrible but DLSS2 generally is good but still requires a good implementation just like FSR :) :)
 
Both cards are too expensive for what you get.

Please don't include RT performance in the same chart as Raster! I am using RTX but I never use RT because I prefer a much higher fps. I could not care less about RT performance actually. I like RTX mostly because of DLSS and DLDSR, not RT. I think RT is a gimmick, even on the most expensive cards like 4080 and 4090. You don't buy 4090 for 1440p RT gaming. You buy 4090 for 4K gaming and enabling RT here just destroys performance. 4090 can deliver good framerates at 4K in pretty much all games maxed out - until you enable RT...

I don't really like DLSS3 either so that is not gonna help. DLSS2 at Quality or Balanced can work in some games at 4K, not lower than that, and no fake frames from DLSS3 thank you. I want smooth gameplay with low input lag, AI generated frames won't take my input into account... Might work in slow singleplayer games, but sucks for fast paced games.


This^^ pretty much sums it up.
Additionally, I have never met or know anybody who spends $800+ on a GPU, to get less frames. I touch base with a mix of Gamers daily and nobody cares about anything other than more frames. Understanding, that many pros do NOT use DLSS or even G-sync.

 
By all accounts the 7900 XTX when compared to the 6900 XT should seem like a better deal than it is, you are getting nearly 50% more performance for the same MSRP. The problem is that the 6950 XT was selling for <$899 when it launched, and the 6900 XT could be found for as low as $699. So, it definitely did not feel like AMD was keeping things the same from one gen to the next. The 7900 XT is 25% faster than the 6900 XT and +15% faster than the 6950 XT, but still doesn't feel like a bargain, even at $800. I think the 7900 XT should have been the 7800 XT as it would have been 50% faster than previous gen and it would have felt really good priced at $699 (which seems to be where it's heading anyway).

AMD sort of played the same game here as Nvidia did. The 7900 XT is named incorrectly. However, AMD was not as egregious about it. But the 7900 XT is basically their 4080 12GB. (Why did they not get nailed for this like Nvidia?). By adding the extra X to the XTX, they were able to avoid Nvidia's "same name" mistake, but it's practically the same thing. AMD's generation over generation increase is actually pretty good not as great as Nvidia, but Nvidia really overreached with the 4090 to make sure they stayed in control of the performance segment. AMD missed a great opportunity as I think a $699 7900 XT would have been a no-brainer. AMD claims to be keeping things lower priced, but honestly, the 7900 XT price tag actually validates Nvidia's prices. It was not enough of a generational leap to warrant the price increase, especially considering it should have been a 7800 XT. At $699, AMD would have invalidated Nvidia's claims/prices and had the most desirable GPU on the market. Maybe the margin just is not there, maybe we're blaming AMD/Nvidia for high prices and maybe it is TSMC that should shoulder a lot of the blame. I don't know... But none of these new cards are the least bit attractive.
 
while the cheapest 4070 Ti is $830, with most priced at $850 or more.

A very quick check of B&H Photo showed 4 models listed at $799, 3 out of stock but listed nonetheless. Also there was a model priced at $819. NewEgg has 2 at $799 and one at $809. And, yes, many priced well above MSRP.
 
The current price is decent, it's a 20 fu** GB vram GPU. But Nvidia's GPU is expensive relative to manufacturing costs.
Do you have those costs? From what I'm seeing Nvidia isn't driving high margins right now. As of Jan 2023 their margins were running around 16%. Now, that is everything, not just gaming GPUs. But, I am very confident that gaming GPUs have far less profit than embedded and data center products.
 
By all accounts the 7900 XTX when compared to the 6900 XT should seem like a better deal than it is, you are getting nearly 50% more performance for the same MSRP. The problem is that the 6950 XT was selling for <$899 when it launched, and the 6900 XT could be found for as low as $699. So, it definitely did not feel like AMD was keeping things the same from one gen to the next. The 7900 XT is 25% faster than the 6900 XT and +15% faster than the 6950 XT, but still doesn't feel like a bargain, even at $800. I think the 7900 XT should have been the 7800 XT as it would have been 50% faster than previous gen and it would have felt really good priced at $699 (which seems to be where it's heading anyway).

AMD sort of played the same game here as Nvidia did. The 7900 XT is named incorrectly. However, AMD was not as egregious about it. But the 7900 XT is basically their 4080 12GB. (Why did they not get nailed for this like Nvidia?). By adding the extra X to the XTX, they were able to avoid Nvidia's "same name" mistake, but it's practically the same thing. AMD's generation over generation increase is actually pretty good not as great as Nvidia, but Nvidia really overreached with the 4090 to make sure they stayed in control of the performance segment. AMD missed a great opportunity as I think a $699 7900 XT would have been a no-brainer. AMD claims to be keeping things lower priced, but honestly, the 7900 XT price tag actually validates Nvidia's prices. It was not enough of a generational leap to warrant the price increase, especially considering it should have been a 7800 XT. At $699, AMD would have invalidated Nvidia's claims/prices and had the most desirable GPU on the market. Maybe the margin just is not there, maybe we're blaming AMD/Nvidia for high prices and maybe it is TSMC that should shoulder a lot of the blame. I don't know... But none of these new cards are the least bit attractive.
My guess is that AMD, because they sell far fewer units than Nvidia, felt the need to drive profitability and chose higher prices. It's clear they could have priced the XT $100 lower at release. But they only did so when it was apparent that no one was buying it and the Nvidia 4070Ti was seen as a better value.

I think there are several factors keeping prices high. Costs have gone up, sales are down, profits are down and manufacturers are still drunk on crypto-mining revenues. I'm predicting prices will remain high until the second half of the year. Then you'll see a lot of talk about new generation cards and prices will slowly trickle down to where they should have been at release.
 
Do you have those costs? From what I'm seeing Nvidia isn't driving high margins right now. As of Jan 2023 their margins were running around 16%. Now, that is everything, not just gaming GPUs. But, I am very confident that gaming GPUs have far less profit than embedded and data center products.
The chip should cost around U$125-130, the XT main chip is similar in size so we can assume the same price, however there are 6 more cache chips each costing around U$14(U$84).

The last price found for GDDR6 16Gbps was $16/Gb. GDDR6X should be a bit more expensive(guess). Let's say U$ 18/Gb. So... $216, Other costs should add up to about $150.

4070 = $130 + $216 + $150 = $496.
7900XT = $130 + $84 + $320 + $150 = $684

XT's margins are pretty thin considering development and design costs.
 
The chip should cost around U$125-130, the XT main chip is similar in size so we can assume the same price, however there are 6 more cache chips each costing around U$14(U$84).

The last price found for GDDR6 16Gbps was $16/Gb. GDDR6X should be a bit more expensive(guess). Let's say U$ 18/Gb. So... $216, Other costs should add up to about $150.

4070 = $130 + $216 + $150 = $496.
7900XT = $130 + $84 + $320 + $150 = $684

XT's margins are pretty thin considering development and design costs.
I think those costs for 7900XT are bit too high.

GCD: Assuming 304 mm2 TSMC 5nm (17k$/wafer) 90% yield (7900XT is cut down from 7900XTX die) makes around 100 dollars.

MCD: Assuming 38mm2 TSMC 6nm (10K$/wafer) 95% yield (ultra small die) makes around 7 dollars.

Keeping others same, that would make $100 + $42 + $320 + $150 = $612. That makes more sense IMO.

For 4070, Nvidia uses custom process that probably cost more than "normal" one.
 
MCD: Assuming 38mm2 TSMC 6nm (10K$/wafer) 95% yield (ultra small die) makes around 7 dollars.
While the MCDs are tiny, there’s no usage (at present) for dies with any defects — for the 7900 cards, it’s the full die, at the required clock speeds, or nothing. So the yield is unlikely to be quite that high.

The same is also to be true of the GCD, although not to the same extent — the 7900 XTX uses a full Navi 31 and the 7900 XT only has 5 WGPs disabled (rest of the die is normal). One wafer is unlikely to yield 90% for both SKUs, as any sporting cache defects or more than 5 WGPs with problems can’t be used.
 
I think those costs for 7900XT are bit too high.

GCD: Assuming 304 mm2 TSMC 5nm (17k$/wafer) 90% yield (7900XT is cut down from 7900XTX die) makes around 100 dollars.

MCD: Assuming 38mm2 TSMC 6nm (10K$/wafer) 95% yield (ultra small die) makes around 7 dollars.

Keeping others same, that would make $100 + $42 + $320 + $150 = $612. That makes more sense IMO.

For 4070, Nvidia uses custom process that probably cost more than "normal" one.
As already mentioned, a chip of that size would not reach 90% yield, I think it is somewhere between 70-80%.
 
While the MCDs are tiny, there’s no usage (at present) for dies with any defects — for the 7900 cards, it’s the full die, at the required clock speeds, or nothing. So the yield is unlikely to be quite that high.

The same is also to be true of the GCD, although not to the same extent — the 7900 XTX uses a full Navi 31 and the 7900 XT only has 5 WGPs disabled (rest of the die is normal). One wafer is unlikely to yield 90% for both SKUs, as any sporting cache defects or more than 5 WGPs with problems can’t be used.
We don't actually know if there needs to be perfect die for MCD or if it can sustain small defects. If it does, then 95% is pretty much possible. Again, 37.5mm for just 16MB is pretty big area and IMO leaves room for defects.

Probably yields for 7900XTX would be lower and 7900XT higher since 7900XT can sustain more defects. Again, we have basically no way of telling how much manufacturers actually put extra stuff on chip. But it rarely makes sense to manufacture ultra large dies that Must be "perfect" (0 defects on whole die) to be OK. So while yields for "0 defect" chips may not be 90%, actual yields might well be.

I gladly accept more information about this since it's pretty hard to find any actual data.
As already mentioned, a chip of that size would not reach 90% yield, I think it is somewhere between 70-80%.
Again, perfect yield might not be but it's hard to know how much defects actually makes chip unusable. It makes sense to put some extra stuff because of that considering current prices but that's something I rarely find real information. As for MCD, while package takes some space, it just seems pretty big chip considering low amount of memory. For comparison, 6700XT (Navi 22) is 335 mm² and has 96MB infinity cache included on that. MCDs on 7900XT are total 225 mm².
 
Last edited:
On major detractor here is the drivers themselves; AMD drivers are a massive mess while Nvidia drivers are well organized; also Nvidia drivers are far, far more robust than AMD when it comes to something like DSR; AMD VSR is pretty much non-existent for 21:9 monitors while Nvidia offers plenty of resolution modes for the exact same. You can't even do 2560x1440 on a 3440x1440 ultrawide monitor with AMD drivers; 1920x1080 is the only available option. Really too bad AMD is such a sh*tshow on their driver features and interface; but I will probably be going green next time due to them being so tits up.
 
So, between these two cards, I guess it depends on which company you hate less (or adore more). Performance is a moot subject between certain gpus, at this point.

I stick with NVidia simply because their products have seemed "ready for showtime" right out of the box more-so than AMD. And, with DLSS and RT, which may be BS marketing for some, do seem more interesting and advanced to me. I've never had an issue with an NVidia card - which does carry some weight for me...

Yes, NVidia is a much larger company, but I tire of AMD fans who seem to portray the rivalry as a David vs Goliath scenario. AMD is a huge and rich company in its own right.

I just want the best value and trouble-free product for the handsome sums they charge. And sorry, but NVidia does seem to have the edge here, in general.
 
Last edited:
A very quick check of B&H Photo showed 4 models listed at $799, 3 out of stock but listed nonetheless. Also there was a model priced at $819. NewEgg has 2 at $799 and one at $809. And, yes, many priced well above MSRP.
Out of stock prices are always wanky. Ignore them. Once in stock the prices are generally adjusted back up.
 
On major detractor here is the drivers themselves; AMD drivers are a massive mess while Nvidia drivers are well organized; also Nvidia drivers are far, far more robust than AMD when it comes to something like DSR; AMD VSR is pretty much non-existent for 21:9 monitors while Nvidia offers plenty of resolution modes for the exact same. You can't even do 2560x1440 on a 3440x1440 ultrawide monitor with AMD drivers; 1920x1080 is the only available option. Really too bad AMD is such a sh*tshow on their driver features and interface; but I will probably be going green next time due to them being so tits up.
AMD's drivers are fine and the their software pack is leagues ahead of Nvidia's Geforce Experience application. They do lack some features, but that's generally not a major issue.

Features aside, I don't know why you have a problem with the interface when it is so much better than what you get from the ancient Nvidia interface.
 
AMD's drivers are fine and the their software pack is leagues ahead of Nvidia's Geforce Experience application. They do lack some features, but that's generally not a major issue.

Features aside, I don't know why you have a problem with the interface when it is so much better than what you get from the ancient Nvidia interface.
AMDs control panel looks more modern, but features inside are not really beating Nvidias.

What is AMDs answer to DSR and especially DLDSR? VSR is not it

I think AMD lacks too many features in generel, most of their features is a ripoff from Nvidia really - Gsync > Freesync .. RT perf (AMD is 2-4 years behind) .. DLSS > FSR etc.

And performance in lesser popular titles and especailly early access games just seems far superior on Nvidia
 
AMD's drivers are fine and the their software pack is leagues ahead of Nvidia's Geforce Experience application. They do lack some features, but that's generally not a major issue.

Features aside, I don't know why you have a problem with the interface when it is so much better than what you get from the ancient Nvidia interface.

Interface is a clunky mess designed for generation something shiny. BLUF is always the right answer for interfaces; which is what the Nvidia control panel is; concise; everything on one screen in a row/list. Also AMD features are much more primitive; say something like ultrawide or DSR on Nvidia just smokes AMD's equivalent (or non-existent options). On a 3440x1440 display with Nvidia I can easily switch from 1920x3440 to 2560x1440 to 34401440 to 3768x1577 to 4213x1764 to 4587x1920 to 4865x2036 to 5160x2160 to 5958x2494 to 6880x2880; on the AMD side you are SOL with only 1920x1080 and 3440x1440 being available options; try to enter anything above 3440x1440 as a custom resolution with AMD and it will be invalid on my 3440x1440 monitors.
 
I bought my 1080 Ti in 2017 for $750, which was retail price at the time. Second-best card in the world at that time. Inflation since then has been over 22%. That same $750, adjusted for inflation to today, would mean spending $920 on the card.

Conversely, this $850 card, today, is the equivalent of spending $692 on a card in 2017. See for yourself: https://www.usinflationcalculator.com/

Y'all can complain as much as you like, but this seems fair enough to me at this point. Yeah, 22% inflation in just six years is INSANE, but you can hardly expect AMD to fix that themselves. Sure it's an expensive card, but it's once again the second-best card you can buy from them. Why are we all expecting them to charge $450 for that in this day and age?
 
Out of stock prices are always wanky. Ignore them. Once in stock the prices are generally adjusted back up.
It was a sale. Don't know if they had any stock before I saw the page. Either way, the article is off on the pricing. There are 4070Ti cards selling at MSRP, not many, but they do exist.
 
The chip should cost around U$125-130, the XT main chip is similar in size so we can assume the same price, however there are 6 more cache chips each costing around U$14(U$84).

The last price found for GDDR6 16Gbps was $16/Gb. GDDR6X should be a bit more expensive(guess). Let's say U$ 18/Gb. So... $216, Other costs should add up to about $150.

4070 = $130 + $216 + $150 = $496.
7900XT = $130 + $84 + $320 + $150 = $684

XT's margins are pretty thin considering development and design costs.
So, estimated costs. A quick check on eBay shows GDDR6X to be closer to $22/gb which would add another $50-ish. But, as I am sure you know, manufacturing costs are just one piece of the puzzle. Shipping, marketing, and more contribute to the overall real-cost of any product. As I stated, Nvidia is showing around 16% margins recently. How that applies to GPUs sales is unknown. Both Nvidia and AMD tend to obscure specific details by product line. I did see that Nvidia gross margins are around 57% versus AMD at 43%.
 
Status
Not open for further replies.
Back