AMD Radeon RX 7900 XTX Review: RDNA 3 Flagship is Fast

Looking at the spec sheet (thx to neeyik), it's even more dissappointing. The flagship Radeon has wider memory bus, more L2 cache and more transistors overall under the hood than the competitor. 7900xtx also consumes more power than 4080.

IDK the reason, what's to blame? Probably 1gen MCM, weak drivers?
Just hope to see some competition in the midfield of monolithic GPUs of RDNA3 and Ada.

Somewhere...
In the galaxy far far away...
There's Intel crying but none can hear it.
 
Price is too high, just like the 4080.
$799 maybe, $699 is what they are worth and would sell well.
Just because people spent an arm and a leg in the gpu shortage, doesn't mean they are worth it. But I digress, those that did spend a ridiculous amount back then have opened pandoras box.

I guess we will have to wait and see what comes in the mid range tier, $799 for 50% less performance?
If gamers were smart, no one would buy these cards.



 
IDK the reason, what's to blame? Probably 1gen MCM, weak drivers?
The RTX 4080's L2 cache is really helping out here and the SM design/format is pretty mature now, having gone through three iterations of tweaking (Turing > Ampere > Ada Lovelace). Beyond the Tensor and RT cores, there were virtually no changes to the SM.

For RDNA 3, there are some significant alterations, the most important of which is the doubling of ALUs in the Streaming Processors. One can look at this as being akin to doubling the number of SPs, but it's not quite the same thing. While the caches in the Compute Units have doubled in size, the register file is only 50% bigger (which, in theory, shouldn't be a problem but if games are hitting the register limit in Navi 21, they'll be absolutely bouncing off it in Navi 31). The double FP32 throughput figure is only achievable in specific circumstances, and even when they do apply, it's only one part of the long rendering process.

If one leaves the double ALU aspect out of the equation, the 7900 XTX has 20% more CUs, an 8% higher boost clock than the 6950 XT, double the L1 and L2 cache, plus a lot more global memory bandwidth. So the difference of 34%, across the 16 games at 4K, seems to be a reasonable fit to those aspects, if a touch lacking. Improvements to the compiler in the drivers should help over time, though.
 
the difference of 34%, across the 16 games at 4K, seems to be a reasonable fit to those aspects, if a touch lacking. Improvements to the compiler in the drivers should help over time, though.
If AMD can make up the 16% performance difference through driver updates would that make Steve happy?
 
If my 980Ti had made it this far and I was looking for new right now, I wouldn't be against a 7900XTX or XT....if they were both $100 less. The current 4080 and 4090 are so far out of my price range that they're not even cards I would consider paying for.

I don't care about RT. I don't use DLSS. I put settings to a spot that I enjoy the game's performance and I play the game.

RT is a gimmick for me - you're generally going through a game too fast to see the slight benefits of what RT can offer and since I don't use RT, DLSS is moot.

I can't really say I have much good to say about this generation. Cards are overpriced if you ask me. Glad I'm not looking for a new GPU, I'll get another 3-5 years out of my 3080 and I'm okay with that.
 
I can't say that it's disappointing considering the context. The RT performance is actually better than I expected it to be (I don't care but many do). Sure, it's behind nVidia but that's to be expected since nVidia has an extra generation (2+ years) of RT development. As a result, hoping that they're at nVidia's level is just plain unrealistic.

Anyone who is disappointed that AMD isn't on par with nVidia in RT is expecting AMD to somehow advance in RT faster than nVidia. With everyone and their mother throwing money at nVidia, being disappointed that AMD hasn't caught nVidia is basically being disappointed that a miracle hasn't occurred. That's just plain stupid.

What the focus should be on is the fact that they've made a good step from the RX 6000-series to the point that RT is actually usable in several games without needing FSR enabled. That's a huge thing because it's the difference between "good enough" and "not good enough". Let's also not move the goal posts here because the RT performance was "good enough" on the RTX 3000-series to the point that it was a big selling feature. Now the RX 7900 XTX is roughly on par with the RTX 3090 Ti in RT performance. There were enough people that bought the RTX 3080 specifically for RT and the RX 7900 XT is better at RT than that card which means that it's usable. It would've been disappointing if there wasn't a significant improvement in RT performance but that's not the case.

Then of course, there's the elephant in the room... When the RX 7900 XTX is clearly hamstrung in Forza Horizon, Steve rightly points it out:
We think AMD's dealing with a driver related issue for RDNA 3 in Forza Horizon 5 because performance here wasn't good. There were no stability issues or bugs seen when testing, performance was just much lower than you'd expect. At 1440p, for example, the 7900 XTX was a mere 6% faster than the 6950 XT. And that sucks."
Yes, it does suck but it's also not a permanent thing. However, Steve makes sure that he bashes the performance numbers at higher resolutions despite being fully aware that there's a driver issue making a good showing impossible.

Then, where the RX 7900 XTX beats both the RTX 4080 and the RTX 4090, Steve dismisses it as an outlier, which could be called fair:
"These results are certainly outliers in our testing, but Modern Warfare 2 and Warzone 2 are very popular games, so this is great news for AMD."
The question is, why wasn't Steve equally dismissive of the Forza Horizon results when he knew that something wasn't right? Instead, he continued with testing the game t at higher resolutions, despite the fact that a bad result was already a foregone conclusion. If he had also dismissed this score as an outlier the way he did the others, I would have nothing to say here, but that did not happen:
"This margin improved a lot at 4K, but even so the 7900 XTX was just 16% faster than the 6950 XT, a far cry from the 50% minimum AMD suggested. This meant the new Radeon GPU was 5% slower than the RTX 4080, so a disappointing result all round."
Instead, he calls it disappointing that the RX 7900 XTX came within 5% of a card that costs 20% more while hamstrung with a driver issue (that is by no means permanent). Just imagine what the result will be when the bug is ironed out. Sure, it's disappointing for now but it's clear that the RX 7900 XTX will be the faster card in this game.

NOTE: I have since noticed that the Forza Horizon test had Ray-Tracing TURNED ON:
FH5_1440p-p.webp

FH5_4K-p.webp

"This margin improved a lot at 4K, but even so the 7900 XTX was just 16% faster than the 6950 XT, a far cry from the 50% minimum AMD suggested. This meant the new Radeon GPU was 5% slower than the RTX 4080, so a disappointing result all round."
So, according to Steve's "logic", if the RTX 4080 is a paltry 5% faster at 4K with RAY-TRACING SET TO HIGH than the RX 7900 XTX, a card that is US$200 cheaper, the Radeon should be "disappointed". Are you fracking kidding me? The RX 7900 XTX comes to within 5% of the RTX 4080 at 4K with ray-tracing set to high? That's fracking INCREDIBLE, not disappointing! However, people who don't know better won't notice this insanity and that's what you're counting on, eh?

ANY Radeon card that costs US$200 less than the RTX 4080 and is only 5% behind it in a game is ALREADY A WIN for the Radeon. The fact that RT was on and set to "HIGH" makes it a HUGE WIN for the Radeon. I don't know how this could be considered disappointing in the least.

Then there's this little pot-shot that I still don't understand:
"The 4K data is much the same, as in the 7900 XTX and RTX 4080 are on par, though this time that meant the 7900 XTX was just 14% faster than the 3090 Ti and 30% faster than the 6950 XT."
Yeah, but it also means that a card that costs $200 MORE is also just 14% faster than the 3090 Ti and 30% faster than the RX 6950 XT. Steve's wording is clearly misleading here because he's framing this as a negative for the RX 7900 XT when it's a much bigger negative for the RTX 4080 because it costs an extra $200.

Let's also remember that Techspot originally gave the RTX 4080 a score of 90/100. Truthfully, this is simply a case of the cards all being terrible values and while my review of the RX 7900 XTX wouldn't be glowing either, it would at least be comparable to my review of the RTX 4080. That wasn't the case here at Techspot however because let's just look at the sub-headings from the two articles:
AMD Radeon RX 7900 XTX Review
RDNA 3 Flagship is Fast

I've never seen a more generic sub-heading in my life. I don't even know why you bothered. It's clear that you didn't consider it to be worth the effort.

Now the RTX 4080:
Nvidia GeForce RTX 4080 Review
Fast, Expensive & 4K Gaming Capable All the Way

Huh, look at that. You actually put effort into this one.

So, I guess then that the RX 7900 XTX is not "Expensive" or "4K Gaming Capable All the Way", eh? Then of course, there's also the fact that you completely ignored the 8GB VRAM difference between the RX 7900 XTX and RTX 4080. I guess that when you're paying through the nose, longevity isn't important, eh?
- This is just inexcusable

When you add all of these little (and some aren't so little) problems with this article and consider that it makes the RTX 4080 article look like an nVidia love-in. It's pretty clear which card that Techspot is trying to promote and it's not the RX 7900 XTX. I have never wanted to say something like this, but I can't deny it any longer.
 
Last edited:
I'm not looking for a new GPU but if I were, I just couldn't get past AMD's utterly lousy Ray tracing performance and the $200 price differential to the 4080 isn't big enough to convince me otherwise.

Of course, If all you care about is raster performance then you may as well take the $200 saving and spend it elsewhere.
Depending on how you look at it, you're not wrong. Consider that if I wanted to build a new PC from the ground up, I could go Intel 13th gen, DDR4 and use a less expensive Mobo with a 4080 and come out less expensive, overall. If you're just upgrading the GPU, then that $200 might matter.

I have to wonder what Nvidia will do. If they drop the 4080 price to $1000 or even $950, that could spell trouble for AMD. If the 4070Ti comes in at $700-750 it could hurt 7900XT sales.

Overall, the XTX falls just about where I thought it would. It's a 4080 competitor and its primary value is that it is less expensive. Of course, we haven't seen the AIB pricing yet and I've heard that they will be a couple hundred more than the Reference design. At that point, the 4080 FE might make more sense.
 
But design of the cooler? ... Irrelevant.
Not to me! The 4080 taking 3.5 slots would be a major negative as I can only just fit a 2.5-slot card. Monster size is no small thing; literally and figuratively. And while it can be justified for the 4090, it apparently isn't even needed for the 4080.
 
Imo it looks like a good enough card, if I was going to replace to replace my 3070ti I'd wait for a few weeks to see how all of amd's software stacks up against nvidia. in my experiences I've had more problems on that side than actual performance and with the prices all these cards are going with the whole package needs to be worth it.
 
The elephant in the room should be addressed. RDNA2, though it lagged behind in RT, was otherwise competitive in raster all the way up to the 3090 tiers. That was an amazing showing for AMD. The 6900 XT was $1000 and went toe to toe with Nvidia's $1500 GPU, not quite as good at 4K and far behind at RT, but still $500 cheaper. But RTX 40 left RDNA3 in the dust here and there is no denying that. The 7900 XTX is not competitive with the 4090. And even at $200 less than the 4080, it is not nearly the value proposition that even the 6900 XT was. There was a reason AMD seemed so low key and defeated when announcing these cards, they really are. I hate to admit this, but Nvidia cleaned their clock with the 4090. Nvidia might need to drop the 4080 price a little, but certainly not as low as some of us might have hoped. On top of that, Nvidia has plenty of room for a 4080 Ti on AD102, they have a whole tier of GPUs that AMD can't touch. Nvidia has essentially won, at least for now.

There are rumors that a more powerful AMD GPU might be in the works that uses V-Cache tech for some apparently big gains. We'll just have to wait and see, but at the moment, Nvidia's advantage grew this generation, it did not shrink as many of us hoped it would.
 
Last edited:
The elephant in the room should be addressed. RDNA2, though it lagged behind in RT, was otherwise competitive in raster all the way up to the 3090 tiers. That was an amazing showing for AMD. The 6900 XT was $1000 and went toe to toe with Nvidia's $1500 GPU, not quite as good at 4K and far behind at RT, but still $500 cheaper. But RTX 40 left RDNA3 in the dust here and there is no denying that. The 7900 XTX is not competitive with the 4090. And even at $200 less than the 4080, it is not nearly the value proposition that even the 6900 XT was. There was a reason AMD seemed so low key and defeated when announcing these cards, they really are. I hate to admit this, but Nvidia cleaned their clock with the 4090. Nvidia might need to drop the 4080 price a little, but certainly not as low as some of us might have hoped. On top of that, Nvidia has plenty of room for a 4080 Ti on AD102, they have a whole tier of GPUs that AMD can't touch. Nvidia has essentially won, at least for now.

There are rumors that a more powerful AMD GPU might be in the works that uses V-Cache tech for some apparently big gains. We'll just have to wait and see, but at the moment, Nvidia's advantage grew this generation, it did not shrink as many of us hoped it would.
Leatherjacket told Nvidia 4090 must beat AMD RNA3 at all costs even if it requires a nuclear reactor to power it...
 
I think the elephant in the room is that PC gaming is turning back into a boutique industry. Nvidia and AMD noted that PC gamers with money were more than happy to drop THOUSANDS during the mining years and those operating within a budget would complain but still coughed up $500+ for mid-grade gear.

The clear message to the masses is simple. Get a console.

My six year old 1080 is still worth hundreds and such old gear still having good market value pushes newer generation prices ever higher.
 
Stocks on Navi 23, 22 and 21 started to shrink here, looks like today launch and reviews had some effect but no the expected one. This morning there were 4 "open box" Navi 21 sitting there since more than a week, gone now.
Navi 31 after all the hype and AMD not delivering again , must lose some "weight" to sell.
 
At first I thought this review was too negative; it's actually the most negative review I've seen today, but as usual, Steve is right and I was wrong. The bottom line with all of these new GPUs is that they're too expensive.

I think ray tracing is generally over-emphasized in tech reviews, but that argument goes out the window at the 7900 XTX's $1,000 price point. If you're spending four figures on a GPU--which, to be clear, seems ridiculous to me, but if you do spend that much on a GPU, you don't want to to be told to dial down settings. Add CUDA and NVENC to the equation, and suddenly even the 4080, which is an absolute turd in terms of value, starts to look at least as compelling as the 7900 XTX.

I do quibble with the idea that DLSS's advantage over FSR is a major selling point, though; I've seen variations of this comment repeated half a dozen times in today's reviews, and as far as I can tell there's no basis for it. If you have to zoom in 4x and play the footage at half speed to notice a difference, then what we're discussing isn't important. The recent Techspot/HUB review of FSR 2.4 backs me up here. Given that FSR is an open standard and DLSS isn't, I also have a hard time imagining that DLSS will age better.

Leaving that quibble aside, the 7900 XTX really isn't impressive as a value proposition, except in comparison to the ludicrously bad value Nvidia's offering on Ada GPUs. Steve hasn't reviewed it yet, but the $900 7900 XT is apparently worse value than its XTX cousin, which means AMD is basically just following Nvidia's anti-consumer practices at slightly lower price scale. Both companies appear to be leaving very little headroom for cards lower down the stack to offer meaningful perf-per-dollar uplifts over prior gen offerings.
 
Leaving that quibble aside, the 7900 XTX really isn't impressive as a value proposition, except in comparison to the ludicrously bad value Nvidia's offering on Ada GPUs. Steve hasn't reviewed it yet, but the $900 7900 XT is apparently worse value than its XTX cousin, which means AMD is basically just following Nvidia's anti-consumer practices at slightly lower price scale.
I don't see these prices last long at all. They are not going to sell great and they will drop very quickly I think.

I bet the 7900xt will be $700 in less than 6 months...maybe sooner.
 
Judging by those rasterization performance, AMD's mid-tier cards will sell like hotcakes, where RT is currently useless.

In this $1000 segment though, while
7900XTX can still appeal to 4k120/1440p180 setups, rasterization fps is gradually saturating monitor and human eye capabilities. We also start to see fairly viable framerates with RT/DLSS, so RT is indeed becoming an important decision factor. A discounted RTX 4080 could still be the favorite.
 
I'll stick to my 3070 Ti for now. Works great with my excellent MSI 2K 165 Hz monitor, so, nothing compelling here, all these products are overpriced. Manufacturers saw that people were buying largely overpriced GPU because of the COVID, so now, they take things for granted.

Well, not with me, enough is enough. Time to slow down now. I've got a cool rig and litterally tons of games I have not played, so the armament race is over for me.
 
I think the elephant in the room is that PC gaming is turning back into a boutique industry. Nvidia and AMD noted that PC gamers with money were more than happy to drop THOUSANDS during the mining years and those operating within a budget would complain but still coughed up $500+ for mid-grade gear.

The clear message to the masses is simple. Get a console.

My six year old 1080 is still worth hundreds and such old gear still having good market value pushes newer generation prices ever higher.
I mean, I agree with you. The "elephant in the room" comment is that Nvidia's win here is our loss. AMD has justified Nvidia's costs because their $1000 GPU does not bring a better value proposition to the table. While you might get similar performance in many games with Nvidia's $1200 GPU, there are disadvantages that make Nvidia's GPU at least as good a value for many users, it is more of a trade off than a value win for AMD. Which means the only way for consumers to win is to stop buying these expensive cards.
 
Anyone who is disappointed that AMD isn't on par with nVidia in RT
I will admit, I am not an expert on RT because to be honest, I cant justify the visuals with the insane hardware requirements, so I haven dived in fully into the tech part.

That said, observing the performance in Portal RTX (another wonderful nvidia appropriation by using RTX, instead of just RT) and the new Fornite using the new Unreal engine, kinds of tell me that perhaps, RT is being done the hard way, instead of the smart way.

I hate using this video, because DF is one of the biggest nvidia shills, but as they say, even a broken clock is right twice a day.

Anyways, this is interesting:

 
I don't see these prices last long at all. They are not going to sell great and they will drop very quickly I think.

I bet the 7900xt will be $700 in less than 6 months...maybe sooner.
Considering it's only 40% faster than the 6900 XT that is already available for under $700.00 I'm inclined to agree. At the very least, you'll be able to pick them up under $800. There is no reason for this card to cost $1000 nor for the 4080 to cost $1200, but the 4080 will maintain a higher selling point regardless, even if it is under $1000.
 
The current market is not set for new gpu buying spree, as consumers spend less.

So these gpus are here to prove their points, people already know. I am pretty sure no one came here and immediately thought to buy ASAP, and arent many that would buy a gpu atm.

They will battle it out when price and offers will appear, after both gpu makers read into people's mind and decide, if their assumptions were right, so they make new assumptions for next gen. If people bought the hype, this gen or not, we will see it in prices, later.


It is kind of a moot point.

Those benchmarks are about old releases mostly. New games still run well on both gpus...

Only in the rt sphere there is doubt, because RT was mostly pushed by Nvidia, but new games don't come out with cool, game changing RT in their engine... Except unreal 5.1.


What we had so far about rt that made the difference? Reflections are not impressive and screen space is a good alternative.
Lighting RT seemed to be something related to shadows and not to real-time GI.

This stresses the question about how well these cards would fare literally in games that actually look good? PS. If amd is bad at RT, why ratchet and clank is one of the best looking games today...

I would have liked to see some unreal benchmark with lumen and nanite technologies.
To me, these made the most impact. These made much bigger impact, in fact you could easily sacrifice reflections, in favor for real time GI.
In fact, lumen is somehow based on data from previous frames and have roots in raster performance if I reckon.

So bring some unreal 5.1 benchmarks please




 
I will admit, I am not an expert on RT because to be honest, I cant justify the visuals with the insane hardware requirements, so I haven dived in fully into the tech part.

That said, observing the performance in Portal RTX (another wonderful nvidia appropriation by using RTX, instead of just RT) and the new Fornite using the new Unreal engine, kinds of tell me that perhaps, RT is being done the hard way, instead of the smart way.

I hate using this video, because DF is one of the biggest nvidia shills, but as they say, even a broken clock is right twice a day.

Anyways, this is interesting:

Yeah. Who here wants to cut their frame-rate in half, introduce distance popping and edge shimmering all to make the lighting more technically correct? The difference between 60fps and 120fps was really glossed-over in that video.
 
Slight performance edge over the 4080 at a superior price point? Sounds good to me.
Very slight, and based on averages, none at all. Add RT and it loses in perf. If Nvidia is smart, and I'm not sure they are, this is a simple discount or "rebate" for them on the 4080 and they will sell well, or as well as any $1000 GPU.
 
Back