AMD Radeon RX 7900 XTX Review: RDNA 3 Flagship is Fast

Well that was disappointing, and on multiple levels. Marketing, power consumption, issues, outright performance... I certainly wanted it to be better, and AMD's slides certainly led me to believe it would be better than it is.
And all the AMD diehards are in meltdown mode over the conclusion, not only Steve's but others across the net that aren't immediately calling this an outright win, because on a balance, it really isn't and everything that's launched so far in 2022 that is $899+ is overpriced.
 
Yeah. Who here wants to cut their frame-rate in half, introduce distance popping and edge shimmering all to make the lighting more technically correct? The difference between 60fps and 120fps was really glossed-over in that video.
Yeah, the frame rate thing is particularly amusing given that we're talking about a competitive shooter, in which many if not most players minimize quality settings to increase FPS. Still, Fortnite does offer an impressive demonstration of Unreal 5.1.

The buried lede in that video, though, IMO, is that even the software implementation of Lumen looks fantastic; of course the hardware RT looks better, but not by leaps and bounds. It's possible that widespread adoption of Unreal 5 will ultimately do as much to diminish Nvidia's RT lead as AMD's engineers do.

But yeah, RT is still a long way from becoming the mainstream standard. It's impressive tech, and I think it is the future, but the vast majority of gamers aren't rocking cutting edge hardware, and thus developers will continue to expend max effort on purely rasterized visuals--which still look great--for the foreseeable future.

Then there's the larger question about the quality of games generally. It's all well and good to rhapsodize about ultra-realistic reflections or illumination, but you start to feel pretty silly about it when an NPC starts T-posing, or glitches through a wall, or simply demonstrates brain dead AI, examples of which are nearly ubiquitous in modern games. Or maybe you marvel at the stunning vistas in the newest title only to find that its gameplay loop is stale and its micro-transaction model is depressing.

Don't get me wrong; I like following the latest developments in visual fidelity as much as anyone, and obviously AMD and Nvidia have no influence over AI, gameplay, or the predatory business practices of game publishers. Yet I sometimes wonder whether the pursuit of immersion through visuals is a fool's errand in today's environment. To say that we've passed the point of diminished returns is an understatement.
 
Unless the drivers are garbage at this stage, colour me majorly underwhelmed. The hype was so much for this card and frankly with a few exceptions this just makes the 6900XT look like a good buy right now. This does not auger very well for the 7900XT IMO. RT is still a joke and hasn't even caught up to Ampere. Also AMD's 50% improvement in efficiency still leaves it trailing. If only Intel weren't so late with Alchemist and were getting close to a Battlemage cards targeting higher end. Zen 4 and RDNA3 somewhat of a let down IMO, but RDNA3 could improve a lot with driver updates, so I'll hold off. In no rush to upgrade my 1080Ti but I want to get a 4K monitor soon, so am looking at something quite strong in 4K.

One thing that may save AMD is third party boards with higher clocks if rumours are true. However, given the power is already high this will just skyrocket power consumption. Still the reference coller is pretty poor with stupidly high fan speeds, and coil whine galore.

Honestly if 4080 price dropped $250 I'd buy it and that is something I don't want to do.
 
I can't say that it's disappointing considering the context. The RT performance is actually better than I expected it to be (I don't care but many do). Sure, it's behind nVidia but that's to be expected since nVidia has an extra generation (2+ years) of RT development. As a result, hoping that they're at nVidia's level is just plain unrealistic.

Anyone who is disappointed that AMD isn't on par with nVidia in RT is expecting AMD to somehow advance in RT faster than nVidia. With everyone and their mother throwing money at nVidia, being disappointed that AMD hasn't caught nVidia is basically being disappointed that a miracle hasn't occurred. That's just plain stupid.

What the focus should be on is the fact that they've made a good step from the RX 6000-series to the point that RT is actually usable in several games without needing FSR enabled. That's a huge thing because it's the difference between "good enough" and "not good enough". Let's also not move the goal posts here because the RT performance was "good enough" on the RTX 3000-series to the point that it was a big selling feature. Now the RX 7900 XTX is roughly on par with the RTX 3090 Ti in RT performance. There were enough people that bought the RTX 3080 specifically for RT and the RX 7900 XT is better at RT than that card which means that it's usable. It would've been disappointing if there wasn't a significant improvement in RT performance but that's not the case.

Then of course, there's the elephant in the room... When the RX 7900 XTX is clearly hamstrung in Forza Horizon, Steve rightly points it out:
We think AMD's dealing with a driver related issue for RDNA 3 in Forza Horizon 5 because performance here wasn't good. There were no stability issues or bugs seen when testing, performance was just much lower than you'd expect. At 1440p, for example, the 7900 XTX was a mere 6% faster than the 6950 XT. And that sucks."
Yes, it does suck but it's also not a permanent thing. However, Steve makes sure that he bashes the performance numbers at higher resolutions despite being fully aware that there's a driver issue making a good showing impossible.

Then, where the RX 7900 XTX beats both the RTX 4080 and the RTX 4090, Steve dismisses it as an outlier, which could be called fair:
"These results are certainly outliers in our testing, but Modern Warfare 2 and Warzone 2 are very popular games, so this is great news for AMD."
The question is, why wasn't Steve equally dismissive of the Forza Horizon results when he knew that something wasn't right? Instead, he continued with testing the game t at higher resolutions, despite the fact that a bad result was already a foregone conclusion. If he had also dismissed this score as an outlier the way he did the others, I would have nothing to say here, but that did not happen:
"This margin improved a lot at 4K, but even so the 7900 XTX was just 16% faster than the 6950 XT, a far cry from the 50% minimum AMD suggested. This meant the new Radeon GPU was 5% slower than the RTX 4080, so a disappointing result all round."
Instead, he calls it disappointing that the RX 7900 XTX came within 5% of a card that costs 20% more while hamstrung with a driver issue (that is by no means permanent). Just imagine what the result will be when the bug is ironed out. Sure, it's disappointing for now but it's clear that the RX 7900 XTX will be the faster card in this game.

Then there's this little pot-shot that I still don't understand:
"The 4K data is much the same, as in the 7900 XTX and RTX 4080 are on par, though this time that meant the 7900 XTX was just 14% faster than the 3090 Ti and 30% faster than the 6950 XT."
Yeah, but it also means that a card that costs $200 MORE is also just 14% faster than the 3090 Ti and 30% faster than the RX 6950 XT. Steve's wording is clearly misleading here because he's framing this as a negative for the RX 7900 XT when it's a much bigger negative for the RTX 4080 because it costs an extra $200.

Let's also remember that Techspot originally gave the RTX 4080 a score of 90/100. Truthfully, this is simply a case of the cards all being terrible values and while my review of the RX 7900 XTX wouldn't be glowing either, it would at least be comparable to my review of the RTX 4080. That wasn't the case here at Techspot however because let's just look at the sub-headings from the two articles:
[HEADING=2]AMD Radeon RX 7900 XTX Review[/HEADING]
[HEADING=2]RDNA 3 Flagship is Fast[/HEADING]
I've never seen a more generic sub-heading in my life. I don't even know why you bothered. It's clear that you didn't consider it to be worth the effort.

Now the RTX 4080:
[HEADING=2]Nvidia GeForce RTX 4080 Review[/HEADING]
[HEADING=2]Fast, Expensive & 4K Gaming Capable All the Way[/HEADING]
Huh, look at that. You actually put effort into this one.

So, I guess then that the RX 7900 XTX is not "Expensive" or "4K Gaming Capable All the Way", eh? Then of course, there's also the fact that you completely ignored the 8GB VRAM difference between the RX 7900 XTX and RTX 4080. I guess that when you're paying through the nose, longevity isn't important, eh?
- This is just inexcusable

When you add all of these little (and some aren't so little) problems with this article and consider that it makes the RTX 4080 article look like an nVidia love-in. It's pretty clear which card that Techspot is trying to promote and it's not the RX 7900 XT. I have never wanted to say something like this, but I can't deny it any longer.

Wow, I always know you are a AMD hard-core fan and an NVIDIA forever hater but you keep impressing me. I usually don't dig deep into reviews like this to find some misleading tactics cause in the end, they need any kind of sponsors to live.

What I'm curious about is that if what you mean is true, then why NVIDIA treated Hardware Unboxed like the way they did last year ??? I mean, I might be the dumbest guy but Hardware Unboxed and TechSpot are well related right ? All their themes and info are the same.

Btw back to the topic, which is about 7900XTX, I think they tried and their road looks really promising, especially when drivers are updated, programs and games are optimized then 7900XTX might show its true value. Right at the beginning, they said that they will compete with 4080 and I think they did well.
 
Yet I sometimes wonder whether the pursuit of immersion through visuals is a fool's errand in today's environment. To say that we've passed the point of diminished returns is an understatement.
I feel this way too. World of Warcraft is an example of a game that has almost 18 years of zones. The newer zones are far more technically impressive. Still for me it is some of the older zones, almost cartoony in their appearance at this point, that are the most evocative and immersive. Sometimes it pays to let the player's imagination do most of the work.
 
Blablabla:

Conclusion: Nothing is wrong with card and it beat 4080 countless times (its real competition) but was beaten by a single card that costs (at least) twice as much and has 2x VRAM, if it didn't melt first.

If this card was $350 cheaper, it would have been crowned the New King by everyone on the planet (well, except for the blinders-always-on Nvidia fans)
.
End of story.
 
Who cares about rtx gimmick and dlss? I just need raw performance with the minimum fps exceeding 60fps at the highest quality settings at 4K. Looking at the graphs, the 7900XTX clearly gives the RTX4080 a run for it's money. And in another 6 months, the expected 7950 XTX will make the 4090 run for it's money.
 
Nice Card. I don't like this article though. We already knew that AMD is behind in Ray Tracing against Nvidia cards. However, 7900XTX has similar RT performance of a 3090Ti while Rasterization is between 4080 and 4090. Everyone needs to compare it with 6950XTX. The improvements are decent. Sure, if you are one of the gamers who likes to max out Ray Tracing at 4k, Nvidia with DLSS is the way to go. Thats what Nvidia is counting on too, hence, the huge price difference.
 
Let's not forget, this gen Radeon cards are not designed to go head to to head against the RTX 4080 or 4090. They're improved upon their own 6900XT and 6950XT series. Saying that, the ability of the 7900XTX to cruise up to the 4080 and easily overtaking in many charts speaks volume about the raw power of these cards. I expect the 4090 will be humbled by the eventual 7950XTX which should be here in 6 months.
 
Here's my prediction : Next gen AMD cards, at least the top end, will be larger in size. Think 4090 large. The 7900 xtx is around just 73% space volume at 287×121.9×53.3 mm while 4090/4080 are at 304×137×61 mm. This is not a small difference.
Reason is simple, the space available does matter a lot. It affects the design not just in respect of thermals and therefore stability, clocks, voltage etc but also allows use of relatively less costly materials and technologies resulting in higher chip count.
 
Last edited:
I really wish the 90 FPS Locked data included GPU card power (not just full system power). Since the other charts showing power are on different games at different resolution, it's very hard to work backwards to suss out the card power differences.

Is a 90 FPS Locked chart of card power available anywhere?
 
If this card was $350 cheaper, it would have been crowned the New King by everyone on the planet (well, except for the blinders-always-on Nvidia fans)
.
End of story.

If the 4080 were $350 cheaper, nobody would buy a 7900. Alternate realities do not seem particularly relevant.

This release has retroactively made the 4080 a better value proposition. Not easy to pull off!
 
I really wish the 90 FPS Locked data included GPU card power (not just full system power). Since the other charts showing power are on different games at different resolution, it's very hard to work backwards to suss out the card power differences.

Is a 90 FPS Locked chart of card power available anywhere?
at 90fps I don't think anybody tested, but at 60 fps here you go

power-vsync.png


and also rest of power direct measured on 12 plugs

power-consumption.png


We all know that HU cant use a multimeter and go for a AC power socket thing.
 
Wait a couple of months and see how it all pans out.

Steve needs to up his game - as I pointed out a couple of months ago, and someone else just a while ago .

Cost per frames is a stupid blunt metric. - Look at graphs for your needs
You want 144 at 4k get a 4090

wow 267 fps at 4K so cheap - Who cares - that type of gamer with mega fast screens is gaming at 1440 or 1080p to risk no latency to win the fight .

RT at 4K - say what! - for so few games - why even go there?? - just game at 1440 at get decent fps

Really how many 4080 users will RTX at 4K if they want 144hz ??- I have a 3080 - would dream of running a RT at 4K

RT is good enough for a lark - at 1440 vs 4K - You can not easily tell the difference at a distance on a big screen .

Steve's conclusion was dreadful - better in nearly every metric - then belatedly mentions the most important one rasterization - which AMD won in 1080, 1440 and 4K - yes only a draw in effect but it's $200 cheaper or 17% cheaper - media engine say what - I do hardware encoding most of us don't and AMD can stream to multiple channels at same time - Nvidia only 3.
RT for only .00001% of games- yes new titles going forward

Well impressed by 4080 power usage - all things equal 7900xtx is a better than a 4080.

Going forward AMD looking real good vs Nvidia

More games using things favoring AMD will be made as normally favours Nvidia .
AMD improves drivers quite well now and for a long time.

Cheaper to produce - so Nvidia is having to catch up with their own chiplet/modules going forward .
Yes we wanted AMD to be better - but as I said wait a couple for sales and driver improvements

What will Nvidia do for a 5090 ? needs to say in less than 500 watts and under $2000- so they will need to get creative - die shrinkage , faster and more memory , better controllers and better RTX in hardware .
Same for AMD - how to get these 50% improvements and keep price and power usage good

So really interesting
What is good for us 4K is enough = so now on RT, more engine features, sound
I for one will completely ignore 8K as BS - so we should be getting quite life like in 4K with power and cheats( shortcuts) in next decade

Anyway all these cards have great performance - we are just getting hyped for more and annoyed at price hikes
 
I think if they want to be competitive they need to stop being "almost as fast as Nvidia" cause it's making them look bad. Everything they do is almost as good like some sort of a Chinese Nvidia knockoff. It's an image problem they have.
 
The only thing going for the Radeon 7900 XTX is the fact that it's 17% cheaper, but when spending $1,000 on a graphics card, do you really care about $200? Wouldn't you just spend the extra money to get the superior product?
I'm not sure about that. I bought a new laptop last week, and in the end I chose a cheaper option despite not having any dedicated GPU. Once I narrowed down what I need, that miniscule 10-20% difference in price matters a lot.

I can't comment about that in this context because 1.) I'm not rich enough to even consider buying a $800 GPU; and 2.) I'm a Linux user so by default I do favor AMD; but if the GPU does and fits with what the person wants and is cheaper, then it can make sense to buy it and that small price difference suddenly matters.
 
I think if they want to be competitive they need to stop being "almost as fast as Nvidia" cause it's making them look bad. Everything they do is almost as good like some sort of a Chinese Nvidia knockoff. It's an image problem they have.
Maybe at the $1,000+ luxury / epeen price point, where if you've decided to waste that much money, you at least want unqualified bragging rights even on junk you'll probably never use.

Once you get back to mainstream price levels, I prefer AMD's positioning: better cost, better compatibility with the case & power supply you already have, competitive performance for the features you actually care about (raster), and less fluff that you don't. Sure it'd be nice if they had the lower energy usage and less driver worries but the wider audience won't be worrying about either anyway.

$200 savings vs a $1,200 luxury card that you'd only buy because the $1,600 card is sold out is almost besides the point. But if they can deliver a 15-20% discount at same raster performance at say a $300 mainstream card they'll be looking very attractive to those buyers. Even more if they maintain a VRAM lead as well.
 
Last edited:
Considering it's only 40% faster than the 6900 XT that is already available for under $700.00 I'm inclined to agree. At the very least, you'll be able to pick them up under $800. There is no reason for this card to cost $1000 nor for the 4080 to cost $1200, but the 4080 will maintain a higher selling point regardless, even if it is under $1000.
I think they are both pricing these cards high to sell off old stock, the remaining RTX3000 and RX6000 series cards before the holidays.
 
and also rest of power direct measured on 12 plugs

We all know that HU cant use a multimeter and go for a AC power socket thing.
TechPowerUp doesn't use a multimeter -- I've not seen the name for the specific system they use, but it will be very similar to that made by Cybernetics Labs. Such systems require additional hardware for logging the data and then time to process it all. It's obviously a lot quicker to simply monitor the system's power consumption and given that this setup never changes across all of the GPUs tested, it's a perfectly legitimate means of comparing relative GPU power draws.

Let's not forget, this gen Radeon cards are not designed to go head to to head against the RTX 4080 or 4090.
Indeed not the 4090 but it absolutely was designed to go head-to-head with the 4080 -- the launch presentation material makes this patently clear.
 
First off well done for packing in this review Steve I'm sure it wasn't easy.

I think it'll be a case that driver optimisations will bring some major gains. It would have been nice for you to test the games AMD mentioned in their slides and compared it to Nvidia's lastest products such as Resident Evil Village with RT on.
I did feel that the RT testing was limited in your case with game choice, the number 1 RT game a lot of people play outside of Cyberpunk is Control, so I found it weird for you to not include this but I understand time constants to get a video out on YouTube to get that day 1 mad rush of streams.

I think it's pretty obvious the extra memory and interconnects add extra power usage to the 7900XTX over the 4080 but I do still think the XTX is more efficient gen on gen just not to the extent of the 4080/4090 which are on the latest node.

Going off a lot of different sites data I think it's easy to see where the 7900XTX will sit once driver optimisations have been done for most of the popular games out there, which is clearly sat above the 4080.

RT performance on par with the 3090ti is still decent it's just whether or not someone cares enough to use it.

As for any argument about DLSS Vs FSR 2.2 I think it mute as Tim's video footage clearly shows that they're very similar in visual output.

So I think anyone thinking of upgrading should wait and see what happens because right now both Nvidia and AMD are taking the Micky with the pricing of these products.

Let's see if AMD fine wine sets in later next year.
 
Considering it's only 40% faster than the 6900 XT that is already available for under $700.00 I'm inclined to agree. At the very least, you'll be able to pick them up under $800. There is no reason for this card to cost $1000 nor for the 4080 to cost $1200, but the 4080 will maintain a higher selling point regardless, even if it is under $1000.
Yeah I have no doubt at all AMD will drop prices a lot but it will be interesting to see what happens with Nvidia. I really dislike Nvidia and that turd of a CEO they have but honestly I would take a 4080 if priced decently.

I just wish people would stop supporting these crazy high prices. It wasn't that long ago that you could buy a fairly decent card for $300 and that would get you a higher resolution (for that time of course) and about 60 fps....not so much these days.
 
Back