AMD Radeon RX 7900 XTX Review: RDNA 3 Flagship is Fast

Underwhelming at 350w, underwhelming efficiency and cost, disappointing RT (it can match a ~3090, but not in heavy RT games, plus that bar was set 2 bloody years ago), drivers issues at launch, over promised and under delivered, how on earth did you mess this up aga.... oh right, AMD. Enjoy 2 years of beta testing an underwhelming product, and the XT? rofl don't get me started. Now to sit back and enjoy the mental gymnastics.
 
Underwhelming at 350w, underwhelming efficiency and cost, disappointing RT (it can match a ~3090, but not in heavy RT games, plus that bar was set 2 bloody years ago), drivers issues at launch, over promised and under delivered, how on earth did you mess this up aga.... oh right, AMD. Enjoy 2 years of beta testing an underwhelming product, and the XT? rofl don't get me started. Now to sit back and enjoy the mental gymnastics.

It faster than a rtx 4080 in nearly every game & as fast as the 3090ti in ray tracing... not bad ray tracing for a card that cost $1,000 less than the 3090ti..
 
It faster than a rtx 4080 in nearly every game & as fast as the 3090ti in ray tracing... not bad ray tracing for a card that cost $1,000 less than the 3090ti..
Yeah, still pretty underwhelmed for the asking prices and hype, especially given the other points I said. Fwiw I'm not particularly impressed by the 4080 either, especially at the current price.
 
Agree. I wish I saw a reason to want to upgrade from my 6800xt but imo this just is not it.
Why on earth you want a reason to upgrade (= spend money). If newer cars don't bring anything to the table that you really need, then it means you'll be able to save money, keep playing at very high framerate and that's it! ......
But they give the 4090 a 90/100.

But I do agree with the logic behind your comment.
The 4090 and 4080 are the best products. Point. They should receive a very score *except* the price tag or the price/frame ratio.

If the 4080 costed 900€ it would sell like cookies, if the the 4070 (previously 4080 12 GB) would sell for 600€ it would sell like cookies.

AMD has some room to sell because;
- there are people who are team red, whatever happens
- raster performance for the price is better than nVidia's

But AMD has been plagued with bad media engine, no 3D/ CAD app support and much worse RT performance.
"Oh raytracing is not that important, bla bla bla": it will be important as soon as AMD get it right. It is about the same as Apple: big screens, better cameras, battery life... never important until they are as good or better than the competition!
 
I honestly don't care about that and neither do people like Steve Burke. If a "review" is tainted, I WILL attack it and this is as tainted as I've ever seen. It's even more overt than when Hardware Canucks tried to do a hit-piece on the R7-1800X and I tore SKYMTL a new one for his "trouble". Just like here, I backed up everything I said and instead of someone trying to kiss SKYMTL's butt, like you're doing here with Steve, they, like most of the people here, couldn't say that I was wrong.

It "put the fear of God" into them because they've been slowly moving in a green direction ever since. Did you not notice how, in the Spider-Man benchmark, Steve "conveniently" forgot to mention that nVidia was involved in the development of the game? No, he just said that AMD's drivers sucked, trying to stoke the flames of fear that have kept people buying nVidia for so many years. Somehow, nVidia got to Steve and while I don't think he's happy about it, I won't sit silently while a travesty like this "review" gets published.

I defended Steve on more than one occasion but on this occasion, he deserves no defence. That's why we have this forum.

Read further up my post here to discover why my post received over 10 likes and yours received a fat goose-egg. You're wrong, I'm right and that's life. Whenever I have to say something that would be considered controversial or hostile, I make damn sure that I back up every word of it with evidence. Arguing with me here is a fool's errand.

Ha ha ha chill bro, I have no interest in arguing with you or anybody here. I'm just a customer trying to find helpful information for my future purchases or simply updating my information for my hobby. I may share your point of view for bad or misleading reviewers but obviously you took it to another extreme. Hopefully they take your opinions or from passionate users like you into consideration so that they can do better.
 
Shame there are no more benchmarks using Assetto Corsa Competizione, it was always the one I was looking forward the most since I mostly play racing sims and they tend to deviate from the norm on results.
 
If the 4080 were $350 cheaper, nobody would buy a 7900. Alternate realities do not seem particularly relevant.

This release has retroactively made the 4080 a better value proposition. Not easy to pull off!

Supposing that all the new cards were $350 cheaper? What'd be your take?
 
The 4090 and 4080 are the best products. Point. They should receive a very score *except* the price tag or the price/frame ratio.

If the 4080 costed 900€ it would sell like cookies, if the the 4070 (previously 4080 12 GB) would sell for 600€ it would sell like cookies.

AMD has some room to sell because;
- there are people who are team red, whatever happens
- raster performance for the price is better than nVidia's

But AMD has been plagued with bad media engine, no 3D/ CAD app support and much worse RT performance.
"Oh raytracing is not that important, bla bla bla": it will be important as soon as AMD get it right. It is about the same as Apple: big screens, better cameras, battery life... never important until they are as good or better than the competition!
THAT^...

IS the exact reason nVidia's GPU cost more and perform less than AMDs. Because Ada Lovelace architecture is not for gaming... and those who are buying Ada 102 dies... are not doing it for gaming. People are spending $2k+ on RTX3090/4090 because they don't want to spend $4k on the Quadro prosumer card. (So it doesn't matter if they are $1,700, or $2,400 is still cheaper for them)

Gamers do NOT care about CAD/Enterprise/Creator software... If you personally need to save $2k by buying a 4090 instead of a Quadro card, just admit it. Don't pretend your special and/or that Gamers care.

Gamers care about Games... and want frames. AMD wins!
 
Gamers care about Games... and want frames. AMD wins!
This is a 3 way relationship where everybody wins if it was a normal market. Now only GPU manufacturers and AIB partners win. Gamers lose big money to have their digital entertainment. "Gamers want FPS" but also care about prices. I know I do and don't want to spend $1000 on a single PC part. Let those cards gather enough dust in stores and they will be begging us to buy their stuff. First sign of low sales is here, look at Zen4 price cut in first month after release. Game bundles with new products is next.

Nvidia and AMD ca go...........you know where.
 
Ha ha ha chill bro, I have no interest in arguing with you or anybody here. I'm just a customer trying to find helpful information for my future purchases or simply updating my information for my hobby. I may share your point of view for bad or misleading reviewers but obviously you took it to another extreme. Hopefully they take your opinions or from passionate users like you into consideration so that they can do better.
I'm sorry Marcus. I was pretty livid after discovering that the Forza Horizon test had ray-tracing in it. When I saw this:
"Wow, I always know you are a AMD hard-core fan"
It triggered me a bit because being called a "fan" of any company is a huge pet peeve of mine. I think that fanboys are the dumbest fools on the planet. I have no delusions that AMD is a group of saints, they're a capitalist corporation trying to make a profit. When I went back and read your post again I realised that I had misunderstood you. I was wrong to do that and I always admit when I'm wrong about something.

Having said that, AMD, Intel and nVidia are by no means equivalent in their practices. AMD has demonstrated FAR fewer instances of anti-consumer behaviour while Intel and nVidia commit anti-consumer and anti-competition acts seemingly on the daily, aided by the ignorant masses who are so easy to fool that this meme completely applies to them (especially the blurb at the bottom):
JPGUw.jpeg

What Intel and nVidia specialise in is getting people to think that they need to spend more on stuff that doesn't matter. The average gamer doesn't use CUDA or QuickSync so why would they pay extra for it? The RTX 3070 gives horrible RT performance and RT on the RTX 3060 is borderline unusable, so why pay extra for those?

It's because Intel and nVidia (along with their army of paid shills) say that they do of course!

Then you get the people who buy Intel and nVidia because they're insufferable snobs who get off on paying more for something than they need to. Either way, I don't want to support either of those companies and I personally want to have as little in common with insufferable snobs as possible.
 
Last edited:
Why on earth you want a reason to upgrade (= spend money). If newer cars don't bring anything to the table that you really need, then it means you'll be able to save money, keep playing at very high framerate and that's it! ......
Because I enjoy being able to max everything out at 4k and preferably at 120fps. This is pretty much the only thing I spend money on so I don't mind upgrading.
 
This is a 3 way relationship where everybody wins if it was a normal market. Now only GPU manufacturers and AIB partners win. Gamers lose big money to have their digital entertainment. "Gamers want FPS" but also care about prices. I know I do and don't want to spend $1000 on a single PC part. Let those cards gather enough dust in stores and they will be begging us to buy their stuff. First sign of low sales is here, look at Zen4 price cut in first month after release. Game bundles with new products is next.

Nvidia and AMD ca go...........you know where.
I understand your sentiment but it's a bit misplaced. AMD isn't to blame for the fact that cards cost what they do now. The blame lies with nVidia and the people who bought nVidia. I'll explain how this happened:

1) nVidia is the poster-child for unbridled greed in the tech industry.
2) nVidia uses their marketing machine to make people think that they need nVidia cards even though gamers don't use CUDA and ray-tracing is just terrible below the RTX 3080.
3) Demand for nVidia products skyrockets because people are basically stupid.
4) This demand is recognised by nVidia so they jack prices higher and create artificial shortages by selling entire skids of cards directly to miners. Prices soar even higher.
5) nVidia increases the MSRP of its 80-level card by 40% in a single generation, based on the fact that its 90-level halo card is $1,600.
6) AMD decides that if people are willing to cut their own throats for GeForce cards, then they can slash their own wrists for Radeons. Radeon prices rise but not to the same degree as GeForce cards.

People seem to think that AMD raised its prices out of greed and to a certain degree, they would be correct but it would be a very small degree. AMD had to raise its prices because with all of the money that nVidia is getting thrown at them by the people that they've hoodwinked with their marketing BS, if AMD didn't raise their prices as well, they'd have no hope of being able to keep up with nVidia's R&D. Remember what happened with Intel in the 2010s. Everyone and their mother were buying overpriced Intel parts and then they had the nerve to admonish AMD for falling behind.

If you don't support the underdog with more than just words, no industry can survive. I've bought only Radeon since 2008, not because I thought that GeForce was a bad product but because it was terribly overpriced and nVidia was doing some slimy things behind the scenes back then.

When I worked at Tiger Direct, nVidia was offering TD employees big discounts on the GeForce 9800 GTX+ (aka GTS 250) or GTX 260 if we promised to always try to sell an nVidia card to any prospective customer before recommending an ATi card, regardless of their needs (Intel tried the exact same thing with their i7-920).

Something about that didn't sit right with me and I basically told the rep "Thanks but no thanks." which kinda shocked her because the vast majority of my co-workers accepted the terms. I can't tell you how much respect I lost for many of my co-workers at that time.

One evening, as we were closing, I saw the nVidia rep outside. She was smoking a cigarette (I swear, ALL of the tech reps back then smoked like chimneys) and seemingly waiting for us. It turned out that it was me she was waiting for (I guess she got some pie-in-the sky bonus if she managed to turn an entire store) and tried to get me interested in the deal again. I remember thinking that she was pretty (which seems to be par for the course when a sales rep is a woman) but I also remember saying to myself "The black mamba is pretty too but I wouldn't want to get close." ( :laughing: )

She was clearly trying to charm me but I remained stone-faced (You wouldn't want me as a poker opponent). I told her that even with the discount (about $100), the cards weren't worth what I'd have to pay for them. I'll never forget what she said in response to that. She said "But these are nVidia cards and no cards are made better than nVidia." to which I replied "Sure, but none are made worse either." which actually got a chuckle out of her. I told her that I had owned cards by ATi, Matrox, CirrusLogic, nVidia and Oak Technologies. I pointed out that they all did their jobs as video cards pretty much the same.

She realised that, unlike a lot of the teenagers that worked at my store, I actually knew what I was talking about because I'd already been building PCs for almost 20 years at that point. I told her that I was certain that these actions were probably illegal at worst and damaging to the market at best. She said "Who cares, as long as you get cheap cards?" to which I replied "I care because I want to always be able to get cheap cards and if what you're trying to do succeeds, there would be only overpriced nVidia cards left. Equally important is that I will not lie to my customers because my job is to help them make the best decision they can to win their loyalty in the future." and continued to my car.

I bought my first XFX Radeon HD 4870 for $100-$300 (depending on the brand chosen) less than her "discounted" GTX 260 would've ended up costing me. More importantly, I kept my self-respect and my intellectual sovereignty remained intact. Over time, I saw nVidia doing more and more terrible things (like arm-twisting reviewers) and when they blocked the use of PhysX if their drivers detected a Radeon card in your system along with a GeForce, I began to actively hate them. They (like Intel) never given me a reason to stop.

AMD never tried anything like that with me. Their rep was friendly and liked me a lot because I sold more Phenom CPUs than the rest of the store combined. This was because I was easily the most knowledgeable one there and was thus perfectly comfortable selling any brand of anything because I knew that they all did the same thing. He and I had a mutually respectful repectful relationship and he never tried anything slimy like the nVidia rep did. The Intel rep actually tried pretty much the same thing and I turned him down as well.

So yeah, they're all corporations trying to make a profit but the way that they go about it isn't even close to being the same.
 
Unfortunately, this is the reality. 1k+ flagships are becoming the norm. Not just graphics cards but almost everything else, including smartphones.

The bread that we buy today is a far cry from the affordability we paid 3 decades ago.
The thing is, the 5 years in which it went up is a far cry from 3 decades. As a matter of fact, for three decades the cost of video cards stayed relatively stable, increasing by an average of about 18¢ per week overall.
No matter what we say, the price will keep increasing. No thanks to the consumers who are ready to splash 1k or 2k.
Actually, it's completely because of them. However, there's not a chance that I'll be paying that much and I think most users would agree. I don't expect the high prices to remain as long as people exercise self-control and refuse to pay. That's exactly what happened with the Zen4 CPUs. People weren't buying so AMD dropped the price.

There may be some good news too because according to Moore's Law is Dead, there are no supply issues with RDNA3 whatsoever. A crap-tonne have been made and apparently 30,000 reference models were available on launch day and 200,000 cards have been shipped for Q4 (which is NOW), according to a tweet from HardOCP's Kyle Bennett. Here's the video, pre-cued at the beginning of the relevant chapter:
People's earnings have also increased. Hence, the buying power.
Actually, that's wrong. People's earnings have increased, but barely enough to keep up with inflation (sometimes not even that). Plus, the cost of living has exploded in most places with prices going up while wages remain stagnant.

People who aren't rich that buy these cards are fools that are willing to take on debt to get them. A lot of people like that end up bankrupt or worse.
Yeah, I too probably would be buying the 7900XTX or the eventual 7950XTX. I bought my current 5700XT at MSRP. At that time, the scalper phenomenon didn't take off yet.
Yeah, I got my RX 5700 XT for $490CAD also before the $hit hit the fan.
I think I will wait another 6 months to upgrade, though.
With an RX 5700 XT, you'll be good for longer than that. I would still be using mine but I managed to get my hands on an OG reference RX 6800 XT. It was the only RDNA2 card that I would've been interested in because it's the first time that ATi hasn't used a blower-cooler in their consumer-grade reference Radeons. The Radeon VII did also but that was more like a prosumer card (think Titan) than a gamer card. In retrospect, it was probably a really stupid decision on my part because it was an emotional purchase, not a rational one.

I expect the RX 5700 XT to be perfectly usable for AAA gaming at least two more years.
 
Last edited:
This is a 3 way relationship where everybody wins if it was a normal market. Now only GPU manufacturers and AIB partners win. Gamers lose big money to have their digital entertainment. "Gamers want FPS" but also care about prices. I know I do and don't want to spend $1000 on a single PC part. Let those cards gather enough dust in stores and they will be begging us to buy their stuff. First sign of low sales is here, look at Zen4 price cut in first month after release. Game bundles with new products is next.

Nvidia and AMD ca go...........you know where.
Hunh..?
You only have to spend $1k on a video card, if you have a $1,300+ 4k monitor.

You can easily buy a $300 GPU for 140fps @1080p
 
Hunh..?
You only have to spend $1k on a video card, if you have a $1,300+ 4k monitor.

You can easily buy a $300 GPU for 140fps @1080p
A 4K screen is not what I want to put on a desk, on a wall maybe. Even at 42" it's to big for this close focal distance under 1 meter (3 feet I guess). At 32" 4K text on it it's too small and have to scale to 125% or more. I tested many screens before settling with 32" 1440p 165hz, witch I got for $350. I never had 1080p screen, jumped from 2 x 19" to 27" UWHD since Excel files looked better and movies were 21:9 back then. Now all media is 16:9 and 1440p it's ok for this usage.
 
A 4K screen is not what I want to put on a desk, on a wall maybe.
Yeah, no screen that could fit on a desk is big enough to benefit from 2160p resolution. Hell, a 17" screen can barely benefit from 1080p. I think that a screen would have to be at least 32" at the bare minimum to have any benefit from 1440p. Even then, 1440p would still look pretty much the same as 1080p.
Even at 42" it's to big for this close focal distance under 1 meter (3 feet I guess). At 32" 4K text on it it's too small and have to scale to 125% or more.
This is true. I have my Windows Desktop resolution set to 720p because it's the easiest to read from about 8m away on my 55" panel.
I tested many screens before settling with 32" 1440p 165hz, witch I got for $350. I never had 1080p screen, jumped from 2 x 19" to 27" UWHD since Excel files looked better and movies were 21:9 back then. Now all media is 16:9 and 1440p it's ok for this usage.
It's funny how aspect ratios change over time, eh? When I worked at Tiger Direct, the aspect ratio of a TV was 16:9 but the standard aspect ratio for a monitor was 16:10. The idea of the 16:9 monitor had only started to gain traction.

I'm expecting the next big aspect ratio to be 1.9:1 because that's "wide-field" IMAX (tall IMAX is 1.43:1).
 
People's earnings have increased, but barely enough to keep up with inflation (sometimes not even that). Plus, the cost of living has exploded in most places with prices going up while wages remain stagnant.
Exactly. People's wages increased much less than living costs, in Europe -though it depends on which country- if a couple of decades ago people (middle class) could save around 50-60% of their wages, nowadays it is being estimated around 20-30%.
On southern countries (Europe), some decades ago in one household, one person could stay at home and raise the kids because one wage was more than enough, nowadays BOTH working is a must and that household barely saves any money...
But another study shows that farmers and other those providers get very few money, the intermediates get the big cut...

Yeah, no screen that could fit on a desk is big enough to benefit from 2160p resolution.

The size/ resolution thing depends on what you do and at what distance. I have a 27" 4K monitor (to work and edit 4K video, not to play games) and between 1080p vs 1440p vs 2160p the difference can be easily seen! Of course from 1440p to 4K the difference is much smaller, but it is there.

Now resolution is much more important when doing multitasking or working than when playing, personally if I play at 1440p or 4K on standard sizes, I see no difference;

1080p: ok for playing very fast games, nothing else
1440p: the ideal middle point
4K: the best option to work (monitor) or to play games (TV / huge monitor). On my setup I have no scale issues.
 
Fact 1: All game developers still make games with the 1060 gtx in mind. Fact 2: Human eyes cannot distinguish above 24 fps. Fact 3: Average time per month gaming is negligible (about 10 hours for most people over 30).

So the real question is what the GPU can do for the applications. And for applications only Nvidia exists with cuda and their newer models don't have enough memory.
I can easily see the difference in fps, which is why I got a 144hz monitor. Below 45ish fps it becomes annoying. Where did you hear you can't notice anything above 24fps?
 
Yeah, I listened to Moore's Law is Dead on the way to work this morning and it does sound like there is a potential that drivers just are not there for these cards. Then again, it also sounded like there could be a more fundamental flaw with the MCMs. I would not be surprised if AMD ends up getting a bit more performance out of these cards, even more reason though not to be an early adopter. You might get one later for $600 and the performance originally promised. Then again, you might not.
Agree, I've never been first on the block to get new tech, though I was hopeful this round I might get in early. As it is, I'm now waiting. There's no way I want to spend $1000 for a GPU.
 
that's how averages work, yes here, no there but taken all together it's faster than the 4080 even if by a little. A win is a win.

I only care about rasterization performance and not RT. Even with 4090 4K RT is still not there yet, and but not there I mean high fps (100+) on ultra without upscaling.
The averages I've seen put the XTX on par with the 4080. I can't recall the game mix but a 16 game average showed 181 fps for both cards. The problem with averages is that not every reviewer is using the same mix of games. If you want to make the XTX look good, you'll choose a mix of games in which AMD wins, but if you want to make Nvidia look good you would choose a different mix.

For me, any GPU within +/- 5% is basically equal. So many factors can impact those results from choice of mobo, ram, ssd, room temperature and more that a 5%v delta is within the margin of error.
 
There's not a chance in hell that he would ever get anything other than nVidia and it has nothing to do with price, quality or value. Read his post history.
I actually bought quite a few ATI/AMD cards in the past. Which is a large part of the reason I currently do not.

That along with Nvidia consistently delivering as far as end user experience is concerned.
 
Supposing that all the new cards were $350 cheaper? What'd be your take?
I would say a $200 premium for Nvidia would be much harder to justify for lower end pricing. But on the high end I it isn't all that much of a problem - for me.

And seeing how the 4080 has become the top seller at Newegg after the release of RDOA3, I think that's a notion shared by others.
 
Back