AMD Radeon RX 7900 XTX Review: RDNA 3 Flagship is Fast

Let's tell the things as they are:

ALL new graphic cards are OVERPRICED. Heavily.

It is the result of coin mining mania, inflation and companies seeing that people are still buying GPUs at 2000USD to PLAY games. They understood that there is a LOT of people ready to pay very high price.
So they adjusted their prices accordingly.

Now, back to today's new arrival.

In raster, it beats flat out RTX 4080 on price/performance basis.

When we include ray tracing, it loses to the same 4080 heavily. BUT, it does brings almost double performance compared to last AMD flagship 6950XT.

Do not forget, that drivers are not optimized for this gpu, and but are for all others.
If past is anything, you will see improvements on all positions, and it will get closer to 4080 in RT and to 4090 in raster.

For the price, if you do not need IMMEDIATE rt performance, it is better buy than 4080. Especially if we take into account street price rather than MSRPs.

That been said, I am disappointed that I would need to pay double to get double performance. That is not called advancement in technology.

Should we all ignore all of these new cards and push manufacturers to lower their prices?
While I would love to see this and I am willing to wait, the reality is, they will be sold out and nothing will happened.

So stop whining about prices, we brought that upon our-self's by not being able to control that "buy" button on our screens.
Considering all of Europe is supposedly only allotted 10.000 of these cards I'm pretty sure they'll sell out no problem.

I think the question is, did AMD do this on purpose? Did they look at the 4080 sales and in response release early getting the most profit that way. Or are they trying to play the market by intentionally only ordering a small amount to avoid looking silly like Nvidia with unsold products?
Imo starting with the RTX 2000 series graphics cards have had outrageous pricing. The only card that actually seems to have a decent price is the RX 6700 XT. Or at least it does in the US, here in Europe it's only decently priced on the used market.
 
Fact 1: All game developers still make games with the 1060 gtx in mind. Fact 2: Human eyes cannot distinguish above 24 fps. Fact 3: Average time per month gaming is negligible (about 10 hours for most people over 30).

So the real question is what the GPU can do for the applications. And for applications only Nvidia exists with cuda and their newer models don't have enough memory.


Your facts have been stuck under a rock for a decade.....

-Fact 1: All game developers are making games for the new RDNA Consoles (XSX/PS5).
-Fact 2: The human eye can distinguish above 1000 frames per second. It's a continuous analog scan.
-Fact 3: As you get older, you start to game more than when in your 30's. You sell off your ATVs, Watercrafts, boats and build simulators for a racing league.. so u can still jam with your friends virtually.... now that you are older, have money and the kids moved out.


SO the real question is, when you work hard for you money and want the best gaming experience do you get caught up in little boy marketing..? Or see through it and buy based in Price/performance as a logical criteria.
 
I think they are both pricing these cards high to sell off old stock, the remaining RTX3000 and RX6000 series cards before the holidays.
6950XT at my local Micro Center, they're selling for about $100 under the 7900XT.

If you still want a solid AMD card, but want to save $100, look at the 6950XT - it only falls behind the 7900XT by around 5-10% (depending on resolution) and power draws are very close (6950XT winning in the multi monitor by 40W, but losing in the spikes 20 ms by 40W, everything else is within 10-20W of each other)
 
Let's be clear. >$1k GPUs are really bad value. All of them. Irrespective of what the model number is, or how fast or slow these products are relative to each other, there's really no way to rationally justify their value. Having missed two generations with my relative frugality, I'm probably going to end up buying a 4080/7900XTX, but I'm not going to pretend that I'm not being utterly dumb and frivolous.




 
I've read other reviews in addition to this one, and there's mixed feelings: some view the 7900 XTX as a near-perfect card earning the editor's choice award, others are a bit more ambivalent but positive, like this one. So which is correct? Spoiler alert: they both are.

Since ray tracing still remains largely niche, this card really puts the RTX 4080 in a less desirable place than it was before. That's the XTX's main success, actually, and every reviewer seems to agree that the 4080 is a weaker option than it was before in the majority (but not all) use cases.

The only workloads where it makes sense to use the RTX 4080 (like ray tracing or Blender) make one seriously consider getting the RTX 4090 instead of the 4080. The XTX and 4080 are neck and neck, with the XTX coming out slightly ahead in rasterization, and if you care about ray tracing or creative work and are willing to spend $1200, why not wait another month or so to save up and get the 4090 instead?

The reviewers aren't wrong here: if you care about DLSS, ray tracing, and power efficiency, the 4080 is a better product, but at the price point it is at, it's a hard sell. On the other hand, the XTX at $1000 doesn't help the performance per dollar figure much compared to prior generations, but at least you do get considerably more performance.

That said, if Nvidia lowers its price for the 4080, which I suspect it may be forced to do to resolve weak sales, AMD may do the same depending on how significant the price cut is. If that happens, that will make the XTX a much more palatable card when considering raw price.

Until that happens, I would say the XTX remains a great card, but it's $1000 price tag is a stickler. So is it "just an 80" as the reviewers here on TechSpot say? Yes. Is it also an Editor's choice as TechRadar, PCmag, and other say? Also yes. But which way you're personally inclined to feel about the card is going to depend entirely on what is important to you (ray tracing, price, etc.), and for coming short of wowing everyone, a score of 80 or 4 out of 5 stars seems like the more reasonable one to me.
 
Last edited:
Let's be clear. >$1k GPUs are really bad value. All of them. Irrespective of what the model number is, or how fast or slow these products are relative to each other, there's really no way to rationally justify their value. Having missed two generations with my relative frugality, I'm probably going to end up buying a 4080/7900XTX, but I'm not going to pretend that I'm not being utterly dumb and frivolous.
Absolutely... Mid-range cards allow you to upgrade more often and maintain more resell value when you do upgrade. This has been my philosophy for GPU, cars, and most products that you'll want to upgrade later. High-end stuff always comes at a premium and loses more resell appeal because it is no longer the top tier product when the next thing comes out.
 
This is the biggest elephant in the room for me:

Our time with the Radeon 7900 XTX wasn't flawless either. We ran into a few game crashes and we spoke with other reviewers who suffered from the same kind of issues. This could simply be an issue with prerelease drivers that AMD will sort out in time for public release, or it could be a taste of something gamers will experience for weeks or months to come. We also ran into a frustrating black screen issue, that required us to disconnect and reconnect the display, the game didn't crash, but the display would flicker and go blank. This was rare and only happened twice in our testing, but it's worth mentioning given the other stability issues with the review driver.

I bought the 6900XT on release with reference cooler, and while it is a great card, it is still to this day (2 years) ridden with random crashes, driver time-outs and other weird stuff. And the watt consumption with multi-monitor is ridiculous. I love everything about the 7900 XTX and was ready to jump on it, but my main reason for replacing my 6900XT was stability issues. I am NOT going into another beta test of new AMD tech. There are also rumours around the MCM and RDNA3 having many hardware issues that will not be solved by driver updates. No thank you to first-gen MCM with tons of issues.
 
Let's be clear. >$1k GPUs are really bad value. All of them. Irrespective of what the model number is, or how fast or slow these products are relative to each other, there's really no way to rationally justify their value. Having missed two generations with my relative frugality, I'm probably going to end up buying a 4080/7900XTX, but I'm not going to pretend that I'm not being utterly dumb and frivolous.
Here, here! I dumped the cash into a 4080, and I love it. I walk around my home with a new vigor. Fourteen new chest hairs sprouted since I installed. Its your money, spend it however you want.
 
Maybe at the $1,000+ luxury / epeen price point, where if you've decided to waste that much money, you at least want unqualified bragging rights even on junk you'll probably never use.

Once you get back to mainstream price levels, I prefer AMD's positioning: better cost, better compatibility with the case & power supply you already have, competitive performance for the features you actually care about (raster), and less fluff that you don't. Sure it'd be nice if they had the lower energy usage and less driver worries but the wider audience won't be worrying about either anyway.

$200 savings vs a $1,200 luxury card that you'd only buy because the $1,600 card is sold out is almost besides the point. But if they can deliver a 15-20% discount at same raster performance at say a $300 mainstream card they'll be looking very attractive to those buyers. Even more if they maintain a VRAM lead as well.
I prefer AMD everywhere except the high-end. In this case, if you're willing to throw a grand or more on a GPU why not go all out and get the fluff? I'm not their target audience for these types of cards but if I was I would also pick Nvidia. They "feel" like a safer option for many people and me, which is an underrated quality to have. I'm very interested in the RX 6700 XT successor as that is the most I'm willing to spend and I can just about max out its potential.

I would like to see AMD transcend from being the budget-friendly option to being a serious contender at these price points.
 
Let's be clear. >$1k GPUs are really bad value. All of them. Irrespective of what the model number is, or how fast or slow these products are relative to each other, there's really no way to rationally justify their value. Having missed two generations with my relative frugality, I'm probably going to end up buying a 4080/7900XTX, but I'm not going to pretend that I'm not being utterly dumb and frivolous.


How bad of a value is the 4k Asus Rog Swift OLED PG42UQ ...?


The value market is at 1080p... and for $280 bucks, you can get a RX 6650xt to run Warzone 2.0 between 140fps ~ 180fps..!
 
Yeesh driver issues still on even newer releases? That's a very bad sign. AMD is bad on older releases, but people usually brush that off. If they're bad at newer releases, have the poorer upscaling, RT and encoding, why get the GPU, especially at these price points? As the writer states, at these price points why not spend the extra $200 for a much better product? That's also an issue for the 4080 vs 4090, if you're going to spend four figures already what's a couple of hundred more? Even at $500, most people will spend $50 more for a superior product, even $100 more. I think only when you get to around the $150-300 or so levels do people go with the value option.

Also what this excellent article showed me is that realistically, 1080p is still the target resolution for smooth framerates and more reasonably priced hardware (it's not reasonable right now overall, but moreso). I got a 1440p165 monitor recently and I somewhat regret it, these framerates just aren't really to my liking. I guess FSR/DLSS is still a must since most people will take the frames, but it's questionable whether it isn't better to just run at a lower resolution, especially since you're reliant on the game having the option available.

I hope we get value oriented GPUs at some point for the mid-range, and I hope we get a similar article for that, if I wanted to spend this much money I'd just get the 4090 especially after reading this.
 
How bad of a value is the 4k Asus Rog Swift OLED PG42UQ ...?


The value market is at 1080p... and for $280 bucks, you can get a RX 6650xt to run Warzone 2.0 between 140fps ~ 180fps..!
The 6650 XT is a great 1080p card and even a good 1440p card for most games. I can't believe the MSI one on Newegg for $270.00 is still available. And of course there is also a $360 6700 XT as well. Either one of those are an excellent value option. To put it in perspective the 6700 XT is about an XSX level card and the 6650 XT is actually right about the PS5 level as far as GPUs go. Both of these cards beat the 3060 ($370) by 12 and 24% respectively. The 6700 XT is on par with the 3060 Ti, with more VRAM. RT in this range is pointless anyway, so why pay a premium for mid-range? For whatever reason the 6800 XT's are just not there. I guess AMD is just not putting many of these out. If they followed this pricing you should be able to get one for about $499, but you can't even find those at MSRP. So the next option is the 6900 XT which can be found as low as $620.
 
At first I thought this review was too negative; it's actually the most negative review I've seen today, but as usual, Steve is right and I was wrong.
No, you really weren't. I thought that I was imagining things too but then I went back and looked at the review of the RTX 4080. They initially gave that card a score of NINETY only to lower it to 80 because they got a tonne of pushback on it. Techspot isn't reviewing fairly when the card that costs more but performs worse gets a positive review over the card that costs less but performs better.

Then there's that "driver issue" with Forza Horizon... Steve conveniently neglected to mention that he had RT turned on and claimed a "driver issue". Note the "RT High" in the graph:
FH5_1440p-p.webp

FH5_4K-p.webp

NOTE: These were stuck in with the rasterisation benchmarks, not the RT benchmarks. That's pretty sneaky because I didn't see it right away and I bet nobody else did either.

Interestingly, TechPowerUp didn't experience anything like that with the RX 7900 XTX being about 1fps behind the RTX 4080, which is a tie:
forza-horizon-5-3840-2160.png

There's no way that this was an accident. This was nothing short of an attempt by Steve Walton to mislead his audience and it pisses me off. The fact that he's so good at what he does means that he doesn't get to plead "it was an honest mistake" because we know damn well that he's not that stupid. This is as bad as when he "failed" to mention that nVidia was involved in the development of Spider-Man while he was showing us just "how much better nVidia is". After all those years of great reviews, I honestly can't believe that Steve would destroy the trust that he'd earnt over the last 15 years, but he did.

The waters get even muddier because "according to Steve", the fastest card in Far Cry 6 was the RTX 4090:
FC6_4K-p.webp

Yet somehow, TechPowerUp found something else entirely with several versions of the Radeon RX 7900 XTX, those being the OG, XFX MERC and ASUS TUF:
far-cry-6-3840-2160.png

far-cry-6-3840-2160.png

far-cry-6-3840-2160.png

Now, I had no idea what settings that TechPowerUp was using but Guru3D has almost the exact same numbers and it was specified as Ultra High Quality:
index.php

Here's Techgage agreeing:
Far-Cry-6-4K-Performance-AMD-Radeon-RX-7900-XT-and-XTX.jpg

So, Steve didn't use ultra settings to test 2160p. That makes no sense to me because why on Earth would you test halo-level cards with anything below Ultra quality? The fact that Steve used High settings instead of Ultra is the reason that the RTX 4090 got a higher score than the RX 7900 XTX.

However, it's clearly a flaw in the testing methodology because the only people who will really be gaming at 4K will be doing it with halo-level cards and so ultra settings would be the most appropriate. Hell, I have an RX 6800 XT and I choose to game at 1440p because I can run any game maxxed-out and I can't really tell the difference between 1440p and 4K from a graphical fidelity standpoint even on a 55" panel. This means that, to me, 1440p ultra looks better than 4K high.
The bottom line with all of these new GPUs is that they're too expensive.
Yes they are, but you'll notice that the RTX 4080 is even worse and the article he wrote about it was positively glowing when compared to this. He's holding the RX 7900 XTX to a higher standard than the RTX 4080 and that's wrong, especially considering the outrageous MSRP of the RTX 4080. Considering how nVidia treated Steve and Tim when they tried blacklisting Hardware Unboxed, Steve has no reason to be so kind to nVidia. I honestly think that nVidia got their message across that HU had better fall in line, or else.

One of those demands may have been that Far Cry 6 was only to be shown with high settings at 4K because nVidia clearly has a weakness there. Note that for the RTX 3090 Ti, Steve used 4K ULTRA settings:
FC6.png

However, he then inexplicably switched from ultra with HD textures to simply "high" for the review of the RTX 4080:
FC6_4K-p.webp

This change in testing methodology was never explained and it doesn't make any sense either. When everything down to the RX 6800 can get >60fps at ultra, there is no reason to change from that. If you want to know a GPU's performance, you use 4K ultra. It's just like if you want to know a CPU's performance, you use 720p low or 1080p low depending on a game's options and the level of GPU you're testing with.

This is bad because it's not consistent. If the standard testing methodology changes, you need to explain to your audience that it has and why. Sure, it says ultra and it says high but who the hell can remember what the settings were on the last card?

In this case, the change was critical because it directly affected the outcome and it's dishonest. If the RX 7900 XTX is faster in Far Cry 6 than the RTX 4090 at 4K ultra settings, then it IS the faster card. The fact that the RTX 4090 is faster at 4K high is irrelevant, just like for total performance, the results at 1440p, while important, do not say which card is ultimately the fastest because it's not as strenuous a test as 4K ultra.

It gets even worse than that because there's no Far Cry 6 RT chart. Techgage showed that there's at least one game in which the RX 7900 XTX is every bit nVidia's equal:
Far-Cry-6-4K-Ray-Tracing-Performance-AMD-Radeon-RX-7900-XT-and-XTX.jpg

Sure, it's just ONE game but this is seriously important because it completely defies expectations.

It actually gets better for the RX 7900 XTX because, as it turns out, the RTX 4080 is only 15% faster in RT performance than the RX 7900 XTX according to TechPowerUp:
relative-performance-rt_3840-2160.png

This actually surprised the hell out of me because the impression I got from Steve was that the RT performance of the RX 7900 XTX was somewhere around that of the RTX 3080. Clearly, this is NOT the case.

Sure, the RTX 4090 destroys the RX 7900 XTX, but it also destroys the RTX 4080. A difference of 15% is like comparing an RX 6800 XT with an RX 6950 XT. Sure, it's there but it's not worth paying extra for, something that Steve himself said when reviewing the RX 6950 XT. Now suddenly he changes his tune when it comes to RT? Remember that nVidia tried banning Hardware Unboxed for not paying enough attention to RT and DLSS so I don't buy it for a second that he really believes that.

I don't know what it is, but there's something rotten in the state of Denmark and nVidia's involved. My faith in Steve Walton has been completely shattered by this hit piece that he dared to call "a review". I used to believe that he was beyond things like this but the more I look at this "review", the more I find wrong with it.

Steve turned the settings down in Far Cry 6 from the RTX 3000-series to the RTX 4000-series. That makes ZERO sense unless nVidia wanted him to hide a weakness that exists in Ada Lovelace. There can be no other reason that I can think of based on the facts available.

Then he even had the nerve to stick an RAY-TRACING test in with the rasterisation tests and blame some mythical "Driver Issue" for its seemingly weak performance and then go on to bash it further. What could be the reason for that if not to re-stoke the fears of the uninitiated that Radeon drivers are bad? This is LITERALLY the most dishonest hardware review that I've ever seen. This is Jayz2Cents-level bad made even worse by the fact that we know what to expect from Jay. From Steve Walton... I never expected this.

Feel free to fact check everything that I said.
 
Last edited:
The AMD hype around this launch created hi expectations. AMD wanted to go after Nvidia's best, but after the 4090 came out they deflated. And came up with "it's not a 4090 competition". Marketing team changed direction to the unreleased 4080.

After watching many hours of video reviews and read others articles about this, I came to conclusion that it's not a such bad card just the price is wrong in current market. Now it trades blows with 4080 and maybe with some drivers fixed it will perform better. But knowing AMD and all, at least 5-6 months before any noticeable improvement.

And if a rebate on AMD side is expected next months it will increase the value even more for those not willing to pay a grand now. Only sales figures for next 2-3 months will tell the full story.

Now small youtubers tend to tell the truth rather than big names those days. Some big ones got dirty along the way to the top.

After all many years reading Hexus, XbitLab's, Guru3d, Anandtech, Tomshardware and Techpowerup reviews the HU ones look like they don't have it the right way yet.
Also editors here are not arranging the info received into a easy to read format. I have to look 3-4 times where the test system is described what drivers were used. But the stuff here is responsive to user requests and that's a good sign.
 
No, you really weren't. I thought that I was imagining things too but then I went back and looked at the review of the RTX 4080. They initially gave that card a score of NINETY only to lower it to 80 because they got a tonne of pushback on it. Techspot isn't reviewing fairly when the card that costs more but performs worse gets a positive review over the card that costs less but performs better.

Then there's that "driver issue" with Forza Horizon... Steve conveniently neglected to mention that he had RT turned on and claimed a "driver issue". Note the "RT High" in the graph:
FH5_1440p-p.webp

FH5_4K-p.webp

NOTE: These were stuck in with the rasterisation benchmarks, not the RT benchmarks. That's pretty sneaky because I didn't see it right away and I bet nobody else did either.

Interestingly, TechPowerUp didn't experience anything like that with the RX 7900 XTX being about 1fps behind the RTX 4080, which is a tie:
forza-horizon-5-3840-2160.png

There's no way that this was an accident.

The waters get even muddier because according to Steve, the fastest card in Far Cry 6 was the RTX 4090:
FC6_4K-p.webp

Yet somehow, TechPowerUp found something else entirely with several versions of the Radeon RX 7900 XTX, those being the OG, XFX MERC and ASUS TUF:
far-cry-6-3840-2160.png

far-cry-6-3840-2160.png

far-cry-6-3840-2160.png

Now, I had no idea what settings that TechPowerUp was using but Guru3D has almost the exact same numbers and it was specified as Ultra High Quality:
index.php

Here's Techgage agreeing:
Far-Cry-6-4K-Performance-AMD-Radeon-RX-7900-XT-and-XTX.jpg

So, Steve didn't use ultra settings to test 2160p. That makes no sense to me because why on Earth would you test halo-level cards with anything below Ultra quality? The fact that Steve used High settings instead of Ultra is the reason that the RTX 4090 got a higher score than the RX 7900 XTX.

However, it's clearly a flaw in the testing methodology because the only people who will really be gaming at 4K will be doing it with halo-level cards and so ultra settings would be the most appropriate. Hell, I have an RX 6800 XT and I choose to game at 1440p because I can run any game maxxed-out and I can't really tell the difference between 1440p and 4K from a graphical fidelity standpoint even on a 55" panel. This means that, to me, 1440p ultra looks better than 4K high.

Yes they are, but you'll notice that the RTX 4080 is even worse and the article he wrote about it was positively glowing when compared to this. He's holding the RX 7900 XTX to a higher standard than the RTX 4080 and that's wrong, especially considering the outrageous MSRP of the RTX 4080. Considering how nVidia treated Steve and Tim when they tried blacklisting Hardware Unboxed, Steve has no reason to be so kind to nVidia. I honestly think that nVidia got their message across that HU had better fall in line, or else.

One of those demands may have been that Far Cry 6 was only to be shown with high settings at 4K because nVidia clearly has a weakness there. Note that for the RTX 3090 Ti, Steve used 4K ULTRA settings:
FC6.png

However, he then inexplicably switched from ultra with HD textures to simply "high" for the review of the RTX 4080:
FC6_4K-p.webp

This change in testing methodology was never explained and it doesn't make any sense either. When everything down to the RX 6800 can get >60fps at ultra, there is no reason to change from that. If you want to know a GPU's performance, you use 4K ultra. It's just like if you want to know a CPU's performance, you use 720p low or 1080p low depending on a game's options and the level of GPU you're testing with.

This is bad because it's not consistent. If the standard testing methodology changes, you need to explain to your audience that it has and why. Sure, it says ultra and it says high but who the hell can remember what the settings were on the last card?

In this case, the change was critical because it directly affected the outcome and it's dishonest. If the RX 7900 XTX is faster than in Far Cry 6 than the RTX 4090 at ultra settings, then it IS the faster card. The fact that the RTX 4090 is faster at 4K high is irrelevant, just like for total performance, the results at 1440p, while important, do not say which card is ultimately the fastest because it's not as strenuous a test as 4k ultra.

It gets even worse than that because there's no Far Cry 6 RT chart. Techgage showed that there's at least one game in which the RX 7900 XTX is every bit nVidia's equal:
Far-Cry-6-4K-Ray-Tracing-Performance-AMD-Radeon-RX-7900-XT-and-XTX.jpg

Sure, it's just ONE game but this is seriously important because it completely defies expectations.

It actually gets better for the RX 7900 XTX because, as it turns out, the RTX 4080 is only 11% faster in RT performance than the RX 7900 XTX according to TechPowerUp:
relative-performance-rt_3840-2160.png

This actually surprised the hell out of me because the impression I got from Steve was that the RT performance of the RX 7900 XTX was somewhere around that of the RTX 3080. Clearly, this is NOT the case.

Sure, the RTX 4090 destroys the RX 7900 XTX, but it also destroys the RTX 4080. A difference of 15% is like comparing an RX 6800 XT with an RX 6950 XT. Sure, it's there but it's not worth paying extra for, something that Steve himself said when reviewing the RX 6900 XT. Now suddenly he changes his tune when it comes to RT? Remember that nVidia tried banning Hardware Unboxed for not paying enough attention to RT and DLSS so I don't buy it for a second that he really believes that.

I don't know what it is, but there's something rotten in the state of Denmark and nVidia's involved. My faith in Steve Walton has been completely shattered by this hit piece that he dared to call "a review". I used to believe that he was beyond things like this but the more I look at this "review", the more I find wrong with it.

Feel free to fact check everything that I said.
My biggest gripe is Steve is upset that AMD's uplift doesn't match what they claim on theirs slides:

COD: Modern Warfare 2 150%
Watch Dogs: Legion 150%
Cyberpunk 2077 170%
Resident Evil: Village 150%
Metro Exodus 150%
Doom Eternal 160%


Then tests 3 out of the 6 games that listed and get upset that games he tested didn't get close to 1.5x improvement. But then calls CODMW2 an outlier.
 
What's up with all the negativity in the review? Almost reads as an editorial for Nvidia. Faster than 4080 for for $200 less than the 4080.

Seems AMD needs to be 30% faster than Nvidia while costing 50% less for some reviewers to think it's worth it, I'm exaggerating for effect here but the mentality holds.
It's not faster than the 4080, on a consistent basis. Some games, yes, others no. It's pretty much on par with the 4080, not the 4090 as people had hoped. And that is raster only. Turn on RT and the 4080 wins. The only "value" here is that it's $200 less, but at $1,000, there's no real value here. That is why people are unhappy. They were looking for a $1,000 4090 but only got 80 class performance, with no RT.
 
My biggest gripe is Steve is upset that AMD's uplift doesn't match what they claim on theirs slides:

COD: Modern Warfare 2 150%
Watch Dogs: Legion 150%
Cyberpunk 2077 170%
Resident Evil: Village 150%
Metro Exodus 150%
Doom Eternal 160%


Then tests 3 out of the 6 games that listed and get upset that games he tested didn't get close to 1.5x improvement. But then calls CODMW2 an outlier.
AMD's overall marketing was misleading. People were expecting 50-70% gains across the board when what they got was closer to 25-30%. 6 games, cherry-picked, is neither accurate nor truthful. I think even AMD is surprised at the performance, or lack thereof. Drivers may help down the road, but then again, they may not.
 
It's not faster than the 4080, on a consistent basis. Some games, yes, others no. It's pretty much on par with the 4080, not the 4090 as people had hoped. And that is raster only. Turn on RT and the 4080 wins. The only "value" here is that it's $200 less, but at $1,000, there's no real value here. That is why people are unhappy. They were looking for a $1,000 4090 but only got 80 class performance, with no RT.
that's how averages work, yes here, no there but taken all together it's faster than the 4080 even if by a little. A win is a win.

I only care about rasterization performance and not RT. Even with 4090 4K RT is still not there yet, and but not there I mean high fps (100+) on ultra without upscaling.
 
AMD's overall marketing was misleading. People were expecting 50-70% gains across the board when what they got was closer to 25-30%. 6 games, cherry-picked, is neither accurate nor truthful. I think even AMD is surprised at the performance, or lack thereof. Drivers may help down the road, but then again, they may not.
Yeah, I listened to Moore's Law is Dead on the way to work this morning and it does sound like there is a potential that drivers just are not there for these cards. Then again, it also sounded like there could be a more fundamental flaw with the MCMs. I would not be surprised if AMD ends up getting a bit more performance out of these cards, even more reason though not to be an early adopter. You might get one later for $600 and the performance originally promised. Then again, you might not.
 
No, you really weren't. I thought that I was imagining things too but then I went back and looked at the review of the RTX 4080. They initially gave that card a score of NINETY only to lower it to 80 because they got a tonne of pushback on it. Techspot isn't reviewing fairly when the card that costs more but performs worse gets a positive review over the card that costs less but performs better.

Then there's that "driver issue" with Forza Horizon... Steve conveniently neglected to mention that he had RT turned on and claimed a "driver issue". Note the "RT High" in the graph:
FH5_1440p-p.webp

FH5_4K-p.webp

NOTE: These were stuck in with the rasterisation benchmarks, not the RT benchmarks. That's pretty sneaky because I didn't see it right away and I bet nobody else did either.

Interestingly, TechPowerUp didn't experience anything like that with the RX 7900 XTX being about 1fps behind the RTX 4080, which is a tie:
forza-horizon-5-3840-2160.png

There's no way that this was an accident. This was nothing short of an attempt by Steve Walton to mislead his audience and it pisses me off. The fact that he's so good at what he does means that he doesn't get to plead "it was an honest mistake" because we know damn well that he's not that stupid. This is as bad as when he "failed" to mention that nVidia was involved in the development of Spider-Man while he was showing us just "how much better nVidia is". After all those years of great reviews, I honestly can't believe that Steve would destroy the trust that he'd earnt over the last 15 years, but he did.

The waters get even muddier because "according to Steve", the fastest card in Far Cry 6 was the RTX 4090:
FC6_4K-p.webp

Yet somehow, TechPowerUp found something else entirely with several versions of the Radeon RX 7900 XTX, those being the OG, XFX MERC and ASUS TUF:
far-cry-6-3840-2160.png

far-cry-6-3840-2160.png

far-cry-6-3840-2160.png

Now, I had no idea what settings that TechPowerUp was using but Guru3D has almost the exact same numbers and it was specified as Ultra High Quality:
index.php

Here's Techgage agreeing:
Far-Cry-6-4K-Performance-AMD-Radeon-RX-7900-XT-and-XTX.jpg

So, Steve didn't use ultra settings to test 2160p. That makes no sense to me because why on Earth would you test halo-level cards with anything below Ultra quality? The fact that Steve used High settings instead of Ultra is the reason that the RTX 4090 got a higher score than the RX 7900 XTX.

However, it's clearly a flaw in the testing methodology because the only people who will really be gaming at 4K will be doing it with halo-level cards and so ultra settings would be the most appropriate. Hell, I have an RX 6800 XT and I choose to game at 1440p because I can run any game maxxed-out and I can't really tell the difference between 1440p and 4K from a graphical fidelity standpoint even on a 55" panel. This means that, to me, 1440p ultra looks better than 4K high.

Yes they are, but you'll notice that the RTX 4080 is even worse and the article he wrote about it was positively glowing when compared to this. He's holding the RX 7900 XTX to a higher standard than the RTX 4080 and that's wrong, especially considering the outrageous MSRP of the RTX 4080. Considering how nVidia treated Steve and Tim when they tried blacklisting Hardware Unboxed, Steve has no reason to be so kind to nVidia. I honestly think that nVidia got their message across that HU had better fall in line, or else.

One of those demands may have been that Far Cry 6 was only to be shown with high settings at 4K because nVidia clearly has a weakness there. Note that for the RTX 3090 Ti, Steve used 4K ULTRA settings:
FC6.png

However, he then inexplicably switched from ultra with HD textures to simply "high" for the review of the RTX 4080:
FC6_4K-p.webp

This change in testing methodology was never explained and it doesn't make any sense either. When everything down to the RX 6800 can get >60fps at ultra, there is no reason to change from that. If you want to know a GPU's performance, you use 4K ultra. It's just like if you want to know a CPU's performance, you use 720p low or 1080p low depending on a game's options and the level of GPU you're testing with.

This is bad because it's not consistent. If the standard testing methodology changes, you need to explain to your audience that it has and why. Sure, it says ultra and it says high but who the hell can remember what the settings were on the last card?

In this case, the change was critical because it directly affected the outcome and it's dishonest. If the RX 7900 XTX is faster in Far Cry 6 than the RTX 4090 at 4K ultra settings, then it IS the faster card. The fact that the RTX 4090 is faster at 4K high is irrelevant, just like for total performance, the results at 1440p, while important, do not say which card is ultimately the fastest because it's not as strenuous a test as 4K ultra.

It gets even worse than that because there's no Far Cry 6 RT chart. Techgage showed that there's at least one game in which the RX 7900 XTX is every bit nVidia's equal:
Far-Cry-6-4K-Ray-Tracing-Performance-AMD-Radeon-RX-7900-XT-and-XTX.jpg

Sure, it's just ONE game but this is seriously important because it completely defies expectations.

It actually gets better for the RX 7900 XTX because, as it turns out, the RTX 4080 is only 15% faster in RT performance than the RX 7900 XTX according to TechPowerUp:
relative-performance-rt_3840-2160.png

This actually surprised the hell out of me because the impression I got from Steve was that the RT performance of the RX 7900 XTX was somewhere around that of the RTX 3080. Clearly, this is NOT the case.

Sure, the RTX 4090 destroys the RX 7900 XTX, but it also destroys the RTX 4080. A difference of 15% is like comparing an RX 6800 XT with an RX 6950 XT. Sure, it's there but it's not worth paying extra for, something that Steve himself said when reviewing the RX 6950 XT. Now suddenly he changes his tune when it comes to RT? Remember that nVidia tried banning Hardware Unboxed for not paying enough attention to RT and DLSS so I don't buy it for a second that he really believes that.

I don't know what it is, but there's something rotten in the state of Denmark and nVidia's involved. My faith in Steve Walton has been completely shattered by this hit piece that he dared to call "a review". I used to believe that he was beyond things like this but the more I look at this "review", the more I find wrong with it.

Steve turned the settings down in Far Cry 6 from the RTX 3000-series to the RTX 4000-series. That makes ZERO sense unless nVidia wanted him to hide a weakness that exists in Ada Lovelace. There can be no other reason that I can think of based on the facts available.

Then he even had the nerve to stick an RAY-TRACING test in with the rasterisation tests and blame some mythical "Driver Issue" for its seemingly weak performance and then go on to bash it further. What could be the reason for that if not to re-stoke the fears of the uninitiated that Radeon drivers are bad? This is LITERALLY the most dishonest hardware review that I've ever seen. This is Jayz2Cents-level bad made even worse by the fact that we know what to expect from Jay. From Steve Walton... I never expected this.

Feel free to fact check everything that I said.

Remember that W1zzard on TPU, he doesn't use built in benchmarks when he tests games. He finds an area in a game that is more real to actual gameplay than the benchmarks generally provide. Most folks don't realize that the built in benchmarks in games don't really provide a true experience you'd see in the game.

I can't speak for other testers, but I do know W1zzard does not use any built in benchmarks in games. Sometimes that's why you'll see differences in his results in some games when compared to other sites.
 
Looking at the hardware specs, this card will only grow in benchmarks. Because it's got a lot more raw potential than 4080. With better drivers and more games using its features to the full potential, it will squash 4080 and come closer to 4090 in everything except RT.

This will be a rerun of RX 590 vs GTX 1070 where in early benchmarks 1070 was sweeping the floor with 590, not even in the same class, only to become inferior in later games and benchs. AMD was always laggy with drivers and developer support, but eventually they get there.
 
But they give the 4090 a 90/100.
That's because nVidia would accept nothing less. Haven't you noticed that their pro-nVidia bias happened after nVidia tried banning them? I think that nVidia succeeded at scaring them into capitulation.
The 7900XTX cost per frame is really good, however RT performances is a hit.
It's not as bad as you think because it's only a measly 15% behind the RTX 4080 in RT performance overall according to TechPowerUp, not that Steve Walton would ever say it:
relative-performance-rt_3840-2160.png

And get this, according to Techgage, the RX 7900 XTX actually defeated the RTX 4080 in RT performance in Far Cry 6 by 13% and was only 2fps slower than the RTX 4090:
Far-Cry-6-4K-Ray-Tracing-Performance-AMD-Radeon-RX-7900-XT-and-XTX.jpg

The 4090 is having great performances, but is the most ridiculous GPU I have ever seen. The price is beyond stupid and the design is a joke.
Yup, I couldn't agree more. The thing is, there are plenty of *****s out there who have a lot of money but no cocaine habit to curb their spending. :laughing:
IDK the reason, what's to blame? Probably 1gen MCM, weak drivers?
Unless the drivers are garbage at this stage, colour me majorly underwhelmed.
Wow, I always know you are a AMD hard-core fan and an NVIDIA forever hater but you keep impressing me. I usually don't dig deep into reviews like this to find some misleading tactics cause in the end, they need any kind of sponsors to live.
I honestly don't care about that and neither do people like Steve Burke. If a "review" is tainted, I WILL attack it and this is as tainted as I've ever seen. It's even more overt than when Hardware Canucks tried to do a hit-piece on the R7-1800X and I tore SKYMTL a new one for his "trouble". Just like here, I backed up everything I said and they, like most of the people here, couldn't say that I was wrong.
What I'm curious about is that if what you mean is true, then why NVIDIA treated Hardware Unboxed like the way they did last year ??? I mean, I might be the dumbest guy but Hardware Unboxed and TechSpot are well related right ? All their themes and info are the same.
It "put the fear of God" into them because they've been slowly moving in a green direction ever since. Did you not notice how, in the Spider-Man benchmark, Steve "conveniently" forgot to mention that nVidia was involved in the development of the game? No, he just said that AMD's drivers sucked, trying to stoke the flames of fear that have kept people buying nVidia for so many years. Somehow, nVidia got to Steve and while I don't think he's happy about it, I won't sit silently while a travesty like this "review" gets published.

I defended Steve on more than one occasion but on this occasion, he deserves no defence. That's why we have this forum.
Btw back to the topic, which is about 7900XTX, I think they tried and their road looks really promising, especially when drivers are updated, programs and games are optimized then 7900XTX might show its true value. Right at the beginning, they said that they will compete with 4080 and I think they did well.
Yes they did. In fact, they made the RTX 4080 essentially irrelevant or at least they would have if there weren't enough crazy people to keep buying them just because they're in a green box.
My biggest gripe is Steve is upset that AMD's uplift doesn't match what they claim on theirs slides:
Steve may be pretending that this is his gripe but he's far too competent and professional to be affected by that. I'll tell you what happened, he didn't want to talk about RT or DLSS and nVidia scared him with that ban. Since then, he has made DAMN SURE to have SEVERAL references to RT and/or DLSS in ALL of his reviews. He was even dishonest when he didn't tell us that nVidia was involved in Spider-Man's development, leaving many wondering why the Radeon drivers weren't ready but the GeForce drivers were.
COD: Modern Warfare 2 150%
Watch Dogs: Legion 150%
Cyberpunk 2077 170%
Resident Evil: Village 150%
Metro Exodus 150%
Doom Eternal 160%


Then tests 3 out of the 6 games that listed and get upset that games he tested didn't get close to 1.5x improvement. But then calls CODMW2 an outlier.
AND he snuck in an RT test in the rasterisation test section and tried to say that AMD had a "driver issue" in Forza Horizon. This isn't a review, it's a hit-piece, brought to you by nVidia.
Remember that W1zzard on TPU, he doesn't use built in benchmarks when he tests games. He finds an area in a game that is more real to actual gameplay than the benchmarks generally provide. Most folks don't realize that the built in benchmarks in games don't really provide a true experience you'd see in the game.

I can't speak for other testers, but I do know W1zzard does not use any built in benchmarks in games. Sometimes that's why you'll see differences in his results in some games when compared to other sites.
That's not what happened here. Steve wasn't using ultra settings in Far Cry 6 like the others were. His numbers for ALL cards were higher. Plus, he snuck in a Forza Horizon RT test where it didn't belong and claimed that there was a "Radeon Driver Problem". There's no excuse for that. Believe me, I wish I were wrong because I've trusted Steve since I first read his HD 4870 review. This is not a happy day for me in the least but I serve the truth, not a person or a concept.
 
Last edited:
Let's be clear. >$1k GPUs are really bad value. All of them. Irrespective of what the model number is, or how fast or slow these products are relative to each other, there's really no way to rationally justify their value. Having missed two generations with my relative frugality, I'm probably going to end up buying a 4080/7900XTX, but I'm not going to pretend that I'm not being utterly dumb and frivolous.
Unfortunately, this is the reality. 1k+ flagships are becoming the norm. Not just graphics cards but almost everything else, including smartphones.

The bread that we buy today is a far cry from the affordability we paid 3 decades ago.

No matter what we say, the price will keep increasing. No thanks to the consumers who are ready to splash 1k or 2k. People's earnings have also increased. Hence, the buying power.

Yeah, I too probably would be buying the 7900XTX or the eventual 7950XTX. I bought my current 5700XT at MSRP. At that time, the scalper phenomenon didn't take off yet.

I think I will wait another 6 months to upgrade, though.
 
6950XT at my local Micro Center, they're selling for about $100 under the 7900XT.

If you still want a solid AMD card, but want to save $100, look at the 6950XT - it only falls behind the 7900XT by around 5-10% (depending on resolution) and power draws are very close (6950XT winning in the multi monitor by 40W, but losing in the spikes 20 ms by 40W, everything else is within 10-20W of each other)
If the price is about $100 difference, I would go for the newer tech card. Future driver optimizations will only make it better. Utilizing modern tech is better for future proofing.
 
Unfortunately, this is the reality. 1k+ flagships are becoming the norm. Not just graphics cards but almost everything else, including smartphones.

The bread that we buy today is a far cry from the affordability we paid 3 decades ago.

No matter what we say, the price will keep increasing. No thanks to the consumers who are ready to splash 1k or 2k. People's earnings have also increased. Hence, the buying power.

Yeah, I too probably would be buying the 7900XTX or the eventual 7950XTX. I bought my current 5700XT at MSRP. At that time, the scalper phenomenon didn't take off yet.

I think I will wait another 6 months to upgrade, though.
On the other hand, it means that for some/many people upgrades will become less frequent.
Main reason I have an old PC is that I just could not spill so much money for new cool parts.
I also plan to keep my cellphone for longer if it works fine. Things are getting more expensive, purchases becoming less frequent.
 
Back