Enthusiast proves AMD's RX 7900 XTX can reach RTX 4090-like performance levels

nanoguy

Posts: 1,355   +27
Staff member
In brief: AMD's philosophy with RDNA 3 was to develop better value products that draw gamers away from Team Green RTX 40 series offerings instead of going for the performance crown. However, removing the power limit on the RX 7900 XTX reveals the underlying architecture is technically capable of reaching a higher performance tier.

Nvidia's GeForce RTX 4090 graphics card is the undisputed performance king and it probably will be for a while. Our own Steven Walton called it "brutally fast," which is an accurate description for a card that can be bottlenecked by relatively capable processors like the Ryzen 7 5800X3D even when playing games at 1440p.

AMD has chosen not to launch a competitor for the RTX 4090 and instead opted to offer an alternative to Nvidia's RTX 4080. During a Q&A session last year, the company explained the main reason behind this decision was its philosophy of building great value products for gamers, which is why it uses a chiplet architecture for its RDNA 3 GPUs. At the same time, we were told it didn't want to create a higher-end product with extreme power and cooling requirements just to prove that it can.

That said, an enthusiast who goes by "jedi95" on the AMD subreddit sought to test the limits of the Radeon RX 7900 XTX, and their experiment led to some interesting findings. For one, removing the power limits allowed the GPU clock speed to climb as high as 3,467 MHz. The average clock speed was around 3,333 MHz during benchmarks, which is also an impressive figure.

More importantly, the card was able to achieve a graphics score of 19,137 points in 3DMark's Time Spy Extreme test. For reference, the RTX 4090 in all its incarnations achieves an average of 19,508 points in the same benchmark, with the world record being 24,486 points at the time of writing this. As you'd expect, the modded Asus TUF RX 7900 XTX used in this experiment was a massive power sink, with a peak power draw of 696 watts.

The enthusiast used a high-end custom water loop cooling system to tame the GPU during the testing, though he believes the stock cooler on the TUF RX 7900 XTX should also be able to handle thermals when power consumption is pushed higher than the 430-watt power limit.

Still, it's clear AMD optimized the RX 7900 XTX for better performance-per-watt as many users aren't going to run this card with an exotic cooling solution. As for the potential performance improvements in actual gaming scenarios, jedi95 says the 700-watt RX 7900 XTX was around 15 percent faster in Cyberpunk 2077 at 1440p using the built-in benchmark with graphics set to the Ultra preset. Warzone 2 gains compared to stock were a touch over 17 percent at the same resolution, which is also an impressive result.

It's unlikely that AMD will release a higher-end RDNA 3 SKU, but a recent leak suggests the company may have been toying with an RX 7950 XTX. After all, Team Red did previously say the RDNA 3 architecture was designed to achieve clock speeds of 3 GHz and beyond. We'll have to wait and see, but then again it would be an expensive, power-hungry product in a GPU market that desperately needs better mainstream offerings.

Permalink to story.

 
Maybe on the next process node - of course, they are likely working on improvements that go beyond process node, too.
And nvidia will likely be moving to 3nm too. Even more power efficiency. I’m looking forward to the 5090. Hopefully amd can keep up. Intel is off in a lake somewhere.
 
Not bad, if only AMD could offer this tier of performance at lower power consumption...
I think that's the fundamental problem. To get this level of performance you need more power. AMD alluded to this in one of the interviews that I saw. What we need is a technology breakthrough that can deliver higher performance at lower power consumption.
 
I think that's the fundamental problem. To get this level of performance you need more power. AMD alluded to this in one of the interviews that I saw. What we need is a technology breakthrough that can deliver higher performance at lower power consumption.
You mean like the 4090? At half that 700W power consumption nvidia is faster.

In fact in some games, the 4090 consumes LESS power than the sapphire nitro+ 7900xtx at its stock configuration, and is 20% or more faster.

I have rdna2 and wanted to pull the trigger on the nitro 7900xtx, but it’s high idle power consumption with dual high refresh rate displays coupled with the high power consumption (which as I said can exceed the 4090 at times) made me skip the rdna3 generation.

In fact, I’m probably going to ditch amd and come back to GeForce with the 5090, but I will wait to see what rdna4 has to offer.
 
Last edited:
You mean like the 4090? At half that 700W power consumption nvidia is faster.

In fact in some games, the 4090 consumes LESS power than the sapphire nitro+ 7900xtx at its stock configuration, and is 20% or more faster.

I have rdna2 and wanted to pull the trigger on the nitro 7900xtx, but it’s high idle power consumption with dual high refresh rate displays coupled with the high power consumption (which as I said can exceed the 4090 at times) made me skip the rdna3 generation.

In fact, I’m probably going to ditch amd and come back to GeForce with the 5090, but I will wait to see what rdna4 has to offer.

Exactly. While this was a nice science experiment, a card that consumes 700W consumption is not feasible, especially when you can get a 4090.
 
Exactly. While this was a nice science experiment, a card that consumes 700W consumption is not feasible, especially when you can get a 4090.
Well to be fair to amd, the 7900 xtx consumes roughly 380 to 500W depending on the game, and whether it’s the stock amd card or an aftermarket card that is overclocked, but nonetheless, the 4090 is still relatively power efficient.

Of course there’s the issue of the nvidia tax but from a technological point of view, Lovelace is impressive.
 
Something strange, I think I missed the announcement from AMD stating that the 7900 XTX was a direct competitor to the 4090.

Or that both cost the same amount of money or worse, that the 7900 XTX MSRP is actually higher than the 4090?
 
You mean like the 4090? At half that 700W power consumption nvidia is faster.

In fact in some games, the 4090 consumes LESS power than the sapphire nitro+ 7900xtx at its stock configuration, and is 20% or more faster.

I have rdna2 and wanted to pull the trigger on the nitro 7900xtx, but it’s high idle power consumption with dual high refresh rate displays coupled with the high power consumption (which as I said can exceed the 4090 at times) made me skip the rdna3 generation.

In fact, I’m probably going to ditch amd and come back to GeForce with the 5090, but I will wait to see what rdna4 has to offer.
No, unfortunately not like the 4090. The 4090 may be more efficient than AMD cards but it is still a power hog. That's not to say it's a bad card, just power hungry (and expensive). What I'm talking about is a major breakthrough, 4090 performance at 250 watts kind of thing. AMD may meet or exceed 4090 performance but I guarantee you it will come with a hefty power bill.
 
Something strange, I think I missed the announcement from AMD stating that the 7900 XTX was a direct competitor to the 4090.

Or that both cost the same amount of money or worse, that the 7900 XTX MSRP is actually higher than the 4090?
I think that was coming more from the 'peanut gallery' than from AMD. I'm pretty sure AMD was pushing the 80% performance at 30% less cost, or dollars per frame.
 
"During a Q&A session last year, the company explained the main reason behind this decision was its philosophy of building great value products for gamers..."

So when are they going to do that? So far it's been nothing but high-end and overpriced cards, just like Nvidia.
 
The 12VHPWR connector can deliver 600W total power. The card is maxed at 450, but I've seen over-clockers getting above 550W.
Keywords: can deliver
Actual consumption while gaming is closer to 350W.
550W with an OC is still a far cry from 700W.
 
Keywords: can deliver
Actual consumption while gaming is closer to 350W.
550W with an OC is still a far cry from 700W.
I said 600w. I did mistype 700w in one response, but to you I said 600w. 550 is ~90% of 600. The point is power is driving more performance and we are at or near a threshold where getting to the next level of performance is going to require more power. I’m sure that’s how AMD is going to get 4090 levels of performance. I wouldn’t be surprised if that card is drawing 500w+.

I don’t think this is sustainable in the long run for neither AMD nor NVidia. Maybe we will see more power optimization in the next go-round. I suspect SW will play a big part as well.
 
I said 600w. I did mistype 700w in one response, but to you I said 600w. 550 is ~90% of 600. The point is power is driving more performance and we are at or near a threshold where getting to the next level of performance is going to require more power. I’m sure that’s how AMD is going to get 4090 levels of performance. I wouldn’t be surprised if that card is drawing 500w+.

I don’t think this is sustainable in the long run for neither AMD nor NVidia. Maybe we will see more power optimization in the next go-round. I suspect SW will play a big part as well.
I saw the 600W you wrote. 12VHPWR. got it. but the 4090 uses 350W. that's half of the 700W XTX needed to get close to that. you asked for better efficiency and I gave you a killer example, and you don't seem happy? how come?
 
In my opinion, AMD’s RDNA series is not as power efficient as it seems. For example, when comparing RDNA2 with Ampere, the former uses TSMC’s 7nm, while the latter uses some mature Samsung 10nm node refined to be called 8nm. I feel if Ampere was built on TSMC 7nm, it would have been substantially more power efficient and faster. Now that Nvidia is using TSMC’s 4nm for Ada, thus, we can start to see AMD falling behind. But I feel AMD clearly missed their performance vs power target with RDNA3. The positives about RDNA3 are that it does not get bottlenecked by CPU that quickly, and, there are some titles that the RX 7900 XTX actually leads the RTX 4090.
 
Performance is only part of the deficit; drivers is the other; and no amount of voltage can redress that. I ended up putting my EVGA RTX 3080Ti back in my main rig after getting tired of AMD driver issues ranging from Anisotropic filtering forcing not working to Combat Mission series games crashing to turning off vsync not working in Shadow of Chernobyl. Also in my case the 7900XT was about even overall in terms of performance with my 3080Ti on both my i9 10920X and i7 10700KF over 70 games tested @1920 and 3440.
 
Something strange, I think I missed the announcement from AMD stating that the 7900 XTX was a direct competitor to the 4090.

Or that both cost the same amount of money or worse, that the 7900 XTX MSRP is actually higher than the 4090?

Just an imho but this assumption that the 7900XTX failed in trying to compete with the 4090 is as much a bad faith myth as the continuing one re AMD drivers. It holds as much water as a sieve... or a 4090's heat sink.

It's an elitist opinion that those who can afford Nvidia's best spread to those that can't, even those who'd actually be better served buying AMD. That's not to say the 4090 is bad itself, just (for me) not worth the £500-700 (all prices depending on card model/status) over a 7900XTX, with 4080's already £200 more.
For me, even if keeping my favoured 3440x1440 resolution through any upgrade, the only main difference re my current 6800XT vs a 7900XTX would be that one of them will more consistently max out my 144Hz refresh rate at high-ultra settings. The price, 6800XT (when I bought it, May 2021, not a good time but needs must etc) or 7900XTX would be abut the same, £1200 max. Again, the main difference there being the 3080 and it's better RT vs the 6800XT (for which upscaling was and still is compulsory) wasn't worth 50-100% more for that any more or less than the 4080 would be worth £200 more than a 7900XTX for a smaller difference this gen.

But this is where the PC/gamer mindset is (even after AMD coming from nowhere to first kick Intel into shape and then catch up to Nvidia) That second best in a hotly contested space might as well be miles behind is as foolish as Nvidia fans and more casual types alike putting RT performance even before the importance of base/raster perf or VRAM caps. Nm with 'better' RT being Nvidia's main hype/mindshare remaining and as yet unchallenged, they still aren't equipping their more expensive cards with enough assets to meet that demand without huge perf loss. I dunno, seems counter-something to me.
 
Just an imho but this assumption that the 7900XTX failed in trying to compete with the 4090 is as much a bad faith myth as the continuing one re AMD drivers. It holds as much water as a sieve... or a 4090's heat sink.

It's an elitist opinion that those who can afford Nvidia's best spread to those that can't, even those who'd actually be better served buying AMD. That's not to say the 4090 is bad itself, just (for me) not worth the £500-700 (all prices depending on card model/status) over a 7900XTX, with 4080's already £200 more.
For me, even if keeping my favoured 3440x1440 resolution through any upgrade, the only main difference re my current 6800XT vs a 7900XTX would be that one of them will more consistently max out my 144Hz refresh rate at high-ultra settings. The price, 6800XT (when I bought it, May 2021, not a good time but needs must etc) or 7900XTX would be abut the same, £1200 max. Again, the main difference there being the 3080 and it's better RT vs the 6800XT (for which upscaling was and still is compulsory) wasn't worth 50-100% more for that any more or less than the 4080 would be worth £200 more than a 7900XTX for a smaller difference this gen.

But this is where the PC/gamer mindset is (even after AMD coming from nowhere to first kick Intel into shape and then catch up to Nvidia) That second best in a hotly contested space might as well be miles behind is as foolish as Nvidia fans and more casual types alike putting RT performance even before the importance of base/raster perf or VRAM caps. Nm with 'better' RT being Nvidia's main hype/mindshare remaining and as yet unchallenged, they still aren't equipping their more expensive cards with enough assets to meet that demand without huge perf loss. I dunno, seems counter-something to me.
Very well said.

The problem is, the media will never say anything like what you just did.

Instead they keep pushing the 4090 down our throats with very clever play of words and facts.

You never hear one of them saying that the performance hit of enabling RT is worth for the (many times barely visible graphical improvements) that result in enabling this. Instead, they double down in saying "but just use DLSS 3 glorious frame generation!" which is simply cheating in my eyes, since you are not getting "inflated" FPS with the fake frames inserted.

Lastly, as I originally posted, AMD never stated that the 7900 XTX was competing with the 4090, why is that always ignored?

So in the end, I dont place the blame entirely on gamers, since many of them follow what the media tells them and as stated above, they are not receiving the whole "story".
 
I saw the 600W you wrote. 12VHPWR. got it. but the 4090 uses 350W. that's half of the 700W XTX needed to get close to that. you asked for better efficiency and I gave you a killer example, and you don't seem happy? how come?
Here's an interesting article regarding 4090 power draw. As you can see from the chart some games are pushing well above 400W, and close to 500W on average. Look at the peak power draw and you have games drawing over 600w and in some cases over 700w. Even Tom's Hardware stated that to get that last 5% out of the 4090 you're increasing power consumption 15-20% higher. Average power draw is all well and good, but a good design accounts for those peaks, otherwise your PSU could shut down.

This is not to say the 4090 is a bad GPU, I'm just trying to show the correlation between power and performance and high-eng GPUs are drawing a lot of power. Nvidia seems focused on monolithic, chip designs which may be optimized, but still draw significant power. AMD went the chiplet route, but they don't seem to have optimized the power draw yet.

I think to get that next level of performance we are going to see a jump in power. We saw that from the 3090 (350W) to the 4090 (450W) and I can only imagine what the 5090 is going to require. I don't think this is a sustainable path for GPUs.
 
Here's an interesting article regarding 4090 power draw. As you can see from the chart some games are pushing well above 400W, and close to 500W on average. Look at the peak power draw and you have games drawing over 600w and in some cases over 700w.
That's a 4090 that's had its BIOS flashed to one with a 500W power limit -- standard 4090s are capped at 450W, before throttling kicks in. TechPowerUp's power analysis of the Founders Edition version shows how such a card behaves.

We saw that from the 3090 (350W) to the 4090 (450W) and I can only imagine what the 5090 is going to require. I don't think this is a sustainable path for GPUs.
The 3090 Ti has a 450W TDP so the jump isn't a large as it might seem at face value.
 
Back