AMD Radeon RX 9070 XT Review: Have They Finally Done It?

Absolutely not, it won't "be great upgrade option for those who have skipped a few generations."

I am sitting on a GTX 1080.

I will only consider upgrading AFTER I see the NVIDIA 6th Gen cards and the corresponding AMD ones.

Right now, you are in the midst of a huge cluster-party, similar to the one that happened during Covid with ETHEREUM.

Even RTX 3050's sell for $500.
While I agree that GPUs are getting to expensive. Someone that paid ~$599 for a GPU 9 years ago, and then argue that the cost of a $599 GPU that is newer, and much better seems odd. I am not trying to be sarcastic or busting your chops. I am merely trying to understand your thought process.

I have not seen a 3050 cost $500 in two years.
 
I will only consider upgrading AFTER I see the NVIDIA 6th Gen cards and the corresponding AMD ones.

Right now, you are in the midst of a huge cluster-party, similar to the one that happened during Covid with ETHEREUM.

Even RTX 3050's sell for $500.
The GTX 1080 launched at a $599 MSRP (for the custom AIB cards, Founders was $699) in 2016, before the crypto, NFT, and AI BS craze. If a $600 MSRP for a pretty good performer (especially with inflation!) is somehow not good enough in 2025, then you may be holding on to that 1080 forever.
 
Last edited:
It's your test that's broken, Steve. Not AMD's. TPU obtained better figures than those presented by AMD;

Stalker
1440p +35% vs GRE
4k +40%
https://www.techpowerup.com/review/sapphire-radeon-rx-9070-xt-nitro/27.html
Starfield
-1440p +26%
-4k +33%
https://www.techpowerup.com/review/sapphire-radeon-rx-9070-xt-nitro/28.html
Dragon Age
-1440p +30%
-4k +31

And it's no use saying that W1zzard tested in a light scenario, he always chooses the most GPU-intensive scenarios.

Your conclusion is misleading because as well as being far below the numbers of other reviewers, as I've already mentioned, it includes obviously broken or irrelevant games that run at 400fps.
Beware....I was banned for a week here in using Techspot forums for criticizing Steve.
 
I think that no matter how you score these cards, none are worth the price they are asking, nevermind scalped pricing.

GPUs are just not getting better every launch like they used to. Sad state for PC gaming currently.
 
Come on! Not a word that the Indiana Jones result is completely bizarre??
Both Indiana Jones and Black Myth Wukong are a joke on non NVidia cards. Are the highest settings path tracing in these games? So now that the 9070xt does okay ray tracing - finally - it still cannot do path tracing?

I really think these games are either totally broken or NVidia paid bribes so they are NVidia exclusives (have for(I=0;I<100000; I++) loops when AMD cards are used) at high ray tracing levels. In either case they should not be included in any rational benchmark.
 
So it has been a week and plenty of options to pick up a 9700 XT or 9700 at just over £600 for the 9070 or just shy of £700 for the XT (including 20% VAT/Sales tax). 5070s are sitting around £640 and 5070 Ti's - most are just shy of £1000.
If I hadn't bought a RX7800XT last year I know what I'd be looking at to replace my 1080 Ti.
 
@Steve
I’m not entirely sure how you assessed the “overinflated” AMD claims…
As per your 4k graphs the 7900GRE’s average frame rate is 56fps and the 9070XT is 74fps.
Now if the older card is held as the 100% yardstick the math functions as follows

If 56 fps is 100% performance, then what is 74 fps as level of performance expressed in percentage? More precisely if:
56fps….. 100%
74ps….. x%
Therefore x= (74x100)/56=132.143 %

So it looks like AMD’s claims of being 35% faster were not all that far from the actual measured performance, right?

Now if you go the other way around and hold 74 fps as the 100% yardstick, the result is:
X=(56x100)/74=75.676%

The right assessment here is that the older card has only 75.676% performance of the new card, which makes the old card 24.32% slower compared to the new card.

I sure hope you understand the difference here. You may want to update that review section.

(Edit) I’m sure your concern that AMD’s claimed 35% did not materialize also subtracted some points from the final score… it is only fair you add them back

(Edit 2) the 7900XTX is actually 46.4% faster than the 7900GRE … sorry but that whole section needs to be updated to reflect the mathematical truth that AMD has largely hit their target.
 
Last edited:
No improvement in power efficiency over RDNA3... Performance per watt sucks compared to RTX 40 and RTX 50 series.... AMD fanboys mocked nvidia for blackwell power consumption. Turned out that nvidia GPU (even older RTX 40 series) are still most power effient GPU in the planet

16GB vram did not help 9070XT in Indiana Jones with Full RT...... AMD fanboys mocked nvidia 5070 for having 12GB which is not enough to max out Indiana Jones with Full RT.... But none of AMD GPU can do it including latest RDNA4 ... LOL

What? The FPS/W efficiency of the 9070 is EXACTLY the same as Nvidia 40 and 50 series, .21. And the 9070 XT is only slightly less, at 0.18 FPS/Watt. That is almost double the efficiency of the RX 7xxx series, which makes sense since the 9xxx series compared to the 7xxx has 24% fewer cores, a 33% smaller die, 20% less cache, 20% less VRAM as the review states at the top.

I just upgraded from the 7800 XT to the 9070 XT and I'm really excited. Will be doing my own benchmarks of the games I play that aren't really benchmarked much by the mainstream outlets: EA WRC Rally, Forza Motorsport, Resident Evil Village, Hell Let Loose, and GTAV Enhanced.
 
Looking for some input.

Current system: 5800X3D, 32GB RAM, and a GTX 4070 (12GB vRAM).

Fairly certain my CPU is a bit too weak for the XT, so I'm wondering how much performance uplift I'd see with replacing the 4070 with the 9700. Right now they're priced at about 1,000 CN, and my current system is fine, running most games at a good FPS and graphical setting. Still if I'd see a good improvement I might pull the trigger.

No, the 5800 X3D is still very solid. The uplift you'd see moving from the 4070 to the 9070 would be close to what these benchmarks show. Only certain games will be bottlenecked appreciably, assuming you don't game much over 100 FPS.
 
Back