AMD Radeon RX 9070 XT Review: Have They Finally Done It?

Hang on, you gave this card an 85 when it, like the 5070 (which got 60), also had inflated performance claims? It’s also got worse efficiency than the 5070 Ti and worse RT than a 5070. Like the 5070, it achieved only a 20% perf uplift at its price point. What gives?
 
I get that value perspective here. But it's a 7900 XT with less RAM and 50% better RT. A good value I guess, especially if you can actually find it at MSRP. But, for those hoping that AMD was finally going to release the GPU we've all been waiting for, perhaps not.

Maybe GPUs really are maxing out and we can't expect major upgrades going forward. At least the next generation will be on a new node and likely you'll see some uplifts there... for a price.
 
No improvement in power efficiency over RDNA3... Performance per watt sucks compared to RTX 40 and RTX 50 series.... AMD fanboys mocked nvidia for blackwell power consumption. Turned out that nvidia GPU (even older RTX 40 series) are still most power effient GPU in the planet

16GB vram did not help 9070XT in Indiana Jones with Full RT...... AMD fanboys mocked nvidia 5070 for having 12GB which is not enough to max out Indiana Jones with Full RT.... But none of AMD GPU can do it including latest RDNA4 ... LOL

 
No improvement in power efficiency over RDNA3... Performance per watt sucks compared to RTX 40 and RTX 50 series.... AMD fanboys mocked nvidia for blackwell power consumption. Turned out that nvidia GPU (even older RTX 40 series) are still most power effient GPU in the planet

16GB vram did not help 9070XT in Indiana Jones with Full RT...... AMD fanboys mocked nvidia 5070 for having 12GB which is not enough to max out Indiana Jones with Full RT.... But none of AMD GPU can do it including latest RDNA4 ... LOL
The joke is on you because FPS doesn't show missing textures and massive spikes in latency, not to mention the ROP scam.

Enjoy worshiping at the altar of the missing ROP and the scam that is the 5xxx series.
 
Last edited:
I'm impressed at it coming within striking distance or even matching the 4080 in raster performance while the (supposed) MSRP is half of what the 4080's was. It'll be a great upgrade option for those who have skipped a few generations. We'll have to see how things actually shake out in regards to price, but if the supply is there for around MSRP I'll congratulate AMD on a job well done.
 
They traded raw raster against RT performance. Slower than the 7900XTX in the former and faster in the latter.

I wanted it to work but I don't feel the performance is quite enough to be compelling at $599. The RT performance still collapses in important games. I wonder how much of that is a driver or game level issue and not a raw hardware performance problem.

In Cyberpunk with RT it's right up there with the 4070Ti and 5070. In Indiana Jones or Black Myth it's a fail. People are still going to go for the 5070 because it has more consistent performance, although it's an underwhelming product with too little memory.

Perhaps the 340w version of this card is the one to consider. It'll be a bit power thirsty but might deliver that little extra performance I had hoped for.
 
Good card, but all depends on the availability and price. If I could get a nice model below 700 it would be a win, but if not I will just wait. Still the best option for Linux systems.
Still, the RT improvements they made is very good, lets see the upscalling and so on, but this actually gives me hope for AMD creating as competitive gpu as cpu, but right now they seems to be at ryzen 2k step. 2 generations are still required.
 
Availability at or close to the $600 MRSP is the key for AMD’s success.

The performance is decent and a better step forward, especially in the areas of former weakness, over the previous AMD generation than Nvidia’s generational quasi-stagnation. It would be a pretty good, almost automatic buy for someone who was previously carefully weighing a 7900xt over a 4070 ti Super like I was.

For.The. Right. Price.

I’m also one of the undoubtedly many who are playing catch up with a pretty sizeable library of older games which run perfectly well on older configurations and sometimes several generations old GPUs. If AMD catches a bout of Nvidiatitis I will be perfectly content to run some of my library on my Media PC (GTX 1080 card), the more demanding titles on my PC ( RX 6800 card) and skip this generation (yet again) all together.
 
They traded raw raster against RT performance. Slower than the 7900XTX in the former and faster in the latter.

I wanted it to work but I don't feel the performance is quite enough to be compelling at $599. The RT performance still collapses in important games. I wonder how much of that is a driver or game level issue and not a raw hardware performance problem.

In Cyberpunk with RT it's right up there with the 4070Ti and 5070. In Indiana Jones or Black Myth it's a fail. People are still going to go for the 5070 because it has more consistent performance, although it's an underwhelming product with too little memory.

Perhaps the 340w version of this card is the one to consider. It'll be a bit power thirsty but might deliver that little extra performance I had hoped for.

There is literally nothing better at $600-800. Unless you can actually find a 5070 Ti for $750. 16GB is fine for a $600 card. (It is NOT acceptable for the $1200+ 5080.)

Yes, it's a bit sad to see AMD in the charts missing from the high end this gen, but we knew this was coming. It's correctly marketed as a 70 series card. Better than their (true) 80 series card (7900 XT) from last gen, and the obvious choice over the dumb 5070. They've found a niche NV's unable to fill. A $600 midrange with 16GB, solid rasters and finally usable ray tracing.

Drivers might fix Indy and Wukong. Fingers crossed for an actual decent launch tomorrow.
 
The RT has improved but not by enough. A 4070 is beating this thing with RT on in some cases. This means Radeon will keep its rep as being **** with RT. If they wanted this to change, they would need to take the RT crown from Nvidia.

Personally, I'm not that excited by this product. Maybe if we get the cards at MSRP this might be good but even then, I would rather have a 5070 purely because from past experience I always have a better time with GeForce than I do on Radeon. But I'm not sure why you would bother with these cards as they are not quite good enough for a flawless 4K experience and if you are at 1440p then you don't need to spend this much.

Maybe RDNA4 will be better in the budget end of the market.
 
The RT has improved but not by enough. A 4070 is beating this thing with RT on in some cases. This means Radeon will keep its rep as being **** with RT. If they wanted this to change, they would need to take the RT crown from Nvidia.

Personally, I'm not that excited by this product. Maybe if we get the cards at MSRP this might be good but even then, I would rather have a 5070 purely because from past experience I always have a better time with GeForce than I do on Radeon. But I'm not sure why you would bother with these cards as they are not quite good enough for a flawless 4K experience and if you are at 1440p then you don't need to spend this much.

Maybe RDNA4 will be better in the budget end of the market.

The only card good enough for flawless 4K is the 5090 and it's $4000. I'm loving the 7900 XT (at 4K w/FG) that I picked up for $650 and the 9070 XT is better and possibly cheaper.
 
The only card good enough for flawless 4K is the 5090 and it's $4000. I'm loving the 7900 XT (at 4K w/FG) that I picked up for $650 and the 9070 XT is better and possibly cheaper.
Yeah I wonder what these cards are really for, they are kinda overkill at 1440p but you have to turn settings down at 4K. Thats the conclusion I came to. I want a 5090, im not prepared to pay for it and the cards below it are not worth it for my needs.

So il stay on my 3070 ti which is still offering almost a flawless 2K experience.
 
"we agreed that AMD needed to offer at least 20% better cost-per-frame value for a Radeon GPU to earn our recommendation"

And why is that?
For enthusiasts: Better RT, better upscaling, better AI support, better power efficiency, etc.

For the masses: Better consumer mind share from Nvidia's long time dominant market position (90% of GPU market).

AMD has been 20%-ish cheaper than Nvidia on several cards on the last few generations and it has gone from selling cards to 17% of consumers to 10% of consumers. Don't blame HUB for realizing this historical fact, blame the 90% of people that don't think even 20% is enough of a discount to get the not-as-good card. At this point, they need to sell cards cheap so that they can convert enough Nvidia customers to be a real competitive threat or they will sink into irrelevancy regardless of how "almost as good" they are. When AMD is back at 30% market share it will be very different but they aren't even close to that now.
 
"we agreed that AMD needed to offer at least 20% better cost-per-frame value for a Radeon GPU to earn our recommendation"

And why is that?

If I recall one of the YouTube tech channels were discussing pricing before the official announcement. Essentially AMD has been consistently losing market share and the majority of consumers for various reasons simply will not choose AMD over Nvidia unless there is a large discount. At 20% discount they still struggle to sell vs Nvidia, and something like 30% seems to be the sweet spot at reaching a broader audience base unfortunately.
 
Come on! Not a word that the Indiana Jones result is completely bizarre??

Wailing yesterday that 12GB cards fail horribly in this test and today we have a 16GB card that collapses completely! We can assume that the game is horribly bugged (remember the HUB obsessive wailing about Hogwarts Legacy and 12Gb cards that magically went away with patches?)

Once the devs. get off their asses and patch it, it'll probably be like HW where even a 3070 can play it decently now...
 
The actual RT improvements look pretty good. Shame AMD has decided to not release high end cards this generation.
"we agreed that AMD needed to offer at least 20% better cost-per-frame value for a Radeon GPU to earn our recommendation"

And why is that?
Because you are trading significantly worse RT, a worse feature set (FSR vs DLSS) and worse efficiency for that lower price.

When Nvidia's efficiency tanks (ampere VS RX 6000) AMD gets significant praise. Pretty much everyone liked the RX 6000s.
How? TS gave the 5070 a pretty poor score, whereas the 9070 XT got a pretty nice score.
"shill" gets thrown around like "fascist" in a reddit thread. Techspot's been accused of shilling for basically everyone, that's how you know their numbers are pretty accurate, because everyone who meatshields for their favorite corpo cant stand them.
 
The best value you can get in 2025... and honestly, unless you are buying a 5090, all the rest of the stack are a bunch of 4080 in the high end.
 
Back