AMD Radeon RX 9070 Review: One Hit, One Miss?

I actually prefer this card because with it, I won't have to buy a new PSU. The performance difference is not huge. I would like it to be $50 cheaper, but it's not like MSRP means anything. I suspect the RX 9070 will drop in price eventually and the gap between it and the XT will be bigger.
 
Since I don't require ultra top level performance in the games I play, the lower cost, while I agree, should be a bit lower than it is in a cost to performance ratio, if I was buying RIGHT NOW, the 9070 would be my choice. Considering how often I game and for how long, the 9070 would have easily paid off the $50 price difference within a month when it comes to my electric bill. Its efficiency really appeals to me.
 
This time I kinda disagree and disappointed with this review and more agree with GN's.

9070 is still a hit. The XT's power consumption is a deal breaker for some people including me, while the non XT is similar to my 1080 Ti in 250watt-ish, means less space heater and cheaper electricity bill for longer run, especially I keep my card for 8 years or longer. 9070 non XT is the obvious choice for me.

While I agree the price is the problem here, once the demand settles down, AMD's prices usually go down faster than Nvidia, so I'm looking forward to that moment to grab a 9070.
 
"However, no one outside of AMD seems to understand this approach, and even some within AMD appear equally confused." -> maybe AMD has good yields for the chip and does not want to sell too many good chips in the 9070 so they price it accordingly.
I find the 9070 a better card than the 9070XT because of the much better power efficiency. Probably limiting the 9070XT to 950mv will bring the power consumption way down (while losing a bit of performance).
Regarding the recent reviews I don't understand the double measure. Bash nVidia for overpromising and a week later give AMD a pass for the same thing. Less performance than promised, higher power consumption than advertised.
These are still good products, but the current market (wafers are allocated to high margin products) does not allow for better pricing, and the old node does not allow for performance increases.
Better power efficiency is always good, I agree. But I think what you are missing is the fact that the RX 9070 and 9070 XT are based on the same Navi48 chip. So technically, if the RX 9070 is able to draw lesser power, the XT variant should technically be able to do so. I feel the reason why the RX 9070 appears more efficient is because of the significantly lower stock clockspeed. Hence if power draw is your concern, you can always undervolt and underclock the chip to achieve comparable results. But for USD 50 bucks more, the XT version incrementally offers you more performance when you need it, which gives it a better value.
 
maybe AMD has good yields for the chip and does not want to sell too many good chips in the 9070 so they price it accordingly.
I agree. AMD may want low demand for the non-XT version if they are expecting high yields for the XT chips. They don't want a lot of complaints about low stock of the non-XT and have to disable cores on perfectly good XT chips to meet demand.
 
A rating of 70 your being ridiculous. Purchased at £539 including next day delivery which is a ridiculously good price for the performance of this card. I'll take 100w less power usage over the 13 percent performance increases any day regardless of prices difference. If the 13 percent actually mattered, then you could have an argument, but it simply doesn't because everything plays splendid on the non xt anyways. It's always made into this ohhh but this card gives you an extra 20-30 fps you must buy this when im playing at 200fps in most games with frame gen if I wish to use it.
 
Hello to the community.
I’ve been following you for a long time through YouTube and articles, but I’ve never posted before.
I was watching this review and I think there’s something wrong with it.
I can only see 9 game benchmarks, and you’re not mentioning any other games (maybe in a list?), yet you give an 18-game average? How is that possible? What are the data for the rest of the games?
Another thing I’d like to mention: isn’t it misleading to include games like CS in the game average? With such a small sample of games, it can skew the results.
For example, we have the FPS of the 9 listed games at 1440p: (77, 55, 404, 93, 113, 108, 79, 127, 90).
The average comes out to around 127 FPS, but if you exclude the extreme CS result (404 FPS), then the average drops to 93 FPS.
Technically, this is probably a better representation of the card's real performance.
 
Back