AMD Radeon RX 7900 XT Re-Review: What Should Have Been

doubt it.

BqVIEmo.png
Depending on the uplift the 4090 suprim liquid selling for $1529 from Lenovo's website with code last week might be similar performance per dollar you'll be paying in q1 2025 with significant efficiency delta gains naturally.
Sad times ahead. 4090 might be a 4 year card for some. If anyone ever wondered if ai is good for humanity, look no further than what pc gaming will is looking in the near term ( 4 year outlook).
 
doubt it.

BqVIEmo.png
Yeah? What's difference between RTX4090 and RTX4080? RTX4090 has more RT units and is bigger chip.

AMD makes big GPU that contain mostly RT capable hardware. Not problem at all. Then AMD would have fastest RT card available. No problem at all. Another question is who would want to buy GPU that sucks everywhere outside RT.

But for my claim, adding more RT compute power is very trivial. And so AMD would have absolutely no problem doing that. That wouldn't make any sense but I didn't say anything about that.
 
AMD makes big GPU that contain mostly RT capable hardware. Not problem at all. Then AMD would have fastest RT card available. No problem at all.
RT is still compute limited. 4090 wins by that much cause dual issue fp32 does not work on rdna3, not because nvidia cards have 3x rt cores.
 
RT is still compute limited. 4090 wins by that much cause dual issue fp32 does not work on rdna3, not because nvidia cards have 3x rt cores.
Of course RT is compute limited. You make it sound like Nvidia did something magical with RT performance which they did not. Make bigger chip, put mostly RT specialized hardware on chip, improvise architecture more toward RT. Any of those GTX4090 is easily beaten on RT. Fact that there are no "RT only" GPUs on customer market tells time is not yet ready for RT. And that is why non-RT performance still matters more, therefore AMD does not care about RT performance so much and so putting too much effort on RT does not make sense.

This is pretty much same thing as no high end GPU on RDNA1 or even one that beats Nvidia's best GPU at that time. It's not AMD couldn't make it, but AMD decided it's not worth it.
 
If these were my only two options, I would still upgrade to the 4060 instead of the 7600. The simple fact is that when we are talking the same VRAM buffer, the Nvidia card it still worth the 10% more because of DLSS and better RT performance. In this comparison, RT was actually an option in a lot of those game at 1080p. Also, I know AMD has FSR 2.0, but DLSS is still slightly better and well, since its exclusive to Nvidia cards, it's better to go with Nvidia knowing that if FSR is not included, but DLSS is, you still have an option there. AMD has the advantage against the 4080 with the 7900 XTX due to price and VRAM, and the 7900 XT also has an advantage over the 4070 Ti given the 20GB vs 12GB of VRAM, but the tables do turn at the 4060/7600 level.
 
Any of those GTX4090 is easily beaten on RT. Fact that there are no "RT only" GPUs on customer market tells time is not yet ready for RT.
?????????
what are you talking about ?
nothing beats 4090 in rt, and a card that would only run path tracing would play zero games, except for q2rtx.

therefore AMD does not care about RT performance so much and so putting too much effort on RT does not make sense.
but they still make you pay for it....... you think amd's rt hardware comes free cause it's bad lol ?
also, 7900xtx loses to 4090 in rasterization while drawing more power, so your point in very much invaild.

This is pretty much same thing as no high end GPU on RDNA1 or even one that beats Nvidia's best GPU at that time. It's not AMD couldn't make it, but AMD decided it's not worth it.

they couldn't or else they just would have.full navi 11 on 7nm couldn't beat the heavily cut tu104 on 2070 Super, that was the problem. So they stood no chance against full tu102 on 2080Ti and decided to leave 2080/2080Ti to sell with no competition.

truth is, amd absolutely ruined ati's legacy. 50% market when deal happened, now 8%, soon behind intel.
 
Last edited:
What are you guys paying in euros/dollars for a custom 7900xt these days anyway guys ?
848eur for the pulse here in Central Europe. Bout the same for 4070ti. Don't even wanna say how much the 4080 is cause it's just plain stupid.
 
Long time ago (1998) I set my personal limit for any PC part to $300. So any part above that limit it's not for me. I would rather buy used than raise my limit.

"it's not stupid the one who asks, it's the one who pays"
 
I think AMD graphics cards particularly in the high end segment need to be much lower than NVIDIA because of the lack of features - once AMD can implement these things they can legitimately price closer to NVIDIA.

Another thing, you'd expect more games to run better on AMD CPUs because they must optimise for consoles based on rdna2

AMD used to be the well priced underdog, but recently they seem to charge a premium. Mind you there were some expensive CPUs like the top end one that was launched when athlon 64 first launched.
It's coming - the ROCm support for GPU Compute is being adopted by more and more video editing and rendering libraries. DaVinci Resolve and Adobe Premiere work great on AMD. The performance of ROCm is not quite as good as CUDA but it's way better than any time in the past 10 years. They need to improve upscaling and h.264 encoding (still; again) and then they'll be darned close to NVidia, but NVidians will never admit that because they wasted all that extra money on their under-performing poor-rasterization cards.

Also in regards to the 7900xt - it's a whole new generation of video card with a different architecture. If you want to compare, the 7900xt is most comparable to the Intel a750, a770 series of cards because those also have newly written drivers getting developed for the first time. NVidia hasn't really updated their architecture in a decade. AMD has some new patents for AI-based ray-tracing tree pruning that will likely disrupt NVidia with its 8000-series of cards. When the tables are turned we'll see how much NVidia owners truly think that ray tracing matters!
 
Last edited:
Back