AMD launches the $549 Radeon 9070 and $599 9070 XT: Another fail or great success?

The thing that really gives away the massive blind loyalty to Nvida is this concept:

"AMD screwed up big again. Don't they realize unless they match the 5090 raster with a $400 price tag, the 9070XT is DOA"

So, if it is anywhere near 9070ti raster, with decent RT performance, and realistically sells for $300-$400 less, it should be no contest to sell everything they can produce. But then again, PC buyers are hardly rational consumers.
 
Compared to their CPU line, the GPU department is far lacking behind nowadays. No substantial strides seen for the past few generations.

Either that or Nvidia took bigger leaps ahead, and AMD couldn't reach that far immediately.

The only thing worth considering is the comparatively lower pricing.
 
I'll be interested to see how far the pricing moves between AIB card models for the release, especially as AMD aren't producing any GPUs themselves.

For example, I'm curious to see which model between the Sapphire RX 9070 XT Nitro+ and the Sapphire RX 9070 XT Pulse will attract the $599 price tag.
 
I'll be interested to see how far the pricing moves between AIB card models for the release, especially as AMD aren't producing any GPUs themselves.

For example, I'm curious to see which model between the Sapphire RX 9070 XT Nitro+ and the Sapphire RX 9070 XT Pulse will attract the $599 price tag.
I think market rate for the 9070XT will be 650-700 with OC models from board partners with the 9070 staying close to MSRP. What I think will end up happening is that the 9070 will be popular with overclocers as it's already underclocked by about 500mhz. We're going to see lots of XT OC models for 650-700 but you'll be able to pick up a 9070 for 550 reliably and overlock it yourself. There are rumors of this chip being highly overclockable so the OC models might actually offer a 10% increase in performance for a 10% price increase unlike what we're seeing with nVidia right now where you're getting 2-5% performance increase for a 50% price increase.
 
Why are people sold on DLSS so much ? Is a feature like any features gfx providers tried before. DLSS had so many artifacts, with each version, trying to polish the image... But remember, is still a feature! Not a must!

Also with ray tracing, the most impressive I seen was on PlayStation...

Each RT implementation tried to add something, but each time seemed little and insignificant. Yet ppl say is the holy grail...

The cost of these features is that one who implemented it first, somehow made it slower for the competition... Like in the old days... Features that were small but costly in terms of gpu power. I understand it needs to be a reason to upgrade... But still!

So people bought the first GeForce cards with first version of DLSS and early RT that were not having a wow factor anyway, but had a performance penalty...
It happens again.

I think people just don't know what they want. They let Nvidia choose for them.
We degrade image quality and make it seem like a pro feature. :facepalm:

We add RT reflections with 40% performance penalty, but we accepted because of dlss

We already had screen based reflections that were fine, but we feel the itch for those better reflections... Because is the new cool!
 
Last edited:
I fully expect AMD to tune down the pricing of the RX 9070 to 499,- USD very soon to make it more logical.

As for the Rx 9070 XT - They've basically been given a golden opportunity to claw back market share. If it does indeed perform like a 5070ti - at well..300 dollars less AND it's actually available for purchase - they will gain some traction in the market.

I'm inclided to believe that working within only a single segment will greatly increase the driver support and decrease issues. If you think about it - it's almost at a console level of optimization when you only have two cards in play for this entire generation.
Uhhm no on this being the only two and no on console like. The optimisations come from the game developers not AMD on the consoles.
They've already announced "multiple RX 9060 products" which likely means a xt and non xt, maybe an xtx.
 
Uhhm no on this being the only two and no on console like. The optimisations come from the game developers not AMD on the consoles.
They've already announced "multiple RX 9060 products" which likely means a xt and non xt, maybe an xtx.
The developers works extremely closely with the manufacturers. I happen to have a family member who worked on a rather large AAA game - and they had two Nvidia employees working with them on site for the last 6 months of development. So if you see a GPU logo at the beginning of a game - that is more than just pure fluff. As for additional models coming out later. Sure - they’re not even announced. This is probably the longest period ever AMD has worked on optimizing two single cards in history that isn’t supposed to be competing at the upper tier - but simply offer the best bang for the buck at midrange market.
I am quietly optimistic for the first time in 6 years on what AMD has managed to get out of their GPU lineup.
We desperately need more competition in the Gpu segment
 
Looks like Dr Lisa Su has another Architectural win on her hands..!

RDNA4 uArch was engineered & built specifically for gaming & is much more powerful than Blackwell (for enterprise). So there is only so much NVidia is going to be able to do, because RDNA4 is playing with Jensen's head..

Expect another discount from NV, as rdna4 AIBs roll out and punish blackwell at 300+watts.
 
Considering the performance of the existing rumors, it seems that the 9070 series, with the MSRP price, is very tempting..
just waiting for the benchmarking results of various types of the 9070 series..
and anxiously hoping that the prices set by resellers where I live do not follow suit like the 5000 series..
 
Looks like I get to wait for the 9070 pricing to drop. I want my upgrade from my 3080 to have a drop in power usage, and the XT uses the same power, but the non-XT isn't worth it at $550.
 
Radeons' two biggest problems are consistency and uncertainty.
They got the price down for the most part, but what other reason is there to buy it over the other guys? What if they make a monster (for AMD) GPU next year, but then target mainstream again the next? Will they change the naming again for no reason? Will they have more marketing blunders? I'm always guessing.
 
Generally, I agree with Techspot, if the 5070ti remains at $900 that is just a bit too much more than this thing for buyers to bother with. But at $750, it's not much more for a much better product.

What's really impressive is how Nvidia have effectively bumped their cards up a performance tier in terms of competition simply by using features like AI and real time lighting. Those things are certainly not easy or cheap to deliver but they are paying back Nvidia big time by allowing them to compete with the competition a performance tier under.

Also, I don't really understand the logic that AMD should sell cheap now in order to increase their market share to get more customers in the future. The vast majority of PC gamers are not brand loyal. If AMD go cheap now and sell a consumer a GPU, it's not a given that the same consumer will go AMD the next time around. What AMD needs to get more customers in the future is a better product. They don't need a cheaper product, the consumers have plenty of money, they need something consumers actually want like ray tracing and DLSS. They need to make sure they get ahead of Nvidia on the next major 3D breakthrough. Or develop and innovate something the competition can't deliver themselves. It's not like they dont have the money.
 
The developers works extremely closely with the manufacturers. I happen to have a family member who worked on a rather large AAA game - and they had two Nvidia employees working with them on site for the last 6 months of development.
They used to do this even more intensively back in the day with NVIDIA game works with whole teams helping the big studios out to implement it. Nowadays they'll settle for a small performance uplift over the competition, early testing to ensure a smooth release and help out with support for for DLSS and raytracing.

Still, comparing the focus a graphics card release with a narrow scope allows for optimisation to console development is comparing apples to onions imo.
Microsoft/Sony work with AMD and probably have some contracts that allow game developers to get into contact with AMDs developers in some form through them but AMD isn't involved in any big way with the studios themselves. They're just there for when the odd bug/behaviour pops up.

As for additional models coming out later. Sure - they’re not even announced.
The company confirmed multiple "RX 9060 products" will arrive in the second quarter of this year, which could be anytime between April – June.
They're coming pretty soon.

Afaik they only announced not wanting to compete in the high end, so a RX 9050 might come at some point as well and perhaps some OEM / China only models.
 
This seems like a shill article for NVIDIA. For two generations NVidia has barely achieved 10% improvement while the hype and lying by that company has exploded. The 9070xt will outperform the 4080S in raster and will be closer to 4070ti-super than 4070 super in ray tracing. They cut the price by 20% for this class of card to only $600, stuffed the channel with actual cards at actual MSRPs, quadrupled AI TOPS and introduced FP8 and FSR4 (just as prototypy as DLSS4), and improved performance by 22%. The only FAIL I can see is this article, not AMD ..
 
Compared to their CPU line, the GPU department is far lacking behind nowadays. No substantial strides seen for the past few generations.
Let me explain what I know. The 6000 series was an outstanding generation and the 6950 XT blew away the 3090 ti, however, weakness in raytracing AND H.263 B-Frames encoders caused most influencers to adopt NVidia for streaming (fixed in 7000).

Then they really botched RDNA3 / 7000. The goal was 3 Ghz clocks and they only hit 2.5Ghz. The chip took too much power and/or manufacturing yields were too variable. Meanwhile they were STILL on 5nm so they got criticized for power inefficiency but that was 5nm (vs NVidia on 4nm). And, in 7000 series they stretched to innovate with chiplets but the drivers were slow to accomodate the new architecture and 1080p suffered due to interconnect latency so the 7900 cards gave them a black eye. The 7800xt was the bright spot.

RDNA4 is about getting RDNA3 to reach its potential. They reached NVidia parity in almost everything - power, performance + 90-95% in Raytracing, 105% in raster, DLSS=FSR. I have to laugh out loud at NV*****s who pay $200 extra for their fake frames and fake bragging righs - a fool and their money are easily parted. Jensen says ... The only games that raytrace modestly are the ones DEEPLY SPONSORED BY NVIDIA (Cyberpunk 2077). The only place they don't match NVidia is greed.

NVidia sends onsite engineers to get the game to have GPU vendor lockin. There is nothing altruistic about it. Imagine if we asked russia to help us shut down the war in Ukraine, that would be treasonous. Same thing.

AMD is back and imho their VLSI designs are within 6 months of NVidia if not ahead. Meanwhile Intel is provably 4 YEARS behind both NVidia and AMD in GPU design!!
 
Last edited:
Hopefully this means I'll be able to get an NVIDIA RTX card for a better price. Competition is always good when it drives the cost of the superior product down.
 
Generally, I agree with Techspot, if the 5070ti remains at $900 that is just a bit too much more than this thing for buyers to bother with. But at $750, it's not much more for a much better product.

What's really impressive is how Nvidia have effectively bumped their cards up a performance tier in terms of competition simply by using features like AI and real time lighting. Those things are certainly not easy or cheap to deliver but they are paying back Nvidia big time by allowing them to compete with the competition a performance tier under.

Also, I don't really understand the logic that AMD should sell cheap now in order to increase their market share to get more customers in the future. The vast majority of PC gamers are not brand loyal. If AMD go cheap now and sell a consumer a GPU, it's not a given that the same consumer will go AMD the next time around. What AMD needs to get more customers in the future is a better product. They don't need a cheaper product, the consumers have plenty of money, they need something consumers actually want like ray tracing and DLSS. They need to make sure they get ahead of Nvidia on the next major 3D breakthrough. Or develop and innovate something the competition can't deliver themselves. It's not like they dont have the money.

All NVidia has done, is increased the power usage of blackwell and fixed the cable issues of Ada. Jensen is trying to sell Gamers on software features (ie gimmicks) that any GPU has... while trying to dismiss blackwell's actual lack of raw performance.

Coincidentally, AMD is not selling a cheap GPU, they are selling a smaller more powerful and inexpensive die (navi48), and leveraging that economy of scale over NV's larger blackwell, to sell a more inexpensive product.

rdna4>blackwell




If you are talking about "cheap" GPUs, plz see:
  • -NVIDIA's GeForce RTX 5090 GPUs are now said to be experiencing another fiasco, as the onboard "Blackwell" GB202 turns out to be defective for many units out there.
  • -NVIDIA's GeForce RTX 5080 has now been "officially" confirmed to have the "missing ROP" problem, which means that every RTX 50 series GPUs has been affected.
  • -NVIDIA is reportedly working on fixing "black screen" issues with GeForce RTX 50 series GPUs as the firm prepares for a new driver update that will fix the problem for some users.

edit:
-
CPU-Z Validator will now warn about GeForce RTX 50 cards with missing ROPs
 
Last edited:
Back