AMD confirms RDNA 4 to launch in early 2025 with improved ray tracing: the "strongest PC portfolio we've had"

midian182

Posts: 10,633   +141
Staff member
In brief: AMD's confirmation that it won't prioritize competing with Nvidia's top gaming GPUs with the RDNA 4 graphics cards has left some unenthusiastic about the next-gen GPUs. However, according to CEO Lisa Su, they will be part of what she says is the strongest PC portfolio in the company's history. Su also confirmed that RDNA 4 will launch early next year with improved ray tracing performance.

There have been plenty of rumors claiming that AMD will be joining Nvidia in revealing and launching its next-gen graphics cards at CES 2025. While Su never named the event during prepared remarks to analysts during AMD's third-quarter 2024 conference call, she did say the company is "on track to launch the first RDNA 4 GPUs in early 2025." This marks the first time Team Red has given an official launch timeline.

In September, Jack Huynh, AMD's senior vice president and general manager of the Computing and Graphics Business Group, was asked if AMD was committed to competing with Nvidia in the top-end gaming card market. He said that the priority was to focus on creating the best midrange products, which make up the majority of the gaming card market.

Also read: AMD's Navi 44 "RDNA 4" GPU package size could be much smaller than Navi 33

When asked directly if AMD would release cards in the RX 8000 series that are part of the enthusiast market, Huynh said, "One day, we may. But my priority right now is to build scale for AMD."

Despite appearing that there will be no competition for Nvidia's RTX 5080 and RTX 5090 cards, Su has confidence in AMD's upcoming PC products. "I think the main point is, I mean, this is the strongest PC portfolio we've had sort of in our history, I think, across desktop and notebook."

Jean Hu, AMD's Chief Financial Officer, revealed that AMD's data center segment delivered record quarterly revenue of $3.5 billion, up 122% year-over-year. It accounted for just over half of the company's total revenue in the third quarter, while gaming revenue was down to $462 million, a 69% decline compared to the previous year. This fall was mainly due to a decrease in revenue from console makers who use AMD's GPUs in their machines reducing their inventory.

There was also a revenue decline from AMD card makers and vendors as they wait for the RDNA 4 cards to arrive. Su said that in addition to a strong increase in gaming performance, RDNA 4 delivers significantly higher ray tracing performance and adds new AI capabilities. That contradicts reports from March claiming RDNA 4 graphics will receive only a minor bump in ray tracing abilities.

With AMD pretty much confirming its presence at the event, it looks as if we're going to have one of the most exciting CES shows in years. Nvidia has confirmed that Jensen Huang will deliver the CES opening keynote in January, marking the first time that the CEO has appeared at the event since 2019. It strongly suggests that rumors of an RTX 5000 reveal at the beginning of January are also true.

Permalink to story:

 
""strongest PC portfolio we've had""

Really? No high end or halo options, only mid range and below, is the "strongest"? IDK about that chief. Also, calling BS on the raytacing. Guys, you should know by now to not believe marketing hype. rDNA3 was a "revolutionary' step up in RT that hardly did anything.

No, I'm not just salty that my next GPU will have to be nvidia asince AMD has decided to leave me high and dry. Again.
 
""strongest PC portfolio we've had""

Really? No high end or halo options, only mid range and below, is the "strongest"? IDK about that chief. Also, calling BS on the raytacing. Guys, you should know by now to not believe marketing hype. rDNA3 was a "revolutionary' step up in RT that hardly did anything.

No, I'm not just salty that my next GPU will have to be nvidia asince AMD has decided to leave me high and dry. Again.

Tons of games uses software RT today. AMD have to improve RT performance or they will compete in low-end only, in a few years from now.

https://www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html

An exampe of a game that uses software RT, even if hardware RT option is turned off.
AMD is hammered due to weak RT performance here.

Avatar also had forced RT elements - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

This game was even sponsored by AMD.

So, RT performance matters even if you don't want to enable it in all games. More and more engines uses forced RT elements.

If AMD did nothing, and left their RT performance as it is, they would be out of the GPU business in 5-10 years.

Game developers can't wait to stop making fake lighting in games. Ray traced lighting and reflections will eventually do it all automaticly. Game devs spends alot of time doing baked lighting in games, which most of the time, don't react dynamicly to the environment, which sucks in games with destruction.

So yeah, future is RT. AMD can't ignore it.
 
Last edited:
Tons of games uses software RT today. AMD have to improve RT performance or they will compete in low-end only in a few years from now.

https://www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html

An exampe of a game that uses software RT, even if hardware RT option is off.

AMD is hammered due to weak RT performance here.

Avatar also had forced RT elements - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

This game was even sponsored by AMD.

So, RT performance matters even if you don't want to enable it in all games. More and more engines uses forced RT elements.

If AMD did nothing, and left their RT performance as it is, they would be out of the GPU business in 5-10 years.
You should slow down and read my comment again. I was referring to rDNA3's RT improvements, which were worthless, not RT as a tech. AMD's claims of rDNA4 having "huge gains" in RT are the same thing they said about rDNA3, and that went nowhere.

Womp Womp.
 
""strongest PC portfolio we've had""

Really? No high end or halo options, only mid range and below, is the "strongest"? IDK about that chief. Also, calling BS on the raytacing. Guys, you should know by now to not believe marketing hype. rDNA3 was a "revolutionary' step up in RT that hardly did anything.

Your misquoting her to make a point. She said 'Strongest PC portfolio' that means across all ranges of Hardware. Laptops, PCs, etc (Servers/DC if you want to inlude that as well)

Okay, Graphics card are the only negative however that will need building up again after years of under investment. As a whole market though she is correct/entitled to make that statement...


 
Your misquoting her to make a point. She said 'Strongest PC portfolio' that means across all ranges of Hardware. Laptops, PCs, etc (Servers/DC if you want to inlude that as well)

Okay, Graphics card are the only negative however that will need building up again after years of under investment. As a whole market though she is correct/entitled to make that statement...
I didnt "misquote" her. With rDNA2 AMD had a stronger portfolio, they had everything you just said PLUS an entire range of GPUs. Also Intel was floundering, shoveling amps into parts trying to compete.

Today they have Arrow lake, which is more impressive in TDP limited scenarios then it is full throttle. The server tech from TSMC 3nm is gonna be a bear for AMD to work with, and mobile is gonna be an interesting fight with lunar lake's efficiency.
 
You should slow down and read my comment again. I was referring to rDNA3's RT improvements, which were worthless, not RT as a tech. AMD's claims of rDNA4 having "huge gains" in RT are the same thing they said about rDNA3, and that went nowhere.

Womp Womp.
AMD did not hype up RT performance with RDNA3. They talked about second generation ray-tracing accelerators, not really promising a huge uplift. They promised >up to< 80% increase, delivered 50% or so at 1080p/1440p.

If you check out the Silent Hill 2 performance numbers, you will see that 7900XTX performs about 50% better than 6900XT in 1440p and close to 75% in 4K/UHD.

So they obviously delivered on this promise for the most part.
 
AMD did not hype up RT performance with RDNA3. They talked about second generation ray-tracing accelerators, not really promising a huge uplift. They promised >up to< 80% increase, delivered 50% or so at 1080p/1440p.
80% increase isnt "huge". K den
If you check out the Silent Hill 2 performance numbers, you will see that 7900XTX performs about 50% better than 6900XT in 1440p and close to 75% in 4K/UHD.

So they obviously delivered on this promise for the most part.
That wasn't achieved because of rDNA4's arch changes. That's because they threw more cores at the problem. The 7900xtx is significantly larger than the 6900xt was.

The actual comparison of arch would be a 6900xt and a 7900 GRE, they are far closer, with the 7900 GRE having 11% more SMs then the 6900xt. And would you look at that, it averages closer to 15%, no 50%, and certainly not 80%. So that's, what, a 4% increase thanks to arch, and the other 11% comes from having more cores? Truly revolutionary, LMFAO.

compare this to nvidia. The RTX 3060 12gb has 28 SM units. The 4060 has 24 SM units. So the 3060 has 16% more cores. And the 4060 shows.....about a 12% average uplift in RT vs the 3060.

Wait.....

OH HEY, LOOK, AN ARCH IMPROVEMENT! The 4060 has FEWER cores then the 3060 yet is FASTER then the 3060 in ray tracing. WOWZERS.

So, no, AMD did not "clearly deliver on this promise". They, pretty clearly, did almost nothing with rDNA3 and dropped the ball on RT, then covered it up by fiddling with core names and shoving more SMs into the core to make it look like they did something. But when you actually do the math, nope, they sure didnt manage much!

I hope this helps :)
 
80% increase isnt "huge". K den

That wasn't achieved because of rDNA4's arch changes. That's because they threw more cores at the problem. The 7900xtx is significantly larger than the 6900xt was.

The actual comparison of arch would be a 6900xt and a 7900 GRE, they are far closer, with the 7900 GRE having 11% more SMs then the 6900xt. And would you look at that, it averages closer to 15%, no 50%, and certainly not 80%. So that's, what, a 4% increase thanks to arch, and the other 11% comes from having more cores? Truly revolutionary, LMFAO.

compare this to nvidia. The RTX 3060 12gb has 28 SM units. The 4060 has 24 SM units. So the 3060 has 16% more cores. And the 4060 shows.....about a 12% average uplift in RT vs the 3060.

Wait.....

OH HEY, LOOK, AN ARCH IMPROVEMENT! The 4060 has FEWER cores then the 3060 yet is FASTER then the 3060 in ray tracing. WOWZERS.

So, no, AMD did not "clearly deliver on this promise". They, pretty clearly, did almost nothing with rDNA3 and dropped the ball on RT, then covered it up by fiddling with core names and shoving more SMs into the core to make it look like they did something. But when you actually do the math, nope, they sure didnt manage much!

I hope this helps :)

Nah 80% is not that huge when performance was low to begin with :)
 
If they can achieve parity with 4080 like rt performance for $500 then I believe they will gain significant market share and mind share as well.
 
If they can achieve parity with 4080 like rt performance for $500 then I believe they will gain significant market share and mind share as well.
They absolute won't do that. Would love to see it, but nah. AMD is not spending huge amounts of R&D funds on gaming GPUs right now.

They might hit 4070 series RT performance with top RDNA4 offering.
 
AMD hasn't been able to make any real improvements in a couple of generations other than throwing more cores into virtually the same cards...

This is the main reason they can't compete with Nvidia in the high end... They probably COULD make a card that performs similar to the 5090 - but it would run at 200 degrees and weight 20 pounds... not to mention cost $5000...

Innovation is dead in their GPU section - hopefully this doesn't spill over into their CPU brand... Intel has dropped the ball over and over, giving AMD a fighting chance - but they can't rely on Intel being incompetent forever.
 
AMD hasn't been able to make any real improvements in a couple of generations other than throwing more cores into virtually the same cards...

This is the main reason they can't compete with Nvidia in the high end... They probably COULD make a card that performs similar to the 5090 - but it would run at 200 degrees and weight 20 pounds... not to mention cost $5000...

Innovation is dead in their GPU section - hopefully this doesn't spill over into their CPU brand... Intel has dropped the ball over and over, giving AMD a fighting chance - but they can't rely on Intel being incompetent forever.

Intel really first dropped the ball with Arrow Lake, as TSMC 3nm was used here. Intel has been competing with node disadvantage for years and still delivered top performance. The problem was powerdraw.

TSMC 3nm fixed the powerdraw, sadly the performance was not good. Arrow Lake obviously has some problems, that might or might not get fixed.

In some games, 285K performs worse than i5-12600 for example and in others, it beats even 7800X3D. 285K is 1st in Cinebench ST and MT as well.

Performance is all over the place in many games. Some runs great others don't. Might be memory latency issues causing it. Some tech sites needs to dig deep.

Intel might have fixed or improved these issues by the time 9800X3D reviews hit. We will see.

Productivity performance is good tho, so I don't really see Arrow Lake as a complete failure. Depends on the usecase.

Intel can't really compete with Ryzen 3D in gaming, so it makes sense they aim for productivity performance, while still delivering decent gaming performance.

It's not like AMD offers a true do-it-all chip either. You buy 9950X for productivity and 9800X3D for gaming. 9950X3D might deliver something close to both overall, but still loose.
AMD still struggles with dual CCD latency issues and according to recent rumours, they won't put 3D cache on both CCDs with 9900X3D and 9950X3D either.
 
Last edited:
Really? No high end or halo options, only mid range and below, is the "strongest"?
If you're AMD, ask yourself... Why would you want to target the high-end when nVidia dominates that segment of the market not only in technological expertise but mindshare as well?

AMD could put out the best damn card there is that absolutely kills everything that nVidia makes but will it sell? Nope. Because there's too many nVidia fanboys out there that'll say... "Oh, it's AMD, it must be crap." With that in mind, why would AMD want to target what is a lost cause?

Besides, only a small portion of the market buys ultra-high-end products like the 4090. Better for AMD to target the mid-range portion of the market where there's a lot more people buying in it.
 
If you're AMD, ask yourself... Why would you want to target the high-end when nVidia dominates that segment of the market not only in technological expertise but mindshare as well?
Because consistency sells. If you keep walking away from the market, the market will walk away from you.

Polaris marked a steep decline in AMD GPU marketshare into the single digits, where it resides today.
AMD could put out the best damn card there is that absolutely kills everything that nVidia makes
Name the last time they did that. Last that I can remember was Hawaii. The 290x. From 2014. And that sold well.
but will it sell? Nope. Because there's too many nVidia fanboys out there that'll say... "Oh, it's AMD, it must be crap." With that in mind, why would AMD want to target what is a lost cause?
There are plenty of complaints that have been made, far more than "its AMD so its bad". The fact you refuse to state those arguments tells me a lot. Just as an example, there were MANY who were pissed that the Evergreen lineup of GPUs was de-prioritized for drivers over GCN, then abandoned early. Those people, instead of buying Hawaii, went to the faster and more efficient Maxwell GPUs instead. After that, those who did buy hawaii and wanted an upgrade found AMD was making nothing, so they went with pascal, and later turing. When AMD finally did show up, Vega was hot, late, and slow, AND more expensive!

Here's a hint: the 6800/xt/6900/xt and the 7900xt/xtx all sold well and were frequently out of stock. AMD themselves admitted that they de-prioritized GPUs in favor of EPYC chips during the lockdowns. Which was not the wrong move. But to then claim low sales are why you pulled out IS disingenuous. Breaking back into markets you have left previously is actually a really hard thing to do.
Besides, only a small portion of the market buys ultra-high-end products like the 4090. Better for AMD to target the mid-range portion of the market where there's a lot more people buying in it.
this was the same argument AMD used with Polaris.

The result: The 1080ti, the 1080, the 1070ti, and the 1070 ALL, individually, outself the entirety of the polaris lineup. At a minimum price nearly twice what the top polaris cost.

If AMD decides to ignore the higher end, and focus only on $400 and less, that's their prerogative. How well is it working for intel? Oh, right, 0% marketshare. Companies engage with making halo products for a reason.
 
I didnt "misquote" her. With rDNA2 AMD had a stronger portfolio, they had everything you just said PLUS an entire range of GPUs. Also Intel was floundering, shoveling amps into parts trying to compete.

Today they have Arrow lake, which is more impressive in TDP limited scenarios then it is full throttle. The server tech from TSMC 3nm is gonna be a bear for AMD to work with, and mobile is gonna be an interesting fight with lunar lake's efficiency.

She simply said 'Strongest PC Portfolio' which is a somewhat vague statement/spin said to an analyst at the earnings call. It could mean whatever she wanted it to mean. Largest range of hardware, depth, market coverage etc. Your just spinning it to your own definition

Arrow Lake should be that efficient. Its a two node jump is it not, with some architecture updates ? Given that, its actually disapointing its not better and only a minor upgrade is speed terms. Intel is still struggling in DC (though catching up) and Granite Rapids is not going to change that according to recent reviews. Turin Dense will be also be on TMSC 3nm node. Lunar Lake will be a difficult one though granted...
 
Besides, only a small portion of the market buys ultra-high-end products like the 4090. Better for AMD to target the mid-range portion of the market where there's a lot more people buying in it.
The 4090 crowd isn't that small though is it? Have you seen the percentage on the Steam Survey? More people are running a 4090 than a 1080, 1660, 2070, 4080, 2080, 1650Ti and I think, there's more 4090's out there than the entirety of AMD's current offerings, combined.

Makes you wonder as well, at £1500+ for a 4090, and that many out there, what made Nvidia more money, a 4060 at 4x the market share (according to Steam Survey) or the 4090?

I'm not saying the share of 4090's is large, but I struggle to call it small either, compared to literally the entire AMD lineup.
 
Intel really first dropped the ball with Arrow Lake, as TSMC 3nm was used here. Intel has been competing with node disadvantage for years and still delivered top performance. The problem was powerdraw.

TSMC 3nm fixed the powerdraw, sadly the performance was not good. Arrow Lake obviously has some problems, that might or might not get fixed.
But... Intel dropped the ball by being content with the 14nm node. Their foundry business to be precise - they were unable to progress to 10nm or 7nm - and had to outsource eventually...

Had they been able to fix that - or outsource to TSMC earlier (they had more clout and capital than AMD back then and easily could have superseded AMD's orders), they probably would have driven AMD out of business...

Saying that, despite all their failures, they still had a decent counter to AMD's CPUs - but the gap has been widening year by year... fortunately for Intel, AMD may have taken a page out of Intel's playbook by sh*tting the bed with the 9000 series... of course, Intel being Intel, they've decided to drop the ball again with Arrow Lake...

Edit: thanks Burty - had the wrong node numbers!
 
Last edited:
But... Intel dropped the ball by being content with the 7nm node. Their foundry business to be precise - they were unable to progress to 5nm or 3nm - and had to outsource eventually...
I thought it was the 14nm node they got stuck on? They ended up on 14nm++++++++++++
Then their 10nm yields were bad, and weren't any better than their massively refined 14nm.
They've been stuck like this ever since, they renamed the whole lot (10nm=Intel 7, 7nm= Intel4) but as far as I'm aware, 7nm (intel 4) only just started being using with Meteor Lake (first seen in laptops late 2023).

I wouldn't call that "content with the 7nm node", I'd argue they've only just started using the 7nm node. Let alone 5nm or 3nm.
I'm glad some people have the money to burn like that because I certainly don't! Jesus Christ!
It is pretty insane how many 4090's are in the Steam Hardware Survey, and to think, lots of 4090's sold for purely AI training and Crypto Coin Mining, there's actually a legit, substantial amount of 4090's out in the wild.
 
For those complaining that this isn't 'AMD's best etc' due to no higher end options. Well, we all know the truth of why and how it came to this. There's been nothing wrong with their high end answers to Nvidia's xx80 and xx90 since RDNA2 launched but holding as much as half the cost for blows traded in raster is nothing in the face of that perf busting RT business nor the proprietary, gen locked upscaler you need to right it. Add in old memes and myths of AMD's fails and flaws, experienced first hand or not, for many ppl and it didn't matter how far AMD had come or improved since the Vegas and Polaris etc that were more inferior to Nvidia then. Go tour some less intelligent forums and see how much AMD are reviled on this side of things... but ppl forget that Nvidia did worse things planned and in full knowledge of the effect and outcome.

My guess is it just got to the point where there is no point for AMD investing all that R&D and production into something nobody wants or appreciates for what it is... the only existing barrier to Nvidia's gen on gen record breaking prices since Turing, with AMD staying far closer to that standard with far better cards since. It's a crying shame tbh but we can only hope this move brings AMD deserved success and maybe heralds a return to the top tiers later with some improvements where they have been lacking to date, albeit by what is actually a small margin for, again, that pricing gap.

Me, I'm just gonna have to make my 7900XTX last until then given there are no indications so far from Nvidia, partners or vendors that Blackwell prices aren't simply going to jump far, far in excess of any reasonable notion of 'MSRP' or the old added cost for premium models yet again. I had no problem paying a grand last year for a 100 fps average running AAA games at ultra at 4K (which was also a 50 fps average uplift over my 6800XT from two years before) but I got stuck at paying another grand on top for damn near the same fps but some better reflections and lighting in some games that always drop fps hard but might not be noticed without pausing.
 
For those complaining that this isn't 'AMD's best etc' due to no higher end options. Well, we all know the truth of why and how it came to this. There's been nothing wrong with their high end answers to Nvidia's xx80 and xx90 since RDNA2 launched but holding as much as half the cost for blows traded in raster is nothing in the face of that perf busting RT business nor the proprietary, gen locked upscaler you need to right it. Add in old memes and myths of AMD's fails and flaws, experienced first hand or not, for many ppl and it didn't matter how far AMD had come or improved since the Vegas and Polaris etc that were more inferior to Nvidia then. Go tour some less intelligent forums and see how much AMD are reviled on this side of things... but ppl forget that Nvidia did worse things planned and in full knowledge of the effect and outcome.

My guess is it just got to the point where there is no point for AMD investing all that R&D and production into something nobody wants or appreciates for what it is... the only existing barrier to Nvidia's gen on gen record breaking prices since Turing, with AMD staying far closer to that standard with far better cards since. It's a crying shame tbh but we can only hope this move brings AMD deserved success and maybe heralds a return to the top tiers later with some improvements where they have been lacking to date, albeit by what is actually a small margin for, again, that pricing gap.

Me, I'm just gonna have to make my 7900XTX last until then given there are no indications so far from Nvidia, partners or vendors that Blackwell prices aren't simply going to jump far, far in excess of any reasonable notion of 'MSRP' or the old added cost for premium models yet again. I had no problem paying a grand last year for a 100 fps average running AAA games at ultra at 4K (which was also a 50 fps average uplift over my 6800XT from two years before) but I got stuck at paying another grand on top for damn near the same fps but some better reflections and lighting in some games that always drop fps hard but might not be noticed without pausing.
BINGO! Give this man a prize!

You just went into far more detail into explaining what I mentioned before in that AMD, at least in the eyes of the majority of the market, can't do anything right no matter what they try to do. So, AMD has thrown in the towel, and I don't blame them. Why put the R&D into making a product that the market will always see an inferior even if it isn't actually inferior and won't buy it.

We've done this to ourselves and nVidia is laughing their asses off all the way to the bank. We have no one to blame but ourselves.

By the way, I'm sitting here with an all AMD system. 7700X and an RX 7900 GRE and buying the GRE was the best thing I ever did. I chose to not feed the beast that is nVidia.
 
Back