What's next for AMD? Building momentum

Jay Goldberg

Posts: 75   +1
Staff

AMD reported its quarterly earnings this week with a lot of moving parts, but most importantly they came in a little better than expected, which is exactly what the company needs.

Over its history, AMD has gone from the company that couldn't shoot straight to now the leader in in their field and probably one of the best at execution. As we highlighted a few months back, the company now needs to continue building its roadmap and find pieces of the market to claim. The company does not need to report blowout earnings, it just needs to continue building on what appears to be a solid head of momentum.

Editor's Note:
Guest author Jonathan Goldberg is the founder of D2D Advisory, a multi-functional consulting firm. Jonathan has developed growth strategies and alliances for companies in the mobile, networking, gaming, and software industries.

By raw numbers, the quarter went just fine. They reported revenue of $5.4 billion, just ahead of consensus of $5.3 billion, and EPS of $0.58 versus expectations of $0.57. The stock was trading up after hours, but we would not be surprised if that's only temporary. As we said, not exciting, just steady progress.

Beyond the numbers, the one topic that everyone wanted to know was how well is the company progressing in the data center, specifically around AI. And here the news was generally good. They have attracted a lot of interest from the usual suspects for their newly launched M-series GPUs.

They only announced these a month ago, and the product has just begun sampling, but judging from executive commentary, they have made a lot of traction garnering demos. This is still early days – for AMD's new products and for "AI" – so it's too soon to expect a huge change in AMD's results from the new products, but they are in a good position.

Put another way, Nvidia is streets ahead of everyone in the AI market, due to their foresight and no small amount of luck, but Nvidia will not, cannot be the only provider of data center AI semis, and AMD can still do great business as the leading fast follower in the market. Not for nothing, AMD is leading the way for Windows PC CPUs with built-in AI blocs, so they have not been asleep at the AI wheel like certain others.

Also read: The Rise, Fall and Renaissance of AMD

We also think there are some important takeaways for the broader market. A key trend that has emerged is the hyperscalers are for the most part scrambling with their AI strategies. Some of them are increasing their capex budgets for 2023 to accommodate AI needs, others are waiting until next year.

This means that AI is cannibalizing some market share this year from CPUs, but not all of it. And it seems clear that next year's budgets will be big for all hyperscalers, both for AI and for traditional workloads. AMD expects AI inference to become a $150 billion opportunity, which is a really big number. Perhaps most interesting was the increased focus on enterprise (I.e. non-hyperscale customers). This has been a somnolent market for a decade, steadily seeping into the cloud, but has now reignited to the point that AMD (among others) is investing in tooling up their sales effort to support the market.

Sifting through all of this, AMD's quarters had a lot of moving parts. The data center and AI stories were important and sounded good, but everything else was mixed.

Gross margins were good enough but not particularly inspiring; supply challenges seem to have been worked through but are now a concern for the new products; consumer demand for PCs is still recovering; a fair portion of their data center build this year rests on a single order for a supercomputer; the 5G build out helped, but is now tapering. The list goes on. The next few quarters AMD will remain a battleground stock but the company itself is in a great position longer term.

Permalink to story.

 
I don't think GPUs will ever be able to completely replace CPUs in the AI space. You need some beefy CPUs to allow all GPUs to communicate with each other without bottlenecks. I also believe that AMD is number 1 in the server space right now with their Epyc line of CPUs.

I'm not trying to badmouth Intel here, either. They have a fantastic lineup of products and I'd be happy with any if they're CPUs, I just have secondary reasons for using AMD outside of performance.

Interestingly enough, simply having reasons outside of performance alone is a pretty big deal and should not be ignored.
 
Wonder if they will expand out to the new diamond replacement for silicon that looks to become a standard ....
 
Next for amd is throwing all silicon at AI and rdna4 being limited to navi43.
I don't think that will happen because AMD hardware powers the xbox, playstation and switch. nVidia is only in the PC gaming space and I think a large part of why they are "IDGAF" about their gaming cards is that gaming just isn't a large part of their market. nVidia has, alledgedly, stopped producing the 40 series of GPUs aside from the 4090 and 4080 and even then, they have scaled back production and only releasing them in limited numbers to keep prices high as, surprise, people aren't buying them. To be perfectly honest, the only nVidia GPU worth its salt is the 4090 and all you have to do is look at the "open box" and "refurbished" numbers on newegg of the 40 series to get an idea of how much people actually don't want them. They're buying them and then returning them. There are significantly more "open box" 40 series cards on newegg than there are 6000 or 7000 series AMD cards.

I will let you extrapolate what all the means on your own.
 
Unsurprisingly, AMD GPUs do not merit a mention as a potential area of growth outside of the AI area.
How long can the company afford to keep launching successive hard failing RDNA generations?

 
Unsurprisingly, AMD GPUs do not merit a mention as a potential area of growth outside of the AI area.
How long can the company afford to keep launching successive hard failing RDNA generations?
How is RDNA a failure? This comment is so backwards it almost hurts. RDNA is in more gaming machines than any other architecture(consoles matter) and used very little in AI. nVidia has started pricing their AI products such that developers have started buying AMD gaming GPUs instead of paying exorbitant amounts of money for nVidia Tesla GPUs. Simply from an investment perspective, and I literally have put my money where my mouth is, AMD has enormous room for growth in the AI GPU sector.
 
I don't think that will happen because AMD hardware powers the xbox, playstation and switch. nVidia is only in the PC gaming space and I think a large part of why they are "IDGAF" about their gaming cards is that gaming just isn't a large part of their market. nVidia has, alledgedly, stopped producing the 40 series of GPUs aside from the 4090 and 4080 and even then, they have scaled back production and only releasing them in limited numbers to keep prices high as, surprise, people aren't buying them. To be perfectly honest, the only nVidia GPU worth its salt is the 4090 and all you have to do is look at the "open box" and "refurbished" numbers on newegg of the 40 series to get an idea of how much people actually don't want them. They're buying them and then returning them. There are significantly more "open box" 40 series cards on newegg than there are 6000 or 7000 series AMD cards.

I will let you extrapolate what all the means on your own.
just relaying the news after videocardz

AMD-RDNA4-RUMORS.png
 
How is RDNA a failure?

RDNA's market share in the dedicated space has eroded rapidly from RDNA to RDNA2 to the latest low of RDNA3. The much touted promise of mass adoption of AMD APUs in mobile devices has also very much not happened - there are actually very few portable computing devices available that offer these, none of which are strong sellers.

This comment is so backwards it almost hurts. RDNA is in more gaming machines than any other architecture(consoles matter) and used very little in AI.

Consoles are an area where RDNA has maintained a foothold (though I have to say that your mention of the Switch makes your knowledge of this space suspect). But that market has become less important over time as younger generations of gamers have continued to gravitate towards phones and tablets on a massive scale. I think the trees are obstructing your view of the forest.

nVidia has started pricing their AI products such that developers have started buying AMD gaming GPUs instead of paying exorbitant amounts of money for nVidia Tesla GPUs.

It's cute that there a handful of vocal developers toying with ROCm on their private hobby machines - echoes of the once inevitable future compute stack king OpenCL - but let's try to stick to a realistic perspective. CUDA is entrenched and here to stay for the duration (and of course, noone needs a Tesla to benefit from it, but massive deployments are out there for your code to leverage).

Simply from an investment perspective, and I literally have put my money where my mouth is, AMD has enormous room for growth in the AI GPU sector.

Oh, you hold a bit of AMD stock? I reckon some people find that impressive.
 
Last edited:
Navi21 was the first team red card after 9700 Pro for me, rdna2 was (still is) amazing imo. rdna1 and rdna3 not so much, but they're far from failures. I ended up getting a cheap, used 3080 about a month ago tho cause I was disappointed in how vsr+fsr2 looks vs dldsr+dlss but I still have the 6800 for when I need it.
 
RDNA's market share in the dedicated space has eroded rapidly from RDNA to RDNA2 to the latest low of RDNA3. The much touted promise of mass adoption of AMD APUs in mobile devices has also very much not happened - there are actually very few portable computing devices available that offer these, none of which are strong sellers.



Consoles are an area where RDNA has maintained a foothold (though I have to say that your mention of the Switch makes your knowledge of this space suspect). But that market has become less and important over time as younger generations of gamers have continued to gravitate towards phones and tablets on a massive scale. I think the trees are obstructing your view of the forest.



It's cute that there a handful of vocal developers toying with ROCm on their private hobby machines - echoes of the once inevitable future compute stack king OpenCL - but let's try to stick to a realistic perspective. CUDA is entrenched and here to stay for the duration (and of course, noone needs a Tesla to benefit from it, but massive deployments are out there for your code to leverage).



Oh, you hold a bit of AMD stock? I reckon some people find that impressive.
[/QUOTE]

But anyway, RDNA is still selling VERY well and will be in the next generation of xbox and playstation consoles. Cuda is starting to become undesirable for developers because 1) Linux support is a big thing when running these systems. I don't care if Cuda has been around forever, if it doesn't have proper support for the OS that the server grade hardware requires to run then it's useless.

2)While nVidia has maintained a foothold for awhile, they are selling their hardware at prices where it is now becoming cost effective to develop from the ground up for something OTHER than cuda.

Cuda is not the end-all-be-all that everyone says it is. Cuda has a lot going for it but everything comes down to cost effectiveness. At this point, people are basically paying a premium for nVidia hardware so they can copy and past Cuda code off of GetHub.

and since you want to insultingly bring up my AMD stock I think you'd find it funny to know that I recently sold my nVidia stock to open an AMD position. I believe that nVidia's bull run is over. Maybe not right now, but I believe in the next 3 years we will start to see it. They stopped making 40 series cards, game developers are getting tired of developing for their fragmented proprietary tech across generations and now they're pricing themselves out of the AI market.
 
just relaying the news after videocardz

AMD-RDNA4-RUMORS.png
AMD did not monopolize the market, nVidia handed it to them. I'm not thrilled there is only going to be 1 GPU manufacturer(Intel Arc is a thing, BTW), but it is nVidia's fault for essentially becoming the scalpers after the GPU craze.

And I'm also going to throw this out there. I game at 4k120 and have a 6700XT, no one needs a 4090. So not having any "high end" GPUs doesn't mean anything. I also don't think many people want to spend more for a graphics card than I did for my first car. Who cares about high end GPUs when the "mid ranged" ones still cost $1000?
 
But anyway, RDNA is still selling VERY well

Meanwhile, down here in reality, AMD gaming segment revenue was down 4% year-on-year and 10% for the quarter. The company attributed it mostly to poor GPU sales. What is the basis for your claim?

and will be in the next generation of xbox and playstation consoles. Cuda is starting to become undesirable for developers because 1) Linux support is a big thing when running these systems. I don't care if Cuda has been around forever, if it doesn't have proper support for the OS that the server grade hardware requires to run then it's useless.

Nvidia has always maintained, and continues to have, outright excellent support for Linux. Disagreements on how Wayland compositors should be supported have absolutely zero bearing on anything related to the compute stack (nor, for that matter, on their hardware's viability for contemporary production desktop use).

2)While nVidia has maintained a foothold for awhile, they are selling their hardware at prices where it is now becoming cost effective to develop from the ground up for something OTHER than cuda.

Cuda is not the end-all-be-all that everyone says it is. Cuda has a lot going for it but everything comes down to cost effectiveness. At this point, people are basically paying a premium for nVidia hardware so they can copy and past Cuda code off of GetHub.

Is there any evidence for these declarative statements?

and since you want to insultingly bring up my AMD stock I think you'd find it funny to know that I recently sold my nVidia stock to open an AMD position.

I mostly thought it was funny you brought up your portfolio's holdings in the first place. That makes them fair game of course. It also brings to mind the concept known as 'talking your book', where investments drive your arguments - rather than the other way around.
 
I game at 4k120 and have a 6700XT, no one needs a 4090.
I doubt many 2014+ games will do that on 6700xt, unless you mean low-med settings.
anyway, glad I stacked up on gpus in between mining and ai madness spans, got 3080 and 6800. was gonna sell the 6800, but it might skyrocket in value or I might need a 16gb card to use in a few games.It'd be a waste to sell it, does 2540mhz on stock 1025mv.
 
How is RDNA a failure? This comment is so backwards it almost hurts. RDNA is in more gaming machines than any other architecture(consoles matter) and used very little in AI. nVidia has started pricing their AI products such that developers have started buying AMD gaming GPUs instead of paying exorbitant amounts of money for nVidia Tesla GPUs. Simply from an investment perspective, and I literally have put my money where my mouth is, AMD has enormous room for growth in the AI GPU sector.
I suspect he's talking about dGPUs, not all the console or handheld devices. I don't know what AMD's goals are for 7900 XT and XTX GPUs but I suspect they have fallen short of where they would like to be. Are they "failures"? Well, I guess that depends on your viewpoint. Clearly AMD wasn't and still isn't happy with the 7900XT sales, hence the price drops and the 7900 GRE. What's a little frustrating is that they could have dropped that GPU into the market at $650-700 from day 1 and it would have done much better. The XTX also needs a healthy price drop (as do many GPUs regardless of brand). I would like the XTX a lot better at $800, but even that is a high price, all things considered.
 
FYI there is current rumor that AMD canceled enthusiast/ high end rdna 4 gpus. I definitely could feel strings being pulled all directions for market manipulation salty rumors and all. With Nvidia throttling supply of 4000 series
Any thoughts on this article from PC world writing about how ai will cause another graphics card boom in pricing like I predicted might happen when Nvidia's stock tripled in price within a short period of time.
What do you call people taking selfies of mass gpu card orders for ai compute but market manipulation imo? Will scalpers sweep up the throttled supply or will vedors like MSI beat the scalpers again this time and scalp directly via ebay like last time?

https://www.pcworld.com/article/2020375/the-ai-boom-could-create-a-new-gpu-shortage.html
 
AMD did not monopolize the market, nVidia handed it to them. I'm not thrilled there is only going to be 1 GPU manufacturer(Intel Arc is a thing, BTW), but it is nVidia's fault for essentially becoming the scalpers after the GPU craze.

And I'm also going to throw this out there. I game at 4k120 and have a 6700XT, no one needs a 4090. So not having any "high end" GPUs doesn't mean anything. I also don't think many people want to spend more for a graphics card than I did for my first car. Who cares about high end GPUs when the "mid ranged" ones still cost $1000?
If there is a storm coming just buy the cheapest 24 gig vram card set it and forget it. Currently the 7900xtx selling as low $939 via PC parts picker and 7900xt selling at $769 as well;( for reference as of 8/5/2023). You can lock your performance to maximize long term efficiency benefits and sell the cards for future upgrades to mitigate out of pocket costs as well. So investing in high end is not so bad as it seems imo.
 
If there is a storm coming just buy the cheapest 24 gig vram card set it and forget it. Currently the 7900xtx selling as low $939 via PC parts picker and 7900xt selling at $769 as well;( for reference as of 8/5/2023). You can lock your performance to maximize long term efficiency benefits and sell the cards for future upgrades to mitigate out of pocket costs as well. So investing in high end is not so bad as it seems imo.
This.

Picked up a 7900XTX for $1200 and sold my 6800XT for $600

Cost $600 for the upgrade.
 
What did you pay for the 6800 in the first place? That cost has to factor in the upgrade cost. There is no free lunch.
That price of the 6800XT matters for the card before it which was a RX580, not going forward.

I paid $1500 for the 6800XT in the middle of the mining boom and sold the RX580 for $500.

So that cost me $1000 so this upgrade was cheaper than the last one.

Reason why crypto boom is over. And by the time I sold the RX580 it was like 5+ years old and low tier the same can't be said for the 6800XT.
 
Mining craze people got screwed. the used 6800 I got for 400eur had 1250eur on the invoice.
less than 1/3rd value in 1.5 years.
 
Back