The AI chip boom is propelling Nvidia to new heights, gamers be damned

nanoguy

Posts: 1,355   +27
Staff member
Bottom line: Nvidia sees the race to develop generative AIs like ChatGPT and Midjourney as the iPhone moment for artificial intelligence. More importantly, the company seems well-positioned to capitalize on the surging demand for GPUs and AI accelerators, and this has sent its valuation close to $1 trillion this week. Its latest financial report paints a good forward outlook, but it's also a reminder that pleasing gamers isn't a priority for the company. And it probably won't be for as long as the tech industry is obsessed with AI.

Almost every business out there is trying to jump on the AI train, and this is making Nvidia and its investors happier than ever. So happy, in fact, that the company feels confident it can soon become the first chipmaker to be valued at more than $1 trillion, thanks to the enormous demand for data center GPUs and AI accelerators. Investors have sent the share price soaring, and it's hovering around $384 (up more than 25 percent from Tuesday) as of writing this.

AI is one of the main reasons Nvidia's data center business grew no less than 14 percent during the first three months of this year compared to the same period last year. For reference, Intel's Data Center and AI Group recorded a staggering 39 percent drop, and AMD's data center division delivered flat revenue compared to the same quarter in 2022.

As you can see from the graph above, it's a big deal for Team Green as this is where more than half of its revenue has been coming from for almost a year now. The company recorded $4.28 billion in sales to enterprise customers, which is also above the $3.9 billion figure expected by analysts.

This partly explains why Nvidia has moved some production of GeForce RTX 4090 GPUs to make more Hopper-based H100 enterprise GPUs. Companies like Microsoft, Oracle, OpenAI, Twitter, Amazon, and Google are all buying large amounts of the latter product to train and run generative AIs. To put things into perspective, the RTX 4090 sells for around $1,600, while used H100 cards sell for over $40,000 on eBay. This makes even Intel's 4th-gen Xeon Scalable Sapphire Rapids CPUs look cheap at $17,000.

Some companies like Microsoft, Meta, Amazon, and Google are investing in custom silicon for their AI efforts, but that won't curb their appetite for Nvidia GPUs anytime soon. During an investor call Wednesday, Nvidia CEO Jensen Huang explained the company has put 15 years of investment into hardware and software development which placed it in the right position at the right time to capitalize on a large investment cycle from companies big and small working on AI-based services.

Jensen is no doubt happy about generative AIs becoming the "primary workload of most of the world's data centers," but gamers are understandably less joyful about Nvidia's strategy with the RTX 40 series graphics cards. Between the high prices and low VRAM amounts on some of the new models, PC enthusiasts aren't exactly rushing to upgrade. That prompted Nvidia to announce an RTX 4060 Ti variant with a larger frame buffer, and we're also seeing small price drops on existing models.

Nvidia's gaming revenue for the first quarter of this fiscal year was $2.24 billion – a 38 percent year-over-year drop. The company blames the overall economic climate and the relatively slow rollout of RTX 40 series GPUs for the decline. However, our own Steven Walton took a look at the recently-launched RTX 4060 Ti and found it to be overpriced at $400. Higher-end models are $600 and $800, respectively, so not exactly appetizing options for gamers even though they do offer more value for the money.

Also read: Is Nvidia now a software stock? The competitive advantage of CUDA

One thing is for sure, the AI frenzy is transforming the tech industry and Nvidia stands to benefit the most thanks to the CUDA software stack, which is exclusive to its hardware offerings. Rivals have so far failed to create a true alternative and convince others in the industry to use it, though companies like AMD and Intel have certainly tried with toolkits like ROCm and oneAPI.

As a result, Nvidia is optimistic about the future and expects to generate around $11 billion in revenue in the current fiscal quarter. This would be a 64 percent increase year-over-year and mark a new record in terms of quarterly revenue for the Jensen-powered company. We'll have to wait and see.

Permalink to story.

 
1) Soon(ish) Nvidia won't care about gamers and will cater exclusively to where the money is.

2) AI is a bubble waiting to happen. Nvidia is trading close to 50x revenue. Nvidia can't make enough money to justify their evaluation.

3) The market can stay irrational longer than you can stay solvent.


 
@nanoguy oneAPI is not an exclusive to Intel nor its Xe/ARC silicon. There are oneAPI tool kits for both AMD and Nvidia, too.
https://www.codeplay.com/portal/new...pcpp-brings-sycl-support-for-nvidia-gpus.html
https://www.codeplay.com/solutions/oneapi/for-cuda/

https://arstechnica.com/gadgets/202...ty-team-up-to-bring-radeon-gpu-support-to-ai/
https://github.com/OpenSYCL/OpenSYCL

1) Soon(ish) Nvidia won't care about gamers and will cater exclusively to where the money is.

2) AI is a bubble waiting to happen. Nvidia is trading close to 50x revenue. Nvidia can't make enough money to justify their evaluation.

3) The market can stay irrational longer than you can stay solvent.
I agree. AI is the latest fad. People will get bored with it and/or will otherwise abandon it and the bubble will burst.

Personally, I find that the AI crap that has been integrated into Windows 11/Bing has nothing to interest me, and the company I work for has Office 365 which recently added AI crap that I find decreases my productivity. I've turned as much of it as possible off.

In an effort to keep up with the FAD, companies are foisting the AI Crap on us without understanding whether their users really want it or need it. IMO, that's no way to remain competitive and maintain profits. Companies will lose customers that way. Look at Microsotf, first it was Clippy, then it was Cortana - neither of which lived very long and are all but DEAD, yet they foist what I see as the latest versions of Clippy and Cortana on us in the current AI FAD and just keep doing the same things over and over and they expect different results.
 
@nanoguy oneAPI is not an exclusive to Intel nor its Xe/ARC silicon. There are oneAPI tool kits for both AMD and Nvidia, too.
https://www.codeplay.com/portal/new...pcpp-brings-sycl-support-for-nvidia-gpus.html
https://www.codeplay.com/solutions/oneapi/for-cuda/

https://arstechnica.com/gadgets/202...ty-team-up-to-bring-radeon-gpu-support-to-ai/
https://github.com/OpenSYCL/OpenSYCL


I agree. AI is the latest fad. People will get bored with it and/or will otherwise abandon it and the bubble will burst.

Personally, I find that the AI crap that has been integrated into Windows 11/Bing has nothing to interest me, and the company I work for has Office 365 which recently added AI crap that I find decreases my productivity. I've turned as much of it as possible off.

In an effort to keep up with the FAD, companies are foisting the AI Crap on us without understanding whether their users really want it or need it. IMO, that's no way to remain competitive and maintain profits. Companies will lose customers that way. Look at Microsotf, first it was Clippy, then it was Cortana - neither of which lived very long and are all but DEAD, yet they foist what I see as the latest versions of Clippy and Cortana on us in the current AI FAD and just keep doing the same things over and over and they expect different results.


You must be joking. Do you think AI is going away? A fad? Just because YOU don't find a use for it?
ChatGPT is just the beginning. AI is here to stay, and will become more and more integrated into everything we do. If not as a direct user experience, than at the back end.
What we see right now is people/companies scrambling to find different uses for AI and sure, some of those may turn out to be fad-ish (like making mashup images of movie franchises re-imagined in the style of Wes Anderson or putting Arnold Schwarzenegger in every movie imaginable), but AI as a whole will just grow and get more and more sophisticated. Probably exponentially so.
 
You must be joking. Do you think AI is going away? A fad? Just because YOU don't find a use for it?
ChatGPT is just the beginning. AI is here to stay, and will become more and more integrated into everything we do. If not as a direct user experience, than at the back end.
What we see right now is people/companies scrambling to find different uses for AI and sure, some of those may turn out to be fad-ish (like making mashup images of movie franchises re-imagined in the style of Wes Anderson or putting Arnold Schwarzenegger in every movie imaginable), but AI as a whole will just grow and get more and more sophisticated. Probably exponentially so.
Well no but the growth is predictable not like crypto mining where it happened 2x out of left field causing a backlog of more than a year and half of orders at the peak of it. The only thing gamers can hope for is a competive AMD at this point and we are lucky Intel as well. Also lets hope that ai compute doesn't start to be mined as well.
 
@nanoguy oneAPI is not an exclusive to Intel nor its Xe/ARC silicon. There are oneAPI tool kits for both AMD and Nvidia, too.
https://www.codeplay.com/portal/new...pcpp-brings-sycl-support-for-nvidia-gpus.html
https://www.codeplay.com/solutions/oneapi/for-cuda/

https://arstechnica.com/gadgets/202...ty-team-up-to-bring-radeon-gpu-support-to-ai/
https://github.com/OpenSYCL/OpenSYCL


I agree. AI is the latest fad. People will get bored with it and/or will otherwise abandon it and the bubble will burst.

Personally, I find that the AI crap that has been integrated into Windows 11/Bing has nothing to interest me, and the company I work for has Office 365 which recently added AI crap that I find decreases my productivity. I've turned as much of it as possible off.

In an effort to keep up with the FAD, companies are foisting the AI Crap on us without understanding whether their users really want it or need it. IMO, that's no way to remain competitive and maintain profits. Companies will lose customers that way. Look at Microsotf, first it was Clippy, then it was Cortana - neither of which lived very long and are all but DEAD, yet they foist what I see as the latest versions of Clippy and Cortana on us in the current AI FAD and just keep doing the same things over and over and they expect different results.

I never said those two toolkits are exclusive to hardware from either AMD or Intel, just that the industry is mostly hooked on CUDA.
 
I have to agree that AI is a fad. Is it going away? No, there are real uses for AI already out and I'm sure more uses will be found in the future. But you also have many many companies just doing "AI-assisted fill-in-the-blank" for things where conventional algorithms have worked fine for decades.

AI-assisted thermostat? (Even if it's "learning" the heating/cooling schedule to use). AI-assisted autofocus? AI-assisted grammar and spell checker? Things like this have had perfectly serviceable conventional algorithms for decades, and I think some of these are most definitely riding a hype-train.

Side point.. I wonder if AMD will take notice and make a more serious effort to clean up their ROCm software? They could get in on this market, IF they had better software!

CUDA? I used it originally on a GTX650 (which didn't provide a huge speedup over CPU, but was still like 2-4x speedup for what I ran on there). Installed tensorflow, etc., installed CUDA; found tensorflow no longer was compiled by default with support for the older compute model of GTX650. Rebuilt tensorflow with that support back on. Done, everything worked! Everything is layered, the actual apps (tensorflow, PyTorch, etc.) ask CUDA what cards are available, make sure it supports the compute model and has enough RAM (or it doesn't and your run fails as it gets code it can't handle, or runs out of RAM), it tells CUDA to run some work and has CUDA return the results.

ROCm? I decided to try running it on Ryzen -- out of the box ROCm only supports a handful of GPU models, to try to restrict it's use to compute cards (why? How will you get people to dosh out on an expensive compute card if they can't try it out at smaller scale on a regular GPU to see how it works?). Source code supports many more though, so I rebuilt that -- that is a *34* step build process, which included having to rebuild tensorflow-rocm itself! In ROCm, things are NOT cleanly seperated, Tensorflow-rocm DOES NOT just prepare some work and fire it off to ROCm to run on your card... it (rocm version of Tensorflow, pytorch, etc.) ends up with GPU-specific code for each and every GPU variant the rocm version supports! Gross! Yes, that means if you go from like "rocm 3.0.1" to "rocm 3.0.2", you must update tensorflow in lockstep or things may not work right. To top it off, at that point the marketing stuff for ROCm was using a different version number than was actually shipping... I thought this new ROCm version was coming out imminently and it turned out the marketroids had just decided to turn up the version number by one in their marketing page! In the end, I didn't QUITE get it to work -- I could run trivial work on the GPU, but anything that had over a 5-second run time, the video driver would decide the GPU had locked up and reset it. (I'm not complaining about that bit though, I'm complaining about the utter lack of seperation in the rocm software stack.)
 
Last edited:
I remember reading an article when the NVIDIA share was around $100 a few months back. The article suggested to buy NVIDIA stock immediately and it came from someone who was an expert on the stock market. I see now that he was right.

More specifically his advice was to buy NVIDIA stock when the price "approaches" $100.

I am guessing that the people who followed his advice and bought back then will have made an astonishing profit and that this guy is now rich.

That was back in Oct 22.
 
You must be joking. Do you think AI is going away? A fad? Just because YOU don't find a use for it?
ChatGPT is just the beginning. AI is here to stay, and will become more and more integrated into everything we do. If not as a direct user experience, than at the back end.
What we see right now is people/companies scrambling to find different uses for AI and sure, some of those may turn out to be fad-ish (like making mashup images of movie franchises re-imagined in the style of Wes Anderson or putting Arnold Schwarzenegger in every movie imaginable), but AI as a whole will just grow and get more and more sophisticated. Probably exponentially so.
Maybe, maybe not. I did, if you actually read my post, say it may be useful to some. We've seen fads before such as blockchain/crypto that have gone nowhere. Arguably, there are some good uses for AI such as in medical.

No one can predict where it will go, however, I think the general population will get bored by it as it is hardly more than entertainment to that audience - at least as I see it. It won't make dolts instantly smart, and anyone who thinks it will - well, the joke's on them, IMO.

There's also the whole issue of market saturation, too. Right now, if it takes supercomputing centers to really take advantage of the AI hardware integrated into GPUs, the average computer user cannot afford to buy a supercomputer, and needs an always on internet connection to take advantage of anything AI. And need I mention the power consumption that few have addressed?

As I see it, the market has this solution called AI that's in search of a problem. Everyone is jumping on it is expecting that they will be able to employ the solution to their problem. Its like having the right answer to the wrong question. It will get such people nowhere - except that they might find a decent application for it.

And there's also those who are already using it to make malware and other not-so-good items; however, few, if any, of the companies like M$, Google, or others, see fit to consider the ill that their product is capable of since the only thing they care about is profit and being at the forefront of AI implementation.

If you want to dream that AI is the answer to everything all the while making you rich from your investments that's your prerogative. However, I see it as yet another instance where humanity may look back and say "gee, that wasn't such a good idea." It won't be the first time that humanity has done that with something everyone thought was a good thing.
 
Maybe, maybe not. I did, if you actually read my post, say it may be useful to some. We've seen fads before such as blockchain/crypto that have gone nowhere. Arguably, there are some good uses for AI such as in medical.

No one can predict where it will go, however, I think the general population will get bored by it as it is hardly more than entertainment to that audience - at least as I see it. It won't make dolts instantly smart, and anyone who thinks it will - well, the joke's on them, IMO.

There's also the whole issue of market saturation, too. Right now, if it takes supercomputing centers to really take advantage of the AI hardware integrated into GPUs, the average computer user cannot afford to buy a supercomputer, and needs an always on internet connection to take advantage of anything AI. And need I mention the power consumption that few have addressed?

As I see it, the market has this solution called AI that's in search of a problem. Everyone is jumping on it is expecting that they will be able to employ the solution to their problem. Its like having the right answer to the wrong question. It will get such people nowhere - except that they might find a decent application for it.

And there's also those who are already using it to make malware and other not-so-good items; however, few, if any, of the companies like M$, Google, or others, see fit to consider the ill that their product is capable of since the only thing they care about is profit and being at the forefront of AI implementation.

If you want to dream that AI is the answer to everything all the while making you rich from your investments that's your prerogative. However, I see it as yet another instance where humanity may look back and say "gee, that wasn't such a good idea." It won't be the first time that humanity has done that with something everyone thought was a good thing.
I remember when people and the market went crazy over tablets, that didn't pan out.

I remember when people and the market said Internet of Things would revolutionize home life and life everywhere. I'm still wondering why a smart sink, smart toilet, and smart fridge hasn't taken off.

I remember when augmented reality and virtual reality was being pushed as life changing. I'm still waiting. Maybe Meta can do something g with it.


Atm these AI chat bots are just the next step up in search engine. Yeah there are some great applications like ai art and have it write you an email. But they're still chatbots at heart. Wait until you start getting ads and sponsored replies from these ai chatbots...we'll see what their true staying powers are.
 
Training any type of AI with real data collected from real people in real life, with no moderation, spells disaster all over.Only specific domains like medicine or art could really benefit from it.For now.Reiterating known facts and ideas does not mean progress.
 
Nvidia up 24.37% overnight. The super duper hype train is real in this one. Even AMD is getting affected by 11% jump overnight.
 
I remember reading an article when the NVIDIA share was around $100 a few months back. The article suggested to buy NVIDIA stock immediately and it came from someone who was an expert on the stock market. I see now that he was right.

More specifically his advice was to buy NVIDIA stock when the price "approaches" $100.

I am guessing that the people who followed his advice and bought back then will have made an astonishing profit and that this guy is now rich.

That was back in Oct 22.
That's what I did. The day the 4090 released the stock was $115 and I bought in a couple of days later at $122 using the money I got when I sold 1003 shares of AMD for $144 in November 2021 which I bought in early 2009 for $2.70 and it doubled in a year and then tanked but since it was only a $2750 investment I figured I may as well hold it and hope for the best even when AMD was on the verge of bankruptcy. Then Ryzen gen 1+ came out and it started to take off but was obviously way overvalued at $144 so I sold it before the bubble burst (Nvidia was quite high then also) and then just waited for everything to deflate (The bubble didn't burst) which took about 11 months. One problem with AMD stock is they don't pay out dividends so it's worth nothing until you actually sell it while you do get a modest amount of money every year from dividend payouts from Nvidia. Nvidia's biggest edge over it's competitors is it built a software stack to go with their hardware that is wide and deep and it's unmatched by anyone else.
 
That's what I did. The day the 4090 released the stock was $115 and I bought in a couple of days later at $122 using the money I got when I sold 1003 shares of AMD for $144 in November 2021 which I bought in early 2009 for $2.70 and it doubled in a year and then tanked but since it was only a $2750 investment I figured I may as well hold it and hope for the best even when AMD was on the verge of bankruptcy. Then Ryzen gen 1+ came out and it started to take off but was obviously way overvalued at $144 so I sold it before the bubble burst (Nvidia was quite high then also) and then just waited for everything to deflate (The bubble didn't burst) which took about 11 months. One problem with AMD stock is they don't pay out dividends so it's worth nothing until you actually sell it while you do get a modest amount of money every year from dividend payouts from Nvidia. Nvidia's biggest edge over it's competitors is it built a software stack to go with their hardware that is wide and deep and it's unmatched by anyone else.
Good points but nvidia pays pennies per share on dividends, you're not going to be getting high dividend pay out unless you own hundreds of thousands of shares. I went big with AMD when they were at $5 in mid 2016. Should have purchase some nvidia shares too and short Intel lol. But hindsight.
 
I remember reading an article when the NVIDIA share was around $100 a few months back. The article suggested to buy NVIDIA stock immediately and it came from someone who was an expert on the stock market. I see now that he was right.

More specifically his advice was to buy NVIDIA stock when the price "approaches" $100.

I am guessing that the people who followed his advice and bought back then will have made an astonishing profit and that this guy is now rich.

That was back in Oct 22.
Oct 2022 was the Nasdaq low, and nvidia has just tripled since then. Nothing really shocking there as it's the best semiconductor stock in the stock market. What's shocking is that Nvidia was a $8 stock in 2015 when I started following stocks. How do u think those people feel?
 
Back