Nvidia optimistic about the future even as gaming revenue dries up

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: Nvidia has a lot of things to be happy about as of late, one of which is an interesting new partnership that will see Microsoft bring Xbox and Activision games to the former's competing cloud gaming service, GeForce Now. A renewed interest in AI-infused technologies is also a saving grace for the GPU maker as gaming graphics revenue dries up.

The company's latest financial report suggests Nvidia is doing much better than Wall Street analysts had expected. During the fourth quarter of last year, it made revenue totaling $6.05 billion, down 21 percent from the same period in 2021.

Unsurprisingly, gaming revenue saw the most significant year-over-year drop to about $1.8 billion. That's a 46 percent dive compared to Q4 2021 but up 16 percent from Q3 2022. It's also higher than the $1.6 average estimate by Wall Street analysts. Datacenter and automotive sales were up, but not enough to compensate for the sizable dip in consumer GPU sales and SoCs for game consoles.

Still, Nvidia raked in almost $27 billion in revenue for the year, similar to that achieved in FY 2022. Although, net income saw a 55 percent decline to a little under $4.4 billion. It's also evident that data center revenue — up 41 percent year-over-year to $15 billion — is now the primary contributor to overall revenue.

Like every other chipmaker, the company is dealing with much lower consumer demand compared to previous years, which aligns with the overarching trend in the PC industry. The RTX 4080 saw a rocky launch riddled with controversy, cryptocurrency mining has migrated to ASICs, and overall GPU pricing hasn't seen any significant improvement in months.

During an investor call, Nvidia CFO Colette Kress commented that lower gaming sales resulted from China's Covid-19-related problems and channel inventory issues. Specifically, Kress pointed at retailers buying less inventory to eliminate existing RTX 30-series stock. However, that doesn't explain why GPU prices have barely budged since last summer.

One thing is clear. Nvidia's future growth depends heavily on the latest AI renaissance and the race to train large language models like ChatGPT. Earlier this month, Nvidia CEO Jensen Huang said ChatGPT represents the "iPhone moment" for artificial intelligence, which offers a golden opportunity for his company to fill the exploding demand for AI training hardware.

Huang told investors that gaming revenue is also poised for recovery but spent most of the call time hyping the possibilities of computing platforms like ChatGPT, which reached over 100 million active users in only two short months.

Permalink to story.

 
Maybe all the used crypto gpu's flooded the market. until mid-range 40 series come out, I think the slump will continue. if the mid-range 40 series are good and reasonably priced, that should get things moving again. Seems like nvidia is going to wait until Intel catches up before they release those though. They certainly don't have to worry about AMD trying to take any market share from them.
 
I feel like nVidia, in pursuit of market capitalisation, is becoming new IBM and will soon abandon its core field, GPU development.
Well, maybe not nVidia itself, but consumers will sooner or later chose gaming console or $250 "entry level" GPU from AMD or Intel, if given the choice between this and $1K "entry level" GPU from Jensen.
 
They know they have a cult of "geniuses" who believe every marketing feature they sell is worth at least $200-300. I'm sure they'll be fine.
 
Personally, I am not surprised that Nvidia is betting on the latest industry fad to be their saving grace. They probably figure that the market for AI is going to be similar to the market for HPC or other pro-level GPUs. I wish them well, but in other industries, e.g. Streaming, where content providers were inspired to start their own streaming services to cash in on the latest fad, have not found that fads are as durable or the money makers they were hoping for. In other words, they over-estimated the market because they were blinded by the $$$$ signs stuck in their eyes.

Maybe Nvidia will have a different experience, but I am willing to bet that they won't especially if they, yet again, abandon a bread and butter market like consumer/gamer GPUs.
 
There are two things holding the PC market back. People's budgets are still constrained by the inflation of last year, wages have not risen to accommodate the increase. And, PC component prices are still just too high, especially GPU prices. People don't want two-year-old GPUs they can afford; they want recently launched GPUs they can afford. The keyword is afford, because neither AMD nor Nvidia have launched an affordable GPU yet for the latest generation. For most people $500 is still too much for a GPU and the cheapest new GPU is the 4070 Ti, which really is not even attractive at that price point. The $1200 4080 is a bad joke. Remember the 80 tier started at just $700 last gen, you can argue all you want about performance and more VRAM, but you're still paying $500 more for 16GB vs 10GB GDDR6X (2GB chips though so actually less chips and a smaller bus, so the VRAM is probably less expensive than the 3080 or at least not that much more) and about 50% more performance which is about what you would expect out of a similarly tiered card from one gen to the next. The 4080 should be $800 at most. It's 50% overpriced!
 
I think a lack of graphical innovation from games has also been a bit of a hit. Most people aren't seeing need for 4K, and so sticking with 1080p or 1440p you really don't need a massive graphics card for the newest games. Developers have been focusing too much on NFT's and other stupid crap most of us don't care about rather than giving us what we want, a solid game with graphics that keep getting better. UE5 is slowly progressing but nothing major in graphics has really happened since then so that will also make the graphics card somewhat stagnant. If UE6 comes out and is a significant enough step up on graphical fidelity, or another developer makes something a lot better then the demand will be there again - but only as long as we can afford to buy it!
 
Gaming revenue hasn't dried up.

In terms of hardware: Tech companies have priced it out of the range of many people, hoping to keep that massive profit margin afloat.

In terms of software (games): the industry is still growing, but the natural adjustment after all the "let's close everything down" stupidity has finally been phased out and life has gone back to normal (ish) has finally set in.

When these companies aren't seeing the massive continued growth due to unusual circumstances, they're panicking and cutting back and crying that they aren't meeting projected numbers. Those projected numbers are unrealistic, but that doesn't keep them from f'ing everyone and everything else up around them so they can continue to try and force high profits.
 
Gaming revenue hasn't dried up.

In terms of hardware: Tech companies have priced it out of the range of many people, hoping to keep that massive profit margin afloat.

In terms of software (games): the industry is still growing, but the natural adjustment after all the "let's close everything down" stupidity has finally been phased out and life has gone back to normal (ish) has finally set in.

When these companies aren't seeing the massive continued growth due to unusual circumstances, they're panicking and cutting back and crying that they aren't meeting projected numbers. Those projected numbers are unrealistic, but that doesn't keep them from f'ing everyone and everything else up around them so they can continue to try and force high profits.


yeah but the 3080/3090 then 4090 probably already soaked up a lot of people willing to splurge
Think cheaper AMD 5 MBs and DDR5 . and coming AMD 3D chips will be another small splurge

I think then next real splurge cycle will be the next gen as 3080/3090.6900/6800 upgrade

Plus we need that great bang for buck 4060- not sure we are getting it
 
There are two things holding the PC market back. People's budgets are still constrained by the inflation of last year, wages have not risen to accommodate the increase. And, PC component prices are still just too high, especially GPU prices. People don't want two-year-old GPUs they can afford; they want recently launched GPUs they can afford. The keyword is afford, because neither AMD nor Nvidia have launched an affordable GPU yet for the latest generation. For most people $500 is still too much for a GPU and the cheapest new GPU is the 4070 Ti, which really is not even attractive at that price point. The $1200 4080 is a bad joke. Remember the 80 tier started at just $700 last gen, you can argue all you want about performance and more VRAM, but you're still paying $500 more for 16GB vs 10GB GDDR6X (2GB chips though so actually less chips and a smaller bus, so the VRAM is probably less expensive than the 3080 or at least not that much more) and about 50% more performance which is about what you would expect out of a similarly tiered card from one gen to the next. The 4080 should be $800 at most. It's 50% overpriced!
I also think you're forgetting something. People just don't see a need to upgrade. Mid ranged hardware has been capable of high refresh gaming for a couple of generations now. Unless you HAVE TO HAVE Ray Tracing, there is very little incentive to upgrade. Things are more expensive than ever and there is less incentive than ever to upgrade. I feel the entire reason Ray Tracing exists is that nVidia started to see a trend were "hey guys, we're gonna have to find a way to use up all this GPU power or we're gonna have a hard time selling these cards"

Most people don't have a display above 144hz and 1080p is still the most popular resolution, a 6600 can do that for ~$250. It wont peg the display at 144fps 100% of the time, but you'll certainly have a good 100+FPS gaming experience. I remember when reaching 1080p60 was the holy grail and we're long past that.
 
My buddy was running 60 FPS on 4K with the latest shooters in 2016. There's nothing that's come along in the past seven years that's demanded more than a 2070 or 3060 to achieve the maximum detail and FPS detectable by humans. What we need is deeper gameplay..and AI just might be the thing that gets us there. RAM speed will obviously remain a major concern, but its making slow and steady progress. Future GPU marketing will brag about FPS, all right - Flops Per Second for the AI processing.
 
All you have to do is turn shadows to low, turn off reflections, and lower the lighting, keep textures at ultra, and the games are going to look great and play great. Nvidia and their army of fanboys have been insisting you need RT and other gimmicks, but honestly most people really don't care, they want a good game and smooth framerates. Emphasis on good game. Nintendo has an entire library of stuff that runs on cheap hardware that people love playing. PC has indies that fill the same role (and most end up on the Switch too).

And if you really really really just want to see realistic lighting and landscapes, just go outside. Still better than any video game.
 
My buddy was running 60 FPS on 4K with the latest shooters in 2016. There's nothing that's come along in the past seven years that's demanded more than a 2070 or 3060 to achieve the maximum detail and FPS detectable by humans. What we need is deeper gameplay..and AI just might be the thing that gets us there. RAM speed will obviously remain a major concern, but its making slow and steady progress. Future GPU marketing will brag about FPS, all right - Flops Per Second for the AI processing.
I was doing 4k60 with a lot of games on my 1070ti back in 2017. Demanding titles, not at max settings, but it was possible.
And if you really really really just want to see realistic lighting and landscapes, just go outside. Still better than any video game.
man, you had me going until you said that :(
I don't play video games to see realistic landscapes or lighting, I play video games because they are playgrounds for what is not possible in reality. As I like to say, I was born too late to explore the world, too early to explore the stars but just in time to browse dank memes.
 
I was doing 4k60 with a lot of games on my 1070ti back in 2017. Demanding titles, not at max settings, but it was possible.
man, you had me going until you said that :(
I don't play video games to see realistic landscapes or lighting, I play video games because they are playgrounds for what is not possible in reality. As I like to say, I was born too late to explore the world, too early to explore the stars but just in time to browse dank memes.
Oh I totally agree. That last part was a bit tongue in cheek because honestly it gets frustrating seeing people talk about "realistic lighting" on the most obviously fake video game screenshots. I remember seeing a Witcher 3 RTX screen in a house with a giant hole in the roof in the middle of a very sunny day---the screen was completely black and nerds were drooling over the great lighting. I was laughing at how absurd that is. Reminds me of the old Final Fantasy Spirits within movie, dorks drooling about how the rendered video game fictional girl has such nice hair. I'm like just take your gf out to the park, you can see her hair in the wind, seriously wtf is this?

Anyway despite what I wrote above I don't get actually worked up. So I write tongue in cheek stuff with some snark sometimes.
 
Those graphs show exponential growth, from 1.1 in Q1 2015 to 8.3 in Q1 2023. What the heck do they want, ALL the money in the World?
 
My buddy was running 60 FPS on 4K with the latest shooters in 2016. There's nothing that's come along in the past seven years that's demanded more than a 2070 or 3060 to achieve the maximum detail and FPS detectable by humans. What we need is deeper gameplay..and AI just might be the thing that gets us there. RAM speed will obviously remain a major concern, but its making slow and steady progress. Future GPU marketing will brag about FPS, all right - Flops Per Second for the AI processing.
120,144 or 240 Hz monitors/displays have come since 2016. That is what we want now, 120 fps on 4K, with all the reasonable graphic settings to the maximum. Nobody says you have to turn on 64x tessellation for underwater rocks in Crysis 4 ;)
"AI processing" all our personal data and sending it to the benevolent capitalist corporations.
 
Last edited:
I also think you're forgetting something. People just don't see a need to upgrade. Mid ranged hardware has been capable of high refresh gaming for a couple of generations now. Unless you HAVE TO HAVE Ray Tracing, there is very little incentive to upgrade. Things are more expensive than ever and there is less incentive than ever to upgrade. I feel the entire reason Ray Tracing exists is that nVidia started to see a trend were "hey guys, we're gonna have to find a way to use up all this GPU power or we're gonna have a hard time selling these cards"

Most people don't have a display above 144hz and 1080p is still the most popular resolution, a 6600 can do that for ~$250. It wont peg the display at 144fps 100% of the time, but you'll certainly have a good 100+FPS gaming experience. I remember when reaching 1080p60 was the holy grail and we're long past that.
I agree that would make a good third point. However, I do think a lot of people are satisfied because they have to be in the current market. If your budget is $300, for example and you have something like a RTX 2060, your upgrade options for $300 just are not that attractive. Yes, a RX 6600 XT will beat a 2060, but not by the margins that warrant and upgrade.
 
Those graphs show exponential growth, from 1.1 in Q1 2015 to 8.3 in Q1 2023. What the heck do they want, ALL the money in the World?
Investors do. That is how "growth stocks" work and you get "bubbles." Investors assume that exponential growth continues on forever.
 
The fact they are making money from other streams shouldn't be a justification for doing a really sh1t job in home GPU market should it? Just wish AMD and Intel would fill the void a little better.
 
I agree that would make a good third point. However, I do think a lot of people are satisfied because they have to be in the current market. If your budget is $300, for example and you have something like a RTX 2060, your upgrade options for $300 just are not that attractive. Yes, a RX 6600 XT will beat a 2060, but not by the margins that warrant and upgrade.
Let's stay away from "what ifs", the 1650 is the most popular card followed by 1060. This is why each PC build needs an individual approach
 
Back