If you haven't been paying attention, Nvidia is no longer a gaming company

Markoni35

Posts: 1,078   +442
But only when the products can be purchased.
I mean. Who is to really blame for all this mess?
COVID, miners, manufacturing plants, etc, etc?

Don't worry, BitCoin will drop. And with it all the other cryptos. If it wasn't for the secondary artificial pump that kicked in, it would already be dropping. Once mining becomes too expensive, all those cards will be selling cheaply.

As they say in Battlestar Galactica: "All of this has happened before... and will happen again".
 

terzaerian

Posts: 923   +1,318
Not sure and I could be wrong? But don't most people hate crypto miners because of the shortage of video cards?



Do you mind telling me why you want to switch from NVIDIA?

Do you also stream online?
There would still be a video card shortage if there were no miners, because there is a chip shortage in general, to the point that automakers are shutting down production and display-makers can't get their hands on $1 chips to build displays.

The first strike was when Nvidia tried to squeeze the Hardware Unboxed youtube channel:


While it's not clear where this attempt to censor reporting originated from, the intent and results were still clear.

The second and third strike was Nvidia's decision to make "dedicated" crypto and gaming video cards.

It'd be one thing if Nvidia was just building a dedicated mining ASIC, but their "mining" and "gaming" cards are the same hardware, their functionality is just determined arbitrarily by company fiat and device drivers. This is a slippery slope down to making it so hardware manufacturers can determine where and in what ways their hardware is used, and does absolutely nothing to solve the current crisis besides create mining cards that miners will have to write off as junk once they're obsolete, instead of re-selling to the used market. It serve's nobody's best interest besides Nvidia's.

I don't stream, but I dabble in video editing, game, and mine casually. Mostly I just take the long view on things in tech, because for the most part, people are selfishly obsessed with the short term.

Don't worry, BitCoin will drop. And with it all the other cryptos. If it wasn't for the secondary artificial pump that kicked in, it would already be dropping. Once mining becomes too expensive, all those cards will be selling cheaply.

As they say in Battlestar Galactica: "All of this has happened before... and will happen again".
I sure hope so. I'm never going to hit my goal of getting at least one entire Bitcoin to my name if the price keeps skyrocketing like this.
 

Stoly

Posts: 89   +50
And AMD would love to have just a tiny slice of all those markets.

It's not only that nvidia is in more markets than ever.
It's that nvidia is the dominant player in many of these markets.

About the only market it tried to complete and failed miserably was mobile, but if the ARM deal comes through, nvidia may get the last laugh.

 

CybaGirl

Posts: 55   +24
There would still be a video card shortage if there were no miners, because there is a chip shortage in general, to the point that automakers are shutting down production and display-makers can't get their hands on $1 chips to build displays.

I don't stream, but I dabble in video editing, game, and mine casually. Mostly I just take the long view on things in tech, because for the most part, people are selfishly obsessed with the short term.
Thanks for getting back to me.

I have only used AMD Radeon cards in the past and for many years. It wasn't until the GTX 970 was released that I jumped shipped.

The main reason for jumping ship back then was because the power consumption on NVIDIA cards was so much better compared to the AMD card I was using.

In Australia unless you have solar on the roof the prices of electricity are over the top and so expensive! So I try to save where I can and by going with hardware that uses less power but still performs the same or better so I have to make this choice.

I also like the fact that NVIDIA also has a dedicated encoder chip for streamers. But I know AMD is developing one and or has implemented one in the latest generation cards. But it still has a long way to go before it catches up to NVIDIA from what I have read.

But in saying this. I guess it doesn't really matter if using a game capture device either. Like the Avermedia Live Gamer Xtreme 2.
 

CybaGirl

Posts: 55   +24
Don't worry, BitCoin will drop. And with it all the other cryptos. If it wasn't for the secondary artificial pump that kicked in, it would already be dropping. Once mining becomes too expensive, all those cards will be selling cheaply.

As they say in Battlestar Galactica: "All of this has happened before... and will happen again".
I sure hope so as all this mining is a bit of a downer in my opinion anyway. Not so much for the resources but more for the lack of cards and hardware we can buy.

But would you buy a second hand mining card? I know I wouldn't as I would prefer to pay a bit more and get something new that I know hasn't been used in a 24/7 environment as I would worry it would fail.

I really don't understand how it is worth doing anyway? I mean what are these people paying in electricity charges including running air con to keep everything cool? Compared to what they are actually making and then there is the outlay for the hardware itself.

Maybe it's me as I don't understand how it all works to be honest?

Maybe in Australia because electricity is so expensive unless you have solar panels on the roof then it's not worth doing?

I once thought about looking into this. But I can't see how I would make any money from it which all the costs involved.
 

Edster

Posts: 96   +67
The headline is a little click baity. It is no longer just a gaming company, but it hasn't been for a long time. The headline implied it will no longer make consumer/gamer GPU.

And you be stupid to not diversify anyway especially with the growing demand for computing stuff like AI. If you establish yourself as an enterprise or professional standard hardware, you are very hard to shake off.
 

ypsylon

Posts: 353   +273
No to be picky but have nVidia ever been a gaming company?

Granted they design and manufacture graphics cards, but then graphic card ASICs hardly are used only for gaming. nVidia never actually released a single game so that's two. Yes, yes nVidia API/architectures are supported, but in same way AMD, Vulkan, Intel, heck even Matrox when they were relevant or Apple's Metal is too.

So nVidia never was a gaming company. For past decade nVidia propels HPC like no other company ever before. "Gaming" cards for nVidia for good decade are but a footnote in total revenue earned. When they sell container(s) of Teslas or those new Axxx to power super computers in some location do you really believe that selling 10000s 3080 even registers on the radar compared to HPC volume shipments. One 10G HBM2 stack on PCB cost same as decent "gaming" GPU.
 

Biostud

Posts: 44   +15
Intel, AMD and nvidia are all changing to be multi chip companies, and hopefully three strong companies will create competitive markets at all levels, which should benefit the consumers.
At this point I'm more worried about TSMC not having competitors in the bleeding edge production, and the total production capacity of chips nowhere near the demand.
 

neeyik

Posts: 1,877   +2,191
Staff member
No to be picky but have nVidia ever been a gaming company?
Arguably, yes. Or at the very least, it's been predominantly about gaming for a long time. Nvidia's financial reports since 2004 paint this picture very clearly:

image001.png

The first chart shows the reported breakdown of their various sectors - GPU covers GeForce and Quadro. MCP = media and communications processors (aka motherboard chipsets); WMP = wireless media processors; CE = consumer electronics.

image002.png

For these two years, Nvidia separated the GeForce from the other chip sales. PSB = professional solutions business (I.e. Quadro and Tesla); CPB = consumer products business (e.g. Tegra). The former accounted for 15 and 20% of the total revenue, where the latter was 18 and 19% respectively. GeForce income was 61 and 57%.

image003.png

Nvidia dropped motherboard chipsets after 2008; they also apparently had no revenue, only losses, in the 'Other' sector. PSB varied from 15% to 23% of total revenue.

image004.png

Here one can see the recent and massive expansion of sales in their Datacenter sector. The Gaming sector was 56% of total revenue in 2015 but dropped to just 45% in 2020; ProVis fell from 15% to just 6%, while Datacenter rose from 7% to 41%.

But wait, where's the financial information between 2012 and 2015? For those 3 years, Nvidia simply reported GPU, Tegra, and Other revenue. Why would they do this? Fortunately, the more detailed 2016 report overlaps quite nicely with the sparse 2015 information and in that year, 'GPU' accounted for 83% of total revenue. However, the latter report shows us that the Gaming sector accounted for at least 72% of the 2015 'GPU' total, ProVis around 20% and Data at 9%.

Between 2004 and 2015, the GeForce line of chips produced the lion's share of Nvidia's revenue, and by a substantial margin - the motherboard chipset sales between 2005 and 2008 was a good earner for them, at least as good as PSB/ProVis+Data. However, for the past 5 years, despite a very healthy growth in revenue for GeForce chips ($2b in 2015, $5.3b in 2020), Datacenter has grown almost exponentially.

If one wishes to be a little more pedantic about matters, one could say something like "If you haven't been paying attention, Nvidia is no longer a company that generates the majority of its revenue from gaming" :)

Edit: If you consider architectural designs only, Nvidia didn't start to make a serious move beyond gaming until the release of Volta in 2017 and then Turing in 2018; up to that point, there was nothing about their GPU designs that particularly favoured professional or compute work, over gaming (although to be fair, what's good for the goose is good for the gander, when it comes to compute). The inclusion of the tensor cores, though, was a clear indication that their GPU focus was clearly in favour of professional applications.

I do wonder how long Nvidia will retain the same architecture for their different sectors. Ampere is perhaps the first sign of changes to come, as even though the GA100 and GA102 are both 'Ampere', they're vastly different designs.
 
Last edited:

Reehahs

Posts: 1,198   +828
Arguably, yes. Or at the very least, it's been predominantly about gaming for a long time. Nvidia's financial reports since 2004 paint this picture very clearly:

View attachment 87665

The first chart shows the reported breakdown of their various sectors - GPU covers GeForce and Quadro. MCP = media and communications processors (aka motherboard chipsets); WMP = wireless media processors; CE = consumer electronics.

View attachment 87666

For these two years, Nvidia separated the GeForce from the other chip sales. PSB = professional solutions business (I.e. Quadro and Tesla); CPB = consumer products business (e.g. Tegra). The former accounted for 15 and 20% of the total revenue, where the latter was 18 and 19% respectively. GeForce income was 61 and 57%.

View attachment 87667

Nvidia dropped motherboard chipsets after 2008; they also apparently had no revenue, only losses, in the 'Other' sector. PSB varied from 15% to 23% of total revenue.

View attachment 87668

Here one can see the recent and massive expansion of sales in their Datacenter sector. The Gaming sector was 56% of total revenue in 2015 but dropped to just 45% in 2020; ProVis fell from 15% to just 6%, while Datacenter rose from 7% to 41%.

But wait, where's the financial information between 2012 and 2015? For those 3 years, Nvidia simply reported GPU, Tegra, and Other revenue. Why would they do this? Fortunately, the more detailed 2016 report overlaps quite nicely with the sparse 2015 information and in that year, 'GPU' accounted for 83% of total revenue. However, the latter report shows us that the Gaming sector accounted for at least 72% of the 2015 'GPU' total, ProVis around 20% and Data at 9%.

Between 2004 and 2015, the GeForce line of chips produced the lion's share of Nvidia's revenue, and by a substantial margin - the motherboard chipset sales between 2005 and 2008 was a good earner for them, at least as good as PSB/ProVis+Data. However, for the past 5 years, despite a very healthy growth in revenue for GeForce chips ($2b in 2015, $5.3b in 2020), Datacenter has grown almost exponentially.

If one wishes to be a little more pedantic about matters, one could say something like "If you haven't been paying attention, Nvidia is no longer a company that generates the majority of its revenue from gaming" :)

Edit: If you consider architectural designs only, Nvidia didn't start to make a serious move beyond gaming until the release of Volta in 2017 and then Turing in 2018; up to that point, there was nothing about their GPU designs that particularly favoured professional or compute work, over gaming (although to be fair, what's good for the goose is good for the gander, when it comes to compute). The inclusion of the tensor cores, though, was a clear indication that their GPU focus was clearly in favour of professional applications.

I do wonder how long Nvidia will retain the same architecture for their different sectors. Ampere is perhaps the first sign of changes to come, as even though the GA100 and GA102 are both 'Ampere', they're vastly different designs.

Did you make the graphs just for the comment?
 

neeyik

Posts: 1,877   +2,191
Staff member
Did you make the graphs just for the comment?
Ha! I had the data already in a spreadsheet so it only took a few moments to make them.

@ypsylon raised an interesting question about whether or not Nvidia has ever really been a gaming company, as they don’t make games and GPUs are used in other markets. While the revenue figures show that, money wise, gaming is still Nvidia’s biggest earner, none of my graphs highlight the amount of R&D they put into game programming and rendering techniques. Not that it’s likely to ever happen but Nvidia does have the resources (staff, knowledge, funds) to have a go at making a game, if they really wanted to.
 

Markoni35

Posts: 1,078   +442
But would you buy a second hand mining card? I know I wouldn't as I would prefer to pay a bit more and get something new that I know hasn't been used in a 24/7 environment as I would worry it would fail.

I'm using a second hand mining card right now. I think the previous owner missed the mining hype, so he sold it once he figured out there's no profit in it. Doesn't seem like did too much mining, it's working fine, I'm quite happy. Especially after I uninstalled all the bigger games, since I have no time for that anymore.

But the biggest gain for us will happen when NEW cards hit the market later in 2021 and 2022. Nvidia and AMD won't be able to sell too many, because eBay will be flooded with the current mining cards. So they'll have to reduce the prices of new cards. That's when I intend to replace my current card.