If you haven't been paying attention, Nvidia is no longer a gaming company

Bob O'Donnell

Posts: 81   +1
Staff member
The big picture: For those of you who haven’t been paying close attention, Nvidia is no longer a gaming company. Oh, sure, they continue to make some of the best-performing graphics chips for gaming and host a global cloud-based gaming service called GeForce Now. But if you take a step back and survey the full range of the company’s offerings, it’s significantly broader than it has ever been—a point CEO Jensen Huang hammered home during his GTC keynote this week. In his words, “Nvidia is now a full-stack computing company.”

Reflecting how much wider Nvidia’s scope has become over the past few years, ironically there was probably as much news about non-GPU chips as there were GPUs at GTC 2021 (GPU Technology Conference).

Between a new 5nm Arm-based CPU codenamed “Grace,” a broadening of the DPU line created from the Mellanox acquisition, new additions to its automotive chips and platforms, and discussions of quantum computing, 5G data processing and more, the company is reaching to ever broader portions of the computing landscape.

Not to be left out, of course, there were new GPU and GPU-related announcements, including an impressive array of new cloud-based AI-powered GPU software and services.

To its credit, Nvidia has been broadening the range of applications for GPUs for several years now. Its impact on machine learning, deep neural networks and other sophisticated AI models has been well documented, and at this year’s show, the company continued to extend that reach. In particular, Nvidia highlighted its efforts targeted toward the enterprise with a whole range of pre-built AI models that companies can more easily deploy for a wide range of applications. The previously announced Jarvis conversational AI tool, for example, is now generally available and can be used by businesses to build automated customer service tools.

As impressive as the GPU-focused applications for the enterprise are, the big news splash (and some of the biggest confusion) out of GTC came from the company’s strategic shift to three different types of chips: GPUs, DPUs and CPUs.

The newly unveiled Maxine project is designed to improve video quality over low-bandwidth connections and to perform automatic transcription and real-time translation—timely and practical capabilities that many collaboration tools have but could likely be enhanced with the integration of Nvidia’s cloud-based, AI-powered tools. The company also made an important announcement with VMware, noting that Nvidia’s AI tools and platforms can now run on virtualized VMWare environments in addition to dedicated bare metal hardware. While seemingly mundane at first glance, this is actually a critically important development for the many businesses that run a good portion of their workloads on VMware.

As impressive as the GPU-focused applications for the enterprise are, the big news splash (and some of the biggest confusion) out of GTC came from the company’s strategic shift to three different types of chips: GPUs, DPUs and CPUs. CEO Huang neatly summarized the approach with a slide that showed the roadmaps for the three different chip lines out through 2025, highlighting how each line will get updates every few years, but with different start points, allowing the company to have one (and sometimes two) major architectural advances every year.

The DPU line, codenamed BlueField, builds on the high-speed networking technology Nvidia acquired when it purchased Mellanox last April. Specifically targeted at data centers, HPC (high-performance computing), and cloud computing applications, the BlueField line of chips is ideally suited to speed the performance of modern web-based applications.

Because these applications are split into numerous smaller containers that often run on multiple physical servers, they are highly dependent on what’s commonly called “east-west” traffic between computing racks in a data center. Importantly, however, these same software development principles are being used for an increasingly wide range of applications, including automotive, which helps explain why the latest generation of automotive SoC (codenamed Atlan and discussed below) includes a BlueField core in its design.

The new CPU line—which unquestionably generated the most buzz—is an Arm-based design codenamed Grace (for computing pioneer Grace Hopper—a classy move on Nvidia’s part). Though many initial reports suggested this was a competitive product to Intel and AMD x86 server CPUs, the truth is that the original implementation goal for Grace is only for HPC and other huge AI model-based workloads. It is not a general-purpose CPU design. Still, in the kinds of advanced, very demanding and memory-intensive AI applications that Nvidia is initially targeting for Grace, it solves the critical problem of connecting GPUs to system memory at significantly faster speeds than traditional x86 architectures can provide. Clearly, this isn’t an application that every organization can take advantage of, but for the growing number of organizations that are building large AI models, it’s still very important.

Of course, part of the reason for the confusion is that Nvidia is currently trying to purchase Arm, so any connection between the two is bound to get inflated into a larger issue. Plus, Nvidia did demonstrate a range of different applications where it’s working to combine its IP with Arm-based products, including cloud computing with Amazon’s AWS Graviton, scientific computing in conjunction with Ampere’s general purpose Altra server CPUs, 5G network infrastructure and edge computing with Marvell’s Octeon, and PCs with MediaTek’s MT819x SoCs.

As with a next-generation BlueField DPU core, the new Atlan automotive SoC diagram incorporates a “Grace Next” CPU core, generating yet even more speculation. Speaking of which, Nvidia also highlighted a number of automotive-related announcements at GTC.

The company’s next-generation automotive platform is codenamed Orin and is expected to show up in vehicles from big players like Mercedes-Benz, Volvo, Hyundai and Audi starting next year. The company also announced the Orin central computer, where a single chip can be virtualized to run four different applications, including the instrument cluster, infotainment system, passenger interaction and monitoring, and autonomous and assisted driving features with confidence view—a visual display of what the car’s computers are seeing, designed to give passengers confidence that it’s functioning properly. The company also debuted their eighth generation Hyperion autonomous vehicle (AV) platform, which incorporates multiple Orin chips, image sensors, radar, lidar, and the company’s latest AV software.

A new chip—the aforementioned Atlan—is scheduled to arrive in 2025. While many may find a multi-year pre-announcement to be overkill, it’s relatively standard practice in the automotive industry, where they typically work on cars three years in advance of their introduction.

Atlan is also the first Nvidia product to include all three of the company’s core chip architectures—GPU, DPU and CPU—in a single semiconductor design.

Atlan is intriguing on many levels, not the least of which is the fact that it’s expected to quadruple the computing power of Orin (which comes out in 2022) and reach a rate of 1,000 TOPS (tera operations per second). As hinted at earlier, Atlan is also the first Nvidia product to include all three of the company’s core chip architectures—GPU, DPU and CPU—in a single semiconductor design.

The details remain vague, but the next generation of each of the current architectures is expected to be part of Atlan, potentially making it a poster child of the company’s expanded opportunities, as well as a great example of the technical sophistication of automobiles to be released in that era. Either way, Atlan will definitely be something to watch.

All told, Nvidia showcased a wide ranging and impressive story at GTC. I haven’t even mentioned, until now, its Omniverse platform for 3D collaboration, its DGX supercomputer-like hardware platforms, and a host of other announcements the company made. There was simply too much to cover in a single column. However, this much is apparent. Nvidia is clearly focused on a much broader range of opportunities than it ever has.

Even though gaming fans may have been disappointed by the lack of news for them, anyone thinking about the future of computing can’t help but be impressed by depth of what Nvidia unveiled. Gaming and GPUs, it seems, are just the start.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter .

Permalink to story.

 
I am a shareholder in Nvidia.

Although I'm disgusted with the GPU scalping and the crypto-scam market, I am thoroughly satisfied with both the Nvidia Product (the only GPU maker I buy from) and I'm satisfied with their continued technological leaps.

The crypto market and the gaming market love Nvidia products (when they can actually get ahold of them) and I see nothing but growth with Nvidia in my portfolio and my desktop.
 
We have been paying attention: Too much because of how many of their cards just go to miners which means that it also goes to the point of the article here.

The repercussions however should be very interesting: we already know what happens when there is no push for high end, ''AAA" PC gaming as this won't be the first (Or second, or third) time the PC gaming industry scales down and dominance is asserted by other devices (Consoles mostly in the past, but this time around I feel mobile and cloud gaming will take over consoles too soon enough)

But in my opinion, this is eventually for the better: some of the best and most endearing PC games, entire genres even, are very niche titles. There's entire genres that came in this kind of "PC gaming it's it's own, smaller thing" periods: FPS games were born and specially a very particular style, Arena FPS games. There's also real time strategy games, simulation games and 2D RPGs with far more emphasis on detailed and strategic choices.

All these genres and new ones we probably haven't even imagined yet can probably make a comeback once people get used to the idea that integrated graphics are the common denominator and low powered GPUs are the norm and not these super high end GPUs we've been using for the past few years.
 
We have been paying attention: Too much because of how many of their cards just go to miners which means that it also goes to the point of the article here.

The repercussions however should be very interesting: we already know what happens when there is no push for high end, ''AAA" PC gaming as this won't be the first (Or second, or third) time the PC gaming industry scales down and dominance is asserted by other devices (Consoles mostly in the past, but this time around I feel mobile and cloud gaming will take over consoles too soon enough)

But in my opinion, this is eventually for the better: some of the best and most endearing PC games, entire genres even, are very niche titles. There's entire genres that came in this kind of "PC gaming it's it's own, smaller thing" periods: FPS games were born and specially a very particular style, Arena FPS games. There's also real time strategy games, simulation games and 2D RPGs with far more emphasis on detailed and strategic choices.

All these genres and new ones we probably haven't even imagined yet can probably make a comeback once people get used to the idea that integrated graphics are the common denominator and low powered GPUs are the norm and not these super high end GPUs we've been using for the past few years.


High-end Integrated graphics are great for the mobile market, but Super-high end GPUs absolutely have their own market share and support that can't ever be discounted.

A 3090, for example, ain't fitting in no laptop anytime soon.
 
I think Nvidia's ambitions were pretty clear when they announced the unified shader architecture back in 2000 and whatever is was. They have quite the product stack now. There are competitive headwinds - the likes of Tianshu Zhixin, and closer to home, Intel. But if Nvidia continue to deliver superior products I reckon they'll continue to do well.

In any case, gaming will remain a nice little earner. 2020/21 has shown there's enormous demand for gaming GPUs. Not bad considering PC GPUs were once considered very niche.
 
I am a shareholder in Nvidia.

Although I'm disgusted with the GPU scalping and the crypto-scam market, I am thoroughly satisfied with both the Nvidia Product (the only GPU maker I buy from) and I'm satisfied with their continued technological leaps.

The crypto market and the gaming market love Nvidia products (when they can actually get ahold of them) and I see nothing but growth with Nvidia in my portfolio and my desktop.

I wonder what you going to do when nvidia gets bored of selling "gaming" GPU's to peasants and give this market up? Also you still haven't answered my question about the 11900K, is it still your favourite gaming CPU?
 
I wonder what you going to do when nvidia gets bored of selling "gaming" GPU's to peasants and give this market up? Also you still haven't answered my question about the 11900K, is it still your favourite gaming CPU?


No company gets bored of selling 100% of their GPUS to a scalper market.

Not to mention that they can overprice them because THEY KNOW they will sell 100% of them.

If Nvidia were to dump 10,000 3080 and 3090 cards on the market right now with a 5% markup: You and I both know they would sell out faster than you could clear checkout.
 
No company gets bored of selling 100% of their GPUS to a scalper market.

Not to mention that they can overprice them because THEY KNOW they will sell 100% of them.

If Nvidia were to dump 10,000 3080 and 3090 cards on the market right now with a 5% markup: You and I both know they would sell out faster than you could clear checkout.

That's true, or they could sell those same chips to data centres for 5 times the price from the start and that's what nvidia would rather do and wants to do
 
We have been paying attention: Too much because of how many of their cards just go to miners which means that it also goes to the point of the article here.

The repercussions however should be very interesting: we already know what happens when there is no push for high end, ''AAA" PC gaming as this won't be the first (Or second, or third) time the PC gaming industry scales down and dominance is asserted by other devices (Consoles mostly in the past, but this time around I feel mobile and cloud gaming will take over consoles too soon enough)

But in my opinion, this is eventually for the better: some of the best and most endearing PC games, entire genres even, are very niche titles. There's entire genres that came in this kind of "PC gaming it's it's own, smaller thing" periods: FPS games were born and specially a very particular style, Arena FPS games. There's also real time strategy games, simulation games and 2D RPGs with far more emphasis on detailed and strategic choices.

All these genres and new ones we probably haven't even imagined yet can probably make a comeback once people get used to the idea that integrated graphics are the common denominator and low powered GPUs are the norm and not these super high end GPUs we've been using for the past few years.

I agree. As long as future integrated chips can run the AAA games that are already out with good performance (at least for 1080p), otherwise it would be sad.
 
Last edited:
QUOTE - “Nvidia is now a full-stack computing company.”

It isn't unusual at all for tech to make such broad statements after falling flat on their own faces in other ways.
 
I agree. As long as future integrated chips can run the AAA games that are already out with good performance (at least for 1080p), otherwise it would be sad.

This is not unfeasible: if we can use something akin to DLSS 2.0 but that offloads the need for the compute to either CPU or some other solution it might work. Maybe even a dedicated coprocesor that's not quite a GPU but also not quite as expensive and just helps with machine learning tasks exclusively (Only problem with that is that I'd be a miner's dream)

But ultimately I think a more comprehensive cloud based service will make it worthwhile: that tech is already beyond proof of concept but if adoption hasn't grown is *PRECISELY* because gamers just don't want to jump into Nvidia DRM and we would need to be trusted with at least some control of the virtual machine to enable things like modding on a remote instance.

But again all we need is somebody that gets cloud gaming right on all aspects and we can let Nvidia, AMD and probably intel too battle for the data center compute market while we just lease power as required.
 
I wonder what you going to do when nvidia gets bored of selling "gaming" GPU's to peasants and give this market up? Also you still haven't answered my question about the 11900K, is it still your favourite gaming CPU?


I have no idea what the 11900k has to do with Nvidia, but rest assured, if I was building a PC now, it would be Intel I went with over anything AMD has.

Furthermore, my next build will be with DDR5 on an appropriate Motherboard and I may be using a 13900K by then.
 
The irony of hating crypto miners for using too much finite resources, and in the next breath waxing prosaic about cloud gaming... which rely on data centers... which use exponentially more resources than crypto mining farms.

As for Nvidia, I've had enough of them. I'd get an AMD card to replace my 1070... if I could find one.
 
Gaming has never been the core of nVidia. They are, and always have been, in business to make money. How they do so is incidental.
 
Last edited:
I have no idea what the 11900k has to do with Nvidia, but rest assured, if I was building a PC now, it would be Intel I went with over anything AMD has.

Which is just dumb, given that AMD has surpassed Intel in all but a few niche applications that still show a preference for Intel's architecture. AMD has better multi-threaded performance, matched single-thread performance, superior power efficiency, and a better platform in X570.
 
Which is just dumb, given that AMD has surpassed Intel in all but a few niche applications that still show a preference for Intel's architecture. AMD has better multi-threaded performance, matched single-thread performance, superior power efficiency, and a better platform in X570.
If I was building a PC now, I'd also probably buy an Intel... but a 10900K, because they're on sale and objectively superior to the clusterf* that is the 11900.

That said if Intel keeps fooling around with software like the "Bleep" filter I'll probably be moving to AMD for processors, too. I'm not in the market for processors which pre-censor speech.
 
Well I think it's great. This is competition as we've never seen before! I definitely think this will push GPU development quicker for gaming. In 4-5 years we'll have achieved real-time photo-realism in PC games when ray tracing goes main stream.
 
I am a shareholder in Nvidia.

Although I'm disgusted with the GPU scalping and the crypto-scam market, I am thoroughly satisfied with both the Nvidia Product (the only GPU maker I buy from) and I'm satisfied with their continued technological leaps.

The crypto market and the gaming market love Nvidia products (when they can actually get ahold of them) and I see nothing but growth with Nvidia in my portfolio and my desktop.
One thing to see is the impending cartel lawsuit coming to a court near you in 2021. Crypto is mostly a lie to "justify" it. The big question is, who's involved and who's just tagging along for the easy money?
 
In other words Nvidia is afraid that AMD is catching up very fast and INTEL will be a decent competitor as well.

Does the writer of this article own shares of Nvidia? Just curious.
 
The irony of hating crypto miners for using too much finite resources, and in the next breath waxing prosaic about cloud gaming... which rely on data centers... which use exponentially more resources than crypto mining farms.

Not sure and I could be wrong? But don't most people hate crypto miners because of the shortage of video cards?

As for Nvidia, I've had enough of them. I'd get an AMD card to replace my 1070... if I could find one.

Do you mind telling me why you want to switch from NVIDIA?

Do you also stream online?


 
I am a shareholder in Nvidia.

Although I'm disgusted with the GPU scalping and the crypto-scam market, I am thoroughly satisfied with both the Nvidia Product (the only GPU maker I buy from) and I'm satisfied with their continued technological leaps.

The crypto market and the gaming market love Nvidia products (when they can actually get ahold of them) and I see nothing but growth with Nvidia in my portfolio and my desktop.
So do you think they’ll be able to leverage the ARM acquisition into a portfolio of market leading high performance arm based cpus and apus for datacenter, mobile and laptops and come out ahead?

I tend to think they will succeed and the ARM acquisition will look like a bargain in the future.
 
Well I think it's great. This is competition as we've never seen before! I definitely think this will push GPU development quicker for gaming. In 4-5 years we'll have achieved real-time photo-realism in PC games when ray tracing goes main stream.

Competition is always great and in more cases than not is of great benefit to consumers.

But only when the products can be purchased.

I see and read lots of hype over all these new video cards and what the future holds.

But what is the point if you can't buy anything that they talk about?

I mean. Who is to really blame for all this mess?

COVID, miners, manufacturing plants, etc, etc?

Thoughts?
 
Back