AI chipmaker Nvidia hits $3.46T, overtakes Microsoft and Apple as the world's most valuable company

Shawn Knight

Posts: 15,636   +199
Staff member
What just happened? Nvidia has announced first quarter results for fiscal 2026 and judging by early numbers, the April downturn looks to be well in the rear view. Nvidia reported revenue of $44.1 billion for the three-month period ending April 27, 2025, an increase of 12 percent quarter over quarter and up a whopping 69 percent from the same period a year earlier.

For comparison, analysts polled by LSEG were expecting $43.31 billion for the quarter. Non-GAPP diluted earnings per share were $0.81.

Nvidia's data center division continued to impress, netting $39.1 billion in revenue. That is up 10 percent over the previous quarter and 73 percent from a year ago, and is thanks almost exclusively to the AI revolution.

Source: Largest companies by marketcap

Revenue from gaming reached a record $3.8 billion, up 48 percent quarter over quarter and 42 percent year over year. This division will no doubt continue to hold firm in the coming quarters as Nintendo gears up for the launch of its Switch 2 console, which is powered by hardware from Nvidia and launches on June 5.

Nvidia CEO Jensen Huang said countries around the world are recognizing AI as essential infrastructure, just like electricity and the Internet, adding that he is proud that Nvidia is standing at the center of the transformation.

Nvidia will pay a quarterly cash dividend of $0.01 per share on July 3 to shareholders of record on June 11.

Looking ahead to the current quarter, Nvidia expects to bring in $45 billion in revenue (plus or minus two percent). The chipmaker initially expected to generate an extra $8 billion from the sale of H20 products exported to China but new US law now requires an expensive license to do so.

Nvidia incurred a $4.5 billion charge in the first quarter for the same reason, and saw its non-GAPP gross margin fall from 71.3 percent to 61.0 percent as a result. Earnings per share would have been $0.96 had it not been for the new law.

Shares in Nvidia are up nearly four percent in early morning trading.

Permalink to story:

 
I wonder how much they actually netted though. Funny how they are using tariffs as an excuse to increase prices when they have no problem profiting and have no problem selling to a country that is trying to destroy the US. As long has he makes money, he doesn't care who gets hurt or negatively impacted.
 
I wonder how much they actually netted though. Funny how they are using tariffs as an excuse to increase prices when they have no problem profiting and have no problem selling to a country that is trying to destroy the US. As long has he makes money, he doesn't care who gets hurt or negatively impacted.
Well, there is a very weird thing going on with nVidias AI chips that is the main reason they're selling the way they are. You have companies trying to buy as many AI chips as they can without any real plan on how they're going to use them just so they can delay/prevent other businesses from having access to them.

We have hit a wall where more processing power and data doesn't lead to a better AI model.
 
You have companies trying to buy as many AI chips as they can without any real plan on how they're going to use them just so they can delay/prevent other businesses from having access to them.
Which companies? Every leading-edge AI company needs more chips to run its current models, especially video generation and reasoning models, and has to limit what research avenues it pursues for future models because of compute shortages. Microsoft may be slowing its AI data center build-out, but OpenAI isn't.

We have hit a wall where more processing power and data doesn't lead to a better AI model.
[/QUOTE]
No we haven't. Test-time compute means AIs can think for longer on problems and score even better on benchmarks. As companies augment their remaining high-paid doctors, lawyers, and programmers (while laying off the rest), they're not going to cheap out on lesser models.

Rich people are going to be wandering around with real-time video and audio feeds hooked up to state-of-the-art models in order to get ahead (and mask their stupidity). That's going to take an enormous amount of compute.
 
Which companies? Every leading-edge AI company needs more chips to run its current models, especially video generation and reasoning models, and has to limit what research avenues it pursues for future models because of compute shortages. Microsoft may be slowing its AI data center build-out, but OpenAI isn't.

We have hit a wall where more processing power and data doesn't lead to a better AI model.
No we haven't. Test-time compute means AIs can think for longer on problems and score even better on benchmarks. As companies augment their remaining high-paid doctors, lawyers, and programmers (while laying off the rest), they're not going to cheap out on lesser models.

Rich people are going to be wandering around with real-time video and audio feeds hooked up to state-of-the-art models in order to get ahead (and mask their stupidity). That's going to take an enormous amount of compute.
if the best we can do with AI is augmented reality snapchat filters for rich people, then we REALLY have to re-evaluate what we're doing with AI. I don't remember the name, but the recent AI model that came out of china showed that 1)we don't need massive amounts of compute for AI and 2) we can actually run AI models on some pretty low-end hardware.
 
"AI chipmaker Nvidia hits $3.46T, overtakes Microsoft and Apple as the world's most valuable company"

Big Deal...!
 
Which companies? Every leading-edge AI company needs more chips to run its current models, especially video generation and reasoning models, and has to limit what research avenues it pursues for future models because of compute shortages. Microsoft may be slowing its AI data center build-out, but OpenAI isn't.

We have hit a wall where more processing power and data doesn't lead to a better AI model.

While it’s true that compute is a major bottleneck today, it’s a mistake to assume this will create a permanent two tiered AI society. Here’s why.

We're seeing rapid improvements not just in raw silicon, but in model efficiency. Small, fine-tuned models can increasingly compete with larger ones for many use cases. You don’t need GPT-5 running in real time to get high-value assistance......especially for the 80% of tasks where speed and reliability matter more than creative intelligence.

If OpenAI, Google, or Anthropic want to grow, they need to serve millions, not just a few elites. There's economic incentive to make frontier AI widely accessible, even if it’s in the form of API tiers or usage quotas. Think Chrome or Android, both immensely powerful, both free, both subsidized by larger business models.

Devices like Apple’s Neural Engine and Qualcomm’s AI chips will soon be capable of running surprisingly powerful models locally, especially with efficient architectures like Phi-3 or the next-gen Llama variants. You won’t need a data center to get real-time vision-language feedback. It may not be there at this moment but it will.

Intelligence amplification and competence. Just because a billionaire has a fancy agent whispering in their ear doesn’t mean they’ll use it wisely. Knowledge work is still full of human judgment, taste, and ethics, things that don’t scale with tokens alone.

The smartest players aren’t just chasing scale....they’re chasing distribution.
The real power move isn’t to hoard compute, but it’s to build the best agent experience for the largest number of users. That’s where market dominance and social transformation happen.

I am definitely not a genius in this, but following Ai for the last three years has been a mind boggling and informative experience. For those truly interested, go read some articles from arXiv.org
 
Back