Are Microsoft and OpenAI making moves to enter the AI chip market?

Jimmy2x

Posts: 239   +29
Staff
Why it matters: Data center and AI-related hardware are becoming nothing short of a cash cow for chipmakers with the ability to provide the needed resources. Nvidia maintains a strong hold on the market, while other companies such as AMD, Intel, Google, and Amazon continue ramping up operations in hopes of cashing in. According to recent reports, OpenAI and Microsoft may be the latest entrants looking to enter the growing AI chip race and end their reliance on 3rd-party AI computing resources.

A new report from Reuters outlined OpenAI's plans to begin providing its own AI chips and secure its share of the lucrative AI chip market. While there are no reports that the company has formally started executing its own AI chip plans, sources close to the company indicate it has begun identifying potential acquisition targets to help turn its plans into reality. According to the article, OpenAI's scheme includes several potential approaches, ranging from building its own AI chips to diversifying its supplier network beyond Nvidia's current offerings.

OpenAI isn't the only company looking to end its heavy reliance on 3rd-party resources. According to another article from The Information, OpenAI-supporter Microsoft is also on the verge of releasing its own AI chip, codenamed Athena, as early as next month. The announcement could be positioned as one of the highlights of this year's Microsoft Ignite conference.

Nvidia's early adoption of AI technology and existing GPU capabilities have positioned it as the clear leader in today's AI explosion. According to visualcapitalist.com, Team Green already holds more than 70% of the existing $150 billion AI market, a value that continues to grow daily. And with AMD and Intel lagging behind in the AI and data center segments, there's still plenty of room for competition to break in and provide new, cost-effective solutions.

Nvidia's AI success is attributable to more than just its legacy GPU and newest Grace Hopper chip offerings. The company offers AI resources ranging from hardware solutions to software and enterprise tools providing data analytics, security, and other AI modeling services.

There's certainly room for new offerings and innovation in the rapidly expanding AI market, however any real challenges to Nvidia's AI chip dominance are likely still several years away. Even then, Nvidia's well integrated combination of hardware and software solutions will be a tough opponent to contend with.

Image credit: Jenna Ross and Sam Parker

Permalink to story.

 
I don't really care, but as long as there are more people making AI chips to knock nVidia down a notch I'm cool with it.
 
I don't really care, but as long as there are more people making AI chips to knock nVidia down a notch I'm cool with it.
They definitely need kicking down a few oegs, and if it happens, I think everyone else who needs to use products in their space (from gaming, to graphic design / cad and simulations / compute) can actually get them at a better price and have them care abiut the product again if they don't have a big monopoly on things that can do AI/ML, if other companies can get AI/ML coprocessors either built into cpu's, or as separate coprocessors as add ins, then hopefully it can bring a more choice to that market and stem everyone's current reliance on Nvidia (which well and truly seems to have gone to their head and is causing chaos in the gpu market)
 
Where there is a demand for sure some tech companies will try go get a piece of the pie. Just look at how Bitcoin ASICS evolved and how China made a ton of them. I would really enjoy an AI specific hardware growth...just to see current GPU makers go down from where they are now. But this takes time
 
Things might be heating up...

"The really astonishing thing is it can apparently outperform GPUs and CPUs specifically designed to tackle AI inference. For example, Numenta took a workload for which Nvidia reported performance figures with its A100 GPU, and ran it on an augmented 48-core 4th-Gen Sapphire Rapids CPU. In all scenarios, it was faster than Nvidia’s chip based on total throughput. In fact, it was 64 times faster than a 3rd-Gen Intel Xeon processor and ten times faster than the A100 GPU."

https://www.techradar.com/pro/a-tin...-in-critical-ai-tests-is-it-game-over-already
 
Things might be heating up...

"The really astonishing thing is it can apparently outperform GPUs and CPUs specifically designed to tackle AI inference. For example, Numenta took a workload for which Nvidia reported performance figures with its A100 GPU, and ran it on an augmented 48-core 4th-Gen Sapphire Rapids CPU. In all scenarios, it was faster than Nvidia’s chip based on total throughput. In fact, it was 64 times faster than a 3rd-Gen Intel Xeon processor and ten times faster than the A100 GPU."

https://www.techradar.com/pro/a-tin...-in-critical-ai-tests-is-it-game-over-already
Awesome. That just means more innovation from Nvidia.
B100 is gonna a monster for sure if this reports are true.
 
They definitely need kicking down a few oegs, and if it happens, I think everyone else who needs to use products in their space (from gaming, to graphic design / cad and simulations / compute) can actually get them at a better price and have them care abiut the product again if they don't have a big monopoly on things that can do AI/ML, if other companies can get AI/ML coprocessors either built into cpu's, or as separate coprocessors as add ins, then hopefully it can bring a more choice to that market and stem everyone's current reliance on Nvidia (which well and truly seems to have gone to their head and is causing chaos in the gpu market)
The things is everyone has been hoping for over a decade for competition against Nvidia. But no one seems to be able to compete. Jensen did a very very clever thing by making the industry fully rely on nvidia software.
Now, there is no other way but to use Nvidia hardware to run those software.

You will be shocked to know that most AI research papers are actually from Nvidia engineers. One way or another Nvidia gets profits from AI work. It is like a decade old plan coming to reality.

1st Cuda then RTX(Tensor Cores) then Ai.

All seems to be decade old plan. Otherwise it makes no sense How Jensen always knows where industry is gonna go.

Jensen said years ago. Maybe 10 years ago in an interview- We have AI in our labs that can predict possible future outcomes!!!!

Really scary.
 
Back