Larry Ellison says Oracle plans to spend billions on Nvidia GPUs, even more on Ampere...

Alfonso Maruccia

Posts: 1,027   +302
Staff
Bottom line: Oracle is predominantly recognized for its database and enterprise products. However, its founder and chairman, Larry Ellison, recently revealed the company's intentions to expand into the burgeoning realm of cloud-based, generative AI. The move will require a substantial investment in Nvidia GPUs and a considerable budget for chip procurement.

Oracle will spend billions of dollars to purchase Nvidia GPUs this year alone, driven by the rising demand for generative AI-based products, which require considerable hardware and financial investments for training and content delivery.

During a recent Ampere event, Larry Ellison, Oracle's founder and chairman, declared the company's willingness to join the AI "gold rush", despite the current market being largely dominated by larger competitors in the AI cloud market.

Machine learning algorithms, which underpin the most popular AI platforms, require an extensive amount of hardware and computing resources for training and prompt-based user interaction. To compete with cloud giants like Microsoft, Amazon (AWS), and Google, Oracle is investing in the creation of high-speed networks designed to enhance data delivery speed.

Ellison said that Oracle plans to procure computing units (GPUs and CPUs) from three different companies, with a significant portion of this investment going into Nvidia GPUs as graphics chips from Team Green have been designed with AI-based algorithms in mind for quite some time.

Ellison, listed among the wealthiest people in the world, also disclosed that Oracle plans to purchase a significant number of CPUs from AMD and Ampere Computing. Oracle has made a substantial investment in Ampere, a fabless company that has been developing Arm-based server chips (with assistance from TSMC) for cloud-native infrastructures.

According to Ellison, Oracle will spend "three times" as much money on AMD and Ampere CPUs as on Nvidia GPUs. He also highlighted the need for increased funding for "conventional compute".

Oracle's hefty bet on the generative AI sector is evident, as the company is not only increasing its infrastructure investments, but also securing deals with third-party companies. Last month, Oracle entered into an agreement with Cohere, an AI startup founded by former Google employees. As per the deal, Cohere will operate its AI software in Oracle data centers, utilizing up to 16,000 Nvidia GPUs.

Permalink to story.

 
Oracle has no chance against AWS and Azure.

I suspect that the opposite is true. Oracle has no chance of survival UNLESS they start competing seriously in the AI field. A lot of what Oracle does is directly impacted by AI. They need to step up their game to survive.
 
Funny, I read this somewhere else and the nvidia part was mentioned in passing, with way more emphasis in the AMD part, but in here, its the first one mentioned…oh well.
 
How is the latest instinct compared to Nvidia's hardware?

You'd say with CDNA (Not RDNA) the compute devision must have made large steps by now since it's Vega.
 
How is the latest instinct compared to Nvidia's hardware?
Finding any independent testing of such products in this field is difficult as the cost is prohibitive. That said, Mosaic ML recently published their tests of the MI250 vs Nvidia’s A100:


649f22efda90a745c8606d48_Screenshot%202023-06-30%20at%202.45.46%20PM.png


The MI250 has superior theoretical compute performance compared to the A100 (e.g. FP32 throughput is 45 vs 19 TFLOPS), but the latter has much higher tensor/matrix rates (more than three times that of the MI250).[/IMG]
 
That is just one package out of 20 use cases I think.

In my record that Instinct looks promising. It comes with board power of even up to 600W per unit.
 
That is just one package out of 20 use cases I think.
The news article is about Oracle’s plans in the machine learning sector, so yes — it is just one example of use, but it’s the most relevant one here.

The MI250X is used in the Frontier and Lumi supercomputers, so the compute ability of CDNA 2 is clearly not in doubt.
 
Finding any independent testing of such products in this field is difficult as the cost is prohibitive. That said, Mosaic ML recently published their tests of the MI250 vs Nvidia’s A100:


649f22efda90a745c8606d48_Screenshot%202023-06-30%20at%202.45.46%20PM.png


The MI250 has superior theoretical compute performance compared to the A100 (e.g. FP32 throughput is 45 vs 19 TFLOPS), but the latter has much higher tensor/matrix rates (more than three times that of the MI250).[/IMG]
Mi200 series are fp64 oriented as it was designed to run high precision scientific simulation in frontier supercomputer
 
The news article is about Oracle’s plans in the machine learning sector, so yes — it is just one example of use, but it’s the most relevant one here.

The MI250X is used in the Frontier and Lumi supercomputers, so the compute ability of CDNA 2 is clearly not in doubt.
Mi250x indeed has strong fp64 performance due to frontier's high precision scientific need, but generative ai is 16 bit and 8 bit oriented
 
Mi250x indeed has strong fp64 performance due to frontier's high precision scientific need, but generative ai is 16 bit and 8 bit oriented
And the latter is the area where Nvidia's A100 and H100 are stronger than the MI250/250X. The MI350 may well be better but AMD hasn't said much about the actual internal specifications to make some estimates.
 
Funny, I read this somewhere else and the nvidia part was mentioned in passing, with way more emphasis in the AMD part, but in here, its the first one mentioned…oh well.

The article kinda says it … at at the end.

It has a funny combination of

with a significant portion of this investment going into Nvidia GPUs

and

According to Ellison, Oracle will spend "three times" as much money on AMD and Ampere CPUs as on Nvidia GPUs

While 25% of the total GPU + CPU spend for nVidia isn‘t insignificant, I would not have expected one quarter to beat three quarters when it comes to the article headline, but oh well.

Also interesting that the original news specifically mentioned that Oracle would no longer buy any Intel processors, which imho is far more significant.
 
Back