Microsoft working on "Athena" silicon to power AI

nanoguy

Posts: 1,355   +27
Staff member
Why it matters: With the scale of generative AI models growing faster than the compute capabilities required to train and run them, it's no wonder that companies like Microsoft are exploring in-house solutions as an alternative to off-the-shelf hardware from the dominant company in the dedicated GPU space.

The race for AI supremacy among technology giants has only begun, and it is already going beyond feeding enormous amounts of data to models like the famous (or infamous, depending on your perspective) ChatGPT that is part of almost every news cycle.

A report from The Information claims Microsoft is currently developing a custom AI chip to train its models. The mysterious piece of silicon is codenamed "Athena" and the company reportedly started working on it in 2019. For now, only a small group of Microsoft and OpenAI employees have access to it in order to test how it performs when used for large language models like GPT-4.

Companies big and small are grabbing every enterprise-grade Nvidia GPU they can in order to build powerful systems required for training artificial intelligence models using curated data sets. This is the same hardware they use to run the models and do something called "inference" – the process that takes real-world information or input and generates a piece of useful content for a specific application.

The problem is that Nvidia can only make so many A100 and H100 GPUs, and each and every one of them costs a small fortune. A100 GPUs are around $10,000 apiece, and the more modern H100 GPUs go for more than $40,000 on eBay. Team Green is so happy about the pent-up demand that it is shifting some of the production capacity for GeForce RTX 4090 GPUs to make more H100 GPUs.

Therefore, it shouldn't come as a surprise that companies like Microsoft are trying to reduce their reliance on a single vendor for their machine-learning efforts. This could also lead to significant cost savings down the line, which is why Amazon, Meta, and Google are going the same route of building in-house chips.

With 300 people working on the Athena chip and multiple generations planned for the coming years, it will be interesting to see what becomes of Microsoft's existing partnership with Nvidia. Either way, the Redmond giant has been busy baking AI tech into every product or service it offers, be it the Edge browser, its Bing search engine, the Microsoft 365 suite, GitHub, and more.

Interestingly, a separate report from Thurrott suggests Microsoft is also developing a neural processing unit (NPU) for its Surface line of devices. It currently uses CPUs and SoCs from Intel, AMD, and Qualcomm – so far, only the Qualcomm version of the Surface Pro 9 features an NPU.

Permalink to story.

 
AI these days are found in every newly made chip. It's beyond me how something that progressed slowly, it only became productive just for one vendor.

Certainly there must be other vendors that can make Ai gpu's.

Would have been useful for amd and Intel to be in the ai themselves, so we would not hear stuff like these, as for example, about a 40.000$ GPU



"Therefore, it shouldn't come as a surprise that companies like Microsoft are trying to reduce their reliance on a single vendor for their machine-learning efforts."


 
Back