Jesus' Coming Back

Report: Microsoft Spent Hundreds of Millions on OpenAI Supercomputer

Microsoft, which is devoting its resources to the AI race, and which has a head start thanks to its bankrolling of ChatGPT creator OpenAI, has reportedly spent a figure “probably larger” than several hundred million dollars to assemble the computing power needed to support the AI company’s projects.

At the hardware level, AI systems are powered by the computing power in Graphics Processing Units (GPUs), with market leader nVidia’s A100 chips being the most common. However, the chips are expensive — just one A100 chip costs $10,000.

OpenAI logo seen on screen with ChatGPT website displayed on mobile seen in this illustration in Brussels, Belgium, on December 12, 2022. (Photo by Jonathan Raa/NurPhoto via Getty Images)

OpenAI logo seen on screen with ChatGPT website displayed on mobile seen in this illustration in Brussels, Belgium, on December 12, 2022. (Photo by Jonathan Raa/NurPhoto via Getty Images)

To maintain its leading position in the field, Microsoft has poured $1 billion into OpenAI, including an agreement to build a massive supercomputer for the project.

Via Bloomberg:

OpenAI was trying to train an increasingly large set of artificial intelligence programs called models, which were ingesting greater volumes of data and learning more and more parameters, the variables the AI system has sussed out through training and retraining. That meant OpenAI needed access to powerful cloud computing services for long periods of time.

To meet that challenge, Microsoft had to find ways to string together tens of thousands of Nvidia Corp.’s A100 graphics chips — the workhorse for training AI models — and change how it positions servers on racks to prevent power outages. Scott Guthrie, the Microsoft executive vice president who oversees cloud and AI, wouldn’t give a specific cost for the project, but said “it’s probably larger” than several hundred million dollars.

Now Microsoft uses that same set of resources it built for OpenAI to train and run its own large artificial intelligence models, including the new Bing search bot introduced last month. It also sells the system to other customers. The software giant is already at work on the next generation of the AI supercomputer, part of an expanded deal with OpenAI in which Microsoft added $10 billion to its investment.

Although AI products are springing up everywhere, representing one of the few growth areas in a tech industry that seems to be moving from decay into crisis, the costs of running AI queries is high — it takes millions of hours of GPU time to train the leading AI language models, at a cost of millions of dollars.

Allum Bokhari is the senior technology correspondent at Breitbart News. He is the author of #DELETED: Big Tech’s Battle to Erase the Trump Movement and Steal The Election.

Breitbart

Jesus Christ is King

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More