OpenAI Contemplates Developing its AI Chips Amidst Global Shortage

OpenAI Contemplates Developing its Own AI Chips Amidst Global Shortage
OpenAI Contemplates Developing its Own AI Chips Amidst Global Shortage

OpenAI, one of the best-funded AI startups in the world, is reportedly exploring the idea of creating its own AI chips. This news comes amidst a global chip shortage, significantly impacting the ability to train AI models.

Internal discussions about AI chip strategies have been ongoing within the company since at least last year. OpenAI is considering several methods to advance its chip ambitions. These include acquiring an AI chip manufacturer or designing chips internally. The OpenAI CEO, Sam Altman, has made acquiring more AI chips a top priority for the company.

Like most of its competitors in the AI field, OpenAI currently relies on GPU-based hardware to develop models such as ChatGPT, GPT-4, and DALL-E 3. The ability of GPUs to perform many computations in parallel makes them well-suited for training the most advanced AI models. However, the generative AI boom has dramatically strained the GPU supply chain.

The shortage of server hardware required to run AI is so severe that it may lead to service disruptions. Microsoft warned about this in a recent summer earnings report. Nvidia, a leading GPU maker, has its best-performing AI chips sold until 2024. This shortage and high costs have pushed OpenAI to consider creating its own AI chips.

An analysis by Bernstein analyst Stacy Rasgon revealed that if ChatGPT queries grew to a tenth the scale of Google Search, it would require roughly $48.1 billion worth of GPUs initially and about $16 billion worth of chips a year to remain operational. This sky-high cost could be mitigated if OpenAI pursued creating its own AI chips.

OpenAI Would Not Be the First

OpenAI is one of many companies to consider creating its own AI chips. Google has a processor, the TPU (short for "tensor processing unit"), which is used to train large generative AI systems like PaLM-2 and Imagen. Amazon offers AWS customers proprietary chips for training (Trainium) and inferencing (Inferentia). Microsoft is reportedly working with AMD to develop an in-house AI chip called Athena, which OpenAI is said to be testing.

Despite these significant challenges and the unforgiving nature of the hardware business, OpenAI is in a solid position to invest heavily in R&D. The company has raised more than $11 billion in venture capital and is nearing $1 billion in annual revenue. It is also considering a share sale that could see its secondary-market valuation soar to $90 billion.

However, the path to creating its own AI chips is fraught with potential pitfalls. Last year, AI chipmaker Graphcore's valuation was slashed by $1 billion after a deal with Microsoft fell through. Habana Labs, an Intel-owned AI chip company, laid off an estimated 10% of its workforce. Meta's custom AI chip efforts have been fraught with issues, leading the company to scrap some of its experimental hardware.

Even if OpenAI commits to bringing a custom chip to market, this effort could take years and cost hundreds of millions annually. It remains to be seen whether the startup's investors, one of which is Microsoft, have the appetite for such a risky bet.

Share the Article by the Short Url:

Michael Terry

Michael Terry

Greetings, esteemed individuals. I would like to take this opportunity to formally introduce myself as Michael O Terry, an expert in the field of artificial intelligence. My area of specialization revolves around comprehending the impact of artificial intelligence on human beings, analyzing its potential for development, and forecasting what the future holds for us. It is my pleasure to be of service and share my knowledge and insights with you all.