OpenAI is reportedly looking beyond Nvidia for artificial intelligence chips, signalling a potential shift in its hardware strategy. According to reports and sources familiar with the matter, the San Francisco-based company is exploring alternatives for AI inference, the process that allows systems like ChatGPT to generate responses for users.
Report suggest that OpenAI is not entirely satisfied with the performance of some of Nvidia’s latest chips in certain inference tasks. These include domains such as software development aid and communication between AI systems and other software. Although Nvidia is still leading in the market for training chips in large-scale AI models, inference has emerged as a new competitive area.
With the advancement of OpenAI’s products, its computing requirements are also changing. According to rumours, OpenAI has already had talks with AMD and AI chip startups like Cerebras and Groq. According to Reuters, a source stated in the report that OpenAI is looking for alternative hardware that could eventually account for 10% of its inference workload.
This comes as negotiations for a significant investment by Nvidia in OpenAI seem to have slowed down. Nvidia had earlier announced that it would invest as much as $100 billion in OpenAI. This was expected to happen soon. However, reports suggest that negotiations have been ongoing for months, partly because of OpenAI’s shifting hardware strategy and inference needs.
Nvidia CEO Jensen Huang has denied rumours of any tension between the two companies. Nvidia has also stated that its chips are still the best in terms of performance per dollar when it comes to inference.
OpenAI has reportedly said it still relies on Nvidia to power most of its inference operations. OpenAI Chief Executive Officer Sam Altman has also praised Nvidia’s technology, describing it as the best in the world.