Nvidia’s new specialized chip aims to accelerate AI processing speeds
The new system for inference computing refers to a form of processing that allows AI models to respond to queries with greater speed and efficiency
Nvidia has historically dominated the training phase of AI and is now set to launch a new processor designed to help OpenAI and other customers build faster, more efficient AI systems. The recent revelation marks a significant shift: the launching of a dedicated processor designed specifically for inference computing. This new system allows AI models to respond to queries with greater speed and efficiency.
Earlier this month, Reuters reported that OpenAI is dissatisfied with the speed at which Nvidia’s hardware generates answers for ChatGPT users, particularly for complex tasks like software development and integrating AI within other software. According to a source who spoke to Reuters, OpenAI’s goal is to acquire new hardware that will eventually handle about 10% of its inference computing requirements.
Moreover, OpenAI has been in talks with startups like Cerebas and Groq to provide chips for faster inference. In a strategic countermove, Nvidia successfully closed a $20 billion deal with Groq, effectively ending OpenAI’s negotiations with the startup.
While Nvidia previously committed up to $100 billion to OpenAI, that arrangement has recently been restructured into a $30 billion investment. This deal provides OpenAI with the essential capital for advanced hardware while securing Nvidia’s position as a primary stakeholder.
Ultimately, these advancements pave the way for increased chip, fostering long-term growth across the AI sector.
-
What happens if ChatGPT gains access to your financial accounts? Experts are alarmed
-
Anthropic seeks legal pause on Pentagon supply-chain risk decision: Here’s why
-
'AI washing' or real shift? Atlassian cuts 1,600 jobs in latest tech shake-up
-
Experts predict AI will trigger biggest shift in mathematics history
-
China’s cyber agency raises concerns over OpenClaw AI
-
WhatsApp plans major change for younger users
-
Musk unveils Tesla, xAI joint project ‘Macrohard’ amid advanced AI push
-
Nvidia secures $2 billion deal with AI cloud provider Nebius
