Elon Musk’s xAI unveiled Colossus 2 on Friday, marking a significant breakthrough as the world’s first gigawatt-scale (GW) AI training supercluster. This move outpaces rivals like OpenAI and Anthropic, who are reportedly facing delays until 2027 or later.
The prime motive behind this project is to provide massive computational power required to train the next generation AI, particularly Grok 4.
Constructed in record time using on-site gas turbines and Tesla Megapacks, the cluster’s current draw matches San Francisco’s peak electricity demand; furthermore xAI plans to reach a 1.5-gigawatt capacity by April.
The immediate rollout receives praise from Nvidia’s CEO but criticism from activists over pollution in South Memphis neighbourhoods.
Competitors are still drafting strategic planning for 2027, xAI is already operating at major-city level power today.
The execution speed is unreal as below:
Elon’s playbook has not completely transformed but it has moved faster than anyone else’s, the strategy is clear and execution at scale. The launch of Colossus 2 marks a significant milestone in the era of compute, setting a new record for hardware density and the way AI companies interact with energy grids and the environment.