Elon Musk gives a hint at just how much his AI chatbot cost to develop
- Elon Musk said Grok 3 will be "something special" after training on 100,000 Nvidia H100 GPUs.
- Nvidia's H100 GPUs, a key component for AI, are estimated to cost between $30,000 and $40,000 each.
- While companies may receive bulk discounts from Nvidia, that's still billions in GPU costs.
Elon Musk just hinted at how much it cost to make his AI chatbot Grok.
The billionaire replied to a post on X on Monday and said that the latest version of xAI's chatbot Grok 3 should be "something special' after it trains on 100,000 H100s.
Musk is referring to Nvidia's H100 graphics processing unit, also known as Hopper, which is an AI hip that helps handle data processing for large language models (LLMs). The chips are a key component of AI development and a hot commodity in Silicon Valley as tech companies race to build ever-smarter AI products.
Knowing how many H100 GPUs Musk is getting allows us to do some napkin math to figure out a rough estimate of the cost. Each Nvidia H100 GPU chip is estimated to cost around $30,000, although some estimates place the cost as high as $40,000.
Based on those estimates, the upcoming version of xAI's Grok would cost between $3 and $4 billion to train and develop. Musk could also get a volume discount from Nvidia, which would make the cost a bit cheaper. But even assuming a volume discount brought the price down to something more like $20,000 per GPU, that still would bring the total cost to $2 billion. And that's only counting the price of the chips.
100,000 GPUs would be a big step up from Grok 2. Musk said in an interview in April with the head of Norway's sovereign fund Nicolai Tangen that Grok 2 would take around 20,000 H100s to train.
xAI has so far released Grok-1 and Grok-1.5, with the latest only available to early testers and existing users on X, formerly known as Twitter. Musk said in a post on X Monday that Grok 2 is set to launch in August and indicated in the other post about GPUs that Grok 3 will come out at the end of the year.
xAI did not respond to a request for comment.
100,000 GPUs sounds like a lot — and it is. But other tech giants like Meta are stacking up on even more GPUs, which will cost over triple what xAI is spending. Mark Zuckerberg said in January that Meta will have purchased about 350,000 Nvidia H100 GPUs by the end of 2024. He also said Meta will own about 600,000 chips including other GPUs.
If that's the case, Meta will have spent about $18 billion building its AI capabilities.
The stockpiling of H100 chips has also contributed to how ruthless hiring top AI talent has become in the last year.
Aravind Srinivas, founder and CEO of AI startup Perplexity, talked about getting turned down by a Meta AI researcher he was trying to poach in part because of Zuckerberg's huge collection of AI chips.
"I tried to hire a very senior researcher from Meta, and you know what they said? 'Come back to me when you have 10,000 H100 GPUs,'" Srinivas said.