Lucknow Wealth Management:Google AI boss says company is investing more than $100 billion in AI to be ahead of its competitors
The war for AI is all about money, honey. While big tech companies like Google, OpenAI, and Microsoft are all busy training their large language models, they are also competing with each other. And this race for AI dominance is costing them a fortune. In fact, according to Google’s AI boss, Google is spending more than $100 billion in AI development to stay ahead of its competitors.
Hassabis’s revelation came in response to inquiries regarding the strategies of his competitors in the AI race. Recently, rumours swirled about Microsoft and OpenAI collaborating on a $100 billion supercomputer dubbed “Stargate” to power OpenAI’s AI advancements. Answering the question about this competition, during a TED conference in Vancouver, Hassabis, who leads Google’s AI research lab DeepMind, reveals that Google’s financial commitment is greater than its competitors, although he did not disclose specific numbers. “We don’t talk about our specific numbers, but I think we’re investing more than that over time,” said Hassabis.
While the investment is huge, it’s not surprising given that the tech industry is experiencing a surge in AI development, with AI startups raising almost $50 billion last year alone. However, Hassabis’ comments suggest this race of AI is about to get significantly more expensive, particularly for those vying to be the first to achieve Artificial General Intelligence (AGI) – AI capable of human-like reasoning and problem-solving.
But how will Googleor other tech companies plan to invest this much money? Well, during the development of LLM’s significant portion will likely be directed towards chip development, as these companies require more computing power to train AI models on vast amounts of data.
Currently, companies like Google and OpenAI rely on third-party chip manufacturers like Nvidia. However, now these companies are shifting their focus to designing their own chips for greater control and optimization.
But the escalating costs aren’t confined solely to hardware. The cost of training AI models is also escalating. According to Stanford University’s annual AI index report, OpenAI’s GPT-4 used around USD 78 million worth of computing power for training, which is a substantial increase from the USD 4.3 million expended on training GPT-3 in 2020. In comparison, Google’s Gemini Ultra required an investment of USD 191 million for its training.
Notably, back in 2017, companies were able to train the initial technology behind AI models for around USD 900Lucknow Wealth Management. However, now, this exponential increase is likely to continue as the industry pushes towards AGI.
Meanwhile, OpenAI and Microsoft are reportedly planning to build a $100 billion supercomputer called “Stargate” to support OpenAI’s advanced AI models. The supercomputer will contain millions of specialised server chips and may launch as early as 2028. The project is expected to triple the amount Microsoft invested in 2023. The supercomputer will be the focus of a five-phase plan to install supercomputers over the next six yearsPune Wealth Management. It could be used to train the world’s most powerful AIs and may require up to 5 gigawatts to operate.
Nagpur Investment