The American company OpenAI announced on Monday a $38 billion agreement with AWS, Amazon’s cloud computing division, to acquire additional development capacity for its artificial intelligence (AI) platform.
The startup that created ChatGPT thus continues its ambitious campaign to acquire computing power and storage, both from cloud providers such as AWS, and from chip manufacturers, to ensure that it does not fall behind in the race for AI.
OpenAI aims to be the first player in the industry to develop a general AI model, or GIA, that matches the range of human intellectual capabilities.
“Developing the most advanced AI requires very large and reliable computing capabilities,” explained OpenAI director Sam Altman, quoted in the statement. “Our partnership with AWS strengthens the ecosystem that will support this new phase and make advanced AI accessible to everyone.”
Last week, Altman revealed that OpenAI had made $1.4 trillion in commitments to cloud service providers and the semiconductor industry.
These contracts will require 30 gigawatts (GW) of electricity, which is equivalent to more than 2% of the total installed capacity in the United States by the end of 2023, according to figures from the EIA, the US energy information agency.
A portion of investors is increasingly showing caution in the face of the buying frenzy of OpenAI, whose projected revenue is around $13 billion this year but is estimated to not be profitable before 2029, according to Altman’s own admission.
Asked about the topic in an episode of the “BG2 Pod” podcast released Friday, Altman showed signs of irritation and responded that OpenAI will generate “much more” revenue than that estimate.
The deal with AWS was the first since OpenAI formalized its new structure, in which the company has more freedom to move away from its nonprofit origins and generate profits for its investors.
Nvidia as a leader
Under the seven-year agreement with AWZ, OpenAI has additional immediate availability in the cloud, with full deployment before the end of 2026.
The cloud infrastructures that AWS will put at the service of the Californian company will be based mainly on processors from the giant Nvidia, the GPU (graphics processing unit), considered the most advanced on the market.
They will not only be used for work on the new OpenAI models, but also to operate ChatGPT and manage requests from the interface’s more than 800 million weekly users, the Californian company said.
The news boosted Amazon’s price, which rose nearly 4.9% at midday on the New York Stock Exchange.
Nvidia was also on the rise (+2.68%), driven both by the agreement between OpenAI and Amazon and by another announcement, that of Microsoft, which will rent additional chips and servers to the cloud provider IREN, for a total of 9.7 billion dollars.
OpenAI opts for Nvidia chips, instead of those developed by AWS, the Trainium, which have reached, according to specialists, performance levels now close to the GPUs of the world leader in the sector.
Privileged partner of OpenAI, of which it controls 27% of the capital after having invested more than 13 billion dollars, Microsoft has for several months accepted the idea that the San Francisco startup seek cloud capabilities elsewhere.
The order made to IREN also shows that Microsoft, although it is also a provider of cloud services, can no longer satisfy all the demand of its customers in terms of data storage and processing.
The announced partnership with AWS builds on the existing collaboration between the two companies, with the most open OpenAI models already available on Amazon servers.