Ever since Microsoft Corp. put $10 billion in OpenAI in January and began integrating ChatGPT into its products, people have been watching closely to see how its biggest tech rivals would respond. Google scrambled to release its chatbot, Bard. Meta Platforms Inc. launched its own large language model, LLaMA. On Monday, Amazon.com Inc. made a move it hopes can turn around the perception it had fallen behind in the AI arms race.
Its $1.25 billion investment in San Francisco-based Anthropic, which could grow to $4 billion and includes a minority stake, is something of a coup for Amazon and its cloud division, Amazon Web Services. Anthropic’s decision to make AWS its “primary cloud provider” for “mission critical workloads” comes just seven months after the startup’s cloud deal with Alphabet Inc.’s Google, which had been an early investor.
Gaining Anthropic as a cloud client would alone have been cause for celebration. But the bigger victory for Amazon is that Anthropic has said it will “build, train and deploy” its new models using Amazon’s Trainium and Inferentia computer chips. Amazon hopes to position these as an alternative to those made by Nvidia, whose stock has risen nearly 190% this year because of extraordinary demand for its products. With AI companies climbing over one another to get their hands on Nvidia chips, any possibility of a viable competitor comes as extremely encouraging news to the entire sector. If Anthropic, an AI frontrunner, can build and run cutting-edge models on Amazon’s chips, it sends a strong signal that Nvidia’s absolute dominance in AI won’t last forever. That’s excellent news for those who want to see development of this technology continue apace.
But to what degree Amazon’s chips can match Nvidia’s is unclear. In Monday’s announcement, Amazon and Anthropic said they would collaborate on further development. At the very least, Bloomberg Intelligence analyst Kunjan Sobhani suggested, AWS has become well positioned to handle AI tasks that don’t necessarily require the full capabilities of Nvidia’s technology. “It’s like having a Ferrari and a BMW,” he said. “The Ferrari will get you there quicker, but you don’t always need it.”
The deal is unlikely to do much for Amazon’s revenue in the near term, a possible explanation for the stock’s rise of less than 2% during the trading day. But Wall Street shouldn’t underestimate Amazon’s positioning here because it’s a pattern they should recognize from the company’s history.
As in retail, where Amazon slowly built up control of every part of selling online — sourcing products, running the store, handling logistics — so too is Amazon looking to be involved at every layer of the AI industry. Its data centers and chips provide the raw computing power. Its AWS cloud services — such as Amazon Bedrock — are a trusted go-between between companies and the AI models they want to develop with. And it is also developing its own applications for using AI, such as the coding companion, CodeWhisperer, a competitor to Microsoft’s CoPilot.