What a Japanese AI Unicorn Can Teach Silicon Valley

Silicon Valley’s “move fast and break things” mantra propelled tech innovation for the internet age. In the era of artificial intelligence, it should take a leaf out of Japan’s playbook and slow down.

A rush to deploy AI tools to the public has resulted in embarrassing blunders, from an AI-powered Google search feature recently recommending glue on pizza, to consequences that can impact real people’s livelihoods, like the technology behind OpenAI’s ChatGPT showing signs of racial bias when ranking job applicants, as a Bloomberg analysis found.

It has also led to tech companies consuming enormous amounts of energy to power AI. The International Energy Agency estimates the total electricity consumption for data centers across the globe will be roughly equivalent to the power demand of Japan in 2026. Other forecasters say that by 2030, these centers are on course to use more energy than India, the world’s most populous country. Large language models, the technology underpinning the latest crop of generative AI tools, require gargantuan troves of data and training them takes immense amounts of computing power and energy.

As the tech continues to develop, many AI firms think the key to growth is to make these large language models even larger. Some US tech titans including Microsoft Corp. co-founder Bill Gates and OpenAI Chief Executive Officer Sam Altman are even backing nuclear energy firms to help power AI data centers. But there are other options beyond rushing to fire up new nuclear reactors to train AI models.

Tokyo-based startup Sakana AI — which Nikkei Asia reported on June 15 will become the fastest-ever Japanese company to achieve unicorn status — has taken a different approach. When it comes to the most consequential technology of our time, they’re playing the long game.