Traders, Don’t Fall in Love With Your Machines

Gary Gensler, chief US securities regulator, enlisted Scarlett Johansson and Joaquin Phoenix’s movie “Her” last week to help explain his worries about the risks of artificial intelligence in finance. Money managers and banks are rushing to adopt a handful of generative AI tools and the failure of one of them could cause mayhem, just like the AI companion played by Johansson left Phoenix’s character and many others heartbroken.

The problem of critical infrastructure isn’t new, but large language models like OpenAI’s ChatGPT and other modern algorithmic tools present uncertain and novel challenges, including automated price collusion, or breaking rules and lying about it. Predicting or explaining an AI model’s actions is often impossible, making things even trickier for users and regulators.

The Securities and Exchange Commission, which Gensler chairs, and other watchdogs have looked into potential risks of widely used technology and software, such as the big cloud computing companies and BlackRock Inc.’s near-ubiquitous Aladdin risk and portfolio management platform. This summer’s global IT crash caused by cybersecurity firm CrowdStrike Holdings Inc. was a harsh reminder of the potential pitfalls.

Only a couple of years ago, regulators decided not to label such infrastructure “systemically important,” which could have led to tougher rules and oversight around its use. Instead, last year the Financial Stability Board, an international panel, drew up guidelines to help investors, bankers and supervisors to understand and monitor risks of failures in critical third-party services.

However, generative AI and some algorithms are different. Gensler and his peers globally are playing catch-up. One worry about BlackRock’s Aladdin was that it could influence investors to make the same sorts of bets in the same way, exacerbating herd-like behavior. Fund managers argued that their decision making was separate from the support Aladdin provides, but this isn’t the case with more sophisticated tools that can make choices on behalf of users.

When LLMs and algos are trained on the same or similar data and become more standardized and widely used for trading, they could very easily pursue copycat strategies, leaving markets vulnerable to sharp reversals. Algorithmic tools have already been blamed for flash crashes, such as in the yen in 2019 and British pound in 2016.