Risk management is a key part of many endeavors, from space travel to investment management. In either case, achieving one’s goals requires awareness of what could go wrong, as well as careful attention to details in often-changing conditions to ensure the smoothest possible journey. Tilak Lal, vice president, Performance Analysis & Investment Risk at Franklin Templeton Investments, describes what risk is within the investment space, and how risk management is essential to success.
Flying to the moon is hard. Flying to the moon using 1960s tools and technology? Seemingly impossible.
Consider the computers the US National Aeronautics and Space Administration (NASA) used during the Apollo program nearly 50 years ago. While cutting edge at the time, by today’s standards they were archaic. The Massachusetts Institute of Technology (MIT) designed the Apollo guidance computer, which was considered an engineering marvel in 1969 but only possessed a mere 64kb of memory. It contained 12,300 transistors running at a speed of 43 KHz, and it performed 41 instructions per second.
Sloth-like relative to current computer processing speeds.
For a frame of reference, let’s compare the Apollo computer to the smartphones in our pockets today. An iPhone contains approximately 1.6 billion transistors, runs at 1.4 GHz and performs roughly 3.36 billion instructions per second. That’s 3.36 billion for today’s pocket-sized phone versus 41 for the clunky Apollo!
The “computers” of the Apollo era—and this isn’t hyperbole in our view—were essentially on par with today’s key fobs or digital coffee makers in terms of power. And yet despite these limitations, NASA successfully landed men on the moon and returned them safely to earth, 48 years ago! How was this incredible feat accomplished?
Flight directors and engineers of the time have said that in addition to sheer will and relentless determination, perhaps the most significant attribute contributing to the success of Apollo was the culture of risk management that permeated the program. Risk management was central to every aspect of the mission.
As author Andrew Chaikin observed in his book A Man on the Moon, (Penguin Books, 1994) “when putting humans on top of a 400-foot rocket that burns 15 tons of high explosive propellant a second … using slide rules to send them to a place where there is a 450-degree temperature difference between sunlight and shade… needless to say you better have some pretty solid risk-management protocols in place.”
Risk Management Is Central to Space Exploration, and Investment Success
So what is the significance of space exploration to investing? Clearly, we’re not in the space exploration business, and there is no comparison in terms of magnitude or historical consequence between our endeavors and that of placing a man on the moon. That said, we humbly submit there are some parallels to be drawn, particularly in terms of the importance of robust risk management.
At Franklin Templeton we recognize, as did the engineers at NASA in the 1960s and 1970s, that risk management is central to any successful outcome. While we are not launching spacecraft (thankfully), our clients put their trust in us to help them meet their important objectives and desired investment outcomes. Risk management is central to that effort.
We believe that as investment managers we are risk managers. The two disciplines are linked by their nature—two sides of the same coin. Investing is about the relationship between risk and return. We can only seek to manage risk. Return is an outcome.
Taken in this vein, attractive returns are actually the result of effectively deploying risks.
So What Is Risk?
There are many ways we measure risk. No one methodology, model or statistical measure will reveal the full extent or “truth” of potential risks in a security or investment portfolio. Market risk is not easily characterized by a handful of quantitative metrics, and there are many elements that quantitative metrics don’t capture well at all. The industry teems with models. But, we sometimes forget that models are only tools. It’s up to investment managers to provide the insight.
As Fisher Black—a well-known US economist who was one of the authors of the Black–Scholes equation—said after moving from MIT to Wall Street: “The markets look a lot less efficient from the banks of the Hudson then from the banks of the Charles.”
Conceptually, we consider risk as the exposures we take when making an investment. The challenge, however, is that the magnitude and nature of these exposures may vary significantly over time, and can change very quickly. It is never a static snapshot. In addition, often intended exposures come with unintended or even unknown exposures.
For example, if an investor likes Japanese auto manufacturing and purchases equity in a Japanese car company, the stock will likely come with exposure to the Japanese yen. If yen exposure is not desirable, the investor may shed that yen exposure through hedging. In addition, the investor may seek to multiply his or her auto manufacturing exposure by borrowing to apply leverage, and so on.
The point is that today’s portfolios can be highly tailored, complex and difficult to model. No longer can one approach, data point, or set of data points be deemed suitable to define risk, particularly when applied to the complex investment landscape of today. Given the limited amount of financial engineering and less global nature of investing 30 years ago, perhaps standard deviation, Sharpe ratio and other basic statistical measures were sufficient. We’d argue that’s no longer enough today.
The responsibility of modern risk management is to clarify the dynamic nature and behavior of a portfolio by capturing meaningful, actionable portfolio insights and transparency. How will it perform in different circumstances? In normal and stressed markets? What’s the likelihood of achieving targeted outcomes? The key is to capture actionable insight.
What is Actionable Insight?
“Houston, we’ve had a problem.”
Astronaut James A. Lovell Jr., Commander Apollo 13
9:08 PM CST, April 13, 1970
In April 1970, the Apollo 13 spacecraft was headed for the moon, destined to be the third lunar landing in human history. Unfortunately, the mission was abruptly aborted mid-flight after a mechanical explosion crippled the ship.
For the three astronauts on board, their focus quickly shifted from preparing to land on the moon to survival. The capsule’s main life support systems were failing, its power was rapidly being drained, oxygen was venting into space, and the vehicle’s navigation and propulsion systems were severely damaged. Immediately after the explosion, it appeared that the crew would most certainly be lost, as mission control estimated they had less than 15 minutes of life support left.
The story has over the years come to epitomize the effectiveness of NASA’s thorough, systematic, redundant and detailed risk management systems and practices. Despite the multitude of challenges faced, the agency’s teams systematically and methodically worked to solve each problem, often in parallel and in real time, eventually returning the astronauts safely to earth—and doing so with 1960s technology.
NASA engineering manager George Low described their approach as incorporating “a meticulous and painstaking attention to detail…where no change was too small to consider, no anomaly too little to understand.”
Considering that the Apollo spacecraft was comprised of more than 8 million parts incorporating 500,000 systems, the demand for precision is understood. If every part on the Apollo spacecraft functioned with 99.9% reliability (which is what NASA targeted), they would still experience thousands of failures.
Engineers who were involved with the program cite the detailed understanding they had of each spacecraft component–down to the wire and solder point–as contributing heavily to their success in managing the crisis.
In fact, when prospective controllers joined NASA, their first task was to visit the contractors responsible for manufacturing the ships and systems, collect blueprints and documents about those systems, and then digest the information.
During Apollo 13, for example, the controllers had to rely on their understanding of the wiring diagrams in the lunar module when determining how to power it up rapidly for its use as a “lifeboat.” Without that detailed knowledge, their efforts would have likely been futile.
The Portfolio as a Mosaic of Moving Parts
As risk managers, we think the same way.
A portfolio represents a mosaic of many moving and interrelated parts–asset classes, strategies, instruments, systems, technologies, counterparties, geographies, currencies, political systems, personalities, etc.
It’s not enough to statically see and understand the individual parts of the portfolio. While that is a start, we must understand the dynamic behavior of each in the context of the portfolio, in a variety of conditions, and against one another. Stress tests, scenario analysis and Value at Risk (VaR) assessments need to be core components of any rigorous risk program. This is actionable insight.
Actionable Insight… In Action
In the 1980s and 1990s financial innovation and engineering exploded on Wall Street. The capital markets began deconstructing traditional holdings like stocks and bonds and developing derivative instruments that could provide exposure to some factor driving the value of the traditional instrument.
The term derivative accurately describes the fact that the instrument derives its value from an underlying exposure. Advanced computer modeling and the willingness to underwrite exposures as “counterparties” contributed to the evolution of the industry, such that investments today can be highly tailored and outcome-oriented.
While such investment choice is generally positive, the advent of these highly complex portfolios presented a significant challenge to risk management. How does one capture the risk characteristics of a portfolio when the interrelationships of instruments were not at all obvious from an accounting perspective? For example, something like an option or a swap might not “kick in” until an underlying exposure reaches a certain level.
Value at Risk
While traditional measures of risk serve as a solid foundation, alone they are not enough. In order to make sense of today’s complex portfolios, more sophisticated measures such as VaR were developed, intended to measure the actual loss potential of an investment. These modeling techniques are not based on return stream analysis, but rather the actual holdings within a portfolio. The power of VaR is that it captures the interrelationships of many instruments at the strategy or multi-portfolio level. It does not rely on historic return analysis but is a forward-looking measure of potential loss.
As a methodology, VaR requires two arduous modeling prerequisites. First, all holdings in a portfolio need to be tied to pricing models. Pricing models, in turn, are driven by factors that determine the price or “value” of a particular instrument.
Second, the factors need to be related to each other so that if one factor moves in a certain way then it is reasonable to expect another factor to move in a predicted direction. This requires the development of a covariance matrix, which uses correlations and volatilities to “tie” all the factors driving prices to one another.
Once this process of tying holdings to pricing models and their factors is achieved, one can price a portfolio in a simulated market scenario.
With this capability, portfolio managers and risk managers then have a means of simulating market moves. They can see how a portfolio may behave in a hypothetical environment. This methodology is called a Monte Carlo simulation.
Using a Monte Carlo simulation, thousands of random market scenarios may be “run” and the gain or loss of a given portfolio in each instance is plotted on a distribution. This distribution would be non-linear because it would capture the occurrences of certain derivatives “kicking in” in certain markets.
This represents the final stage of calculating VaR. By simply choosing a point on the left side of the distribution—that is those scenarios that show a loss—a statistically consistent measure of downside loss potential can be monitored.
Typically VaR measures the one in 20 or one in 100 downside loss potential over a specific period, assuming statistical significance. In plain English, VaR answers the question, “In the worst day in 20 days or the worst day in 100 days this portfolio could be expected to potentially lose how much?”
Stress Testing and Scenario Analysis
NASA also maintained an obsessive culture of testing. Every component of any spacecraft had the living daylights tested out of it.
Each ship part was reviewed to determine its potential modes of failure, the effect that failure would have on the component itself, the assembly by which it was attached, the system it supported, the role it played in the mission, and ultimately the impact it would have on the crew. With each analysis, possible spacecraft design changes would be considered which might eliminate the failure mode, reduce the frequency to an acceptably low level, or mitigate its consequence.
In the same way, a very significant by-product of creating the statistical artifice of a VaR calculation is that portfolio managers and risk managers can now stress test an investment against historic market events and hypothetical markets.
It’s possible to understand how a portfolio would perform in any number of scenarios by setting pricing factors at the levels seen during historically stressed market periods like the dot-com bubble in the late 1990s–early 2000s or the Global Financial Crisis of 2007–2009. Stress tests answer the question, “if a similar event were to occur in the future, how much could this fund lose?”
Scenario analysis is a critical tool in outcome-oriented solutions, as effective custom solutions depend on the robust modeling of portfolios in a number of potential market environments.
No Substitute for Judgment
Modern risk management requires extensive investment in data services and risk-modeling tools. It is highly technical and imbued with many metrics for understanding complex investments in a complex world. All the effort that is required to render technical risk analysis, however, can lead to an over-reliance on the numbers.
Perhaps the most important aspect of any successful risk framework, a trait that cannot be easily quantified and is learned from years of experience and wisdom, is intuition.
This is the “intangible art” of risk management. There is no substitute for experience and sound judgment. The best risk consultants possess solid judgment and technical skills. They are encouraged to ask questions and engage in open communications with portfolio managers. Having an independent view of investment risk through robust technical analysis—leavened with sound judgment—is the key to effective risk management.