I consider myself a child of the Global Financial Crisis (a.k.a. The Great Recession). As I wrote about in my introduction I started my career in early 2008, just as the US housing market was unwinding and shortly before financial markets imploded. To those not familiar with financial markets this sort of event would appear to be a rarity–a one-in-a-billion (I’m embellishing) sort of event. Looking back at financial history, however, paints a very different picture. Severe market movements, both positive and negative, have occurred with greater frequency than I (or most others) realize.
Take, for instance, the historic annual returns of the S&P 500. The histogram below clearly illustrates that there have been some rather severe years. From 1926 through 2015 the S&P 500 had an average (arithmetic average) return of 11.95% and a standard deviation of 19.99%. Using these numbers and assuming a normal distribution results in a 1.8% probability of the index declining by 30% or more. However, during this 90 year period, market declines of such magnitudes have occurred 3 times (1931, 1937 and 2008) or 3.3% of the time. The difference between 1.8% and 3.3% percent sounds like small potatoes, but number of occurrences in the real world was almost double what the well accepted normal distribution would suggest. On a relative scale that’s pretty scary.
If I were to torture the data I’m sure I could come up with more situations such as this (the crash of October 19, 1987 comes to mind). The idea that these “outlier events” occur not only with a greater frequency than we think, but also with extreme magnitudes is the central theme of The Black Swan. Taleb defines a Black Swan as having three main attributes:
- It lies outside of regular expectations
- It carries an extreme impact
- Explanations for its occurrence arrive after the fact
Taleb points out that there are undoubtedly extreme events that we know will occur at some point. We acknowledge them, they can be modeled, and they are therefore “Gray Swans.” Black Swans are in a completely different category as they are truly unknown unknowns. We’re very much blind to their presence, and for a number of reasons.
Confirmation bias traps us into thinking that what has happened during a given period may be generalized and extrapolated into the future. A fantastic example of confirmation bias was the dot-com bubble of the late 1990s. “Everybody knew” that the stock market would only continue to go up, fueled by ever higher valuations of internet companies that promoted a new paradigm in technology. Well, we all know how that ended (more rational minds didn’t buy the hype and were scorned for their “outdated perspective”).
Along with confirmation bias we are subject to a narrative fallacy, a tendency to tell stories in an attempt to make sense of the world around us. Looking for patterns where none exist and reporting unrelated events may trick us into thinking that we know or understand more than we really do. Stories prevent (or maybe relieve) us from facing the truth about these misunderstandings. The financial media is wrought with such daily narratives
- Treasuries Surge by Most Since September as Jobs Growth Stalls (CNBC)
- Oil slides on U.S. rig count rise, economy concerns (Reuters)
- Dollar Tumbles Most Since December as Jobs Data Dim Fed Outlook (Bloomberg)
We must be cautious of stories, and history in general. Not for the information that we are receiving, but for what we are not being told. We often fail to acknowledge silent evidence–the facts that didn’t get reported.
One Diagoras, a nonbeliever in the gods, was shown painted tablets bearing the portraits of some worshipers who prayed, then survived a subsequent shipwreck. The implication was that praying protects you from drowning. Diagoras asked, “Where were the pictures of those who prayed, then drowned?” [1a]
Silent evidence presents a major problem in that our information about the past is often incomplete. It creates an image of stability by hiding the real risks that have been encountered. It glosses over previous Black Swans. As a consequence it makes us think that the world was much safer than it has historically been. It distorts our ability to understand the prevalence of Black Swans.
Prediction, not narration, is the real test of our understanding of the world. [1b]
Telling stories about the past is one thing, but stories about the future are an entirely different game. Black Swans are surrounded with uncertainty, and focusing on uncertainty leads to prediction problems. Taleb argues that there is very little difference between guessing (what you don’t know, but someone else does know) and predicting (what has not yet taken place). Some things are very predictable. Casino games, or games of chance, have odds of success/failure that can be computed very accurately, thus the probability of their outcome is, for the most part, known. On the other hand real life situations are not so clean.
In real life you do not know the odds; you need to discover them, and the sources of uncertainty are not defined. [1c]
Taleb argues that our failure to predict is somewhat a consequence of our arrogance about what we do and do not know. There is a wide range of possible outcomes that we cannot foresee or comprehend. The world is complicated, much more complicated than we often think. The real problem is that most of us don’t know how complicated it really is. There are some fields where experts can provide some reasonable guidance, and there are others where the heralded “experts” aren’t really experts (economic forecasters, stockbrokers, etc.).
Experts have their own set of problems when it comes to prediction. Being narrowly focused they tend to “tunnel,” or focus solely on their area of mastery. In the process they ignore exogenous events, things that have nothing to do with the subject at hand, but can end up playing a critical role in future outcomes and lead to Black Swans.
Black Swans are not limited to unfavorable outcomes, they can also be good. Centuries ago when European explorers were looking for an alternative passage to India they stumbled into a land mass that has developed into one of the most economically successful countries the world has ever seen. Accidental, unexpected discoveries that have changed our lives are all around us. They too are Black Swans. Serendipity, not prediction or skill, has been the driver of many of our greatest developments. We have been an incredibly lucky species.
At a quantitative level Taleb strongly encourages us to avoid the use of our traditional statistical models (he refers to the bell curve as an intellectual fraud). Gaussian statistics fail to capture Black Swans. Extreme events can and will happen, not just in financial markets, but in all areas of our lives. As I was reading this book I continually found myself coming back to Murphy’s Law: “Whatever can go wrong, will go wrong.” A coworker once reminded me that when something does go wrong it will probably do so at the worst possible time, and at a magnitude much greater than you or I can imagine.
This was an insightful read, and very much undull.
References
Taleb, Nassim. The Black Swan. Random House. New York, NY. 2007.
(a) p. 100
(b) p. 133
(c) p. 127
What I’m Reading
There is Never a Good Time to Invest (Nocturne Capital)
podcast: Interview With Burt Malkiel: Masters in Business (Barry Ritholtz)
Bill Gross & The 40 Year Black Swan (Ben Carlson)
Alternative Definitions of Risk (Morgan Housel)
“Keep Swimming” and Other Lessons From Intel’s Andy Grove (Isaac Presley)