Black Swans and the Geometry of Denial

Smart people concerned with the many crises that beset humanity and the earth deal with their apprehension in a number of ways. They read a book. Or they write something themselves. Or they to go to a conference where they can listen to other smart people describe a difficult problem and what we might do to solve it.

When the smarties in question are corporate executives or their counselors they bang on about risk management. They make plans. Or, most recently, they propose a “disrupter analysis” meant to help companies withstand black swans.

Another jargon-filled corporate marketing piece masquerading as something useful? Who cares? you might reasonably ask.

One reason is that big, unpredictable events (“black swans”) can directly affect our health—like when a tsunami jumps the seawall and radiation pours through the neighborhood. Another is because the pandemic of denial stalking the land lives not just outside of us but on the inside, too.

Denial is a failure to face up to reality. As Freud made clear, the human instinct is to turn away from “the suffering endemic to reality.” As brain science proves, all of us go instinctual whenever we’re under stress. Put us under pressure and we go old school every time: we live in an idealized past, fight the last war, and think we’re one place when actually we’re somewhere else. All are forms of denial.

Denial is particularly strong in the corporate world. A Harvard Business School professor wrote a book about it. Sometimes the denial is blatant and purposeful. Other times it’s more difficult to detect because it serves a need that lives below the level of awareness—the need for consoling illusions. “How to Prepare for a Black Swan,” with its highfalutin “disrupter analysis,” fits into the latter category.

The geometry of denial works as follows. First, change or warp the data. In this case the author, Matthew Le Merle, maintains that companies can and should make a list of “possible catastrophic environmental, economic, political, societal, and technological events.” Yet he also admits that a black swan, by its very definition, is impossible to predict in advance (although they’re retrospectively easy to rationalize). The reality is that we can’t predict the unpredictable. The consoling illusion is that we can still make lists of possible catastrophes.

The corporate world is deeply wedded to the notion that it’s possible to predict what’s coming at them—from customer demand to catastrophic events. This fixation is driven by a still-deeper consoling illusion: that we still live in an industrial age in which events moved more slowly and predictably.  (The fixation on predictability is characteristic of the out-dated “push” philosophy that I wrote about in The Power of Pull.)

The second part of the denial geometry is to take the distorted, denial-based “evidence” and elaborate deductions from your false premise. In Le Merle’s case the false premise is that you can make a list of possible black swans. The deduction is the ensuing set of soothing contingency plans.

The third part of the geometry of denial is to sell it to all relevant parties—first to one’s self, then to the media, the client, the board of directors—to help everybody feel better about something that rightly causes them a great deal of anxiety and discomfort.

The less-consoling truth is that there is no control over black swan events. Imagine you had been an advisor to Muammar Qaddafi, the one assigned to “manage risk.” You could have made a long list of potential catastrophes but would the possibility of a Tunisian street vendor lighting himself on fire—and thereby setting in motion the string of events that would eventually topple your boss—have conceivably been on it?

When it comes to black swans there is no plan. There’s no data sheet. There’s no risk assessment or four things that you should do to make everything ok. When it comes to a black swan, planning is a great self-deception. A comforting illusion we create for ourselves.

The cost of this refusal to face up to reality is high, because it blinds us to the preparations we really can make for when impossible-to-predict, high impact events inevitably occur. This preparation has to do with training our nervous systems to tolerate anxiety, uncertainty, and volatility. That way we won’t end up like TEPCO CEO Masataka Shimizu, who landed in the hospital with dizziness and high blood pressure after the tsunami hit. Since then it’s been widely reported how the Japanese nuclear industry fell prey not just to corruption—but to the consoling illusion that they’d properly planned for disaster. We needn’t do the same.