Sometimes synchronicity strikes in strange ways.
This morning I found myself thinking in an uninterrupted flow about how we all tend to float through life reactively, much aided and abetted by the rhythms of television news and newspapers, and about how we have a near-obsession with trying to make abstract and complex concepts and events simple and easily understood. (Most of) us just don’t want to put the effort into thinking about things, even if those things are likely to have significant material impact on our lives.
So it was with synchrononistic pleasure that I came across Dave Pollard’s post titled "Fooled By Randomness, Risk and Opportunity", in which he analyses some of the core concepts in Nassim Taleb’s 2001 book "Fooled By Randomness: The Hidden Role of Chance in the Markets and Life". Taleb also wrote the bestseller "The Black Swan – The Impact of the Highly Improbable", published in 2007
.
Fooled By Randomness, Risk and Opportunity
[ Snip … ]
Taleb’s key argument fits well with the above ideas. His thesis for the book is:
We favor the visible, the embedded, the personal, the narrated, and the tangible; we scorn the abstract. Everything good (e.g. aesthetics, ethics) and wrong (e.g. being fooled by randomness) with us seems to flow from this.
We are wired, he argues, to deal with immediate emergencies (fight or flight, when being pursued by predators), and to optimize the likelihood of procreation. Because of this, our brains, emotions and instincts can be "fooled". Several of these types of foolishness are now getting our species into deep trouble:
Hindsight bias: Past events look more logical, causal and correlated than they really are. We therefore reward people (executives, politicians, celebrities, gamblers) for being in the right place at the right time, and don’t see that their "success" had nothing to do with them; it was almost entirely good luck.
Learned helplessness: As Gladwell has argued, because of bad and misinterpreted information, we fear the wrong things. We think terrorism is a bigger threat than flying, and think SUVs are safer than convertibles. And we act (inappropriately) accordingly.
Being swayed by simplifying rhetoric: We love (and believe) proverbs and terse oversimplifications. In an increasingly complex world, our minds crave simplification, and are suckers for it.
Vulnerability to "noise": We are such pattern-seekers that we look for sense in the firehose of information, and find it when it isn’t there. We would be better off to turn off the mainstream media, stop paying attention to them, and spend our time in original thought and reflection instead.
The induction problem: The past always seems deterministic. We expect the future to be a continuation of the recent past. We "forget" (or never learn) the longer-term historical context. Because we have only ever seen white swans (boom economic times) we don’t believe in the possibility of black swans (crashes). So we constantly overreact to the "latest" news (quarterly earnings reports). The key to resilience is to anticipate the possibility of black swans (unlikely but potentially catastrophic events) and to hedge against them, avoid them entirely (become a dentist, not a stockbroker) or be prepared for them. Taleb made his mark by hedging against a market collapse, but the challenge is that some such events are very difficult to hedge against or prepare for, so we tend to shrug them off until they actually occur.
The skewness problem: Despite the fact that rare, catastrophic events can more than undo decades of positives, we tend to perceive ‘blips’ as being of the same magnitude in both directions, so we continue to bet on the positives and, on net, lose.
The survivorship bias: We tend to see only the winners/survivors and lose track of the large numbers of losers. There are probably hundreds of people as bright and skilled as Bill Gates or Warren Buffett or Wayne Gretzky or (name your star) who followed exactly the same method or philosophy as these successes, but who, because they were not in the right place at the right time, failed and are now unknown. We are therefore foolish to follow the lead of such successes and expect the same result. When it comes to success in life and other complex situations, the cream actually rises to the top very rarely. This problem is worst at the top levels, because executives and politicians tend to make decisions that are more affected by external events, so the outcome, good or bad, is least likely to be the result of their decision.
The anchoring bias: Our perception of where we are is biased by where we have come from. If we have a million dollars after living our lives poor, we think we are rich. But if we once had two million, we feel poor, a failure.
.
Read the rest of Dave’s post here …
.
Powered by Qumana
Leave a Reply