Sometimes prediction is useful. Geologists who predict a volcanic eruption can prepare a region to withstand a natural disaster. Projections on the damaging effects of the hole in the ozone layer have spurred politicians to action. A company executive who forecasts that sales will climb hopes to motivate his employees. And it is always a good idea to know when you might need an umbrella.
But the world is complex. For example, how “the economy” is doing depends on an infinite number of factors—and globalization isn’t making things any easier. Climatologists may feed increasing reams of information into their computer models, but every time, the unfathomable process of natural feedback manages to surprise us. Our brains contain billions of nerve cells that influence our behavior, but we influence each other as well. When we look into the future, we see an endless series of dots that could be joined together in so many ways that we don’t even have time to consider all the options; as soon as we try, countless new dots join the scene.
Hindsight, on the other hand, is always 20/20. That’s why history books so clearly explain how one event led almost inevitably to another. Suddenly the dots are joined together by lines. This is the knew-it-all-along effect, a phenomenon psychologists call “hindsight bias”: our tendency to use current knowledge to understand past events so well we think we could have predicted they would occur.
This mechanism has been thoroughly studied by Baruch Fischhoff, an expert on decision- making and risk perception. In the early 1970s, he asked several of his students at Hebrew University in Jerusalem about then-U.S. President Richard Nixon’s historic visit to China. Would Nixon call on Chinese Revolution leader Mao Zedong? Would he consider the visit a success? After the papers stopped covering the topic, Fischhoff asked the same students about the predictions they had made. He discovered that the students had altered their memories to suit reality, as if they had foreseen what was going to happen.
This is also why, within days, commentators begin to talk about unexpected events such as the protests in North Africa as if they had seen it coming. “We partially lose the feeling that we were taken by surprise,” Fischhoff says about the effect of analyses and interpretations. “We simply forget what we didn’t see.” That also makes it extremely difficult to nullify the effect: Your new understanding of the world has erased and replaced your old understanding. According to Fischhoff, it’s simply “a side effect of information processing.”
Fischhoff’s study demonstrates that we’re not aware of our erroneous predictions. A second conclusion is that we dislike admitting our mistakes. When we’re confronted with proof that overturns our predictions of the future, we have an arsenal of reactions to keep us from losing face. For example, we state, perhaps boldly, that our prediction changed the course of history. That millennium bug that was going to bring down every computer and civilizations with them? It never came to pass because of the alarming predictions that it would bring down every computer and civilizations with them. The disaster was averted thanks to preventive measures.
Ehrlich once said that his book has led to effective policy programs and also convinced many people not to have children, thereby heroically preventing disaster. “Hardly a week goes by that someone doesn’t say to me, ‘I read your book in high school, and that’s why I only have one child now,’” Ehrlich says. Does that make him proud? Ehrlich ponders. “It does make me feel good.”
Or we shout, “Hold on! Our predictions may still come true.” After all, that friend’s relationship we were sure wouldn’t last six months may still run aground any day now. Our prediction isn’t wrong; the timing’s just off. And that war between China and Japan may not have happened, but tensions are still high. Ehrlich gladly emphasizes that terrible things are still waiting to happen. “Perhaps the most serious flaw in The Bomb was that it was much too optimistic about the future.”
It doesn’t matter how often experts err in their predictions; they maintain their authority and a multitude of fans who don’t want to hear about mistakes. Since his rise to fame in 1968, Ehrlich has won a series of prestigious awards, and he is still viewed as a hero by the progressive movement. Apologies are scarce after an erroneous prediction. Experts are rarely confronted with their mistakes. On the contrary, they are simply asked to make new predictions.
Would you like to improve your own predictions and learn how to better evaluate those of others? Read Ode’s five rules of thumb for phenomenal forecasters.
Welcome the unexpected
The title of his 2010 book is promising: The World in 2050. But everything geologist Lawrence Smith writes is useless from the start; in the first chapter, he notes that he’s looking into a future based on the prospect that “things continue on as they are now.” The problem is that things never continue as they are. The future is not simply the sum of existing developments. Strange things occur constantly. When we are asked to imagine the future, we like to stick close to today’s situation. That’s why so many economists predict that China will be the next superpower within 20 years. They may be right. But that’s what they thought about Japan in 1990, too.
In his book The Black Swan, Nassim Nicholas Taleb argues ”that it’s in our nature to expand upon existing knowledge and to be surprised by ‘black swans,’ or rare, unpredictable events that have an enormous impact: the sudden fall of the Berlin Wall in 1989, the rise of the Internet in the 1990s, the terrorist attacks of 2001, the recent economic crisis. According to Taleb, events like these shape the world we live in. If historians were a little more honest, schoolchildren would learn that the course of history is determined by a succession of random, unforeseen events. That would help train a new generation to take into account the existence of unknown data when making predictions about the future.
Five rules of thumb for phenomenal forecasters
1. Welcome the unexpected. Take unforeseen twists
into serious account; history is riddled with them.
2. Prevent tunnel vision. Investigate your opponents’ viewpoints. Try to convince yourself you’re in the wrong.
3. Don’t know too much. Watch out for details;
they muddy the main focus. Work on breadth, not depth.
4. Don’t be too optimistic. It’s no fun, but shit happens.
Realize that it may disrupt your life and your plans, too.
5. Be modest. Exercise caution in your proclamations.
Recognize the infinite complexity of the future.
That’s why we should never be too certain of things. We should leave room for unexpected twists. Two recent developments mean we can throw away all our predictions about the future of energy: fear of a nuclear reactor meltdown has caused Germany and other countries to strike all plans for nuclear energy, and a recent technological innovation allows extraction of relatively clean and cheap natural gas from rock beneath a layer of clay—and there’s plenty to extract. These things cannot be predicted. They just happen to us, just like life does. A phenomenal forecast takes this into account. As Tetlock says, “For 30 years, experts did a very good job predicting the future of the Egyptian political leadership. They predicted no change, and they were right—until they were wrong.”