I was recently honored to speak at the Milken Institute London Summit on the Future Of Europe after Brexit and the overwhelming takeaway that I came home with is this: European politicians and pundits said that they regretted their ‘assumption’ that globalization was going to be the rising tide that lifts all boats. The assumption, they said, was that “better together” is actually better for everyone. Of course we all know now that globalization has only benefited some and actually hurt others.
But is it fair of the politicians and pundits, who included Lord Peter Mendelson, a former European trade commissioner and British First Secretary of State, and Marieluse Beck, a member of the German Bundestag, to blame their mistaken policies and practices on ‘assumption?’ And was it a true assumption or rather a prediction disguised as an assumption? Aren’t they guilty of projection bias, a sort of false consensus bias, which presumes that other people think like they do? To believe that what is good for them must be good for others? And when is it necessary to challenge a world view and seek actual evidence?
Niall Ferguson, Senior Fellow at the Hoover Institution at Stanford University, not only admitted at the conference that he’d been wrong to oppose Brexit but said that it was a mistake that he hadn’t listened to “people in pubs.” He even tweeted it after the event, “I admit that I was wrong” about #Brexit, says @nfergus. Now it is a divorce and we’re negotiating how much #UK will have to pay #MIGlobal.”
Of course we all bring our own lens to the world in which we operate but that’s exactly why we need techniques and processes that we can rely upon to challenge us and to differentiate between what we think we know as fact and what we can actually confirm to be objectively true. The politicians and pundits presumed a level of certainty about good versus bad and helpful versus hurtful that simply wasn’t perceived as true for the majority. The result is that a lot of damage was done by confusing a forecast with the truth.
That’s in part why I developed the AREA Method. We all make assumptions –and predictions–every day. Some are trivial, others are potentially devastating. Almost every single one of them is faulty. That’s why AREA challenges us to identify and isolate what we really know. It teases out what is certain versus what is a calculation.
We don’t easily recognize just how much our inner world is coloring the way we see and understand our outer world, and how it distorts things for us. When making assumptions or predictions becomes a habit, we become less grounded in reality and more prone to creating problems for ourselves and others in part by imposing our world view on the information around us. Most of the time this faulty thinking doesn’t lead to catastrophic consequences. But sometimes, the stakes of getting it wrong are too high.
AREA teaches that we are all prey to such confirmation bias where our brains fit data points neatly into what we want to see. That’s why we need the Cheetah Pauses, the strategic stops in our work, where we build in time and attention to our decision making to pry open cognitive space to spot disconfirming data.
When it comes to complex and high stakes decisions we don’t want to match evidence to our most likely hypothesis. Instead we want to ensure that alternative hypotheses receive equal treatment and a fair shake. Yes at times we may get lucky that an analytic approach works out well, but what is valuable in consequential decisions is to ferret out key connections between data and hypotheses that are not always obvious or intuitive. We don’t want to run roughshod over an unexpected insight. The most probable hypothesis is not the one with the most confirming evidence; instead it’s usually the one with the least evidence against it.
Here’s a link to the video of my panel at the conference: http://www.milkeninstitute.org/events/conferences/summit/london-2016/panel-detail/6625