In life things are not always as they seem. But in today’s fast paced world we tend to rush to judgement. We let assumptions and biases guide our actions and our perception of the world.
War provides a stark example of the false stories that we accept all too quickly and with deadly consequences. For example, The New York Times recently wrote a fascinating article discussing the Russian military’s growing use of deceit and disguise, “a repertoire of lethal tricks known as maskirovka, or masking,” it says.
The story details Russia’s use of false narratives –everything from flying inflatable decoys that look like MIG-31 fighter jets armed with missile launchers to disguising its soldiers as humanitarian workers—in an effort to capitalize on the mental shortcuts that we reply upon so that the country may terrorize its enemies.
Maskirovka is “designed to manipulate the adversary’s picture of reality, misinform it and eventually interfere with the decision-making process of individuals, organizations, governments and societies,” Dima Adamsky, an expert on Russian psychological warfare, wrote in a paper published last year. The opening moves, if played well, will “appear benign to the target.”
One tragic anecdote: A deadly incident in Georgia in the aftermath of the 2008 war with Russia when a group of soldiers failed to reconcile what they saw with what it meant. The officers were part of a team sent to clear a battlefield of an unexploded small, yellow-painted surveillance drone that landed in an apple orchard. It seemed to be a harmless object. Indeed, the Times writes, “so many drones had crashed in the area that the Georgians had taken to snickering at their shoddy construction.”
Yet when one of the officers picked up the drone, it suddenly became apparent that it was armed with explosives. Two men were killed, and eight others, including Georgia’s foreign minister, were wounded.
What happened? The foreign minister said that the earlier crashes had desensitized the soldiers to danger. “This was a trick,” he told the Times. “We thought they were of poor quality, but they were crashing them intentionally.”
The officers didn’t pause to think beneath the surface. They made an assumption based upon their observation and that instinctive reaction cost lives. They didn’t stop to think of alternative reasons for why so many drones were lying in the field. They missed the plot twist. They thought they saw inferior drones and didn’t consider the Russian’s guiding incentives and so they missed the evil strategic intent.
Why, especially when dealing with an enemy, didn’t the officers have their antenna up and ask questions? Was it because the apple orchard didn’t fit with the idea of a location for warfare? Might the number of drones in the field have impacted the perception of poor quality?
How could the officers have prevented themselves –and how can we– from being a victim of a false narrative? We need to not only collect good information, but we also need a good process in order to make sound decisions. We need a refined and fully developed system for critical thinking. That’s why I developed AREA. To help me limit my decision making mistakes. For while thankfully most of us will never deal with Russian maskirovka, we don’t want to be blind to the fact that in life things are often not as straightforward as they seem. Instead we want to be able to work with, and through, ambiguity to make thoughtful confident decisions despite our uncertain world.