Facebook recently announced that it is adding 3,000 hires to its already almost 5,000-strong news department in order to streamline what users can share on the site and to control for incidents of “fake news.” By monitoring posts more carefully, the thinking goes, it may be able to limit stories, videos and posts that may have misinformation or other repugnant content. But Facebook’s attempt to catch and control for questionable news may be setting it up to be the Electoral College of social media. After all the Electoral College is an appointed group of unelected people meant to simplify, but not throw off, the intentions of the voting populace in a meaningful way.
The Electoral College was originally established to make sure that there was a group of citizens who could make better informed decisions than the masses, in part because at the time of our nation’s founding many Americans were illiterate and ill-informed about politics. In a sense, Facebook has made a similarly well-intentioned decision; senior management believes that, with training and its technology, its employees will be able to better identify what is accurate information and appropriately newsworthy.
Facebook has embraced a laudable, if paternalistic, goal in that it wants, like the Electoral College, to simplify and upgrade the integrity and quality of its process and results. We, the masses, have neither the skill nor the inclination to vet the truthfulness of news on our own. But the execution is very tricky and the process needs to be well thought out. How close can Facebook come to succeeding and at what cost?
It’s a tall order in part because while the decision may be a fine one –who would want to promote violence or be duped by misinformation? — the execution of the series of decisions that must be made carefully may expose Facebook to flaws in thinking and data. There is a cost-benefit to Facebook’s simplification; It’s easy to make mistakes trying to simplify a complex process.
First Facebook is going to have to be careful that it isn’t substituting, or elevating, its own assumptions and judgements above the assumptions and judgements of the rest of us. After all, exorcising the site of undesirable content is by nature a subjective task. How does one arbitrate decency without being subjective? Where, and how, is the limit set?
Second, high stakes decisions take time and attention and yet Facebook’s staff is necessarily in a hurry to remove hurtful content. In order to move quickly, the company will want to be wary of its expediency and its own shortcuts. When we think quickly we often need to rely upon our natural cognitive biases to guide our actions. Given that we make some 40,000 decisions each day –everything from when to wake up to whether or not to have a late night snack – we need these mental shortcuts to help us quickly make easy decisions. Sadly these well-worn cognitive pathways don’t go away when solving for complex problems. How can Facebook work to pry open cognitive space to allow for thoughtful reflection and good data collection when time is of the essence?
Yes, the company wants to create an algorithm that can help, or even one day supplant, its staff but that too will be predicated upon an assumption that it can conduct a subjective investigation with alacrity. This is a problem because when we rush to judgement we often narrow our thinking. We actually benefit from modulating or varying our pace to complete different kinds of thinking tasks well. Indeed, different states of mind benefit different kinds of decisions. When we’re in a rush we may have trouble expanding our thinking and spotting disconfirming data.
That’s why I build strategic stops into my AREA Method decision making system. I call the stops ‘Cheetah Pauses’ because the cheetah’s prodigious hunting skills are not due to its speed. Rather, it’s the animals’ ability to decelerate quickly that makes them fearsome hunters.
Cheetahs habitually run down their prey at speeds approaching 60 miles per hour but are able to cut their speed by nine miles per hour in a single stride, an advantage in hunting greater than being able to accelerate like a racecar. This allows the cheetah to make sharp turns, sideways jumps and direction changes. We need that same flexibility, agility and maneuverability when it comes to decision making.
Having a good idea isn’t enough; there is an important gap between having ideas and making good decisions about what to do with them. It’s easy to invest in ideas, but to make sound decisions you have to vet them, weed out the lemons, and have a way to develop conviction and maintain faith when it takes time to really see the impact of the decision. What that means for Facebook is that perhaps it considers training its employees about cognitive biases and mental shortcuts and puts practical tools in place to control for and counteract them. Then it could implement this new initiative with higher confidence and greater conviction that it is accomplishing its intended mission.
About the author
Cheryl Einhorn is the creator of the AREA Method, a decision making system for individuals and companies to solve complex problems. Cheryl is the founder of CSE Consulting and the author of the book Problem Solved, a Powerful System for Making Complex Decisions with Confidence & Conviction. Cheryl teaches as an adjunct professor at Columbia Business School and has won several journalism awards for her investigative stories about international political, business and economic topics. Areamethod.com, twitter: @cheryleinhorn
And try out Cheryl’s new Problem Solved web-based app to make your big decision better! The initial results show that although average session duration for direct traffic is 1 minute and 50 seconds, the average for the Problem Solved app is about 10 minutes, with users averaging 6.5 pages per session showing both strong longevity and engagement with the app. You can find it on the homepage areamethod.com by clicking the big orange button that says Make Your Big Decision Better.