January  24,  2017

The Oxford Dictionary’s 2016 word of the year is “post-truth.” The expression has been around for a while, but usage spiked after Brexit and throughout the U.S. presidential election.

President Obama reminded us in his farewell speech that we now live in a post-truth world. “Increasingly we become so secure in our bubbles that we accept only information, whether true or not, that fits our opinions, instead of basing our opinions on the evidence that’s out there.” Referring to climate change, the president lamented that “without some common baseline of facts – without a willingness to admit new information and that your opponent might be making a fair point and that science and reason matter – we’re going to keep talking past each other.”

This problem is not new, it’s just come to a tipping point. “Confirmation bias” is a tendency to perceive information in a way that confirms one’s preexisting beliefs. The term was first coined by English psychologist Peter Wason in 1960, and has been supported by many researchers since then. Daniel Kahneman was the first psychologist to win a Nobel Prize in Economics along with his colleague Amos Tversky, for their work in decision-making and behavioral science. Kahneman pretty much predicted our post-truth predicament and fake news obsession in his 2011 book, Thinking Fast and Slow, “Contrary to the rules of science, people seek data that are likely to be compatible with the beliefs they currently hold.”

In other words, when we read an article or an email, watch the news, or hear a presentation, we unconsciously cherry-pick data and focus on information that agrees with our existing position. And we dismiss information that doesn’t fit that position. Dr Kahneman also observed, “People generally look for a plausible scenario that conforms to their understanding of reality. We are unable to imagine…installing a third-party president.” It’s safe to say most of us did not imagine a Trump presidency a year ago.

The rise of unabashed “post-truth” thinking is more evident than ever before. People effectively cease to see what’s in front of them. For example, even though the earth gets hotter every year, many people deny the existence of climate change. Many simply could not imagine a Trump presidency, so they ignored signs to the contrary.

Some in the news arena have taken advantage of this bias, intentionally publishing “fake news” and highly one-sided content, targeting eager audiences. This appeals to our confirmation biases – there’s no need to swat away information we don’t agree with; it simply isn’t there.

Most of us aren’t in a position to influence news organizations or political dynamics. But many of us are in business leadership positions. We live in times of fast and fundamental transformation. During then President-Elect Trump’s recent news conference the stock market fluctuated wildly as he singled out the pharma, auto and defense industries. Cutting defense spending; taxing infrastructure investment in foreign countries and slashing pharmaceutical prices are all on the table.

Is your leadership team ready to manage in a post-truth world?

I’ve spent over a decade working with corporate leaders to apply behavioral economics to transformation. To survive, we will have to get employees to change the way they work, probably more than once. Managing big change is already a tall order, but understanding behavioral bias can help us navigate these choppy waters.

What can we do based on our understanding of confirmation bias?

  • We know employees will approach any change with a bias – for or against. We know that the bias will act like a filter – causing people to accept and believe our messages only if they fit that bias. Often their bias is based on history – they will look for experiences that are similar to whatever it is we want them to do. The last system implementation? The last wave or policy changes? The last layoff? We should give them the right bias. Get ahead of it – offer employees the thing we WANT them to compare this change to. Compare the change to something that went well. Or tell a compelling story of a company that did the same thing, and came out shining. “Remember BigBang 2010? We all pulled together and it was a great success – our company grew and everyone was pumped. Now get ready for BigBang 2018! We’re getting ready for a whole new level of excellence.”
  • We know employees will also have a bias based on the team involved. We should engineer the face of the change to use that bias. Did consultants lead the last painful transition? The bias: consultants are bad. No amount of talking will change that – your praise of this new consulting firm will bounce right off them. So put popular and influential employees out front. Did employees learn about that system that failed through all-hands PowerPoint presentations? The bias: big presentations mean bad news. You could put the best news in the world into a deck and they’d still be suspicious. So use a channel they already like – maybe their own small team meetings, or the celebratory company off-site.

Michael Lewis, in his new book, the Undoing Project, brought a fresh perspective to Kahneman and Tversky’s work. He writes that when people become attached to a theory, they fit the evidence to the theory rather than the theory to the evidence. The key to a post-truth workforce is not a barrage of facts and rationale. Don’t combat bias directly – use it strategically and everyone wins.

Christian is Vice President, Consulting for Emerson Human Capital. Christian has led enterprise wide transformational retail consulting projects for Gallup and Accenture and worked in global development for Walmart and Metro AG.