How the truth can win | Brunswick Group

How the truth can win

An inviting narrative can help bridge political barriers, The Undercover Economist author Tim Harford tells Brunswick’s Carlton Wilkinson

The escalation of lies and the misuse of information in public life in recent years, particularly in the US and the UK, resembles a riot: old rules suddenly don’t apply, institutional authorities are powerless, commonly law-abiding people are looting ideas for their own benefit. In that arena, fact checking is a water pistol, serving as a provocation to an angry few and a minor annoyance to the rest.

“Fact checking alone won’t be much good, and in fact might do some harm,” says Tim Harford, an economist, a producer of BBC Radio’s show, “More or Less,” a Financial Times journalist and author of the best-selling book The Undercover Economist.

In a recent interview, Harford acknowledges the critical need for fact checkers in the journalistic infrastructure. But, he adds, “It’s not enough.”

Ironically, what can quell the riot – or at least allow sane voices to get noticed – isn’t more instruction or enforcement, Harford believes, but an appeal to the public’s sense of curiosity and adventure. This is something that all good communicators know but few can quantify: a story that draws readers in, allows them to find their own emotional and intellectual connection, is better by far than the most compelling fact sheet.

As host of “More or Less,” Harford analyzes the numbers used in public debate. Beginning in the heavily polarized 2010 UK general election, with “a huge number of statistical claims on either side,” the economist frequently found himself in the role of fact checker. In a Financial Times article, “The problem with facts,” he reflects on how fact checking and political counter measures went wrong.

Harford lists three major hurdles on the way to countering a falsehood. First, a lie is often easier to understand and more attractive than the truth. Second, arguing means you repeat the lie, giving it greater traction in the very effort to refute it.

And third, more facts can backfire, producing a defensive reaction that causes people to dig in on their original beliefs. Our personal identities are shaped by our shared beliefs and closely tied to tribal instincts, so challenges can feel deeply threatening.

“One might argue that the most potent of all is indifference,” Harford says, in our interview. “But once you get past that initial barrier – that a lot of people just don’t read the news at all – then tribalism is a very potent and worrying force. The idea that somebody might be working from a totally different set of presumptions and perceptions about what constitutes agreed knowledge, what constitutes a reliable news source – that’s profoundly unsettling. Partly because we don’t see it in ourselves.”

He describes this as the George Carlin effect.

“You know Carlin’s sketch, where you’re driving along and anybody who overtakes you is a maniac and anybody who’s going slower than you is an idiot – well, of course, because you are driving at what you think is the correct speed.”

People not only resist being corrected about their beliefs, but even increase their loyalty to a faulty idea when it is challenged.

You know [George] Carlin’s sketch, where you’re driving along and anybody who overtakes you is a maniac and anybody who’s going slower than you is an idiot

“Somebody thinks the flu vaccine causes flu, for example,” he says. “You can show them the information on the Center for Disease Control website. And they may accept it. ‘OK, I get it. Now I realize the flu vaccine doesn’t cause flu.’ Yet they’re as resistant as they ever were to getting their flu vaccine – maybe more resistant. It’s because they feel threatened by the whole conversation.”

Worse, increased knowledge about a subject only seems to deepen existing divisions. “More information just gives you more ammunition to believe what you wanted to believe,” Harford says. In effect, the trap tightens the more we struggle to break free. But another innate aspect of human psychology may offer a way out: curiosity.

Harford points to a group led by Dan Kahan at Yale that is studying the role of scientific curiosity in communication.

“Scientifically literate Republicans and scientifically literate Democrats are even further apart on politicized issues like climate change,” Harford says. “But scientific curiosity doesn’t have that effect. You don’t see increasing polarization with increasing science curiosity.”

The opposite, in fact: those more curious about science are more inclined to accept information that challenges their current world view.

In a 2010 article in the Journal of Science Communication, Kahan found that disentangling people’s factual beliefs from their identities as members of a cultural group was the key to communicating challenging scientific findings. In follow-up research published in January of this year, his team specifically examined the differences between groups who had differing levels of science curiosity and found that curiosity seemed to counteract the tendency of tribal instinct toward Balkanized beliefs, allowing people to be more receptive to information.

"A lie is often easier to understand and more attractive than the truth"

Anecdotally, Harford felt these findings resonated with his work on “More or Less.” The more narrowly defined fact-check type programs his team would produce didn’t feel very satisfying or provoke much of a response in listeners.

“We weren’t really getting them engaged with the world,” he says. “And yet there were other stories we would do where we would take a statistical claim, and we’d tell a story, we’d pose a puzzle – a little bit of a mystery – and then we’d take them on a journey, and show how the world worked. And it always seemed to be a much more satisfying form of journalism.”

The appeal to curiosity solves two problems at once, Harford says. First, if you can create a puzzle or a mystery to solve, it piques interest and helps solve the indifference problem. Second, as Kahan’s research has shown, “you might potentially also make some headway against the polarization problem – the idea that I’m just going to believe what my tribe believes.”

Kahan insists that his results are “provisional” and more research is required. Harford also emphasizes that as far as implementing and refining these insights go, “it’s still early days.”

Yet this approach offers a ray of light in dark time, the possibility to make a discussion constructive rather than polarizing. If nothing else, it encourages the communicator to think more deeply about how the message is being received.

“Just this idea of trying to get people intrigued by the process of scientific exploration, not just to say, ‘Well, we talked to experts and experts say you were wrong,’” Harford says.

Rather, the message should be, “there’s this new research, and it’s puzzling, so come with me and let’s talk about why this is puzzling, and why this challenges our pre-existing views,” he says. “Just get people coming on for the ride. That’s exactly what we need.”

Carlton Wilkinson is Managing Editor of the Brunswick Review, based in New York. He is a former editor and award-winning columnist for TheStreet.

Illustration: Edmon de Haro

Download (963 KB)