An inviting narrative can help bridge political barriers, The Undercover Economist author Tim Harford tells Brunswick’s Carlton Wilkinson
The escalation of lies and the misuse of information in public life in recent years, particularly in the US and the UK, resembles a riot: old rules suddenly don’t apply, institutional authorities are powerless, commonly law-abiding people are looting ideas for their own benefit. In that arena, fact checking is a water pistol, serving as a provocation to an angry few and a minor annoyance to the rest.
“Fact checking alone won’t be much good, and in fact might do some harm,” says Tim Harford, an economist, a producer of BBC Radio’s show, “More or Less,” a Financial Times journalist and author of the best-selling book The Undercover Economist.
In a recent interview, Harford acknowledges the critical need for fact checkers in the journalistic infrastructure. But, he adds, “It’s not enough.”
Ironically, what can quell the riot – or at least allow sane voices to get noticed – isn’t more instruction or enforcement, Harford believes, but an appeal to the public’s sense of curiosity and adventure. This is something that all good communicators know but few can quantify: a story that draws readers in, allows them to find their own emotional and intellectual connection, is better by far than the most compelling fact sheet.
As host of “More or Less,” Harford analyzes the numbers used in public debate. Beginning in the heavily polarized 2010 UK general election, with “a huge number of statistical claims on either side,” the economist frequently found himself in the role of fact checker. In a Financial Times article, “The problem with facts,” he reflects on how fact checking and political counter measures went wrong.
Harford lists three major hurdles on the way to countering a falsehood. First, a lie is often easier to understand and more attractive than the truth. Second, arguing means you repeat the lie, giving it greater traction in the very effort to refute it.
And third, more facts can backfire, producing a defensive reaction that causes people to dig in on their original beliefs. Our personal identities are shaped by our shared beliefs and closely tied to tribal instincts, so challenges can feel deeply threatening.
“One might argue that the most potent of all is indifference,” Harford says, in our interview. “But once you get past that initial barrier – that a lot of people just don’t read the news at all – then tribalism is a very potent and worrying force. The idea that somebody might be working from a totally different set of presumptions and perceptions about what constitutes agreed knowledge, what constitutes a reliable news source – that’s profoundly unsettling. Partly because we don’t see it in ourselves.”
He describes this as the George Carlin effect.
“You know Carlin’s sketch, where you’re driving along and anybody who overtakes you is a maniac and anybody who’s going slower than you is an idiot – well, of course, because you are driving at what you think is the correct speed.”
People not only resist being corrected about their beliefs, but even increase their loyalty to a faulty idea when it is challenged.