Researchers need to understand how deep-seated mental biases shape people’s response to information, says Bobby Duffy.
Italians think that 26 per cent of their population are immigrants, when the reality is around 10 per cent. The French think 28 per cent of their population are Muslim, when it is around 9 per cent. Across more than 30 countries, only 15 per cent of people think the national murder rate has fallen since 2000, when it is actually down substantially in the vast majority.
Huge gaps between perception and reality are one of the most pressing social challenges of our time. This is not just driven by the dodgy campaign messages and “alternative facts” brought to the fore by Brexit and United States president Donald Trump. In just about every country there has been an increase in tribal and polarised claims with little connection to the truth.
These gaps are the subject of my book, The Perils of Perception, which, based on more than 100,000 interviews in about 40 countries, outlines what we get wrong, why, and what we can do about it. They are also the subject of a debate that will be held at King’s College London’s Policy Institute on 21 February.
The temptation is to cry “fake news” and “post-truth”, blaming the media, social media and politicians. But that is not the whole story. It is partly about deep-seated biases in how we think.
One such bias is our attraction to negative information. There is an evolutionary element to this. Bad news tends to be more urgent: we needed to take heed when our fellow cave people raised the alarm about a lurking sabre-toothed tiger, and those who did not risked leaving the gene pool.
Many experiments show that our brains handle negative information differently and store it more accessibly. We react more strongly to negative images, such as mutilated faces or dead cats, than positive ones and process them with different intensity in different parts of the brain.
This does not mean the media never twists our perceptions, just that to some extent we get the media our brains demand. But it is also true that we are living in dangerous times for a reality-based view of the world.
The way we consume information has changed beyond recognition. We—and unseen algorithms—can filter and tailor what we see in a way never before known.
This plays on another psychological quirk: confirmation bias. We want ‘facts’ that confirm pre-existing views, and avoid or discredit contrary information. If surveillance is the business model behind apparently free web services, confirmation bias is its currency.
So what can we do, as social scientists and individuals? The first point is to hold on to the importance and power of a shared understanding of facts. Some studies suggest that giving people the correct information reinforces their misperceptions. But just as many studies show that most people are willing to listen, and shift their views—particularly if we can get in first, and not just try to correct after the (fake) fact. We are not all automatons, slaves to our tribal beliefs.
As researchers, we focus on outlining clear facts; but we need to tell the story, too. We are storytelling animals; the emotional connections forged by narratives about individuals affect us deeply. Simple myth-busting will have a limited impact, because it misdiagnoses the issue: our misperceptions are often emotional. But facts and stories are not opposites: both can affect individuals, and we do not need to abandon the truth if we tell the story too.
At a more personal level, researchers’ stories should start from a basis that, on many social issues, reality is better than our instincts suggest. The late social scientist Hans Rosling and the Gapminder foundation, which he founded to promote a fact-based worldview, have made this point brilliantly on global issues such as poverty. The same applies to myriad domestic concerns.
There is criticism of this more positive perspective, questioning whether we should really be so pleased about what has been achieved. But misperception studies show that despair is a greater risk than complacency.
Evidence must help counter the sense that all is already lost, because a sense of hope and efficacy is important to encourage further action—and to defend against extremists who say things are so bad we need to rip it all up.
This is not the same as saying that everything is perfect, or we could not have done more. But we need to get good news stories out as well as bad—and we should be suspicious of those playing on our biases to try to convince us that everything has gone wrong.
Bobby Duffy is professor of public policy and director of the Policy Institute at King’s College London. He is the author of The Perils of Perception: Why We’re Wrong About Nearly Everything (Atlantic Books, 2018). He will be speaking at a panel debate on this topic at King’s on 21 February.
Something to add? Email email@example.com
This article also appeared in Research Fortnight