Recent polls indicate that 2% of Americans believe the earth is flat.
“My daughter is an aerospace engineer, and part of her job is to dock the Dragon vehicle with the International Space Station. So that means six million people believe her job doesn’t exist,” moderator Holly Fitch, UConn professor of behavioral neuroscience, told the 60-some researchers, scientists, communicators, and community members at Konover Auditorium.
The audience were kicking off their Memorial Day weekend with a panel discussion titled “Presenting Science to the Public in a Post-Truth Era,” grappling with the challenge of how, in this time of sweeping science denial, scientists can craft messages about their research that not only inform but also persuade.
One key is to establish commonality, agreed the panelists.
“If you are communicating science to the public, it’s a good idea to highlight your similarities to those in the room,” said Tali Sharot, professor of cognitive neuroscience at University College London, UK, whose TED talk on “optimism bias” has 2.3 million views and counting.
“There are always similarities, and then they are more likely to listen to what we have to say,” she noted. “If we want people to believe what we say, we must relate it to what they believe.”
When talking to anti-vaxxers for instance, Sharot said, better inroads are made by stressing the facts that these vaccines protect children from measles, mumps, rubella – a point of agreement with your audience – than by stressing the fact that there is no evidence of a link between autism and vaccines, which is a point of significant disagreement.
“Reframe the message to fit with the biases people already have,” she said.
Sharot’s research indicates that we humans tend to seek out and believe information from sources we identify with ideologically and politically to a far greater extent than we might guess – even when those sources are not “the experts in the room.” We tend to gravitate to sources who share our political beliefs, even in entirely apolitical situations.
She described a study confirming just that in a 2018 New York Times column titled, “Would You Go to a Republican Doctor?” The 1,000-plus comments further underscore the findings.
Sharot’s research shows, too, the extent to which we humans are susceptible to confirmation bias, that no matter the source, we are indeed more likely to seek and believe information that supports what we already believe or that we want to believe.
“If you perceive the scientist as not sharing your interest, you won’t pay attention,” agreed Åsa Wikforss, professor of theoretical philosophy at Stockholm University, Sweden, whose best-selling book Alternativa fakta. Om kunskapen och dess fiender (Alternative Facts. On Knowledge and its Enemies, Fri Tanke förlag, 2017) is being translated into English.
In this era of post-truth, “objective facts are less influential in shaping public opinion,” said Wikforss, who is an expert on, among other things, “knowledge resistance” – researching why humans are so likely to resist scientific evidence.
“Belief drives action, so knowledge resistance matters,” she said, noting that Europe saw a 300% increase in measles in 2018 and 75 people died.
Trust can be exploited, make us believe what’s false and not believe what is true, sowing doubt in sources, she said: “I often say that post-truth is post-trust.”
Wikforss spoke of climate change and the misunderstanding of language.
“If you do science communication,” she said, “you should do this: Explain what consensus is – the convergence of data – and its evidential value; that it’s not just ‘everyone agrees.’ And explain how scientific institutions work – the internal mechanisms that counter bias and error.”
Human nature works against us, all the panelists agreed. “We are suckers for fake news,” said Wikforss.
UConn speaker Michael Lynch, Board of Trustees Distinguished Professor of Philosophy and director of the UConn Humanities Institute, delved into why fake news is so prevalent and why we are such suckers for it.
Lynch, who researches the role of new media, said it is important to understand what we are doing and what we are not doing when we post and re-post on social media.
The author of The Internet of Us: Knowing More and Understanding Less in the Age of Big Data (W.W. Norton & Co. Inc., 2017), he said the data suggests we are not reading what we re-post and we are sharing the content that “gets us riled up.”
Social media and its platforms operate on emotion. We think we are sharing information. But, said Lynch, “it’s doubtful the primary function of the communicative act of sharing a post is to convey factual information if a: we don’t really know what it is and b: what predicts our sharing of it is emotional effect.”
This, he suggested, is what makes information polluters so effective. They know and understand the game that’s being played. Scientists need to as well.
As the presentations ended, a cartoon showed on the screen. One panel read “Truth: I think therefore I am.” The other panel read “Post-Truth: I believe therefore I’m right.”
The event was sponsored jointly by UConn’s Science of Learning & Art of Communication (SLAC) program and the University of Connecticut Humanities Institute (UCHI).