It’s as if each side is living in a different country—and a different reality.
In fact, over the last few months, a handful of liberal-leaning sites have begun fixating on what they’ve dubbed the “reality gap”: the tendency of Donald Trump’s supporters to endorse misinformation about political and economic issues. Sixty-seven percent of Trump voters, for instance, believe that unemployment has gone up under President Obama’s administration. (It hasn’t.) Up to 52 percent believe that Trump won both the electoral college and the popular vote in the 2016 election. (He didn’t.) And 74 percent of Trump supporters believe that fewer people are insured now than before the implementation of the Affordable Care Act. (More are.)
But this unfairly casts conservatives as being blind to reality. In fact, people across the political spectrum are susceptible. Consider that 54 percent of Democrats believe that Russia either “definitely” or “probably” changed voting tallies in the United States to get Trump elected. Although investigations are still ongoing, so far there’s been no evidence of direct tampering of voter records.
Many are at a loss when trying to explain these findings and have blamed a combination of “fake news,” politicians, and slanted media.
Certainly misleading media reports and hyperpartisan social media users play a role in promoting misinformation, and politicians who repeat outright falsehoods don’t help. But research suggests something else may be going on, and it’s no less insidious just because it can’t be blamed on our partisan enemies. It’s called information avoidance.
"I Don’t Want to Hear It"
Social scientists have documented that all of us have a well-stocked mental toolkit to ward off any new information that makes us feel bad, obligates us to do something we don’t want to do, or challenges our worldview.
These mental gymnastics take place when we avoid looking at our bank account after paying the bills or shirk scheduling that long overdue doctor’s appointment. The same goes for our political affiliation and beliefs: if we’re confronted with news or information that challenges them, we’ll often ignore it.
One reason we avoid this sort of information is that it can make us feel bad, either about ourselves or more generally. For instance, one study found that people didn’t want to see the results of a test for implicit racial bias when they were told that they might subconsciously have racist views. Because these results challenged how they saw themselves—as not racist—they simply avoided them.
Another series of experiments suggested that we’re more likely to avoid threatening information when we feel like we don’t have the close relationships and support system in place to respond to new problems. Patients who felt like they lacked a supportive network were less likely to want to see medical test results that might reveal a bad diagnosis. Students who lacked a large friend group or strong family ties didn’t want to learn whether or not their peers disliked them. Feeling like we lack the support and resources to deal with bad things makes us retreat into our old, comforting worldviews.
No Problem? No Need for a Solution
In other cases, people don’t want to acknowledge a problem, whether it’s gun violence or climate change, because they don’t agree with the proposed solutions.
For instance, in a series of experiments, social psychology scholars Troy Campbell and Aaron Kay found that people are politically divided over scientific evidence on climate change, environmental degradation, crime, and attitudes toward guns because they dislike the potential solutions to these problems. Some don’t want to consider, say, government regulation of carbon dioxide, so they simply deny that climate change exists in the first place.
In the study, participants read a statement about climate change from experts paired with one of two policy solutions, either a market-based solution or a government regulatory scheme. Respondents were then asked how much they agreed with the scientific consensus that global temperatures are rising.
The researchers found that Republicans were more likely to agree that climate change is happening when presented with the market-based solution. Democrats tended to agree with the consensus regardless of the proposed solution. By framing the solution to climate change in terms that don’t go against Republican free-market ideology, the researchers suspect that Republicans will be more willing to accept the science.
In other words, people are more willing to accept politically polarizing information if it’s discussed in a way that doesn’t challenge how they view the world or force them to do something they don’t want to do.
Doubling Down on a Worldview
To return to Trump’s supporters: many identify strongly with him and many see themselves as part of a new political movement. For this reason, they probably want to avoid new findings that suggest their movement isn’t as strong as it appears.
Remember those findings that many Trump supporters believe that he won the popular vote? Among Trump supporters, one poll suggests that 52 percent also believe that millions of votes were cast illegally in the 2016 election, a claim Trump himself made to explain his popular vote loss.
Accepting that their candidate lost the popular vote challenges deeply held beliefs that the nation has come together with a mandate for Trump’s presidency and policies. Information that conflicts with this view—that suggests a majority of Americans don’t support Trump, or that people protesting Trump are somehow either “fake” or paid agitators—poses a threat to these worldviews. As a result, his supporters avoid it.
Information avoidance doesn’t address why different people believe different things, how misinformation spreads, and what can be done about it.
But ignoring the effects of information avoidance and discussing only ignorance and stubbornness do us all a disservice by framing the problem in partisan terms. When people on the left believe that only right wingers are at risk of changing the facts to suit their opinions, they become less skeptical of their own beliefs and more vulnerable to their own side’s misconceptions and misinformation.
Research suggests there are three ways to combat information avoidance. First, before asking people to listen to threatening information, affirmation—or making people feel good about themselves—has proven effective. Next, it’s important to make people feel in control over what they get to do with that information. And lastly, people are more open to information if it’s framed in a way that resonates with how they see the world, their values, and their identities.
It’s crucial to recognize the all-too-human tendency to put our fingers in our ears when we hear something we don’t like. Only then can we move away from a media and cultural environment in which everyone is entitled to not just their own opinions but also their own facts.
Lauren Griffin is Director of External Research for frank, College of Journalism and Communications, University of Florida. Annie Neimand is Research Director and Digital Strategist for frank, College of Journalism and Communications, University of Florida.