Why Are We Our Own Worst Enemy When It Comes To Misinformation?

When it comes to pinpointing the cause of misinformation, it’s easy to point the finger at both traditional and new media, or even malignant governments deliberately attempting to distort the debate.  We don’t tend to apportion blame to ourselves quite as often, but a new study from Ohio State University suggests that may be a mistake.

It shows that even when we’re given accurate data on a topic that is inherently controversial, we often forget those numbers and instead revert to our commonly held belief.  What’s more, these numbers often get more and more distorted each time we pass them along.

“People can self-generate their own misinformation. It doesn’t all come from external sources,” the researchers say.  “They may not be doing it purposely, but their own biases can lead them astray. And the problem becomes larger when they share their self-generated misinformation with others.”

The spread of misinformation

The researchers conducted a couple of experiments, the first of which saw participants presented with a short description of a range of societal issues, each of which involved numerical information.

Some of these issues were such that the perceptions of volunteers broadly matched reality, such as with support for same-sex marriage, and in other issues, the perception was off, such as the number of Mexican immigrants in the United States.

When quizzed on the numbers in each of the descriptions, the results showed that people were generally okay at remembering the stats on issues that chimed with their view of the world, but rubbish at remembering those stats that didn’t.

So why was this poor recall so common?  It wasn’t down to a lack of attention, as eye tracking technology showed that people were reading the content well enough.  What the recordings did reveal however was a difference in behavior depending on whether the stats confirmed our views or not.

“We could tell when participants got to numbers that didn’t fit their expectations. Their eyes went back and forth between the numbers, as if they were asking ‘what’s going on.’ They generally didn’t do that when the numbers confirmed their expectations,” the researchers say.  “You would think that if they were paying more attention to the numbers that went against their expectations, they would have a better memory for them. But that’s not what we found.”

Viral misinformation

The second study then explored how these incorrect recollections become more distorted each time we share them.  The study resembled the game of telephone, in which volunteers passed some statistics down a chain. The first person in the chain had the correct information, and from memory they had to write it down and pass it on, with the next person doing likewise, and so on.

You would imagine the first person in the chain would be guaranteed to pass on the right information, but this didn’t occur, and instead they would usually change it so that it met their preconceptions.  This misinformation then became more and more distorted the further down the chain it spread.

“These memory errors tended to get bigger and bigger as they were transmitted between people,” the researchers say.

While the findings were derived at from a purely experimental setting, and therefore may not be strictly applicable in a more real-world environment, the researchers nonetheless believe they are sufficient to give us grounds for concern, especially in terms of the role we play in the spread of misinformation.

“We need to realize that internal sources of misinformation can possibly be as significant as or more significant than external sources,” they conclude.  “We live with our biases all day, but we only come into contact with false information occasionally.”

Facebooktwitterredditpinterestlinkedinmail