Close Button
Newsletter Button

Sign up for our newsletter

The latest from Inc. Southeast Asia delivered to your inbox.

By signing up for newsletters, you are agreeing to our Terms of Use and Privacy Policy.
THE INC. LIFE

Psychology Finally Has an Explanation for Why False Stories Spread Like Fire on Twitter

The biggest threat to truth doesn’t come from technology. It comes from two natural human tendencies.

Share on
BY Wanda Thibodeaux - 09 Mar 2018

PHOTO CREDIT: Getty Images

If the past year or two has reaffirmed anything for us, it’s that you can’t believe everything you read, particularly on social media. Now experts say they’ve pinned down just how bad false stories are on Twitter, digging deep into the psychological elements fueling their rapid spread.

 

Research led by Soroush Vosoughi and published in the journal Science used six independent fact-checking organizations (e.g., Snopes, Politifact) to look at the spread of roughly 126,000 stories verified as either false or true between 2006 and 2017. When they analyzed the Twitter archive for mentions of the verified stories, tracing the way the information spread, they found that true news was lucky if it spread to more than 1,000 people. False stories, on the other hand, spread to as many as 100,000. It wasn’t just politics that had a problem, either. False stories spread faster than truth in every information category.

 

Two reasons we love to share the lies

 

The authors of the study hypothesize that false news stories were spread more not because the people tweeting had tons of influence, but rather because people are drawn to and like to share what’s novel. This happens on the neurological level. The brain constantly is looking for what’s different and unique, and when it finds it, you get a little hit of dopamine that prompts you to repeat the behavior. This is good in the sense that it keeps you curious and motivated to keep searching, but it’s bad in the news context in that you’re more likely to click on and promote what’s more outrageous.

 

But there’s another element that’s pure psychology. The authors point out that we tend to see people with novel or new information as being in-the-know-;that is, we see them like insiders. And we are desperate to be associated with insiders and be seen as them ourselves, because we don’t want to be cut off from the larger group. We associate that isolation with pain, suffering and, even more dramatically, death and the inability to survive.

 

But couldn’t it be bots?

 

Actually, nope. To test their novelty theory, the researchers honed in on the emotional responses people had to the stories that got shared. False stories showed responses including fear, disgust and surprise. True stories showed responses including anticipation, sadness, joy and trust. But when the researchers used a bot detection algorithm, they found that the bots didn’t discriminate, spreading true and false stories equally. This suggests that, not only are people responsible for the minefield of news trash bombs we see, but our share rates connect to specific feelings. If you can scare, gross out or shock, people are more likely to pass your information along.

 

If bots aren’t the real issue, how should the fight shift?

 

In recent months, companies like Facebook have taken deliberate steps to reduce the amount of false stories bots spread. The underlying assumption has been that, if leaders can curb bots, they can get the false stories more under control. They want that because they know that readers, customers and investors view companies as more trustworthy when the content is true. But this study suggests that the problem lies more in biology and human nature than in a controllable systemic or tech-based issue.

 

The ethics questions for business leaders thus are these: Should we go beyond looking primarily at data source control, instead ensuring that the content itself doesn’t trigger powerful negative emotions or play to the desire for inclusion? Should we assume that any given story is false just because it makes us feel the way false stories so often do, understanding that reality sometimes is incredibly harsh? And if we do filter based on anticipated emotional response, how do we do that while still respecting free speech and taking cultural differences into account?

 

I don’t have an answer. But I would argue that people have much more of an issue not knowing they’re being manipulated than with the manipulation itself. After all, take marketing. That’s arguably manipulation at its finest, and we even teach others how to do it. The key is, people deserve to be aware. Transparency matters.

inc-logo Join Our Newsletter!
The news all entrepreneurs need to know now.

READ MORE

This Surprising Walmart Acquisition Could Forever Change E-Commerce and Much More

Read Next

Southwest Airlines Just Invested $4.68 Billion In the Planes That Offer Miserable, Tiny Bathrooms (Here’s the Good News)

Read Next