Blame excess information for spread of fake news

Top Stories

A new study reveals the mathematics underlying this phenomenon, modelling how information overload can erode an individual’s ability to distinguish high-quality information from its opposite, causing falsehoods to propagate
A new study reveals the mathematics underlying this phenomenon, modelling how information overload can erode an individual's ability to distinguish high-quality information from its opposite, causing falsehoods to propagate

It's hard to tell the difference between fabricated and legitimate news as both freely mix in our feeds online

By Eoin O'Carroll (Virtual insanity)

  • Follow us on
  • google-news
  • whatsapp
  • telegram

Published: Wed 28 Jun 2017, 9:00 PM

Last updated: Wed 28 Jun 2017, 11:46 PM

A lie can travel halfway around the world, goes the well-known Mark Twain quote, before the truth can get its boots on.
Twain himself might have appreciated this quotation's self-reflexivity: There's no record of him ever having said or written it.
Today, with half of Americans now turning to social media for news, many of us are getting misinformation - for instance, that NASA has contacted intelligent extraterrestrials, that a "breatharian" couple can survive on a "food-free lifestyle" - mixed in with the legitimate news articles in our feeds. And, as the news cycle accelerates, it's becoming harder to tell the difference.
A new study reveals the mathematics underlying this phenomenon, modelling how information overload can erode an individual's ability to distinguish high-quality information from its opposite, causing falsehoods to propagate. But with a little effort, readers and social media platforms can cut the information surplus, perhaps sharpening our powers of discernment.
"On a daily basis," says Daniel Levitin, a professor of psychology and behavioural neuroscience at McGill University in Montreal, "the onslaught of information is preventing us from being evidence-based decision makers, at our own peril."
Misinformation is as old as culture itself, and the phenomenon uncovered in this study shows its spread is not limited to one kind of social media.
"Many arguments around gossip and rumours are really driven by the same social mechanisms," says Brian Uzzi, the co-director of Northwestern University's Institute on Complex Systems in Evanston, Ill. "The internet has essentially turbocharged the inclination of human beings to behave this way in regard to news and facts."
A paper published in the journal Nature Human Behaviour by an international team of researchers offers a mathematical model that demonstrates that, as information load increases, so do the odds that low-quality information will go viral.
"It was the first paper I've seen in this area that quantifies what many people thought was happening, and that's basically with limited attention we're unable to see the full range of potential arguments or sides of the story," says Dr Uzzi, who has studied how social media users isolate themselves into echo chambers.
Using mathematical modelling, a team led by Xiaoyan Qiu and Diego Oliveira of Indiana University's Center for Complex Networks and Systems Research statistically confirmed what many have suspected: When flooded with a steady stream of high- and low-quality information, even the most critical readers start to lose their ability to tell fact from fiction.
"Even when individual users can recognise and select quality information," says study co-author Filippo Menczer, a professor of informatics and computer science at Indiana University, "the social media market rarely allows the best information to win the popularity contest."
The researchers suggest that social networks could curb information overload by aggressively limiting content shared by so-called bot accounts, software agents that flood social networks with low-quality information.
"Deceptive bots can be quite sophisticated and hard to recognize even for humans. And huge numbers of them can be managed via software, so it is difficult for operators to keep up," says Dr Menczer.
The research reveals some of the math that drives what psychologists have long known: Information overload makes it harder to make decisions. "The key point of this article is what neuroscientists have been what showing on the biology side has very practical, real-world implications in our daily lives," says Dr Levitin, the author of Weaponized Lies: How to Think Critically in the Post-Truth Era.
Levitin notes that the average American is exposed to about five times as much information than in 1986. "In the old days, I'd get the newspaper in the morning and I'd read about what happened yesterday," he says. "Now, everyone seems to be addicted to what happened five minutes ago."
Levitin recommends unplugging from the internet for a couple hours each morning and again each afternoon. "If you're constantly checking your phone for the latest news, you're allowing your thoughts to become disrupted and fractionated and it becomes harder and harder to concentrate, and you get addicted to this constant stimulation," he says. "So I think what we can do is give ourselves a break."
But taking a break can be difficult for people who have become accustomed to steady social media contact with friends and family. "Trouble arises when we use the same networks to access news," says Menczer, who advises against defriending or unfollowing those with different opinions, because echo chambers make users more susceptible to misinformation.
"We hope that by now, citizens and policymakers from across the political spectrum recognise the need for research to study digital misinformation and how to make the web more reliable," says Menczer. "We are all vulnerable to manipulation irrespective of our political leanings."
- The Christian Science Monitor
 


More news from