The world wide web and social media have opened up a swath of avenues for information creation and consumption. But are people really opening themselves up to divergent information sources?
A study published in Proceedings of the National Academy of Sciences by Italian and U.S. researchers claims that despite the wide availability of information, users—as with other forms of media—surround themselves with sources that align with and bolster their personal beliefs, creating an information echo chamber.
“Using a massive quantitative analysis of Facebook, we show that information related to distinctive narratives—conspiracy theories and scientific news—generates homogenous and polarized communities having similar information consumption patterns,” the researchers write.
The study was produced by researchers from Boston Univ., Sapienza Univ., and the IMT School for Advanced Studies Lucca.
Using data derived from the Facebook Graph application program interface, the researchers collected information from 67 public pages, 32 about conspiracy theories and 35 about science news. A second dataset was composed of two troll pages, which intentionally spread “sarcastic false information” around the Web. All the posts and subsequent user interactions—between 2010 and 2014—were downloaded and fed into an analysis software.
“Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest,” they write. “In particular, we show that social homogeneity is the primary driver of content diffusion, and one frequent result is the formation of homogenous, polarized clusters.”
The researchers believe such behaviors can explain phenomenon like the suspicion surrounding Jade Helm 15, among other stories with misinformation.
The spread of misinformation online is becoming so pervasive that the World Economic Forum listed it as one of their global risks to society in 2013, alongside terrorism and cyberattacks.
According to the researchers, this increasing trend has prompted companies like Google and Facebook to look into ways to rank the trustworthiness of stories and content. “This issue is controversial, however, because it raises fears that the free circulation of content may be threatened and that the proposed algorithms may not be accurate of effective,” the researchers write. “Often conspiracists will denounce attempts to debunk false information as acts of misinformation.”