Landing : Athabascau University

Study shows Facebook spreads nonsense more effectively than fact

http://www.alphr.com/science/1002377/study-shows-facebook-spreads-nonsense-more-effectively-than-fact

An interesting side-effect of the way Facebook relentlessly and amorally drives the growth of its network no matter what the costs: stupidity thrives at the expense of useful knowledge.

This study looks at how information and misinformation spread in a Facebook network, finding that the latter has way more long-term staying power and thus, thanks to EdgeRank and the reification of communication, continues to spread and grow while more ephemeral factual pieces of news disappear from the stream. I suspect this is because actual news has a sell-by date so people move on to the next news. Misinformation of the sort studied (conspiracy theories, etc) has a more timeless and mythic quality that is only loosely connected with facts or events, but it has a high emotional impact and is innately interesting (if true, the world would be a much more surprising place), so it can persist without becoming any more or less relevant. It doesn't have to spread fast nor even garner much interest at first, because it persists in the network. All it needs to do is wait around for a while - the Matthew Effect and Facebook's algorithms see to the rest.

There is not much difference between interest in scientific and anti-scientific articles at the start. There is a wave of activity for the first 120 minutes after posting, then a second one 20 hours later (a common pattern). But then the fun starts...

"It’s over the long term that serious differences were observed. While the science news had a relatively short tail, petering out quickly, conspiracy theories tended to grow momentum more slowly, but have a much longer tail. They stick around for a longer period of time, meaning they can reach far more people.

Then there’s another problem with the way Facebook works – the much-discussed echo-chamber effect. This effect is far more active in Facebook than in other networks, with algorithms favouring content from people and groups you regularly interact with. So if you share, Like or even click on conspiracy theories a lot, you’re more likely to be shown them in future, reinforcing the misinformation, rather than challenging it."