Facebook and YouTube’s vaccine misinformation problem is simpler than it seems
On Friday, President Biden said Facebook is “killing people” by spreading misinformation about the coronavirus vaccines. On Monday, he changed his tune. “Facebook isn’t killing people,” he amended, instead blaming a handful of disinformation merchants who use the platform.
Whether Facebook is or isn’t killing people depends on your definitions. What’s clear, regardless, is that Facebook, YouTube, and other social media platforms have played a major role in the anti-vaccine movement. And they continue to do so, despite some sincere efforts by the companies to combat the trend.
Untangling exactly who’s at fault, and to what degree, is nigh impossible, especially because the companies carefully guard the data that would help researchers understand the problem. But there is at least one critical element of social media’s misinformation problem that’s quite simple, once you grasp it — and that helps to explain why none of their interventions so far have solved it. It’s that the recommendation and ranking software that decides what people see on social platforms is inherently conducive to spreading falsehoods, propaganda and conspiracy theories.
Before social media, the dominant information sources — newspapers, magazines, TV news shows — had all sorts of flaws and biases. But they also shared, broadly speaking, a concern for the truth. That is, they subscribed to an expectation that factual accuracy was core to their mission, even if they didn’t always get it right. (There are exceptions, of course, including a certain cable news network that has broadcast more than its fair share of vaccine misinformation.)
Facebook and YouTube didn’t set out to become dominant information sources. They set out to connect and amuse people, and to make lots of money. [Continue reading…]