Far-Right Fake News Pundits Get Highest User Engagement On Facebook
Facebook may be canceling fake news, but misinformation is still extremely popular on the platform — especially if you’re a right-wing pundit spreading it.
Researchers at the Cybersecurity for Democracy project at New York University have confirmed what we all knew instinctively: Far-right, high-profile leaders get more engagement per follower than anyone else on Facebook — when they post incorrect information.
(via Wired): …while left-leaning and centrist publications get much less engagement if they publish misinformation, the relationship is reversed on the far right, where news organizations that regularly publish false material get up to 65 percent more engagement than ones that don’t.
Drawing from 2,973 Facebook pages of U.S. news sources, analyzed for partisanship and accuracy by independent organizations NewsGuard and Media Bias/Fact Check, the team found that the far-right category performed differently than every other group. While the far-left, slightly-left, center and slightly-right categories all had higher engagement scores for sources deemed “credible” versus those that were not, far-right sources seemed to flourish once a “misinformation” label was applied. Reportedly, regular far-right sources had 259 interactions per thousand followers in an average week. By contrast, far-right “misinformation” drew 426 interactions per thousand followers in that same time period.
This could set up a toxic mix with Facebook’s algorithm, which blindly guides members toward content with higher engagement. That means more eyes are seeing right-wing misinformation because it checks the technical boxes — even though it poisons the information pool.
In an email, a Facebook spokesperson said, “This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook. When you look at the content that gets the most reach across Facebook, it’s not at all as partisan as this study suggests.”
Prior social media research matches up with this latest report, however. In a 2018 study published in Science, MIT’s director of the Initiative on the Digital Economy, Sinan Aral, and his colleagues found that falsehoods spread on Twitter “farther, faster, deeper, and more broadly than the truth.”
He admits, however, that Facebook doesn’t and won’t fully reveal how user engagement and its algorithmic recommendations are related.