Skip to Content

Right-wing misinformation on Facebook is more engaging than its left-wing counterpart, research finds

Shutterstock

By now it’s well known that Facebook has a misinformation problem. The company has tried to address it in various ways, from labeling false claims to reducing its visibility in users’ feeds. But a new study from researchers at New York University finds that not all misinformation on Facebook is created equal when you factor in political ideology.

According to the research, accounts rated by outside media watchdogs as being far-right and frequent spreaders of misinformation are far more likely to generate likes, shares and other forms of engagement on their respective Facebook pages than right-wing sources of reliable information — which in turn are better at generating engagement than left-wing sources of misinformation.

The results provide evidence that right-wing sources of misinformation are some of the most engaging content creators on Facebook, said Laura Edelson, a researcher at NYU’s Cybersecurity for Democracy initiative.

“My takeaway is that, one way or another, far-right misinformation sources are able to engage on Facebook with their audiences much, much more than any other category,” Edelson said. “That’s probably pretty dangerous on a system that uses engagement to determine what content to promote.”

Facebook didn’t immediately respond to a request for comment.

The researchers conducted the study by analyzing nearly 3,000 sources of news and information that operate large Facebook pages and have been reviewed by NewsGuard — a journalistic integrity rater used by the Pentagon and State Department, among others — as well as Media Bias / Fact Check, which provides similar assessments of media reliability and bias.

Using CrowdTangle, a Facebook-owned tool that displays engagement data for any given public Facebook post, the researchers looked at five months of post data from August 2020 to January 11, 2021.

The researchers found that right-wing sources that NewsGuard and Media Bias / Fact Check had flagged as repeat misinformation sharers had an average weekly engagement rate of 426 interactions for every 1,000 followers. That’s nearly 65% higher than what the researchers found for other right-wing sources not associated with misinformation, which had on average 259 weekly interactions per 1,000 followers.

By comparison, sources identified as being in the political center and not deemed consistent misinformation-sharers had an average of just 79 weekly interactions for every 1,000 followers. Sources identified as frequent sharers of left-wing misinformation had even lower engagement — clocking in at roughly 60 interactions per 1,000 followers over the five-month period.

The report steered clear of trying to explain why right-wing misinformation is so highly engaging — it simply noted the existence of the trend. Facebook, the report said, doesn’t provide researchers enough access to its data to determine if, for example, the company’s algorithms may be favoring misinformation with a certain partisan bent.

The study warns that engagement isn’t the same as prevalence. Just because a Facebook post may generate lots of comments or shares doesn’t necessarily mean it was viewed widely. Facebook doesn’t make that type of data available either, according to the report. If it did, the researchers said, the company could help the public understand the reach of misinformation. It also doesn’t appear that volume has anything to do with it; Edelson said her team’s research showed that right-wing sources of misinformation saw more engagement on a per-post basis, not that the sources were “flooding the zone” with lots of posts that receive little engagement.

Facebook has said in the past that news content makes up only a small portion of what most users see in their feeds, and in any case, since 2018, CEO Mark Zuckerberg has pushed to shift the Facebook experience toward “personal” interactions and away from engagement with media posts.

But since 2018, the problem of misinformation has only become more intense for social media platforms. And Wednesday’s research suggests that when it comes to ideologically competing false claims, conservatives may have the upper hand.

“Any algorithm that prioritizes engagement in determining what to promote doesn’t just run the risk of giving priority to misinformation,” Edelson said. “It will, in our current media landscape, also have a partisan skew.”

Article Topic Follows: Money and Business

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

News Channel 3-12 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content