Facebook failed to put fact-check labels on 60% of the most viral posts containing Georgia election misinformation that its own fact-checkers had debunked, a new report says

Mark Zuckerberg
  • Facebook is failing to add fact-check labels to the majority of the most viral posts containing fake news about elections in Georgia, according to a report from nonprofit research group Avaaz.
  • That could have major consequences for voters in the state, where the presidential election was decided by just 12,000 votes, Avaaz campaign director Fadi Quran told Business Insider.
  • Facebook didn't apply fact-check labels to 60% of the top-performing posts containing false election claims — or any label at all to 30% of posts — even though third-party fact checkers had debunked their claims, according to the report.
  • By not applying those labels, Facebook allowed its algorithm to continue to amplify the posts, which were liked, shared, and commented on more than 370,000 times, according to Avaaz.
  • Visit Business Insider's homepage for more stories.

Facebook has promised repeatedly that it would crack down on election-related misinformation.

But ahead of two pivotal runoff elections in Georgia that could decide whether Republicans or Democrats control the US Senate, Facebook is still falling far short of those promises, according to a report from the nonprofit research group Avaaz.

How far short?

Facebook correctly applied fact-check labels to just 40% of the most viral posts containing false claims about Georgia debunked by its own fact-checkers, according to Avaaz, which analyzed a sample of the top performing posts.

For 30% of the posts Avaaz looked at, it found that Facebook simply labeled the posts as election-related content - a move that Facebook knows does little to limit their reach, according to BuzzFeed News. For the remaining 30%, Facebook did nothing at all to inform users that the posts contained claims that fact-checkers had already disproved.

And the misinformation included in Avaaz's report had significant reach as well.

Avaaz found that the 204 posts in its analysis received 643,406 interactions - likes, shares, comments, and clicks - meaning millions more likely saw them. The 60% without a fact-check label accounted for more than 370,000 of those interactions.

Facebook allowing misinformation to circulate so widely in a state like Georgia, where President-elect Joe Biden won by around 12,000 votes and two crucial elections are coming up in January, is "extremely dangerous," Avaaz campaign director Fadi Quran told Business Insider. "Those millions are not getting fact checks and it could impact how people vote or it could suppress people from going to vote."

"Every day Facebook fails to correct the record and demote election disinformers in its algorithm, it rips the fate of this pivotal election out of voters' hands and throws it to the wolves of disinformation and mass confusion that are continuing to put democracy at risk," Quran said.

"We share Avaaz's goal of limiting misinformation. We remain the only company to partner with more than 80 fact-checking organizations, using AI to scale their fact-checks to millions of duplicate posts, and we are working to improve our ability to action on similar posts," a Facebook spokesperson told Business Insider.

"There is no playbook for a program like ours, and we're constantly working to improve it," the spokesperson added.

Facebook AI vs. Avaaz 

Facebook does claim to have a playbook for detecting the "copycat" misinformation posts that Avaaz - an organization with a fraction of its resources - managed to identify.

"Once fact-checking partners have determined that a piece of content contains misinformation, we can use technology to identify near-identical versions across Facebook and Instagram. We then label and reduce the spread of this content automatically - enabling fact-checkers to focus on catching new instances of misinformation rather than variations of content they've already seen," Facebook says on its website.

But Avaaz's analysis suggests either that Facebook's AI systems aren't that good at detecting copycat misinformation or, more likely, that Facebook is refusing to enforce its policies consistently, Quran said, given that the posts weren't hard to find.

Avaaz's researchers rounded up all 12 false claims related to Georgia's elections that Facebook's independent fact-checkers had debunked in the two weeks following the November 3 election. Those fact checks included false stories about topics like dead people voting, mail-in voting, ballot curing, and voter intimidation.

The original misinformation came from high-profile sources including President Donald Trump, Trump's campaign, Tucker Carlson Tonight, and George Takei. In those cases, Facebook labeled the posts with fact checks and limited their reach via its algorithm.

But over the next three days, Avaaz identified the 204 top performing "copycat" posts echoing similar claims, including posts in English and Spanish and from pages as well as groups. Then they waited three more days and looked at whether and how Facebook had labeled those posts.

But 122 of those posts (60%), which together racked up more than 375,000 interactions, never received a fact-check label - meaning Facebook was likely letting its algorithm continue to amplify them.

Real consequences - and solutions

Facebook could actually be making a difference here, Quran said, adding that the 82 posts Facebook did label saw their reach decrease by 80%.

Read more: Facebook is extending its ban on political ads one month as Trump spreads falsehoods around election results. Here's the memo the company sent to political ad buyers.

In its report, Avaaz recommended that Facebook label all variations of the same misinformation and downrank repeat offenders of its misinformation policy while being more transparent with users about how many violations trigger such actions. Avaaz also recommended that Facebook retroactively warn users who interacted with misinformation, something the company reportedly considered doing.

Facebook built and tested a "correct the record" feature that would notify users they had interacted with misinformation and show them a fact-check, but executives vetoed it fearing conservatives would see more notifications, The New York Times reported last week. (Facebook downplayed the effectiveness of the tool, according to The New York Times).

Facebook's willingness to effectively detect and limit the spread of misinformation isn't without consequence, especially given the current political climate in Georgia, where misinformation about the presidential election has had serious impacts.

Earlier this month, Georgia's Republican secretary of state and his wife received texts telling them they deserve "to face a firing squad." This week, a top Georgia election official tore into Trump and Georgia Sens. David Perdue and Kelly Loeffler for not speaking out against the vitriol that election workers and their families are facing.

"Facebook could have played a significant role in reducing the amount of hate or threats that these individuals are facing by applying some of the solutions that we've talked about," Quran said. "They've just failed to do that, and  that's neglect on their part."

Read more: Stacey Abrams helped register 800,000 voters and flipped Georgia for Biden. Here's what anyone can learn from her ability to inspire and influence others.

Read the original article on Business Insider


from Business Insider https://ift.tt/39OLCGe

No comments

Powered by Blogger.