Facebook missed billions of opportunities to tamp down misinformation ahead of the 2020 presidential election. That’s the conclusion of a new report from Avaaz, an advocacy group that researches misinformation online.
Avaaz researchers analyzed 100 of the most popular Facebook pages that have repeatedly spread false claims. According to their analysis, posts shared by those pages were viewed more than 10 billion times between March and October. The report also faults Facebook’s fact-checking policies, noting that “the top 100 false or misleading stories related to the 2020 elections” were viewed 162 million times in three months, even as Facebook’s fact-checkers debunked the claims.
“Although Facebook claims to slow down the dissemination of fake news once fact-checked and labeled, this finding clearly shows that its current policies are not sufficient to prevent false and misleading content from going viral and racking up millions of views,” the report says.
The report comes just days before Mark Zuckerberg will face questions from members of Congress on Facebook’s role in fueling election misinformation. Zuckerberg, along with Twitter CEO Jack Dorsey and Google CEO Sundar Pichai, are scheduled to appear at a House Energy and Commerce Committee hearing on March 25.
Facebook didn’t immediately respond to a request for comment, but a company spokesperson disputed Avaaz’ findings in a statement to Time, saying that the researchers’ methodology was “flawed.” The spokesperson also pointed to Facebook’s efforts to ban militarized social movements, including QAnon.
But the report says Facebook waited too long to implement many of its most important changes, including the “emergency” features that slowed down sharing immediately after the election. Likewise, the company’s crackdown on QAnon and other groups that glorified violence also came too late, according to Avaaz. The most problematic groups had “already gained significant traction” by the time Facebook took action against them.
“Moreover, Facebook again prioritized piece-meal and whack-a-mole approaches – on individual content for example – over structural changes in its recommendation algorithm and organizational priorities, thus not applying more powerful tools available to truly protect its users and democracy,” Avaaz writes. Zuckerberg announced in January that he wanted to reduce the amount of political content in News Feed, and that Facebook would permanently end algorithmic recommendations for political groups.
Members of Congress are likely to raise many of the same issues highlighted in the report. The hearing, titled “Disinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation,” is expected to cover how social media companies handled misinformation about the 2020 election and false claims about the coronavirus vaccines.