Leaked company documents are…

Leaked company documents are providing more insight into how Facebook moderates political content….


Facebook is under more scrutiny because of leaked internal research and documents.

Angela Lang/CNET

Facebook’s executives reportedly resisted efforts to dial back features that help amplify false and inflammatory content ahead of the 2020 US election because they feared doing so could harm the platform’s usage and growth.

The Wall Street Journal, citing leaked internal documents, said Facebook’s employees suggested changes such as killing the reshare button or stop promoting reshared content unless it was from a user’s friend that could slow the spread of viral content for everyone. A proponent of making these types of changes has been Kang-Xing Jin, who heads Facebook’s health initiatives, according to the report. But executives such as John Hegeman, Facebook’s head of ads, raised concerns about stifling viral content.  

“If we remove a small percentage of reshares from people’s inventory,” Hegeman wrote in internal communications cited by The Journal. “they decide to come back to Facebook less.”

The report is the latest in a series of leaked internal documents and communications that The Journal says shows Facebook has put its profits over the safety of its users. Frances Haugen, who used to work as a Facebook product manager, publicly identified herself as the whistleblower who gathered leaked documents used by The Journal. The findings from these internal documents has reignited scrutiny from US and UK lawmakers. Haugen, who already appeared before Congress, is scheduled to testify before the UK Parliament on Monday.

Facebook didn’t immediately respond to a request for comment, but the company has repeatedly said its internal research and correspondence is being mischaracterized. The moderation of political content, though, has been a hot-button issue for the company as it tries to balance safety with concerns about hindering free speech. Conservatives have also accused Facebook of intentionally censoring their content, allegations the company denies. 

Instead of making changes that would be less likely to raise alarms about free speech, Facebook’s approach to moderating content from groups that it considers dangerous has been described as a game of whack-a-mole by The Journal.

The New York Times, also citing internal documents, reported Friday that Facebook failed to address misinformation and inflammatory content before and after the 2020 US presidential election even though employees had raised red flags about the issue. 

Supporters of Donald Trump, who lost the presidential election to Joe Biden, were posting false claims that the election had been stolen. Facebook has suspended Trump from its platform until at least 2023 because of concerns his comments could incite violence following the deadly US Capitol Hill riot in January.

One Facebook data scientist found that 10 percent of all US views of political content were of posts that alleged the vote was fraudulent, according to The Times.

Live Updates for COVID-19 CASES