Davis was appearing before US lawmakers after The Wall Street Journal published a story about how the social network knew that Instagram was “toxic” for teen girls, worsening body image issues and suicidal thoughts. The story, part of an investigative series called The Facebook Files, was partly based on documents leaked by former Facebook product manager turned whistleblower Frances Haugen.
“I want to be clear that this research is not a bombshell,” Davis told lawmakers. “It’s not causal research.”
Sen. Richard Blumenthal, a Democrat from Connecticut, would have none of it.
“It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children,” Blumenthal said of the research, adding that the company had “concealed those facts and findings.”
The testy exchange underscores the lack of trust that lawmakers, as well as society at large, have in the world’s largest social network. The company, which recently rebranded under the name Meta, has dealt with nonstop criticism that it doesn’t do enough to combat hate speech, misinformation and other offensive content, especially in developing countries. Fueled by the leaked documents, criticism that the social network puts profits over user safety grew louder in 2021. Meta, which has 3.6 billion users, says its research is being mischaracterized and points to the more than 40,000 people working on safety and security as evidence it takes these concerns seriously. The public uproar didn’t stop the company from pushing ahead with efforts to create the metaverse, a virtual environment where people can work, socialize and play games together.
Here are some of the company’s major moments in 2021:
Capitol Hill Riot
The Jan. 6 riot, during which supporters of President Donald Trump stormed the US Capitol to disrupt Congress’ certification of Joe Biden’s presidential victory, put social media companies under a harsh light. Trump had been pushing unfounded claims on social networks that the 2020 election was stolen from him.
Citing concerns about inciting more violence after five people died at the riot, Facebook, along with Twitter, Snapchat and Google-owned YouTube, made the unprecedented move to boot Trump from their platforms.
Facebook suspended Trump indefinitely, a decision the social network’s independent oversight board agreed with. Trump will remain suspended from Facebook until at least 2023, at which point the social network said it “will look to experts to assess whether the risk to public safety has receded.”
In March, US lawmakers pressed Facebook CEO Mark Zuckerberg; Jack Dorsey, who was Twitter’s CEO at the time; and Google CEO Sundar Pichai about the roles their platforms played in the January attacks on the US Capitol.
The Trump ban fueled allegations that Facebook and other social networks were censoring conservative speech, which those companies continue to deny. In October, Trump said he’s launching a new social network called Truth Social to push back against Big Tech.
Facebook accused of ‘killing’ people
Tension between Facebook and Biden flared in July when the president accused tech platforms, including the social network, of “killing people” by allowing misinformation about COVID-19 vaccines to spread online.
Facebook pushed back against Biden’s accusations, noting that the company directed more than 2 billion people to reliable information about COVID-19 and vaccines. More than 3.3 million Americans have used Facebook’s vaccine finder tool, the company said at the time.
Biden backtracked and clarified that he meant it’s vaccine misinformation that’s “killing people,” not Facebook itself.
“My hope is that Facebook, instead of taking it personally, that somehow I’m saying Facebook is killing people, that they would do something about the misinformation, the outrageous misinformation about the vaccine,” Biden said.
A whistleblower sets off another wave of concern
Facebook’s PR nightmares continued to grow after Haugen revealed herself as the whistleblower who copied thousands of pages of documents before leaving the company.
“Facebook, over and over again, chose to optimize for its own interests, like making more money,” she told 60 Minutes’ Scott Pelley in an October interview. Haugen has also appeared before Congress twice and testified before the UK Parliament.
Following The Wall Street Journal’s coverage, a consortium of news organizations published a series of stories from the documents, which they called the Facebook Papers, about the social network’s failure to police content in developing countries, teens migrating from the platform and Facebook employees’ complaints that the social network hadn’t done enough ahead of the 2020 US elections.
Lawmakers aren’t done scrutinizing Facebook about its potential mental health effects on young people. Instagram CEO Adam Mosseri is scheduled to testify before Congress for the first time on Wednesday. A group of state attorneys general is also investigating whether the social network violated state consumer protection law by promoting its social media app Instagram to children and teens even though it knew of the service’s harms.
Facebook rebrands, looks toward the future
Facebook unveiled a big rebranding in October, renaming itself as Meta in a nod to its belief that the metaverse will be “the successor to the mobile internet.” The metaverse is a vaguely defined digital environment in which people will gather, some as 3D avatars, to play and work.
The company is moving forward with efforts to advance that vision. In 2021, the company released its first pair of smart glasses with Ray-Ban. The glasses allow people to shoot photos and 30-second videos with the press of a button. They also play music and podcasts, as well as make calls. The release of the glasses showed that Facebook is one step closer to building glasses with augmented reality effects.
In November, the company unveiled a haptic glove prototype that mimics the sense of touch. It’s has also been researching wrist-based technology that can sense your neural signals and track your intentions, even without you moving.