Meta and Twitter purge web of accounts spreading pro-U.S. propaganda abroad
In the post-2016 U.S. election world, secretive online campaigns designed to influence global politics are nothing new. And while U.S. adversaries like Russia, Iran and China have been linked to government propaganda on major social media sites, there’s been little evidence to date of the U.S. and its allies leveraging the same techniques to shape […]
In the post-2016 U.S. election world, secretive online campaigns designed to influence global politics are nothing new. And while U.S. adversaries like Russia, Iran and China have been linked to government propaganda on major social media sites, thereâs been little evidence to date of the U.S. and its allies leveraging the same techniques to shape international opinion, though itâs difficult to imagine they wouldnât.
This week, for the first time, social networks in partnership with social analytics research group Graphika and the Stanford Internet Observatory have surfaced a pro-U.S. influence campaign, operating across Twitter, Facebook, Instagram and other social media apps. Both Meta and Twitter stopped short of attributing the activity to any specific government or organization, with Twitter listing the campaignâs âpresumptive countries of originâ as the U.S. and the U.K. while Meta identified the U.S. as the âcountry of originâ for the activity.
The network of accounts sought to influence public opinion in Asia and the Middle East by promoting pro-Western viewpoints in those regions, including criticism of Russiaâs invasion of Ukraine. The accounts sometimes linked to news stories from U.S.-funded media outlets Voice of America, Radio Free Europe and other websites with financial ties to the U.S. military.
âWe believe this activity represents the most extensive case of covert pro-Western [influence operations] on social media to be reviewed and analyzed by open-source researchers to date,â Graphika and the Stanford Internet Observatory wrote in the paper analyzing the activity.
Like online influence campaigns with origins elsewhere, this operation also relied on fake personas sporting AI-generated faces made with a technique known as GANs (generative adversarial networks). These artificially-generated faces are used to evade detection, but because their visual signatures often end up with small anomalies, GANs can sometimes be easily detected. Using the fake personas, the influence campaign posed as independent journalists, using short videos, memes, hashtag campaigns and petitions to drive its messaging.
In spite of its multi-pronged approach, the campaign wasnât able to get much traction, with most posts and tweets only picking up a âhandfulâ of likes or interactions. Less than a fifth of the fake accounts accrued more than 1,000 followers, though two of the more popular accounts linked to the campaign explicitly claimed connections to the U.S. military.
According to the researchers, most of the accounts began operating in 2019, though a small group targeting Afghanistan began posting on Twitter in late 2017. Out of the fake accounts, some activity continued until July 2022 on Facebook and Instagram.
The wing of the influence campaign focused on Central Asia operated across Facebook, Instagram, Telegram, YouTube and two Russia-specific social platforms. Accounts targeting Russian-speaking users in the region pushed narratives that praised U.S. aid and criticized Russian foreign policy and Chinaâs domestic treatment of Muslim minority communities.
Another cluster of accounts and pages concentrated on spreading pro-Western ideas in the Persian language, at times promoting content connected to the U.S. military. Some of the content mocked the Iranian government, highlighted discrepancies around womenâs rights in the country and criticized Iranâs foreign policy decisions. An adjacent set of accounts targeting social media users in Afghanistan also criticized Iran along with the Taliban and the Islamic State.
âWith few exceptions, the study of modern [influence operations] has overwhelmingly focused on activity linked to authoritarian regimes in countries such as Russia, China and Iran, with recent growth in research on the integral role played by private entities,â the researchers wrote. âThis report illustrates the wider range of actors engaged in active operations to influence online audiences.â