Facebook and Instagram’s parent company announced on Wednesday that it had taken down over 600 accounts, pages, and groups linked to a Chinese influence operation spreading COVID-19 misinformation, including one purporting to be a fictitious Swiss biologist.
The China-based network was one of six Meta, formerly known as Facebook, networks that were shut down in November for abusing the company’s platforms, serving as a reminder that bad actors all over the world are using social media to spread false information and harass opponents.
Facebook is now disclosing how frequently users are exposed to bullying or harassing posts.
Other operations included one supporting Hamas and two others focusing on the migration crisis on the countries’ shared border, both based in Poland and Belarus.
Meta also took down a network linked to a European anti-vaccination conspiracy movement that harassed doctors, elected officials, and journalists on Facebook and other platforms, as well as a group of accounts in Vietnam that reported activists and government critics to Facebook in an attempt to get them banned.
The company was alerted to an account purporting to be a Swiss biologist named Wilson Edwards, which led to the discovery of the China-based operation (no such person exists). In July, the account claimed that the US was pressuring World Health Organization scientists to blame China for the COVID-19 virus on Facebook and Twitter. The posts alleging US intimidation were quickly picked up by Chinese state media.
“This campaign was a hall of mirrors, endlessly reflecting a single fake persona,” wrote Ben Nimmo, a Meta researcher who investigates influence operations, in the company’s report. Meta linked the operation to Chinese individuals and people “associated with Chinese state infrastructure companies located around the world,” he said.
The Chinese operation was an example of what Meta refers to as “coordinated inauthentic behavior,” in which adversaries use fake accounts for influence operations, as Russian operatives did in the run-up to the 2016 U.S. presidential election by impersonating Americans on Facebook.
However, Meta’s security team has recently shifted its focus to uncovering accounts of real people who are collaborating to harm others both on and off Facebook.
That was the justification for deactivating a network of accounts in Italy and France linked to the anti-vaccination movement known as V_V.
According to research firm Graphika, the group primarily coordinates on the messaging app Telegram, but “appears to primarily target Facebook, where its members display the group’s double V symbol in their profile pictures and swarm the comments sections of posts advocating for COVID-19 vaccines with hundreds of abusive messages.” The group has also defaced health facilities and attempted to hinder public vaccination programs, Graphika noted.
The people behind the network, per Meta, used real, duplicate, and fake accounts to comment on Facebook posts in large numbers and intimidate people.
Meta said it is not banning all V V content but will take further action if it discovers more rule-breaking behavior. It did not specify how many accounts from the network were removed.
Even as it gets better at detecting and removing accounts that break its rules, the company admits it’s still playing a cat-and-mouse game.
“Adversarial networks don’t strive to neatly fit our policies or only violate one at a time,” Nathaniel Gleicher, Meta’s head of security policy, wrote on Wednesday in a blog post. “We build our defenses with the expectation that they will not stop, but rather adapt and try new tactics. “