Days before Germany’s federal elections, Facebook took what it called an unprecedented step: the removal of a series of accounts that worked together to spread COVID-19 misinformation and encourage violent responses to COVID restrictions.
The crackdown, announced Sept. 16, was the first use of Facebook’s new “coordinated social harm” policy aimed at stopping not state-sponsored disinformation campaigns but otherwise typical users who have mounted an increasingly sophisticated effort to sidestep rules on hate speech or misinformation.
In the case of the German network, the nearly 150 accounts, pages and groups were linked to the so-called Querdenken movement, a loose coalition that has protested lockdown measures in Germany and includes vaccine and mask opponents, conspiracy theorists and some far-right extremists.
Facebook touted the move as an innovative response to potentially harmful content; far-right commenters condemned it as censorship. But a review of the content that was removed — as well as the many more Querdenken posts that are still available — reveals Facebook’s action to be modest at best. At worst, critics say, it could have been a ploy to counter complaints that it doesn't do enough to stop harmful content.
“This action appears r