‘The Big Delete:’ At the Heart of Germany’s Facebook Crackdown | Social Media News

Days before the German federal election, Facebook took what it called an unprecedented step: removing a series of accounts that worked together to spread disinformation about COVID-19 and encourage violent responses to government restrictions. COVID.

The crackdown, announced Sept. 16, was the first use of Facebook’s new “coordinated social harm” policy aimed at stopping not state-sponsored disinformation campaigns, but other typical users who have made efforts. increasingly sophisticated in circumventing rules on hate speech or disinformation.

In the case of the German network, the nearly 150 accounts, pages and groups were linked to the so-called “Querdenken” movement, a loose coalition that has protested against the lockdown measures in Germany and includes opponents of vaccines and masks, conspiracy theorists and some far-right activists.

Facebook touted the move as an innovative response to potentially harmful content; far-right commentators condemned it as censorship.

But a review of the content that was removed — along with the many other Querdenken posts still available — revealed that Facebook’s action was modest at best.

At worst, critics say, it could have been a ploy to counter complaints that it’s not doing enough to stop harmful content.

“Rather, this action appears to be motivated by Facebook’s desire to demonstrate its action to policy makers in the days leading up to an election, and not an overall effort to serve the public,” concluded researchers from Reset, a nonprofit organization. UK-based maker who criticized the role of social media. in democratic discourse.

“Coordinated Inauthentic Behavior”

Facebook routinely notifies journalists of accounts it removes under policies prohibiting “coordinated inauthentic behavior,” a term it created in 2018 to describe groups or individuals who work together to mislead others.

Since then, he has deleted thousands of accounts, mostly what he said were bad actors trying to interfere in elections and politics in countries around the world.

But there were constraints, because not all harmful behavior on Facebook is “inauthentic”; many perfectly genuine groups use social media to incite violence, spread misinformation and hate. So the company was limited by its policy on what it could delete.

But even with the new rule, one problem remains with the takedowns: They don’t specify what harmful material remains on Facebook, making it difficult to determine exactly what the social network is doing.

Example: the Querdenken network. Reset had previously monitored accounts deleted by Facebook and released a report concluding that only a small portion of content relating to Querdenken had been removed while many similar posts were allowed to remain online.

The dangers of COVID-19 “extremism” were underscored days after Facebook announced that a young German gas station employee was fatally shot by a man who refused to wear a mask.

The suspect followed several far-right users on Twitter and had expressed negative views about immigrants and the government.

The so-called ‘Querdenken’ movement is a loose coalition that has protested against lockdown measures in Germany and includes vaccine and mask opponents, conspiracy theorists and some far-right activists [File: Christian Mang/Reuters]

‘A beginning’

Facebook initially refused to provide examples of the Querdenken content it removed, but eventually published four Associated Press posts that were no different from the content still available on Facebook.

They included a message falsely stating that vaccines create new viral variants and another wishing death on the police who broke up violent protests against COVID restrictions.

Reset’s analysis of Facebook-deleted comments found that many were actually written by people trying to refute Querdenken’s arguments, and did not include misinformation.

Facebook defended its action, saying the account deletions were never meant to be a blanket ban on Querdenken, but rather a carefully measured response to users who worked together to violate its rules and spread harmful content.

Facebook plans to refine and expand its use of the new policy in the future, according to David Agranovich, director of global threat disruption at Facebook.

“It’s a start,” he told the AP on Monday. “We’re the ones extending our network disruption model to deal with new and emerging threats.”

The approach seeks to strike a balance, Agranovich said, between allowing for diverse opinions and preventing the spread of harmful content.

‘Test case’

The new policy could represent a significant shift in the platform’s ability to deal with harmful speech, according to Cliff Lampe, a University of Michigan information professor who studies social media.

“In the past they tried to crush cockroaches, but there are always more,” he said. “You can spend all day tapping your feet and you’re not going anywhere. Tackling networks is a smart try.

While the removal of the Querdenken network may have been justified, it should raise questions about Facebook’s role in democratic debates, said Simon Hegelich, a political scientist at the Technical University of Munich.

Hegelich said Facebook appears to be using Germany as a “test case” for the new policy.

“Facebook is really intervening in German politics,” Hegelich said. “The COVID situation is one of the biggest issues in the election. They’re probably right that there’s a lot of misinformation on these sites, but nonetheless, it’s a highly political issue, and Facebook is stepping in. in there.

Members of the Querdenken movement reacted angrily to Facebook’s decision, but many also expressed a lack of surprise.

“The big suppression continues,” posted a supporter in a still active Querdenken Facebook group, “See you in the street.”

Members of the Querdenken movement reacted angrily to Facebook’s decision, but many also expressed a lack of surprise [File: Bloomberg]