“Major Removal:” Inside Facebook’s Action in Germany

Texas News Today

A few days before the German federal elections, Facebook took unprecedented steps. A series of account deletions that helped spread misinformation about COVID-19 and encouraged a violent response to COVID restrictions.

The action, announced on September 16, is the first use of Facebook’s new “cooperative social harm” policy, which eschews hate speech and false information rules, rather than state-sponsored propaganda campaigns. Its purpose is to thwart the average user who has embarked on an increasingly sophisticated effort.

In the case of the German network, about 150 accounts, pages and groups were associated with the so-called Keldenken movement. It is a loose coalition that opposes the blockade in Germany and includes vaccine and mask opponents, conspiracy theorists and far-right. Original.

advertisement

Facebook touted the move as an innovative response to potentially harmful content. A far-right commenter accused it of censorship. However, a review of the deleted content and the many QueerDenken posts still available suggests that Facebook’s behavior is modest at best. In a worst-case scenario, critics say it could have been a tactic to counter complaints that it was not enough to stop harmful content.

“This action appears to be driven by Facebook’s desire to act on policymakers in the days before the election, not a broader effort to serve the public,” Social Media said. That’s the conclusion researchers from Reset, a UK-based non-profit organization that has criticized Facebook’s role. Democratic discourse.

Facebook regularly recruits journalists for accounts it created in 2018 under a policy that prohibited “cooperative fraud,” a term used to describe groups and people working together to mislead. is done to. I am updating A malicious person who is trying to interfere in elections and politics in countries around the world.

advertisement

However, there were restrictions because not all of Facebook’s harmful behavior was “real”. There are many completely genuine groups that use social media to incite violence, spread false information and spread hatred. ..

However, even with the new rules, the problem of deletion remains. It’s hard to tell what the social network is achieving, as it’s not clear what harmful content remains on Facebook.

A good example: the QuerDenken network. Reset has already been monitoring accounts that Facebook has removed and issued a report concluding that only a small portion of QuerDenken-related content was removed and that many similar posts were retained.

The dangers of COVID-19 extremism were highlighted days after Facebook was announced when a young German gas station worker was shot by a man who refused to wear a mask. The suspect followed some far-right users on Twitter and expressed negative views on immigration and the government.

advertisement

Facebook initially declined to provide examples of the deleted QuerDennecan content, but eventually issued four posts to the Associated Press that did not detract from the content still available on Facebook. He included a post that falsely stated that the vaccine would create new viral variants, and another post that wanted police to die, sparking violent protests against COVID restrictions.

An analysis of Facebook resetting deleted comments found that many were actually written by people trying to refute QuerDenken’s argument and did not contain false information.

Facebook said deleting an account does not mean a complete ban on QueryDenken, but rather a carefully measured response to users who were working together to spread harmful content in violation of the rules. I defended that action.

Facebook has plans to improve and expand the use of the new policy in the future, according to Facebook’s Director of Global Threat Destruction, David Agranovich.

advertisement

“This is the beginning,” he told the AP on Monday. “This is to extend the network disruption model to address emerging threats.”

This approach aims to strike a balance between accepting diverse views and preventing the spread of harmful content, Agranovic said.

According to Cliff Lampe, a professor of information at the University of Michigan who studies social media, the new policy could make a big difference in the platform’s ability to combat harmful speech.

“In the past they tried to crush cockroaches, but there is always more,” he said. “I can spend the whole day stepping on my feet and I can’t go anywhere. Chasing the network is a sensible endeavor.”

Simon Hegerich, a political scientist at the Technische Universitat München, said removing Keldenken’s network may be appropriate, but it should call into question Facebook’s role in the democratic debate.

According to Hegerik, Facebook is using Germany as a “test case” for the new policy.

advertisement

“Facebook is really interfering in German politics,” Hegerich said. “The Covid situation is one of the biggest issues in the election. It is probably true that there is a lot of false information in these sites, but it is still a very political issue and Facebook is interfering in it. Growth.”

Members of the Querdenken movement reacted angrily to Facebook’s decision, but many expressed a lack of surprise.

“Major removals in progress,” a supporter posted in the still-active QuerDenken Facebook group.

Reported by Providence, Rhode Island, Klepper contributed to this report from Oakland, California.

Copyright 2021 AP Communications. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

LEAVE A REPLY

Please enter your comment!
Please enter your name here