Germany’s Battle Against Online Hate Speech

Regina Nagel has learned that caution should be her guiding principle on the internet. She keeps posts and comments to a minimum and pays attention to who her friends are on Facebook. But all these precautions were still not enough. Last September, a friend shared one of her posts that ended up sparking shocking hate speech. Nagel is a parish officer in the Catholic Church. At the same time, she is also a member of a reform movement within the Church known as the Synodal Path, which was created in response to the child abuse scandal that continues to rock the Catholic Church. .

The author of the most vitriolic comment in response to Nagel’s post was a right-wing Catholic who posts under his own name online, including derogatory comments about women in pastoral office. He insulted Nagel personally and, in a later post, used misogynistic language to refer to his appearance. “It’s completely beyond pale, of course. But I didn’t let it get to me,” she said. “After all, it wasn’t really a direct threat to my health and safety, as it has been for a number of politicians.” However, two friends reported the abusive comment to Facebook. But so far nothing has been done, and the text is still there for all to read.

Social networks are mired in similar cases and the tone on the net was already often very aggressive, even before the COVID-19 pandemic. Women are the most affected, especially women with a high public profile. That is why the largest social media platforms, such as Facebook or YouTube, are now obliged to remove hate messages in Germany.

From February 1, a new, even stricter legal framework will apply. Networks with more than 2 million users will not only have to remove illegal content, but they will also have to register this content and the user’s IP address with the Federal Criminal Police Office (BKA) – in principle , at least. Abusive messages will be directed to a new central unit for reporting criminal content on the Internet (ZMI). Some 200 officials will process the reports. But the fear is that Germany’s structures for combating and containing internet hate crimes remain almost entirely toothless.

Incitement to hatred, threats of murder

First, the type of insult or defamation directed at Nagel will not be affected by the changes. “In cases of defamation, for example, there will be no significant improvement. This is because, as the law currently stands, the will of the victim is crucial in deciding whether charges are brought. whether or not regarding allegations of defamation,” said Josephine Ballon, senior legal adviser at the organization HateAid, which supports victims of hate crimes on the internet.

But the decision whether or not to report an incident – and the content involved – to the BKA police unit is made by the social media platforms themselves, not by the users. The BKA’s investigations should focus on illegal content such as incitement to hatred, in the form of anti-Semitic and racist comments, threats of murder and the use of unconstitutional symbols. In essence, Ballon said, this is the right kind of approach.” Moreover, she said, “the basic idea behind this strategy is to ensure that the handling of hate speech reports is accelerated, leading to more lawsuits. And that can only be welcomed.”

The new BKA Reporting Unit is designed to ease the growing workload already faced by prosecutors in several states, as well as to facilitate the fastest possible access to IP addresses. This is particularly important since these addresses are only stored for a few days in Germany. But it is not the competent authorities who decide whether there is sufficient initial evidence to open an investigation. Instead, the social network in question decides whether a case should be brought to the attention of the BKA, which then decides whether an investigation should take place.

Normally, that decision is made by state attorneys — for good reason. After all, it is only after the prosecutors have made their assessment that the police begin their investigation. At present, it is not common for the police to decide on their own whether or not to open an investigation. However, this should be the case in the new reporting unit.

This partly explains why Facebook and Google are appealing the new procedure, which for now means they don’t have to report cases to BKA investigators. Ballon said she believed the call was not unwarranted. “The main fear is that the BKA will become a huge data swamp with an incredible amount of reports being processed without a court or a state prosecutor taking a look and deciding: is it a criminal offense or not?

Marginal cases difficult to decide

Cases involving social media posts are often neither black nor white, and it can be difficult to come to a definitive decision, Ballon warned. To illustrate her scruples, she pointed to the area of ​​unconstitutional symbols. “It’s important to differentiate. Was a swastika, for example, posted by someone with fanatical far-right beliefs, which is effectively a crime? Or is it perhaps a swastika posted in an educational or artistic context? Which, in this case, would be allowed,” she said. The worst-case scenario would be for artists and activists to decide not to get involved in a debate and that people who didn’t post anything illegal “end up buried in BKA filing cabinets”, she added.

Talk to DW, the BKA was keen to allay these concerns. The agency insisted that the vast majority of those employed at ZMI would be ordinary law enforcement officers, whose duties would also include checking whether complaints received by the agency were illegal. Overall, the process developed so far ensures that at each phase of the assessment of claims, there should be intensive consultation with the judiciary with the aim of recognizing and respecting the roles of actors such as normatively defined. For example, that the BKA cooperates closely with the Central Cybercrime Registration Office (ZAC) and the prosecutors in Cologne.

There are no plans to create a central database, and reports of incidents “without criminal relevance are closed by judicial authorities and all data previously transmitted to the BKA are deleted as soon as possible”, the BKA said. However, HateAid said the flood of reports could mean this process could take up to a year – a year during which the BKA remains in possession of information about people who have not published anything that could be considered as criminally relevant.

Social networks still largely unmonitored

The problem remains that it is still up to the big platforms like Facebook or Google to report potentially criminal content to the BKA. That’s why Ballon of HateAid doesn’t expect big improvements from February 1st. “If people ignore it, nothing will change,” she said.

His pessimism is shared by Leonhardt Träumer, founder of ReportHate (Hassmelden), an organization that is also active on behalf of victims of online hate crimes. With the responsibility of flagging potentially incriminating content always with the people operating the platforms, Träumer said it’s “as if all the thousands of security and surveillance cameras installed by the German authorities at airports, train stations and elsewhere were operated by an American company.”

Employees of this company were monitoring the recordings, “and only when this entirely private US company reported that potentially criminal activity had been detected on the recorded material would it then allow a tiny snippet of the recordings to be transmitted to the German authorities. This is clearly not an effective system,” Träumer said. He said it’s likely that his small independent hate speech reporting unit will one day find themselves performing a task “that is not really our responsibility”. It could, he believes, take years before the courts rule on the actions brought by Google or Facebook.

Comments are closed.