The impact of Big Tech on democracy has never been more apparent and regulators are starting to heed it. In recent months, the European Union has passed two of the most important laws aimed at regulating Big Tech companies and their impact on polarization. On the heels of this policy advance, Ashoka social entrepreneur Anna-Lena von Hodenberg — founder of HateAid – itself had a precedent-setting victory: A German court ruled that Facebook (now known as Meta) is responsible for removing “identical and pithy” illegal hate speech from its platform once detected. For more information, Julia Kloiber from SuperrrLab spoke with Anna-Lena about the future of technology policy in Europe and what it could mean for democracy.
Anna-Lena von Hodenberg, founder of HateAid, a Berlin-based organization helping victims of † [+]
Julia Kloiber: Anna-Lena, you started HateAid in 2018 to support victims of digital violence. Can you tell us a bit about the case of German politician Renate Künast and her quest for justice in the face of defamation on social media?
Anna-Lena von Hodenberg: For seven years, a popular meme circulated on Facebook: a photo of Renate Künast above a quote attributed to her. The problem was that the quote was fake. The meme was reported several times by various people, including the politician himself. In some cases, Meta even added the message, “This has been fact-checked and is fake.” So the platform acknowledged that this was defamation, a criminal offense, but the meme was still there. At the time, Meta’s notification and takedown procedure required the victim to search the platform and report each meme one at a time. But the meme was shared so often that it would literally take a lifetime to do so.
Kloiber: How did you get Meta to tackle the problem? And what does the outcome mean for all of us?
von Hodenberg: There are several tools you can use to regulate Big Tech, and one of them is litigation. So we filed a defamation charge against Meta. For the most part, this case showed how successful we can be in defending user rights in court, but what we specifically achieved with this verdict was that Meta is now required to proactively find and eliminate all “identical and kernel” instances of this content. remove . We’ve proven that Meta has the technology to filter out all identical memes and remove them when they’re not being used for journalistic content, for example. They should also proactively look for this and remove any content that is clearly illegal. Remember that they themselves determined that this content was illegal. If they fail to do so, they will be fined 250,000 euros and must pay Renate Künast 10,000 euros in damages.
More broadly, this verdict means that if you are a victim of digital violence, the burden on you to solve the problem is lightened. It is now the platform’s responsibility to remove illegal content. Meta will of course appeal the verdict, but we are ready to defend it in all cases.
Kloiber: On a personal note, how did you come to this work? What inspiration or life experience led to it?
from Hodenberg: Growing up in Germany, the legacy of the Holocaust was very much alive. A sense of responsibility to never let it happen again was ingrained in me from a young age. After the Trump election and Brexit, we saw people, especially right-wing extremists, learn how to manipulate the discourse on the internet, use digital violence and algorithms to silence people, drive them out of public debate and spreading misinformation. Individuals with fake accounts could now actually change the course of elections. I was shocked when this was first investigated, and I thought of Nazi Germany, where propaganda was the key to coming to power. Here we go again: propaganda is used to silence people who speak out against it. I’m passionate because this really is one of the biggest threats to our democracies, and it’s time to defend them.
Kloiber: Do you have any advice for pushing back against Big Tech, for everyone who reads?
von Hodenberg: It’s so important to discuss the issue, to make people think, “Yes, it’s unfair that I, as an individual, have the burden of dealing with harassment. Meanwhile, with so many resources, this profitable platform is doing nothing.” The fact that we made this case public, that it was picked up in many newspapers and that people started talking about it, was probably as damaging to Meta as the verdict itself.
Kloiber: So what was Meta’s argument as to why they shouldn’t be required to remove fraudulent or defamatory content?
von Hodenberg: They didn’t say they weren’t obligated to take it down. They argued that it was technically difficult and also too expensive for them to distinguish between harmful content and, for example, journalistic content. So we had a review done by a Berkeley professor who is very well known in the tech world, and he proved that Meta is in fact quite capable of finding this content. And we argued in court that it’s reasonable to moderate and remove it as well.
Kloiber: Are these non-human solutions, such as upload filters, of any use?
von Hodenberg: Platforms rely heavily on it. The result is a lot of arbitrary substantive decisions, because AI can only go so far. What our case showed was that hugely profitable platforms like Meta need to invest more in human content moderators.
Kloiber: The European Union recently passed the Digital Services Act (DSA), which the Financial Times called “groundbreaking rules for overseeing major technology platforms”. What’s groundbreaking about it? And where does the new law fall short?
von Hodenberg: What’s groundbreaking is that platforms are now required to provide transparency about their algorithms, which we’ve only glimpsed through leaked documents. The DSA also requires platforms to allow researchers and NGOs access to the platform to conduct research. Another positive is that users now have the right to appeal against substantive decisions. For example, in the past, if you flagged a rape threat to the platform and they refused to remove it, your only recourse was the courts. With this new DSA provision, it is a lot easier to appeal against unlawful content decisions.
Disappointingly, the regulation of porn platforms was excluded from the DSA, even though we are seeing more and more women fall victim to image-based abuse. Uploaders on these platforms don’t even need to create an account, so if the woman wants to press charges, the police won’t be able to identify the culprit. We required uploaders on adult platforms to verify, but this didn’t make it.
Kloiber: How do you balance the American concept of free speech with these regulations?
von Hodenberg: The problem with a totally indulgent attitude to free speech is that when the loudest voices are free to intimidate, a majority of other users are silenced. Here in Germany, for example, more than 50 percent of 18- to 35-year-olds have said they are hesitant to express their political views online because of the threat of digital violence. What about their freedom of expression? We have to look at it from both sides.
Kloiber: What makes you optimistic when you look at the next five years?
von Hodenberg: In just three years we have won two important cases. There is more talk on the subject and the DSA is being carried out. People and governments are becoming aware that there are serious problems that need to be addressed. They are beginning to see that social media is not just something we use, but a digital public space that we can create ourselves.
This is part of a series on the future of Technology & Humanity†