1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
PoliticsEurope

New European center fights online hate speech

Stephanie Höppner
April 20, 2023

The EU is working to crack the algorithms of Big Tech companies. The European Center for Algorithmic Transparency in Seville is meant to aid with the task. Will the center help reduce online hate speech and propaganda?

https://p.dw.com/p/4QMad
The server of a supercomputer with wires and flashing lights
Who decides what we see online? A new EU center is working to decode Big Tech algorithms to find out. Image: Sebastian Gollnow/dpa/picture alliance

Racist and sexist insults, other forms of hate speech, and even calls for violence have long been a serious problem online. The tone on social media platforms like Facebook and Twitter can be very hostile and aggressive.

Social media have long been providing users with much more than mere entertainment: They are also frequently misused for information warfare, proliferating fake news, and even mobilizing extremist groups. On video platforms like Tiktok, which is primarily used by younger people, content is spread that encourages eating disorders and even self-harm.

Social media is for more than just storing content

The operators of these social media platforms have been reluctant to take action so far, even though experts say they are at least partially responsible. "These platforms are not just places where user-generated content is stored, there also contain algorithmic systems that determine how the content is distributed," said Josephine Ballon. She's from the organization HateAid, which advocates for victims of online hate speech. 

Supporters Jair Bolsonaro storming the government district in Brasília
Social media can be abused to mobilze extremists. In 2023, supporter of former Brazilian President Jair Bolsonaro stormed the government district in Brasília. Image: Ueslei Marcelino/REUTERS

That means that although Facebook and other social media platforms are not directly accountable for content, they are accountable for the way it is shared through their complex algorithms. These systems gather data to determine what content users can and cannot see on their feed. The posts that users like and comment on are calculated into what is displayed to them. But htese algorithms also include processes that are difficult to understand from the outside.

"These algorithms are not public, so we're mostly in the dark regarding how they actually function," Josphine Ballon told DW. Whistle blowers like former Facebook employee Frances Haugen have released documents that indicate that Facebook algorithms give higher ratings to polarizing content that spreads hate — which in turn has devastating consequences for the whole of society.

Searching for the big picture

Now the EU is trying to tackle the problem head on and resolve this seemingly unsolvable issue. In November 2022, the Digital Services Act (DSA) came into effect. It regulates online services, and was intended to clarify rules for the internet in the EU, especially for platforms that have more than 45 million users. The law also requires service providers to investigate, analyze, and evaluate systemic risks in their services, including their algorithmic systems. 

Do algorithms make radicals?

In addition, the law grants researchers more access to the data of Big Tech. In the future, big tech companies will be required to annually present the EU Commission with a risk assessment report on harmful content, along with planned countermeasures.

"What distinguishes the Digital Service Act from other regulations we've had at the national level is the transparency aspect. The law allows a first glimpse into the technical functionality of the platforms," Josephine Ballon explained. "Of course, we already know some things from whistleblowers and analyses, but we still don't have the big picture, because the platforms have been permitted to act in secret."

Looking under the hood of large social media platforms

The European Center for Algorithmic Transparency (ECAT) was officially inaugurated on Tuesday. Soon, around 30 employees including AI experts, data scientists, and social scientists will advise the EU Commission on the implementation of the DSA, while remaining in contact with other experts. The center and the law aim to make the algorithms that large tech companies use to recommend content to their users more transparent.

A close-up image of Josephine Ballon
Josephine Ballon from HateAid advocates for victims of online hate speech. She says they've been fumbling in the dark for years due to a lack of information. Image: Andrea Heinsohn

According to EU Commission vice-president Margrethe Versager, the center will "look under the hood of the very large online platforms and very large online search engines for the first time, to see how their algorithms function and contribute to the spread of illegal and harmful content, which too many Europeans have been exposed to."

No longer at the mercy of Big Tech

But are the EU's efforts sufficient? "We see the DSA as an important step, because it turns the narrative around a bit and says we're no longer at the mercy of Big Tech and technological developments. As a democratic society, we can decide what terms we want to set and which conditions the platforms must adhere to," Angela Müller from the organization AlgorithmWatch told DW. "I don't think DSA is especially revolutionary, but it's certainly an important step in the right direction." In particular, EU countries that were largely unregulated stood to benefit from the measure, she added.

Elon Musk gesturing with his hands
Soon after billionaire Elon Musk took over Twitter, he announced his intention to prohibit content on the platform as little as possibleImage: Jim Watson/La Nacion/Zumapress/picture alliance

With this new access to data, the actions of Facebook and other social media platforms can be monitored. "But one important question remains: Will the platforms try to wiggle out of the regulations? I'm convinced that they'll do everything in their power to water down these requirements," Angela Müller said.

Another problem she pointed out was the staffing of the new center and what's known as capacity building, which is the collecting of expert knowledge from those running the platforms. "I doubt there are people there who will actually address these issues," Müller said.

This article was translated from German.