Our focus is on how the Digital Services Act will address online hate speech and disinformation.
These phenomena have grown increasingly widespread in recent years as individuals and disruptive actors use the power of online platforms to spread hateful or false information. This harms the collective public interest as harmful content undermines respectful and honest public discourse, and poses threats to public safety given that online hate speech can incite real-world violence. In our research we concentrate on the three big players – Facebook, Youtube, and Twitter.
Online platforms already moderate illegal and harmful content, but they do so with little transparency or accountability.
Social media platforms rely on non-transparent algorithms to identify, delete, or block illegal and harmful content. Content that cannot easily be classified by the algorithm falls to human content moderators, who exercise judgement according to rule set by the platform to classify content, while training the algorithms to further improve their initial classifications. This process is also called content moderation.
Content moderation is not a silver bullet in countering hate speech and disinformation.
Algorithms can fail to decipher the nuance and context of online speech, which leads to wrongful removals and the blocking of legitimate speech. These are reasons why regulators have been hesitant to impose a general monitoring obligation and strict liability for social media platforms; however, indications are that the new Digital Services Act may prescribe proactive measures for companies to strengthen content moderation, and may also lay out stricter provisions to govern algorithms to strengthen transparency and accountability.
The Digital Services Act may include additional regulatory oversight.
The EU Commission is also likely to strengthen regulatory oversight in order to improve the speed and coordination among multiple regulatory environments within the EU where platforms are subject to multiple national and local-specific regulations. A public tender for €2.5 mio has already been advertised to create a European digital media observatory. How this regulator would relate to EU- and national-level institutions, as well as how it would operate in practice, remains unclear.