Tackling Hate Speech: A Multi-Stakeholder Responsibility

Report

event

Session date
to

assignment

Session:
Workshop 150

linkSession page

[Read more session reports and updates from the 14th Internet Governance Forum]

Freedom of speech is one of the principal pillars of a modern democracy. However, where do we draw the line online when hate speech comes into the picture? As Mr Marc Jan Eumann (Director Landeszentrale fuer Medien und Kommunikation (LMK) Rheinland-Pfalz) put it, if we look back in history, this tension has always existed but with the growth of the Internet and newer technologies emerging, it is much more pervasive and widespread. This can be especially problematic since while the scale of online interaction is growing exponentially, there is still no clear legal framework to establish what constitutes harmful content. There are also technical difficulties when it comes to law enforcement related to online behaviour. So, how can we tackle online hate speech? Who should be responsible to counter online hate?

Network enforcement law in Germany arose with the realisation that social networks have to assume greater responsibility for the content that gets published in their platforms and authorities have to take more responsibilities to ensure that victims are protected and there is some effective enforcement. Mr Thomas Blönik (Head of Subdivision​ at the Federal Ministry of Justice and Consumer Protection) explained that so far, the new regulations focus on establishing the offences which are in line with criminal law and set the basis for any kind of compliance system which has to be put in place by the networks. Some examples are incitement to crime and terrorism, incitement to hatred, abuse and child pornography. This is just a first step towards the fine balance between freedom of speech and the question of criminal prosecution. This discussion should not include criminal content. The legal framework is now undergoing further development. For instance, the figure of a media director who covers media sharing platforms has been introduced.

The session also covered the issue from the perspective of the victims of hate speech. As a lawyer who represents victims of online hate speech Mr Chan-jo Jun (Advocate for IT law) has seen its consequences in victim’s lives, stressed that regulations are there because they are needed. We have decided as a society that it is in the public interest to protect the dignity of man and minorities in particular. These interests may conflict with the interests of social networking companies. So, that is the main reason why the community standards of platforms are different from the law, Jun remarked. It is also why we should not leave it to the platforms themselves to regulate this area. The lack of transparency about the new logic of the most powerful platforms constitutes another concern.

Ms Ingrid Brodnig (Journalist and author) conceptualised hate speech as a tool to suppress the views, the visibility, and the opinions of other people. Anyone could be a victim of abusive comments, but some groups are more likely to suffer severe consequences from online abuse. Brodnig quoted a study by The Pew Research Center showing that four in ten Americans say they have experienced online abuse. However, women were found twice as likely to describe severe abuse than men.

From the perspective of large companies Ms Sabine Frank (Google) explained that it is relevant to understand that platforms might give a voice to hate speech, but overall there is much more appropriate content posted by people who might not have access to sharing their views otherwise. Frank said ‘We have 500 hours of new video content every minute and we know that less than 1 percent of this content is either illegal or violating our community guidelines. That doesn't say that there is not an issue and we don't need to take responsibility and scale up our operations. ... I'm just saying that we have to put this into perspective’.

Finally, participants discussed the perspective of online safety for young people. Teaching media literacy at school, for both children and their parents, was considered very important. Breakout groups also considered it a priority to promote critical thinking in general and take a human rights perspective, not only focus on the technical understanding of the online ecosystem.

By Paula Szewach

Share on FacebookTweet