Tackling online harms – A regulation minefield? Present and future

Share on FacebookTweet

Report

Session date
to

Session ID
36

Session page

[Read more session reports and live updates from the EuroDig 2019]

The session was opened and moderated by Ms Virginija Balciunaite (Communications and PR Officer, Sunium). She mentioned the variety of harms in the digital sphere and pointed to the importance of adopting future-looking rules to tackle them.

Mr Michael Oghia (Advocacy and Engagement Manager, Global Forum for Media Development), the focal point of the workshop WS8: Fending off trolls – Journalists in defence of democracy, explained that some of the major discussion points involved online disinformation. During the workshop they had found that rapid alert systems and the support of good journalism to combat disinformation could be adequate solutions to address this issue.

Ms Elena Perotti (Executive Director of Media Policy and Public Affairs, World Association of Newspapers and News Publishers), the focal point of the workshop WS 12: Play the villain – Learn to fight disinformation with news literacy, noted that the term fake news should not be used any more with regard to disinformation given that the term has lost its initial meaning and explained that dis-/ misinformation or propaganda are the more appropriate terms. She said that the term fake news has nowadays been instrumentalised to attack reporters doing their legitimate work. She further mentioned that media and news literacy need to go hand in hand.

Ms Lorna Woods (Professor of Internet Law, University of Essex) first spoke about the Carnegie UK Trust project which identified that a key problem around social media and online harms was that the debate had mostly been framed around the question whether social media platforms were publishers or intermediaries. However, they came to the conclusion that these were not apt representations of how social media works nor is being used. Instead, they found that the Health and Safety at Work Act provided a better analogy for their research because it allowed the drawing of parallels between social media platforms and public space. This analysis took the focus away from content and focused more on the systems that the social media companies or other intermediary companies provide and more particularly, how they influence user behaviour. From there, they found that social media platforms should have a statutory duty of care and should think about the systems that they put in place.

This approach was also taken by the recently published UK Online Harms White Paper which, among other things, refers to safety by design. Woods noted that the white paper looks at codes of conducts and includes measures of taking down obviously harmful content but puts less emphasis on issues such as cyber-bullying.

Woods pointed out that the government is thinking of this not just as self‑regulation of platforms but that this should be enforced through a regulator.

Mr Jan Kleijssen (Director, Information Society – Action against Crime, Council of Europe (CoE)) noted that for the CoE the European Convention on Human Rights is the starting point from which they view online harms. The Council also takes the position that rights and obligations that apply offline also apply online. Therefore, it is the responsibility of states to ensure freedom of expression and that no harm is done.

Kleijssen pointed out that companies have not always reacted as fast as they should have regarding the fulfilment of their responsibilities and that their terms of references are still unclear. He noted that this void is quickly filled by regulators when things go wrong and mentioned the German Netzdurchsetzungsgesetz (Network Enforcement Act) which forces platforms to take down hate speech within 24 hours under the threat of imposing high fines to companies failing to abide by it.

According to Kleijssen, algorithms will play an increasing role in taking down harmful contents given that human moderators will not have the ability to filter all harmful content. He emphasised the importance of looking at the underlying ethics and the respect of human rights of these algorithms and noted that the CoE has started to align ethics frameworks to create a more coherent framework across its member states.

Moreover, Kleijssen mentioned that the rights of human moderators must also be taken into account and warned against outsourcing of the filtering work to countries with weaker human rights frameworks in order to save costs. In terms of digital literacy education, he highlighted that children usually obtain some kind of ‘traffic education’ but that this is a mission on the ‘digital highway’.

Mr Chris Buckridge (External Relations Manager, RIPE Network Coordination Centre (RIPE NCC)) explained that RIPE NCC manages the distribution of Internet Protocol (IP) addresses and focuses more on the network infrastructure. He noted that the regulation of these two layers of the Internet is quite different but that the discussion must be about the intersection.

He mentioned the importance of identifying the scope of the problem and spoke about the importance of creating a common understanding of them in order to strengthen the co-operation between all the stakeholders. This includes the greater involvement of regulators who need to better understand the technical issues that exist or could arise in light of a certain regulation and vice-versa. According to Buckridge, this is particularly important given that certain online harms do not necessarily originate from ill intentions. Buckridge further noted that it is helpful to turn to the existing laws rather than to try to adopt new measures.

Ms Meri Baghdasaryan (ARAC Law Office, Armenia) explained that tackling online harms is a multidimensional issue and that regulation alone will not be enough to solve it. She further emphasised the importance of avoiding over-regulation and explained that the crucial element is rather the extensive dialogue with the technical community to ensure the success of regulatory measures.

She also mentioned that in practice, cases are not only resolved by new rules but that the existing ones also need to be enforced. However, in certain cases, the enforcement of rules proves difficult given the power imbalance between major platforms and small states or the lack of responsiveness from either regulators or businesses. She noted that these occurrences create a sense of impunity which is often abused by actors who exploit this void.

Finally, Baghdasaryan identified the low level of digital literacy as another issue in the fight against online harms given that this makes it challenging to educate people against the dangers of disinformation.
 

By Cedric Amon

Share on FacebookTweet