Dynamic Coalition on Platform Responsibility

Session:

19 Dec 2017 - 09:00 to 10:00

Report

[Read more session reports and live updates from the 12th Internet Governance Forum]

The session was organised by Dr Luca Belli, Head of Internet Governance at the Center for Technology and Society at FGV, and Dr Nicolo Zingales, Lecturer in Competition and Information Law at Sussex Law School, both coordinators of the Dynamic Coalition on Platform Responsibility (DCPR). The aim of the session was to debate the challenges online platforms pose to human rights and data governance introduced in the book Platform Regulations: How Platforms are Regulated and How they Regulate Us. Zingales stressed three key aspects of online platforms: they have duties that are not clearly defined by law; public functions have been delegated to platforms in the case of the right to be forgotten in which the Court of Justice did not impose clear safeguards to the freedom of expression; and the legal uncertainty of some prohibited content regarding hate speech in which the European Commission decided that platforms have to react quickly to remove illegal content to prevent racism and hate speech.

Mr David Kane, UN Special Rapporteur for the Protection and Promotion of the Right to Freedom of Opinion and Expression, emphasised that platform regulation has been perceived as an issue of democratic governance around the world. European regulators are taking steps to regulate this space. The volume will help policymakers and regulators to reflect on platforms.

Ms Julia Reda, Member of the European Parliament and Vice-Chair of the Greens/European Free Alliance, noticed that the European Union has dealt with the issue of content removal. The European Commission currently encourages platforms to use automated content removal technology, specifically in the copyright directive. In addition, she affirmed that the algorithms of the platforms are not transparent, which can lead to discriminatory treatment.

Ms Emily Laidlaw, Associate Professor at the University of Calgary Faculty of Law, underlined challenges encompassing platform responsibility and human rights: 1. Online abuse of freedom of expression. If a company defines freedom of expression more narrowly than the law, it should be responsible for censorship. 2. Platforms should report annually on their human rights impacts, such as companies outside the Internet already do. 3. Right to remedy. General principles should be developed and applied to the dispute resolution systems.

Ms Maryant Fernández Pérez, Senior Policy Advisor at European Digital Rights (EDRi), affirmed that in Europe there is a trend to deal with policy objectives on platforms. The European Commission advises platforms to establish an understanding and respectful fundamental rights framework.

Mr David Erdos, Lecturer in Law at  the University of Cambridge, affirmed that the right to be forgotten is a broad concept that limits access to personal data. The decision of the Court of Justice recognised that search engines should be considered as controllers of data.

Mr Krzysztof Garstka, Information Governance Research Associate at the University of Cambridge, claimed the need for international consensus on the issue.

Ms Judith Herzog and Mr Lofred Madzou of the French Digital Council, highlighted that platform accountability should be built on two main pillars: transparency and visibility. Companies must give means to the users to check out what is done with their data. 

Ms Krisztina Huszti-Orban, Senior Research Officer at the University of Essex, focused on the human rights experience related to content regulation by social media platforms when dealing with terrorist and extremist events. There is no universally agreed definition of terrorism or violent extremism.  Definitions are found in a host of domestic laws; domestic standards that are relevant to this area are extremely diverse. There has been sustained criticism of many domestic definitions of terrorism and violent extremism by human right bodies, NGOs, and other stakeholders for  encroaching on freedom of expression.  Under some of the definitions, almost any kind of view that deviates from the social norms accepted by the majority may be suppressed and measures may target thoughts, beliefs, and opinions as opposed to actual conduct.  When it comes to online platforms carrying out quasi enforcement and adjudication functions, states cannot leave to companies to regulate the processes.

Ms Natasha Tusikov, Assistant Professor in Criminology at York University, stressed that the US government, the UK government, and the European Commission have employed strategies of encouraging or coercing Internet intermediaries to assume greater enforcement responsibilities, including legislation and in the case of Google in the United States, threats and actual criminal investigation.  

By Ana Maria Corrêa

 

The GIP Digital Watch observatory is provided by

in partnership with

and members of the GIP Steering Committee



 

GIP Digital Watch is operated by

Scroll to Top