Council of Europe – Internet intermediaries: Shared commitments and corporate responsibility

18 Dec 2017 12:15h - 13:15h

Event report

[Read more session reports and live updates from the 12th Internet Governance Forum]

This session, moderated by Mr Wolfgang Schulz, a professor at the University of Hamburg, featured discussions on the role and responsibilities of Internet intermediaries. Intermediaries are increasingly under scrutiny regarding the extent to which they provide adequate safeguards for privacy, freedom of expression, due process, and the right to information.

Ambassador Corina Călugăru, Committee of Ministers Thematic Coordinator on Information Policy, Council of Europe (CoE), started by introducing the recent work of the CoE on this issue, in particular through the launch of a platform in collaboration with Internet companies  to support the rule of law and human rights online, as well as the draft recommendation of the Committee of Ministers to member states on the roles and responsibilities of internet intermediaries.

Following ‎Amb. Călugăru, Ms Karmen Turk, Attorney, University of Tartu, then detailed the content of these recommendations drafted by the CoE. They address the challenges for both states and private companies in dealing with misinformation online, radicalisation, terrorism, and intellectual property. Each recommendation refers to specific functions that private companies may perform in the digital world, as opposed to being only general recommendations to intermediaries.

Mr Andy O’Connell, Manager of Global Policy Development at ‎Facebook, presented Facebook’s approach and guidelines in dealing with its corporate responsibilities. Facebook’s first mission is to build communities, which can advance human rights. Facebook follows a human rights approach, and is part of the Global Network Initiative (GNI), consistent with the UN guidelines on business and human rights, which require an independent human rights audit process every two years. In terms of content, Facebook relies on a set of community standards, publicly available and informed by dialogues with civil society and academia. When it comes to transparency, there is still room for improvement in providing more explanation of how Facebook deals with hate speech and terrorist content online. This is why Facebook recently launched a news blog aiming to address and better explain its policies.

Mr Marco Pancini, Public Policy Counsel at Google, then developed the perspective of Google on these issues, starting by emphasising the need for a multistakeholder approach to think deeply about the role and responsibilities of Internet actors. Pancini referred to the recent announcements made by Google on the way it deals with content online. The approach taken by Google to fight against problematic content online does not consist of censoring content, but in disabling the engagement it can generate. Through these new policies, Google comes back to its pure hosting status, and intends to prevent the spread of such content. Generally, Google relies on algorithms to flag content, which are then reviewed by individuals.  For instance, 98% of the videos that were recently taken down were identified by automated systems. But Google not only relies on algorithms; its team working on controversial content will number 10,000 people by the end of 2018, as shown by the recent announcement.

Mr Nicolo Zingales, a lecturer at Sussex Law School, first referred to the methodology he had designed with other researchers to score how companies comply with human rights as part of their terms of services. Even for large platforms, individuals are not always in a position to make informed choices in their use of these services. The notification of removal of content is often problematic. Dispute resolution mechanisms in the terms of service often prevent class actions and require mandatory jurisdiction (for instance, in California). When it comes to tackling inappropriate content online, there is a range of measures that can be considered by policymakers, from leaving it to the market to sort out, up to and including state regulation. The more a few companies have a significant share of the market, the more states should be concerned about leaving private companies to decide. Finally, in the era of algorithms, due process requires more than only notification, and there is an increasing need for a right to an explanation of automated decisions.

By Clément Perarnaud