Content policy

Updates

EURACTIV reports that a Dramatic Parliament vote triggers upheaval of divisive copyright bill as the European Parliament overturned the Legal Affairs Committee's decision last month to approve the copyright bill. Natasha Lomas of TechCrunch said of the results 'Crucially it means MEPs will have the chance to amend the controversial proposals'. The two most controversial articles generating strong debate are Article 11, the so-called 'snipped' provision, which could require companies like Google, Microsoft, and others to pay for use of snippets and links, and Article 13, which would force online platforms to employ filters to prevent the upload of copyrighted materials. 

The Polish, Spanish, and Italian pages of Wikipedia closed down on Wednesday, 4th July, in protest of the European Parliament’s Committee on Legal Affairs vote in favor of copyright articles 13 and 11. Article 13 requires online platforms to filter content for copyright violations, and Article 11, sometimes called a 'link tax' could allow publishers to charge a fee if the link to that content appears on their site. A statement from Academics Against Publishers' Right warns that the proposal 'would likely impede the free flow of information that is of vital importance to democracy'. Cory Doctorow of  the Electronic Frontier Foundation wrote, The EU's Copyright Proposal is Extremely Bad News for Everyone, Even (Especially!) Wikipedia, in an analysis that warns 'While the directive fixes some longstanding problems with EU rules, it creates much, much larger ones: problems so big that they threaten to wreck the Internet itself'.​

According to the Press Gazette in the UK, the European Court of Human Rights in Strasbourg has upheld the decision of Germany’s Federal Court of Justice, allowing three media outlets to continue offering access to material relating to the murder convictions of the two individuals. The two half-brothers, convicted in 1993, and released on probation in 2007 and 2008, had requested anonymity in coverage about them, under the so-called 'right-to-be-forgotten'. The decision noted that the right of the press to freedom of expression and of the public to be informed were important to this case. The court considered aspects such as the fact that these were 'not simply private individuals who were unknown to the public at the time they requested anonymity ... '. Rather the information the in question reported on court proceedings and were part of the body of information necessary to contribute to 'debate in a democratic society.

Pages

One of the main sociocultural issues is content policy, often addressed from the standpoints of human rights (freedom of expression and the right to communicate), government (content control), and technology (tools for content control). Discussions usually focus on three groups of content:

  • Content that has a global consensus for its control. Included here are child pornography, justification of genocide, and incitement to or organisation of terrorist acts.
  • Content that is sensitive for particular countries, regions, or ethnic groups due to their particular religious and cultural values. Globalised online communication poses challenges for local, cultural, and religious values in many societies. Most content control in Middle Eastern and Asian countries, for example, is officially justified by the protection of specific cultural values. This often means that access to pornographic and gambling websites is blocked.
  • Political censorship on the Internet, often to silence political dissent and usually under the claim of protecting national security and stability.

How content policy is conducted

Governmental filtering of content

Governments that filter access to content usually create an Internet Index of websites blocked for citizen access. Technically speaking, this is done with the help of router-based IP blocking, proxy servers, and DNS redirection. Content filtering occurs in a growing number of countries (see opennet.net). 

Private rating and filtering systems

Faced with the potential risk of the disintegration of the Internet through the development of various national barriers (filtering systems), W3C and other like-minded institutions made proactive moves proposing the implementation of user-controlled rating and filtering systems. In these systems, filtering mechanisms can be implemented by software on personal computers or at server level controlling Internet access. This method allows users to implement their own filtering systems without national intervention. It remains to be seen, however, whether governments will sufficiently trust their citizens to create their own filters.

Content filtering based on geographical location

Another technical solution related to content is geo-location software, which filters access to particular web content according to the geographic or national origin of users. The Yahoo! case was important in this respect, since the group of experts involved, including Vint Cerf, indicated that in 70-90% of cases Yahoo! could determine whether sections of one of its websites hosting Nazi memorabilia were accessed from France. This assessment helped the court come to a final decision, which requested Yahoo! to filter access from France to Nazi memorabilia. Since the 2000 Yahoo! case, the precision of geo-location has increased further through the development of highly sophisticated geo-location software.

Content control through search engines

The bridge between the end-user and Web content is usually a search engine, and filtering search results is therefore often used as a tool to prevent access to specific content. The risk of filtering of search results, however, doesn’t come only from the governmental sphere; commercial interests may interfere as well, more or less obviously or pervasively. Commentators have started to question the role of search engines (particularly Google, considering its dominant position in users’ preferences) in mediating user access to information and to warn about their power of influencing users’ knowledge and preferences. This issue is increasingly attracting the attention of governments, which call for increased transparency from Internet companies regarding the algorithms they employ in their search engines. German chancellor Angela Merkel spoke out about this risk, claiming that: 'Algorithms, when they are not transparent, can lead to a distortion of our perception, they can shrink our expanse of information.'

Web 2.0 challenge: users as contributors

With the development of Web 2.0 platforms – blogs, document‑sharing websites, forums, and virtual worlds – the difference between the user and the creator has blurred. Internet users can create large portions of web content, such as blog posts, videos, and photo galleries. Identifying, filtering, and labelling ‘improper’ websites is becoming a complex activity. While automatic filtering techniques for texts are well developed, automatic recognition, filtering, and labelling of visual content are still in the early development phase.

One approach, sometimes taken by governments in an attempt to manage user-generated content that they deem objectionable, is to completely block access to platforms such as YouTube and Twitter throughout the country, or even to cut Internet access completely, hindering all communication on social network platforms (as was the case, for example, during some of the Arab Spring events). However, this can seriously infringe on the right to free speech, and violates the potential of the Internet in other areas (e.g. as an educational resource).

As the debate of what can and cannot be published online is becoming increasingly mature, social media platforms themselves have started to formalise their policies of where they draw the border between content that should or should not be tolerated. For example, Facebook's Statement of Rights and Responsibilities specifies: 'We can remove any content or information you post on Facebook if we believe that it violates this statement or our policies.' Yet, the implementation of such policies sometimes leads to unintended consequences, with platforms removing legitimate content.

Automated content control

For Internet companies, it is often difficult to identify illegal content among the millions of content inputs on their platforms. One possible solution can be found in artificial intelligence mechanisms to detect hate speech, verbal abuse or online harassment. However, relying on machine learning to make decisions as to what constitutes hate speech opens many questions, such as whether such systems would be able to differentiate between hate speech and irony or sarcasm. 

The national legal framework

The legal vacuum in the field of content policy provides governments with high levels of discretion in deciding what content should be blocked. Since content policy is a sensitive issue for every society, the adoption of legal instruments is vital. National regulation in the field of content policy could bring a more predictable legal situation beneficial for the business sector, ensure a better protection of human rights for citizens, and reduce the level of discretion that governments currently enjoy. However, as. the border between justified content control and censorship is delicate and difficult to enshrine in legislation, this tension is increasingly being resolved in the courtroom, for example regarding the role of social media outlets in terrorist activities.  

International initiatives

In response to the increased sophistication with which terrorists manage their activities and promote their ideologies online, multilateral forums have started addressing ways to limit harmful content (e.g. the G7, the UN Security Council, and the United Nations Office on Drugs and Crime). At the regional level, the main initiatives have arisen in European countries with strong legislation in the field of hate speech, including anti-racism and anti-Semitism. European regional institutions have attempted to impose these rules on cyberspace. The primary legal instrument addressing the issue of content is the Council of Europe Additional Protocol to the Convention on Cybercrime (2003), concerning the criminalisation of acts of racist and xenophobic nature committed through computer systems. On a more practical level, the EU adopted the European Strategy to Make the Internet a Better Place for Children in 2012. 

The Organization for Security and Co-operation in Europe (OSCE) is also active in this field. Since 2003, it has organised a number of conferences and meetings with a particular focus on freedom of expression and the potential misuses of the Internet (e.g. racist, xenophobic, and anti-Semitic propaganda, and content related to violent extremism and radicalisation).

The role of intermediaries 

The private sector is playing an increasingly important role in content policy. Internet Service Providers, as Internet gateways, are often held responsible for the implementation of content filtering. In addition, Internet companies (such as Facebook, Google, and Twitter) are becoming de facto content regulators. Google, for example, has had to decide on more than half a million requests for the removal of links from search results, based on the right to be forgotten. These companies are also increasingly involved in cooperative efforts with public authorities in an attempt to combat illegal online content. 

Events

Actors

(ICT4Peace)

In the area of online content policy, the ICT for Peace Foundation is engaged in activities concerning the use

...

In the area of online content policy, the ICT for Peace Foundation is engaged in activities concerning the use of the Internet for terrorist purposes. The Foundation is organising events and producing publications on this issue, with the main aim of raising awareness and promoting a multistakeholder dialogue on possible solutions for countering terrorist use of the Internet. Together with the United Nations Counter-Terrorism Executive Directorate, the organisation runs a global engagement project working with other stakeholders to develop community standards around the prevention of violent extremism online, consistent with UN principles, including in the area of human rights.

(UNHRC)

Privacy and data protection online has been the subject of many UNHRC resolutions.

...

Privacy and data protection online has been the subject of many UNHRC resolutions. General resolutions on the promotion and protection of human rights on the Internet have underlined the need for states ensure a balance between cybersecurity measures and the protection of privacy online. The Council has also adopted specific resolutions on the right to privacy in the digital age, emphasising the fact that individuals should not be subjected to arbitrary of unlawful interference with their privacy, either online or offline. The UNHRC has also mandated the Special Rapporteur on the right to privacy to address the issue of online privacy in his reports.

(IWF)

The IWF removes child sexual abuse images and videos and other criminal content by taking down individu

...

The IWF removes child sexual abuse images and videos and other criminal content by taking down individual web pages. It works in partnership with major content delivery networks and Internet service providers as well as governments and intergovernmental organisations. It has a reporting mechanism where web users can anonymously report criminal content. It also runs a hotline in the UK. Once content is reported, the Foundation traces webpage geographically then contacts the host to remove it. The foundation also connects victims of child online abuse with help offline.

(UN OHCHR)

Challenges to the right to privacy in the digital age (such as surveillance and interception) are among the is

...

Challenges to the right to privacy in the digital age (such as surveillance and interception) are among the issues covered by activities of the High Commissioner for Human Rights. At the request of the UN General Assembly, the Commissioner prepared a report of the right to privacy in the digital age, which was presented to the Assembly in December 2014. The office of the Commissioner also organises discussions and seminars on the promotion and protection of the right to privacy in the online space, and collaborates on such issues with the UN Special Rapporteur on the right to privacy.

(CJEU)

The CJEU has interpreted cases dealing with freedom online and the blocking and filtering of

...

The CJEU has interpreted cases dealing with freedom online and the blocking and filtering of online content. In 2012, the Court ruled that the owner of a social network cannot be obliged to install a general filtering system, targeted at all its users, for the purpose of preventing the unlawful use of copyrighted material. A 2014 decision stated that an ISP may be ordered to block access to a copyright-infringing website, but that a fair balance must be endured between the fundamental rights concerned. The decision on the right to be forgotten empowered EU citizens to request search engines to remove sensitive information from their search engine results.

(OSCE)

The OSCE has been focusing on the connections between freedom of expression and the potential misuses of the Internet (e.g.

...

The OSCE has been focusing on the connections between freedom of expression and the potential misuses of the Internet (e.g. racist and xenophobic content, violent extremism online, and the use of Internet for terrorist purposes). In these areas, it has organised several conferences and capacity development workshops, and issued multiple declarations and recommendations. On the issue of blocking and filtering of online content, several statements of the OSCE Representative on Freedom of the Media underlined the fact that the blocking online content is an extreme measure that negatively affects users’ rights to freedom of expression and access to information.

Instruments

Conventions

Judgements

Resolutions & Declarations

Wuzhen World Internet Conference Declaration (2015)
Universal Declaration of Human Rights (1948)

Other Instruments

Resources

Articles

Eric Schmidt on How to Build a Better Web (2015)
The Digital Dictator's Dilemma: Internet Regulation and Political Control in Non-Democratic States (2014)
Internet Content Regulation in Liberal Democracies: A Literature Review (2013)
Trends in Transition from Classical Censorship to Internet Censorship: Selected Country Overviews (2012)
Policy and Regulatory Issues in the Mobile Internet (2011)
The Impact of Internet Content Regulation (2002)

Publications

Internet Governance Acronym Glossary (2015)
An Introduction to Internet Governance (2014)

Papers

Internet Fragmentation: An Overview (2016)

Reports

One Internet (2016)
Freedom of the Press 2016 (2016)
2016 Special 301 Report (2016)
2016 World Press Freedom Index (2016)
The 2016 National Trade Estimate Report on Foreign Trade Barriers (2016)
The Impact of Digital Content: Opportunities and Risks of Creating and Sharing Information Online (2016)
Content Removal Requests Report (2016)
Global Support for Principle of Free Expression, but Opposition to Some Forms of Speech (2015)
Freedom on the Net 2015 (2015)
Government Request Report (2015)

GIP event reports

Your Freedom of Expression vs. Mine? Who Is in Control? (2018)
Surveillance, Laws and Governments vs. Internet Rights (2018)
Information Disorder: Causes, Risks and Remedies (2018)
EBU Big Data Conference: The discussions during Day 2 (2018)
EBU Big Data Conference: The discussions during Day 1 (2018)
The Legal Framework for Countering Terrorist and Violent Extremist Content Online (2017)
Safer Internet for Children, Mitigation of Conflicts and Language and Communication for Peace (2017)
Realizing Rights Online: From Human Rights Discourses to Enforceable Stakeholder Responsibilities (2017)
Fake News: The Role of Confirmation Bias in a Post-truth World (2017)
Internet in the ‘Post-truth’ Era? (2017)
Report for Violent Extremism Online – A Challenge to Peace and Security (2017)

Other resources

The Twitter Rules (2016)

Processes

Session reports

Click on the ( + ) sign to expand each day.

WSIS Forum 2018

12th IGF 2017

WSIS Forum 2017

IGF 2016

WSIS Forum 2016

IGF 2015

 

The GIP Digital Watch observatory is provided by

in partnership with

and members of the GIP Steering Committee



 

GIP Digital Watch is operated by

Scroll to Top