Content moderation on the Internet infrastructure level – Where does censorship begin? 

29 Jun 2021 14:30h - 15:30h

Event report

Can content moderation really happen at the infrastructure level?

 It can be wrought with a lot of problems, noted the moderator, Mr Sebastian Felix Schwemer (Associate Professor, Centre for Information and Innovation Law, University of Copenhagen). The European proposal for the Digital Services Act (DSA) addresses providers of services that establish and facilitate the underlying logical architecture and the proper functioning of the internet, including technical auxiliary functions. Therefore, the workshop discussed the current and future role of internet infrastructure providers in ‘content moderation’ at non-content layers.

The objectives of the DSA

Mr Denis Sparas (Legal Officer, European Commission) briefly shared the objectives of the DSA. First of all, it is a modernisation of the rules we have under the existing e-Commerce Directive, especially in addressing illegal content and systemic risks in the online space. The next is to clarify the rules on liability at any layer of the internet and give legal certainty to service provider actions. Another objective of the DSA is to increase the transparency of content moderation decisions, ensure accountability, and facilitate better oversight. Section 4 of the e-Commerce Directive deals with the conditional liability exemption. Our concern is to clarify to what extent the liability exemption regime also applied to infrastructure layer services.  

The ccTLD’s view on the DSA

The DSA proposal recognises the limited but very important role of certain technical auxiliary functions, as they are called in the proposal – internet infrastructure services. The DSA confirms that domains and registries are intermediaries and should benefit from an exemption for liability for illegal content provided by the end user, but only to the extent that their services qualify as these intermediary categories, such as mere caching or hosting services. 

However, the services registries offer cannot be considered to fall into any of these categories, cautioned Ms Polina Malaja (Policy Advisor, CENTR). This creates legal uncertainty for operators, as they are in a vulnerable position, because technically, there is no specific way to target specific content at the technical and registry level.  ‘So, what registries can do is to only suspend the underlying infrastructure. That will, of course, have an effect on all services linked to it.’

Content moderation from a digital anthropology perspective

Decisions to suspend services seem to be happening quite ad hoc and at the whims of CEOs of these infrastructure companies, noted Ms Corinne Cath-Speth (Oxford Internet Institute), who gave a digital anthropology perspective of content moderation. She stated that ‘For the most part, internet infrastructure companies are naturally hesitant to position themselves as explicit political players, but when they do intervene, they often attempt to do so without a strong policy framework around how they make these decisions and also with limited sort of follow-ups in terms of accountability measures afterwards.’  She noted it is not a new phenomenon, but recently such cases have become more visible to policy debates and the general public (8chan, Parler).

She concluded that political gatekeeping and content moderation will continue to happen at the infrastructure level, but we need to make sure that its main players have a much more mature framework to respond to such a situation, and these frameworks need to be publicly available to ensure a certain level of predictability and accountability for internet users at large.  

Ms Petra Arts (Senior Manager Public Policy, Europe, Cloudflare) noted that internet infrastructure services typically operate at a lower internet layer and do not interact directly with actual pieces of content. Thus, they have limited responses to that content: clearing cache and limiting public access to the entire website at the DNS level. In the context of DSA, these measures do not remove content from the internet. 

Proportionality of measures to take down illegal content at infrastructure level

Mr Fred Langford (Online Technology Principal, OFCOM) stated that ‘We have URL blocking and DNS blocking, keywords for registries to stop domains; we have CDNs clearing caches; we have filtering solutions on Wi-Fi and also in search results. But some of them are very blunt instruments, particularly when we’re talking about content moderation around particular pieces of content on large providers’ networks.’ Langford brought forth DNS blocking as an example – it would take down the entire site, but not a particular piece of content on it. 

Takedown of legitimate content, transparency, redress and appeal, and conduct moderation

Top-level issues in content moderation were brought forward by Ms Corynne McSherry (Legal Director, Electronic Frontier Foundation). At EFF, they frequently witnessed that perfectly legal, legitimate content was taken down mistakenly. ‘Content moderation system is really broken even at the platform level. It’s really worrisome when we extend it to the infrastructure level.’ 

Users do not realise how many services are involved to ensure the place and freedom of expression on the internet. For them it is really impossible to be familiar with DNS, CDN, and other guidelines. Companies could put out a lot of transparency reports, but what is challenging is to make transparency meaningful

Arts also touched on the issue of transparency. ‘We believe that this is a very important element to make sure that decisions about digital services are more transparent about any actions that might be taken voluntarily, but we have to be able to expand about any orders or any requests or any kind of mandated action from courts or competent authorities. At the moment, there has not been a lot of push for transparency for judicial decisions, for example, which we think is very important because they’re such a blunt instrument.’

McSherry paid attention to the absence of redress and appeal options at the infrastructure level – users in most cases do not have direct contact with platform representatives, not to mention the service provider. The situation is even worse, as people do not necessarily have alternatives for service providers. 

Lastly, she highlighted a new trend – conduct moderation, by which she means: ‘we have seen platforms hiring law firms to investigate conduct of users and then kicking those users off their services based not on what they’ve done on the platforms, not what they’ve done online, but based on their offline conduct. If that starts getting extended to the infrastructure level, that could get even worse’.