WS #137 Combating Illegal Content With a Multistakeholder Approach
17 Dec 2024 10:15h - 11:45h
WS #137 Combating Illegal Content With a Multistakeholder Approach
Session at a Glance
Summary
This discussion focused on combating illegal content online while preserving an open and free internet, exploring the roles and responsibilities of various stakeholders. Participants included regulators, platform representatives, and infrastructure providers who debated approaches to content moderation and regulation.
Key topics included the challenges of regulating content across jurisdictions, balancing freedom of expression with user safety, and the technical limitations of content removal at the infrastructure level. Participants emphasized the need for collaboration between industry and regulators, with some arguing for clearer responsibilities within the ecosystem.
The discussion highlighted tensions between self-regulation and government intervention. While some advocated for stronger legislation and enforcement, others cautioned against overly prescriptive approaches that might stifle innovation or lower standards. Participants agreed on the importance of addressing root causes of harmful content, not just symptoms.
Privacy concerns were raised, particularly around end-to-end encryption and age verification. The group explored ways to preserve user privacy while enabling action against bad actors. Cultural differences in defining harmful content were noted as a challenge for global platforms.
Infrastructure providers discussed their evolving role in addressing abuse, moving beyond a hands-off approach to content. However, they stressed the need for careful, targeted interventions to avoid collateral damage. The importance of multi-stakeholder dialogue was emphasized throughout, with participants agreeing that ongoing collaboration is crucial for effective and balanced approaches to online safety.
Keypoints
Major discussion points:
– The role of regulators, platforms, and infrastructure providers in combating illegal online content
– Balancing content moderation with preserving an open and free internet
– Challenges of global cooperation and enforcement across jurisdictions
– The need for clarity on responsibilities within the internet ecosystem
– The importance of multi-stakeholder dialogue and collaboration
The overall purpose of the discussion was to explore how different stakeholders can work together to address illegal and harmful online content while preserving internet freedoms.
The tone of the discussion was largely collaborative and solution-oriented. Participants acknowledged the complexity of the issues and the need for nuanced approaches. There was a sense of shared responsibility, even as different perspectives were expressed. The tone became more urgent when discussing specific harms like child exploitation, but remained constructive overall. Towards the end, there was increased emphasis on the importance of continued dialogue between regulators, industry, and civil society.
Speakers
– MODERATOR: Auke Pals (KPMG)
– Arda Gerkens: Authority of Terrorism Content and Child Pornography Material
– Deepali Tamhane: Trust and Safety team at Meta
– Tim Scott: Roblox
– Brian Cimbolic: Chief Legal and Policy Officer at Public Interest Registry (.org)
– Mozart Tenorio: Brazil Regulator (Anatel)
Additional speakers:
– Andrew Campling: Trustee for the Internet Watch Foundation
– Roelof Meijer: SIDN (.nl domain registry)
– Abhilash Narayan: University of Exeter, UK
– David McAuley: VeriSign
– Mauricio Hernandez: From Mexico (organization not specified)
Full session report
Expanded Summary of Discussion on Combating Illegal Online Content
Introduction:
This discussion focused on the complex challenge of combating illegal content online while preserving an open and free internet. Participants, including regulators, platform representatives, and infrastructure providers, explored the roles and responsibilities of various stakeholders in content moderation and regulation. The dialogue was characterised by a collaborative tone, with participants acknowledging the complexity of the issues and the need for nuanced approaches.
Key Themes and Discussion Points:
1. Regulating Online Content:
The discussion highlighted the challenges of regulating content across jurisdictions and at different levels of the internet ecosystem. Deepali Tamhane from Meta’s Trust and Safety team discussed content removal policies and practices on platforms, while Brian Cimbolic, representing the Public Interest Registry, addressed the challenges of regulating content at the DNS level.
There was a clear call from the audience for clearer responsibilities in content regulation. Tim Scott from Roblox emphasised the importance of a multi-stakeholder approach, cautioning against overly prescriptive regulation that might lower overall standards. This view was partially countered by another speaker who argued for regulation to set minimum standards, highlighting the tension between industry self-regulation and government intervention.
The Digital Services Act (DSA) was discussed as a significant regulatory development, with speakers exploring its potential impact on content moderation practices. Arda Gerkens from the Dutch Authority for Digital Content highlighted the challenges of regulating smaller platforms and addressing the global nature of the internet.
2. Collaboration Between Regulators and Industry:
A significant portion of the discussion centred on the importance of collaboration between regulators and industry. Deepali Tamhane stressed the value of dialogue between regulators and companies, while Arda Gerkens emphasised the need for regulators to maintain independence.
Roelof Meijer of SIDN highlighted the value of public-private partnerships, though Tim Scott noted the challenges of maintaining collaborative relationships over time. The role of safety organisations in combating harmful content was also discussed, with Deepali Tamhane emphasising their importance in the ecosystem.
3. Addressing Illegal and Harmful Content:
The discussion explored various approaches to addressing illegal and harmful content, including child sexual abuse material (CSAM) and terrorist content. Deepali Tamhane discussed proactive content removal by platforms, while Andrew Campling, a trustee for the Internet Watch Foundation, raised challenges related to end-to-end encryption in content moderation.
Arda Gerkens emphasised the need to address root causes of harmful content, not just symptoms. The audience raised important points about preserving privacy while enabling content removal, and Brian Cimbolic discussed the role of DNS actors in combating illegal content, mentioning the Framework to Address Abuse.
4. Balancing Regulation and Open Internet:
A key theme throughout the discussion was the challenge of balancing effective regulation with preserving an open and free internet. Arda Gerkens highlighted the importance of protecting free speech while addressing harmful content, and discussed the role of human rights in determining what content should be considered illegal or harmful. Audience members called for more effective enforcement of existing laws rather than new legislation.
Roelof Meijer warned of the risk of breaking the open internet through overzealous regulation. Andrew Campling raised an interesting point about potential hypocrisy in selective compliance with regulations across different jurisdictions.
5. Age Assurance Technologies:
Abhilash Narayan from the University of Exeter brought up the topic of age assurance technologies, discussing their potential role in content moderation and online safety for minors. This sparked a conversation about the balance between protecting young users and preserving privacy.
6. Brazilian Approach to Content Regulation:
Mozart Tenorio shared insights into their country’s approach to content regulation, highlighting unique challenges and solutions in the Brazilian context. This provided valuable perspective on how different jurisdictions are tackling similar issues.
Areas of Agreement and Disagreement:
Participants broadly agreed on the need for collaboration between industry and regulators to effectively address online content issues. There was also consensus on the challenges involved in addressing illegal and harmful content, including technical limitations and the need for proactive measures.
Significant differences emerged around the role of end-to-end encryption in content moderation. Andrew Campling argued that it hinders moderation efforts, while Deepali Tamhane defended its use, stating that safety mitigations can still be implemented.
Key Takeaways and Unresolved Issues:
The discussion highlighted several key takeaways, including the crucial need for ongoing collaboration and dialogue between stakeholders, the importance of clear responsibilities and standards in content regulation, and the need for careful consideration of content moderation at the infrastructure level.
Unresolved issues included how to effectively regulate smaller platforms and ‘bad actors’, addressing challenges of end-to-end encryption for content moderation, and harmonising content regulation approaches globally given differing laws across jurisdictions.
Conclusion:
The discussion underscored the complexity of regulating online content while preserving internet freedoms. It highlighted the need for ongoing dialogue and collaboration between regulators, industry, and civil society to develop effective and balanced approaches to online safety. As the internet ecosystem continues to evolve, so too must the strategies for addressing illegal and harmful content, with a focus on shared responsibility, effective enforcement, and preserving the open nature of the internet. The upcoming DNS abuse workshop was mentioned as a continuation of these important discussions.
Session Transcript
Auke Pals: Good morning from Riyadh, Saudi Arabia, and also a great welcome to our online participants. My name is Aouke Aoukepals, working for KPMG and moderating this session, Combating Illegal Content with a Stakeholder Approach. But I’m not doing that alone. We do have great speakers here in the room today and online. And that’s for the greater good. While this is a quite sensitive topic, and it’s a complex challenge in the internet governance world, because we’re going to discuss today who decides what’s allowed online, and how to prevent censorship, and how to ensure openness and freedom of the internet while regulating content online. And so, first of all, I would like to welcome our speakers. So in the room today, we do have Arda Gerkens from the Authority of Terrorism Content and Child Pornography Material. Welcome. Mozart Terroni from the Brazil Regulator. Tim Scott from Roblox. Dipali Ibrahim from Meta. And online, we’re joined by Brian. Brian Simbolik, if I pronounce it all right, and he’s a Chief Legal and Policy Officer at publicinterestregistry.org. Welcome. First of all, Arda, can I give you the floor to give a short introduction about what the Authority of Online Terrorist Content and Child Pornography Material means?
Arda Gerkens: Yeah, I’m sorry about that name. I didn’t think of it, but it’s awful. Anyway, so we’re the new established regulator, and what we do is we execute the terrorist content. online regulation from the European Commission and national regulation to combat child pornography material indeed and so we do basically everything so from detection of this kind of material to sending out removal orders to anything of giving fines or the appeal that any hosting party or platform might do so basically everything and because we’re a regulator I think we also look at the landscape as such because you know we can send out removal or this as many as we want but we actually don’t want the material to be on there so we seek cooperation to find ways to diminish that kind of material online thank you and I think you do that by having good collaboration with platforms I hope so yeah no indeed so we have a sector council so that means that for the infrastructure parties in the Netherlands we have regular talks with them to see how we can cooperate and also to know that we do the right thing technically and also to debate with them whether things are possible or not I think we talk about this later on can we block on DNS level or geo blocking and stuff like that and I have regular talks with the platforms to discuss with them what they are doing but I don’t speak with the platforms in a council in the group because you know they are competitors so they will not speak their tongue when they’re next to each other so I speak with them privately or not
Auke Pals: privately but one-on-one but now in the room you’re together with the platform so Deepali what are you doing regarding content online so I work on the trust in
Deepali Tamhane: safety team and my job is primarily to help make sure that that our users feel safe online our approach to safety is is a multi-pillar approach so our team looks at three things one is do we have the right policies in place to help keep our users safe which includes policies for tackling you know illegal content legal but harmful content all those categories of content tools and features that are really important that users must have to exercise choice and control on the platform, whether it’s reporting, whether it’s tools like Limit or Restrict, to customize your experience on our platforms. And the last is, our team also works a lot on partnerships. When I started out at Meta a decade ago, we had maybe one or two safety partners that we worked with in India, for example. But today, globally, we have a network of over 500 safety experts that we work with, and we use this expertise to actually help inform a lot of the work that we do on the trust and safety side.
Auke Pals: Thank you very much. And Mozart, so from the Brazilian point of view, how are you involved in regulating content online?
Speaker: Sure. First of all, thank you for the invitation. We are the telecoms regulator, so we don’t deal directly with content. But as there is no regulator for content in Brazil, we kind of do what the court’s order tell us to do regarding these issues. So what we can do is, when we receive a court order, like it could be from the Supreme Court or Supreme Electoral Court, or maybe a lower court sometimes, we tell the telecoms operators to take down those websites. But we cannot do that on the DNS level, obviously, so they do it on an IP level on the networks. So that’s more or less how we are trying to deal with that. We have also the NIC.br and CGI.br in Brazil, which are responsible for DNS and IP, but they are not government. They are not state-owned or nothing like this. So they also, I am a counselor there. as well, so we also comply with court orders if they go to nick.pr for the DNS level as well. But mostly, talking from the perspective of Anatel, we tell the telecom operators to comply with that court order and take those IPs off the grid. That’s more or less what we’re doing in Brazil nowadays.
Auke Pals: Very interesting. I’m also really curious to hear the others’ opinion in the room about that. And let’s move to Tim. How is Roblox involved in content online?
Tim Scott: Very similar to Meta, we take a fairly multi-stakeholder approach to this, and I think it actually goes to the heart of the point of this panel, this session. Safety and safety of our consumers, our users on the platform is right at the heart of what we do and always has been from our inception 20 years ago. But we approach that with this idea of partnerships, we provide the tools and the features for people to use, we implement the policies, we adapt those policies as situations change and as we develop these dialogues with regulators, with governments, etc. So rather than we’ve come up with something top-down and drawn a line under it, we’re constantly in that dialogue and understanding where the threats and the risks are coming from so that we can continue to keep as safe as possible.
Auke Pals: Thank you very much. And now let’s move to Brian. Brian, welcome. Welcome online. Brian, from the point of the .org registry, how are you involved in this?
Brian Cimbolic: Yeah, hi, thank you very much and thank you for having me, particularly remotely. So yes, as you mentioned, I work for Public Interest Registry. We are the registry operator for the .org top-level domain as well as a few other mission-driven top-level domains like charity and foundation. As many of you know, dealing with content at the DNS level is difficult and is typically not the right place for it to be dealt with. We have a pretty robust anti-abuse program that focuses primarily on technical harms, things that within the ICANN world fall under the category of DNS abuse, and that’s things like phishing, malware, botnets, things like that. However, we recognize that there are sometimes the scale of harms when you’re dealing with online harms can be so great that if other actors aren’t stepping in, that we are prepared to step in. So we have partnerships with, for example, the Internet Watch Foundation to deal with child sexual abuse materials online. We’ve been working with them for five years, and so we have a process in place where we work downstream with the registrars, with the registrants, with the hosting providers to try and get that content removed so that we don’t have to suspend the domain name. Because the DNS is a really blunt tool to deal with website content. If there’s a piece of child sexual abuse material, one image on a file-sharing site, we at the registry level, we can’t remove that individual piece of content. The only thing we can do is suspend the entire domain name. And so while that might render that piece of content inaccessible by the domain name system, it also renders potentially the hundreds of thousands of legitimate or benign pieces of content inaccessible from the DNS. So it’s not impossible, but you have to be very careful and very deliberate when you’re dealing with online content abuses via the DNS.
Auke Pals: Thank you very much. So we’ve heard some techniques on regulating and removing online content. I would also encourage the room to participate in this discussion. So not only we as panel talking. So if you do have an intervention, stand up and move to one of the piles and we’ll give you a mic. So there you go. so that we encourage all the interaction as well. So if there is any intervention right now from the audience, please let you know, and otherwise I would like to give the floor to our panel to give a little overview also on regulations online, because there are quite some regulations dealing with online content, like DNS, Ofcom, terrorism content online regulation. Who can tell me something about the regulation?
Arda Gerkens: Who can I give the floor? I can tell you about my regulation, yes. So the terrorist content online regulation is regulation basically giving us the possibility to, or giving every competent authority in all of Europe, in every country, there should be one, the authority to send out a removal order to any platform or service that is aiming at customers in Europe. So basically that’s very swiftly. It could be any language from the European Union, or maybe if you pay your services with Euros, that’s already a European connection. Once we give you the removal order, the material needs to be taken down within one hour. So one hour, take down time, no questions asked. Afterwards you can debate that. If you don’t take it down in one hour, we can fine you for that. So if you don’t take it down at all, we can even fine you more. If you are, for instance, based in the Netherlands, and we have some companies in the Netherlands which are really interesting, for instance, Reddit and Snapchat, to name two, Discord, is based in the Netherlands. They have their legal representative in the Netherlands. So if they have had terrorist content more than twice in a year, we can then tell them that we have a special interest and then we’ll debate, go in. to talks with them to see how they can prevent material to be disseminated via their platform so that’s my european regulation for the dutch regulation i think it’s quite obvious that child sex abuse material is illegal in itself basically we shouldn’t even have to debate that, it’s just a bit taken down and other regulations that are coming up uh… is of course the digital service act uh… we have uh… also e-evidence coming up we have the video i don’t know that abbreviation
Speaker: we’re talking about DSA we can give a little introduction about DSA i don’t know maybe it would be better for a european to talk about DSA but what can i say about brazil is that uh… european regulations as a whole are very inspiring for brazil so we have uh… some piece of legislation being discussed but we don’t have still any bill approved by both houses but uh… what has been discussed in congress now is pretty much inspired about the the european experience so DSA, DMA how we can deal with that uh… but nothing yet proved uh… good enough for our congressman to to approve that in a final stage so maybe when we have something that’s going to be inspired as i said by european uh… legislation but uh… what can i say that that i see in those bills proposed in brazil is that we are trying to aim for uh… undoubtful uh… harmful content or illegal content we’re not giving a lot of room for uh… for for us or for the regulation or whoever they are to to have much of uh… judgment about that So only what is really clear that is illegal or that is harmful is going to be removed a priori, I would say. More or less like this. And also we have Ofcom, right? As a regulation.
Tim Scott: Yeah, this is working now. Yeah, so exactly to that point, I guess in the UK we’ve got the Online Safety Act, which is making its way through implementation at the moment. The illegal harms codes have just been published. I think there’s 2,500 pages of information and guidance, and that’s not an exaggeration. So they’re going through a quite forensic approach to this, which I liken to a DDoS attack in terms of trying to keep up with it sometimes. But what they’ve done is there’s clear illegal harms, but then there’s also content that is harmful for children, which is a far more gray area and a lot more open to interpretation in terms of how you react to it. I guess the focus from our regulator in the UK is about what is the evidence base, but also what is the mitigating factors that you’re doing on your platforms to keep users safe from harm. And that does give, again, to the opposite of what you’re saying in Brazil, it gives quite a lot of scope to work with the regulators to sort of say, look, these are the risks that we see on our platforms, but these are the measures that we’re taking, and this is the dialogue that we’re entering into to sort of demonstrate that we’re coming up to scratch on what you’re expecting. And I think it’s been a really collaborative experience thus far, which hopefully should aim towards meeting a shared goal. Colleagues and I have just come from a meeting with the Digital Cooperation Organization, and again, to that sort of point about what is it that you want to achieve as a regulator or as a government, what do we want to achieve for our users and what’s that shared ground, how do we reach that? And I think rather than top-down legislation, let’s ban X, and I don’t mean the company. Although, you know, let’s ban X, Y, and Z. Let’s have a dialogue about how we might sort of achieve that shared aim.
Auke Pals: So, Deepali?
Deepali Tamhane: Yeah, I thought that I could maybe give a little bit of overview on the issue of content takedown, how companies like us deal with it because we have to do it at a global level. We want to make sure that the internet is not splintered in any way. In terms of a lot of our policies and takedown of content actually predates a lot of this legislation. So, we have community standards that cover a wide variety of content, including content that is illegal, including content that may be harmful but not illegal. Child safety policies, for example, cover not just CSAM, but go well beyond that to cover non-sexual child abuse, for example. So, what we’ve done is we’ve always had an approach of removing this content, and we also have proactive technology that proactively removes it a lot of times before, even when somebody reports it to us. What we have to deal with at a very local level is each country’s regulatory or legislative regime dealing with specific content. So, we have geo-blocking policies, and those geo-blocking policies are designed, essentially, to deal with the laws of that particular country. So, as a company, we’re dealing with the laws of Brazil, for example, or the laws of India, relating to very specific content. Now, in each country, there is a very slight, I would say slight to broader gap between our community standards and our geo-blocking policies. So, I think the challenge really is for us to make sure that there is a category of content that we don’t allow on our platform, that we don’t think should be on our platform, plus the additional content of what country’s regulations require, and to try and bridge the gap between those two by having dialogues with the regulator. And it’s not necessarily that we agree all the time, but I think that these are really important discussions. to be had because content that is illegal in one country will be free speech in another country and we’ve dealt with a lot of a lot of those situations. So the question for us is what do we do with that piece of content?
Arda Gerkens: Arda. It’s really interesting what you’re saying because you’re basically saying we we do a lot and still we find Facebook challenging the removal orders for terrorist content because they said this actually didn’t happen from the Netherlands but from another country. Facebook stated that it’s glorifying terroristic actions which is not illegal for Facebook but it is illegal in the European law. So I can still see I mean I find it sometimes difficult that we have these discussions because it’s you make for me quite clear content borderline content because it’s different in different jurisdictions and I do understand that sometimes that can be challenging if you look for instance in Europe when it’s white supremacy content or whether it’s geodesic content that’s quite clear I think to us if it’s somebody from Catalonia and speaking on something then it might be viewed differently right from them but but to debate and to say that you know we do a lot I still think if if you both and I’m not just looking at Metta or as Roblox but if the industry would do more then we wouldn’t be having this conversation we need regulation because I do feel that there’s some one point where we need to step up and say hey listen here’s where we draw the line and for me it was quite surprising to find that Metta even if there was a removal order sent by a competent authority they didn’t even comply with that legislation.
Auke Pals: Gentle do you hear me yeah okay so there’s a gentleman in the room as well I want to reflect on that so I’ll walk over with the microphone
Andrew Campling: Hi, thank you. My name is Andrew Campling. Amongst other things, I’m a trustee for the Internet Watch Foundation, which is focused on the removal of CSAM from the Internet. Well, firstly, I say thank you to PIR for partnering with us and their efforts, entirely voluntary efforts, which are making a big impact. But to challenge, though, tech choices are often made by the platform operators, which, in my view, give them plausible deniability for not acting, to the point about not complying with domestic legislation. So, for example, switching on end-to-end encryption, which hides the existence of CSAM being distributed actively on the platforms. I’ll pick on Meta because I happen to know the numbers, but this is equally true of other platforms. So, Meta, on Facebook, about 30 million or so instances of CSAM being found a year, and then they switched on end-to-end encryption on Facebook Messenger. So, of course, you can’t see the content. It’s implausible that it’s gone away, but it’s a way of avoiding having to do the moderation. So, the terms of service that the platforms have are useful, but they’re not actually enforced most of the time. So, for example, you don’t have meaningful age verification, and, you know, surprise, children tell lies about how old they are. So, again, I’ll pick on Meta. Apologies for doing this twice, but, you know, the report account shows billions of dollars of profit made from children who are too young to use the platform, and that’s equally true of the other platforms. that don’t have appropriate age verification controls or even age estimation controls. So to get to the point, regulation is absolutely necessary but only if it’s attached to significant consequences for the platforms and preferably for their senior execs. And the example I’ll finish with, Telegram was an outlier in this space until their CEO was arrested. They immediately joined the IWF and are now actually actively searching and removing CSAM from their public channels. So that shows that when the senior execs are in jeopardy, they will comply with local legislation and maybe that’s where we need to look for greater consequences. Thank you. Thank you very much. I think I’ll give the floor to
Deepali Tamhane: Deepali. There you go. That’s fine. It’s a meta panel. So in the transparency report that we had for this quarter, I think just off the top of my head, for CSAM particularly, I think we would have removed more than 7 million pieces of content for bullying and harassment, also 7 plus million pieces of content for suicide and self injury, more than 5 million pieces of content and that’s just this quarter. And we proactively remove most of this content before somebody even reports it to us. For CSAM, we probably report the highest volume of numbers to make every year and I don’t know what the number is this year, but I don’t think that that suggests that we don’t do anything. I think that we do do a lot in terms of the content that we’ve removed. In terms of end-to-end encryption, again I think the suggestion here is that end-to-end encryption means that there is no safety. We actually work to ensure that there are safety mitigations in place when there is end-to-end encryption. So from the safety side of things, we actually designed a prevent, detect, remove response to end-to-end encryption, which actually says, let’s think about preventing these interactions in the first place, is one of the approaches, and that’s why we’ve put in really strict messaging restrictions on Messenger, on Instagram, so that a young person is not able to be messaged by an adult or a teen who they’re not connected with or who they don’t follow. Also, end-to-end encrypted doesn’t mean that your content can’t be reported, it means your content can be reported and then we can take certain action. We also, in terms of public surfaces, we can also use our technology across the public surfaces on end-to-end encrypted services and we do that as well and we’ve reported content that we found. If the conversation is about, is end-to-end encryption bad and should we not have end-to-end encryption, I think that there are a lot of people, I don’t know if there are any privacy advocates who would disagree that that’s a bad thing. I’m curious if we do have them in the room. Do we have any in the room? But I think that that’s a different conversation and I think that that’s a debate that we’ve had in many countries, but I don’t think in any country that they have decided to have a regulation that bans end-to-end encryption because creating a backdoor for one means creating a backdoor for everybody and I think that there are a lot of considerations when we need to take this. But that’s a separate discussion and also a very important discussion. But just to my point, just because a service is end-to-end encrypted doesn’t mean that we can’t have safety mitigations. We can’t see the content, but we can do a lot more.
Auke Pals: Thank you very much. I also put a question on the screen. So I would encourage everyone in the room to participate and log in on the website menti.com using the code that’s 7736669. To answer the question, in what way can we shape collaboration between regulators and industry? without regulators losing their independency. And we also have some regulators in the room, so while we collect the responses, I’m also curious on the perspective of the regulator. Who can give the floor?
Speaker: I think I can talk a little bit as a regulator, but I would also like to share the experience in Brazil. As I said, we don’t have a regulator for the DNS or IPs, but we have a multi-sectoral, multi-stakeholder entity which deals with that. And the regulator is a member of the board in that entity, so we can interact every time with them and come to reasonable solutions together. In Brazil, the regulator, as in the UK, I know Ofcom a little bit better. We have panels that include civil society, consumers, private sector, other government branches, et cetera. So they can constantly give feedback about what we’re doing. And finally, they can even challenge our decisions in court if they think it’s appropriate. So that’s a good kind of checks and balances to involve all society, all the sectors, and still keeping the regulator independent. So that’s something that I would like to share.
Arda Gerkens: Yes, I think I do find this very challenging. Because indeed, as a regulator, you should keep your eye on the ball, right? So what we want to do in my organization is make sure that, in the end, the internet is cleansed of terrorist content and child sexual abuse material online. But we also realize we cannot do this without collaboration with any party. Involved in this that’s because you will always have bad actors and always people on purpose put this material out there So it’s a fight. We need to fight together but I do find it difficult first of all, there’s a lot of debates in my country on The possibility of geo blocking or doing anything on DNS level So you have to have a debate on that and the other thing is that in the end if I talk to the platforms I mean Let’s be honest. You you’re big companies, right? You’re what you’re you’re in here for profit, of course with the nice tool maybe but you’re in it to make profit So it’s it’s going to be also challenging not to be too much Left by the information given by the companies But on the other hand I do things for instance if you look at the debate on end-to-end encryption if you’re not quite or very well informed by either Stakeholder on the internet then you might say things like breaking and transfer Christian, which I think is really not good for the Safety of children also online. So yeah, it’s very hard to find a balance but I do think it’s very much needed as a regulator to be to do this closely together and So yeah I I would really be interested to see what the public has to say what they see as a challenge for us as a regulator But also maybe chances for us
Auke Pals: Yes, indeed, and we collected some responses so I’m quite curious about Clarity of responsibilities within the ecosystem response. So who gave that answer in the room? No, no one in the room maybe online any shy people Yes Thank you
Audience: In the Netherlands, we are working on a public-private partnership on this team on concept moderation. And I think this is what we really need so we need clarity of responsibilities because it’s a complex ecosystem with lots of responsibilities, so we need to talk about where the responsibilities lie, and how to actually check on these responsibilities, and if they work out the way they should. Thank you.
Auke Pals: Anyone want to reject on that? Thank you very much. And otherwise I’ll pick another one.
Audience: Yeah. Hello everyone. From the Brazilian Association of Risk Providers. Is this working? It is working. Okay. I just want to reiterate that point on the clarity. I think for the case of Brazil as well, we have been dealing with a lot of issues from the lack of clarity and responsibilities, because the regulators sometimes ask to do things that are outside of its control, and sometimes the ISPs have to do some DNS blocking or block services like X, and sometimes they do not have the capacity to do that in the speedy manner which the regulator would like. So we have this back and forth. So it’s always a challenge, and clarity of responsibility I think would be the most important thing. Thank you
Deepali Tamhane: very much. Go ahead Deepali. Yeah, I just wanted to say that I think it’s really important for companies like us to have a continuing dialogue with regulators, especially as they’re in the process of legislating, or more importantly passing rules. So for example, in Ofcom I know that our teams have done a number of deep dives with the Ofcom team. We’ve met them a lot, and we’ve also consulted with them on a number of rules that they’re thinking on passing. I think those dialogues are really important, because companies like us can talk about the work that we are already doing, and also understand the intention of what those rules are, and ways we can make sure that the rules are being followed in a way that is consistent with the rules that we have already passed. we can get to the substance of the matter. So I think that that sort of dialogue is really important and I think that that should definitely be encouraged.
Speaker: Thank you. Just to add a little about what Deepali just said, I agree with her, I think it’s really important and it’s good to the industry, but sometimes other sectors of society see that as a loss, some kind of loss and some kind of threat for the independency of the regulator, but I agree with her anyhow. But just to add a little bit about what Heian said about X and maybe bring Brian to the discussion as well, because the X issue in Brazil was a very interesting one because the court order came and it was about to take down for a certain time until they comply to the laws in Brazil about DNS, about the name of DNS that was outside of the Brazilian jurisdiction, outside of Brazilian sovereignty. It was not a .br domain, so we had to go through the ISPs and telecom operators networks to comply with this court order. So it’s a little bit trickier, it’s harder to do. We cannot do that within our sovereignty, so I believe we should be talking internationally about that and maybe establish some agreements that we could share, at least with court orders known by each other and things like that. I know Brian can talk about .org, we can talk about .br, which we call, I don’t know if everybody is familiar to that, but we call it ccTLD, Country Code Top Level Domain. But if it’s another country called, we cannot do much in Brazil. So I believe we have really to discuss that internationally like we’re doing here now.
Auke Pals: Yeah. Thank you. And Brian, do you think that content moderation should be a joint effort?
Brian Cimbolic: Yeah. You know, I think that I come at things from the sort of voluntary practice model more than the regulatory model in that, again, we’re sort of different, especially from a meta or a post in that we just, we don’t have content, we, you know, we’re infrastructure, we point by the DNS, but that doesn’t mean that there’s not a role for us in that. And so the gentleman from the Internet Watch Foundation mentioned a sponsorship that or partnership really that, and I just put in the chat that PIR put in place that any registry be they ccTLD like .br or gTLD like .org can take advantage of. And that is that we, we’re, we’re a not-for-profit, PIR is a charity, which makes us slightly different from most gTLD registry operators. And so we’re sponsoring this as part of our nonprofit mission so that any registry operator can work with the IWF, the Internet Watch Foundation to receive notifications when child sexual abuse material are, is found in their TLDs. They can also take advantage of programs to, to block or prevent the registration of domain names that are known to be used for CSAM in other TLDs. So I think that there’s a lot of room for sort of voluntary practices and collaboration between registries, between regulators, between organizations known as trusted notifiers. That’s the term we use in the DNS space where you, where you work with a, an expert organization like an Internet Watch Foundation, because we as registries don’t have the expertise or the tools or even the content to go out and, and look for and try and find CSAM. In fact, it’s illegal for us to do so. So, it’s pretty essential that we work with organizations like the Internet Watch Foundation or like other trusted notifiers, we work with the Red Cross for identifying fraudulent fundraising sites. So, working across between industry and NGOs is really, I think, there’s a lot of room for improvement across platforms, across registries, across registrars, and that’s something that we’re obviously actively exploring.
Auke Pals: And that also requires, I guess, shared resources. So I’ve seen the response from the audience on shared resources, non-material hash databases. Who has given that answer?
Audience: Yeah, John. Yeah, well, basically I was thinking on answering this question. I can imagine that technical resources that you can share in a safe way is maybe one of the easiest things to set up while keeping independency of your regulators. So that’s why I answered that.
Speaker: Yeah. No, no, that’s clear. Arne, can I give you this?
Arda Gerkens: Because I think that we have a problem here. And it’s also, I see more remarks on sharing of information between public-private partnerships. I do know, for instance, that for the platforms, it’s difficult to accept hash databases coming from a governmental organization. Because that would basically indirectly mean that a government is telling them what to host and not to host on their platforms, right? So and on the other hand, I do think that we need to solve this problem. Because in the coming years, we will build up a lot of information or databases on either child sex abuse material or terrorist content. I would very much be interested in having the databases that you’re having, because I’m pretty sure you have some. And I think you would be very interested in our databases too. So how can we solve this problem?
Deepali Tamhane: I think that, you know, we are required as a U.S. company, we are required to report to NCMEC, which is where these considerations come from. And we’ve actually worked with NCMEC in the past to try and address some of these issues, not this one specifically. So I’m just broaching that as an idea of something that maybe we can talk to NCMEC about. But one of the things that we’ve done is, as you know, we report to NCMEC and NCMEC works with law enforcement offices across the globe. We’ve worked with countries to kind of develop something called a NCMEC downloader tool that actually helps to download this kind of content in a very safe way, because when you’re sending across these reports, it’s important that they be done in a secure and private way. We’ve also worked with the tech coalition, and the tech coalition is an alliance of tech companies to go beyond just the content. Tech coalition is an organization where participating companies can share signals about accounts and behavior. So this goes beyond content, because we know that predators don’t stay on one platform, right? And I’m talking specifically about CSAM here. They’ll move from platform to platform. So what Project Lantern has done is it enables us to share those signals, other participating companies can receive them and do investigations. So for example, in the pilot phase of Project Lantern, we received a number of links from Mega, which violated their child safety policies. That’s still not between government organizations, no? Exactly. But what I’m trying to say is that there are ways available that we can address CSAM, and we can go beyond CSAM and address signals and behavior as well, and be able to address this issue at a holistic level. But in terms of specifically to your point, I don’t know if you’ve had any discussions with NICMEC, but we’re happy to…
Arda Gerkens: But I also think that NICMEC cannot solve this problem, because this is what we… I mean, this is what really hampers us if we want to work together, is that you cannot receive just information from governmental organizations with the question if whether you want to take that content down. And I do acknowledge that the Project Lantern is a perfect project. It’s helped identify perpetrators on sending this material on your platforms. It’s very good that you talk amongst each other. I’m very happy that you’re doing that. But still, this really will hamper us to share information that’s being spread so rapidly online whilst we really want to stop that. And I think, you know, if we would talk about, if we would look at what we’re doing in terrorist content, where we have like the incident response on, well, after the Christchurch shooting, we all know that what happened is that this shooting was disseminated online rapidly and in different kind of formats. It would be so great if we could also do such a thing for child sex abuse material. Because if we look at the figures, and that’s something that you know very well, we also know that a lot of these images are duplicates. It’s not unique. So it’s a lot of it, it’s duplicates, it’s some of them going viral, and we would like to cooperate to see how we can stop that. So as long as we cannot share this kind of information between each other, not very helpful for the kids.
Andrew Campling: And so also a remark from the room. Hi, sorry. Andrew Cumberley again, just two brief comments. Firstly, just going back to the end-to-end encryption, one of the easy solutions, which doesn’t break encryption, doesn’t breach privacy, any of the platforms, if they chose to, could scan content that’s being uploaded, I’m speaking specifically of image files, to see whether they contain known CSAM before encrypting and sending the messages. Doesn’t break encryption, it doesn’t break privacy. That would immediately impact on the scale of the problem, which to quantify, we’re looking at roughly three new victims of CSAM a second, which is over 300 million a year, which is. a scary number. And then just a second very brief comment, it’s worth bearing in mind that the tech industry as a whole is actively changing some of the underlying internet standards, again arguably because of privacy, that will bypass many of the existing protections. So it will be quite hard in the future to stop access to, for example, Kiwi farms type sites because it or to know that you’ve got effective parental controls, because some of the changes to the internet standards will mean that parental controls don’t work anymore. So I think that’s again an area where maybe regulators need to challenge the behavior of some of the tech companies and really question whether their motivation is really being helpful, because it will break even some of the existing fairly weak enforcement mechanisms. Thank you.
Auke Pals: And I also saw next to you, someone, no you don’t want to read Jack, but you did have an opinion, I saw that. No? No, it’s okay. Okay.
Roelof Meijer: Thanks Bauco. Rolf Meijer from SIDN. I’m still chewing on that question a bit, because I would have understood if the question was in what way can we shape collaboration between, in order for it to be successful, but losing the independency. I think sometimes regulators claim that they cannot collaborate, because otherwise they lose their independence. So they sometimes prefer, you just do what we tell you to do. I think then the industry often reacts with, okay we will do exactly that at the latest possible moment. So I think second to we need clarity on responsibilities, I think it’s very important also that we need clarity on the ultimate purpose of the collaboration. And I think also there sometimes it goes wrong, the regulator wants you to do what they tell you. And the industry just wants to spend as little money on the effort as possible and to make as much profit as possible. I think there is a very strong interdependency. Regulators can never be successful if they don’t collaborate with the industry, because they will have laws, but they will not prevent what they forbid. And in the end, the industry is very dependent on regulators because they need a license to operate. And I think for quite some time they will think that they don’t need that, but in the end they will. So, yeah, that’s my two cents on this.
Auke Pals: Thank you very much, Golov. I saw someone in the room as well. Please state your name and organization.
Mauricio Hernandez: Mauricio Hernandez from Mexico. I just want to share with you, sorry for my cough, that part of our duties as Industry Academy and even regulators is to be aware about bad practices we can all incur. From industry it’s very common, and you can try right now, in one of our providers of domains, trying to buy a domain with the word child sex, and they are available. There are no limits, just pay. So I think this could be a good beginning, just an example, of what we need to do as industry for creating guidelines or good practices, and in some countries like in my region, LATAM, regarding cookies and some countries like Brazil and Chile that are now developing in their privacy laws, to create guidelines and good practices not to give these options to customers because this is one of the back doors that the lady has mentioned. They are open all worldwide for creating these areas of opportunity to upload illegal content. That’s my comment.
Auke Pals: Yeah, thank you. Brian, do you want to reflect on that?
Brian Cimbolic: Yeah, actually, I was about to raise my hand to do so. And so that’s one of the programs that we have in place. It doesn’t, for example, take, like, child sexual something, like work with that and block that term. What it does is the IWF has identified domain names that have been registered in TLDs that have been dedicated to child sexual abuse material. And let’s just, for sake of an example, let’s just say it’s bad CSAM domain dot something. The registry operator suspends that. When bad CSAM domain dot another TLD pops up, then that domain name has hopped. Once that domain has hopped twice, the IWF maintains a list. And now any registry can receive that list from the IWF, again, under this sponsorship with us, so that it can prevent registration of that term in their TLD. So it really helps to sort of protect the TLD from being abused. And it also helps to disrupt these commercial brands. Unfortunately, they’re sort of brand recognition with known peddlers of CSAM. And so if bad CSAM domain dot org is suspended and it hops to bad CSAM domain dot example, then the consumers of that recognize that brand. And so it doesn’t solve CSAM online, but it helps introduce friction and make it harder for the sort of bad guys to continue their brand online. But and so but coming back to the to the earlier point of like, well, why don’t you just block anything that says CSAM or whatever? We’ve actually explored with in other sort of discussions with regulators around opioids and narcotics online. And an interesting thing we ran into is that they wanted to block known terms for opioids or narcotics. And that was you know, we were interested in that. And then they provided us the list and it included things like. lemonade, they wanted anything worth the word lemonade to be banned online, and not recognizing that there might be some really legitimate uses of the word lemonade. So it’s something that there’s, you know, to the kind of core of the question that there has to be a sort of good faith feedback loop between industry and regulators to ensure that whatever we want to give regulators the tools that they need to have good and educated regulation. And so having that feedback mechanism is key, because you don’t want to inadvertently block, for example, anything, any registration with the word lemonade in it. There’s lots of generic terms that are street names for drugs, and that’s bad, but those street names also have legitimate uses online, and you don’t want to inadvertently, you know, hamper speech online.
Auke Pals: Mozart, I saw your hand as well.
Speaker: Sure, I would just like to add two little pieces of information about those issues. In Brazil, we do have a regulation to avoid this kind of names in domain, but recently it doesn’t always work, so it has its flaws, and recently in a judgment in the Supreme Court in Brazil, the justice asked if he could go online and register the domain name as devtodemocracy.com.br, and he shouldn’t be able to do that, but some guy saw that because it was on TV, and he did it, and he registered that domain, and within our regulations, we could suspend this domain in a few hours, but not everything always works, so without court order in this case. But just to add about this independency thing about the regulator, I would like to say that one very recent piece of legislation that’s been proposed in Brazil was talking about self-regulation of the industry, and that the industry could create an entity to self-regulate itself, and that entity would be overseen by the regulator. So there’s a state regulator, so it’s probably, possibly a kind of a midway measure to deal with this lack of… possibilities of the regulator to predict everything and so the industry
Auke Pals: could be a little bit more comfortable. Thank you very much. To all, I’ve put a new question on the screen so you can log on again at menti.com and answer the questions. And while you do that, I will give the floor to Brian to answer this
Brian Cimbolic: question. Yeah, thank you very much. So the question, what role can technical and infrastructure actors play in combating illegal material online? So I’ve sort of already covered the drawbacks or technical impediments to dealing with content online via the DNS. If you think of the domain name Craigslist, in the U.S. it’s based on .org. In a lot of around the world it’s based on different country codes. But if someone uploads a piece of CSAM to Craigslist, is the right solution for me as the registry to suspend craigslist.org, rendering all those millions of pieces of legitimate content inaccessible? I think we’d all agree that the answer is no. But that doesn’t mean we do nothing. So that’s why I think it’s incumbent on infrastructure actors to, when the scale of harms is so great, to have processes in place to work to notify downstream. Now I do also want to draw the distinction. If you have a site that’s dedicated to something like CSAM or threat to human safety, which we have come across, to me there’s no issue. Registries should step in right away and suspend that. These sort of principles are actually codified in a document that came out in 2019 called the Framework to Address Abuse. And this is a document that was originally it was 11 registries and registrars that signed on. Now it’s more than 50 GTLD, CCTLD registries and registrars. And it basically has two principles. The first is that registries must step in, registries and registrars must step in when faced with DNS abuse or technical abuses, those phishing malware botnets, et cetera, et cetera. But then it accepts the premise that generally speaking content is best dealt with at the level in which it can be directly removed, the hosting provider level, the platform, et cetera, et cetera because of these issues of collateral damage. But in instances of CSAM, threats to human safety, human trafficking, and opioids and narcotics online, the registries and registrars should step in and do something to try and disrupt those sites, whether that’s referring to the registrant and demanding that they remove the content, whether it’s threatening suspension of the domain name, that those categories when the scale of harm is so great, the burden sort of shifts. And I think that it does become appropriate for the registry or registrar to step in to do something to try and disrupt that content. Again, it doesn’t always mean suspending the domain name because of the issues of collateral damage, but to try and do something, I think that it does become incumbent upon the registry and registrar to do something.
Auke Pals: Thank you. Any reflections?
Arda Gerkens: I think it’s very interesting. If you look at the terrorist content online regulation, we can only send out removal orders to the party that on behalf of another actor has published the information. So basically, that’s 99% of the time, that’s platforms or social media or… And 1% of the time, it might be a hosting company for a website. Whilst at the child sex abuse material legislation, we can actually go up in the line. So the problem that we encounter in the Netherlands is that we have some rogue hosters who apparently host a lot of this material. And they basically always say, yeah, there’s no… we could do about it because we have unmanaged hosting, we don’t know who our customer is, we cannot reach them, so there’s no way that we can act. And so this is something, hopefully the legislation, which is quite new, so I don’t know yet how effective it will be, but hopefully we will be able to somewhere down that line say, well, you know, whether you know your customer or not, that should not be our problem, that’s your problem. So you need to do something. If you cannot find your customer, I’m going to go to you because you are the one that is making, giving the possibility to the customer to host this kind of material. But it is something we are still struggling with and therefore I’m also seeking collaboration with the infrastructural parties to see, you know, in what way we can be effective enough without just taking down the whole of the internet or, I don’t know, 30 other websites while you just want to take down that one image. So it’s still, yeah, for us difficult and I’m really seeking cooperation here.
Auke Pals: So I’m also curious about the questions that have been coming from the audience. So I’m curious, who made the comment about the Clinton administration approach? Is that online? No, no one, or shy people? No. Then, let’s move on. Then I’m curious about what you’ve been writing down.
Abhilash Narayan: If you must know, I was rudely texting somebody. I wasn’t writing down anything on that. But, I’m Abhilash Narayan from the University of Exeter in the UK. Much of what I wanted to say has been covered here, but I just wanted to go back to the earlier point. I had a little. to add on age assurance, which Andrew mentioned earlier, and what regulators can do. We did a study, a pan-European study, a couple of years ago for the European Commission, where we looked at the state of play of age assurance technologies in terms of how it helps children, and we found that the ground reality was there was hardly anything out there. And one of the reasons for that was the industry really did not know what standards to use, what to use really, and regulators really provided very little guidance, and that came up quite a lot. So one of the recommendations we made was, this is going back two years ago, Ofcom has done it recently in the UK, that regulators need to help the industry with issuing guidance, standards, so that they can comply with the law. It’s not enough for legislation to stipulate that this needs to be done, but they actually need a bit of help. I’m not talking of big companies like Meta, who have the resources for that, but there are also smaller companies who have obligations to comply with, not just audiovisual media services directive, but also sale of online goods that are inappropriate for children,
Arda Gerkens: everything else. Thank you. That’s a really very good point. At least we know for instance, there are several initiatives out there. You were talking about the Technology Coalition, which is a coalition of platforms to combat child sex abuse online, and so there’s also Tech Against Terrorism and GIFTT, which are both organizations where they help the platforms or cooperate to combat terrorist content, and certainly for the smaller companies, this legislation is really bad. I mean, so many legislation coming up to them. So I really agree with you, what we do in our organization is to see what other initiatives are out there, and help give them guidance where they can find it, so because there already so many has been done, but it needs to be implemented. So
Auke Pals: yeah, it’s a very good point here. Thank you. I also saw an intervention from the back of the room.
David McCauley: Thank you. Can you hear me? Is this on? Yes, yes. My name is David McCauley, and like Brian, I work for an internet registry named VeriSign. And I just want to answer your question, although I’m not on Menti right now. And a couple of things I think technical companies can do is to reach out to two target groups. One is government regulators, to talk to them, to try and speak with them about what we do, how esoteric it is, all the implications of taking action and what that means, and vice versa, hear the same concerns from the government on terrorism, especially things like that. The other is to reach out to groups like this. I think participating in a session like this answers your first question. How can we share information and collaborate without regulators losing their independence? And so I would encourage organizations that have sessions like this to do more like this. I took part in one that had to do with DNS abuse about a month ago, and it was an eye-opening session where internet registrars spoke about the implications of taking sweeping actions and the need to be sort of circumspect in what orders might come. Very informative stuff. And so I want to thank you for the session, and that’s my way of doing it, trying to answer two questions at once. Thanks.
Auke Pals: Thank you very much. Any other comments from the room? If not, what does the industry think of the remark that was just made?
Tim Scott: I haven’t said anything for a minute. So on that last comment exactly, and in the UK, I’ve worked in and around the games industry for about 20 years in the UK government, then in the trade association now for Roblox. Some of the initiatives we’ve been involved with have been exactly that multi-stakeholder approach where you’ve had things like the MCA. the National Crime Agency in the UK, working with Ofcom, but working with the trade bodies and then working with the individual companies in a forum to share best practice, to tackle problems, to look at whether there’s a white label solution, particularly in the area of CSAM, excuse me, CSEA. And it’s been incredibly effective. The problem is maintaining that impetus as people come and go in roles. So you will have someone incredibly proactive within say a government organization who is promoted or retires or goes somewhere else. And they kind of take that knowledge and that enthusiasm and that willing to collaborate with them out of the building. It’s quite difficult for industry to establish those government relations in a really meaningful way. So I echo the sentiment of the gentleman in the corner there that absolutely this sort of forum and IGF is a great way of doing that we’ve got people from around the world here talking about like shared issues. The other thing I’d say is that they truly are shared issues. I think we’ve almost got the wrong people in the room if you wanna tackle some of these problems because we do care, right? And if our platform becomes synonymous with problems, people won’t be on our platform, it won’t be economically viable for us to continue to run it. There are other people who are less concerned about these sort of things and trying to get at those people is the real challenge.
Arda Gerkens: Arne, go ahead. I think you do yourself short here. I don’t know if that’s an English saying, but the fact that you are here and acting up means that they need to comply more. If nobody would be here in the room complying, the job would be much harder. And also for me as a regulator to be able to stipulate if I would ever have to go to a judge that there’s policies out there which are common within the industry, that means that really helps me to fight that battle. So I’m really happy that we’re here in this room together. And also I think the discussion on DNS abuse. To me, that’s really, that’s a new sound that we have because for many, many years, we didn’t want to interfere on the infrastructural level. We always said, no, no, no, that’s content. We cannot touch content and we cannot mess up with content and infrastructure. So I’m really happy that we’re having this discussion now and that we’re looking at the possibilities here because we need to do more than just looking at platforms too.
Speaker: I’m sorry, I would just add a little piece of information once more because what Tim just said, if I understood it well, Tim, maybe a part of the answer for your question is that we would have more stability in public service staff, something like this.
Tim Scott: If not necessarily in the staff, then in the policy approach. So when somebody leads, they don’t take the impetus and the proactive collaborative approach with them. That is kind of handed over to the role as opposed to the individual.
Speaker: Sure, the culture of the regulator. And part of the answer for that is in the question, in the first question, that is independency because then we don’t change if the governments change. So if a regulator is truly independent, the staff will go on, even though the president or prime minister has changed so we can keep on doing our job in a regular way and through time. So part of the dependency of the regulator is part of this answer and it’s really important for us.
Auke Pals: But you all mentioned that we do care. And in France, the owner of Telegram was arrested. And is that a way to deal with non-cooperative parties?
Arda Gerkens: No, you don’t want to answer.
Deepali Tamhane: I did, I actually wanted to come in on the last point, if I may. I do think that there’s also a role, I mean, working on the safety team, we work with a lot of safety organizations and I do think there’s a huge role that they play in helping combat a lot of harmful content online. And like Project Lantern, and Project Lantern is run by the Tech Coalition, which is an alliance of tech companies, but we also have equivalent things like StopNCI.org and TakeItDown, which is run by Revenge Pond Helpline and by NikMik, which allows participating companies to receive hashes from them, and we’re able to remove that content if it is uploaded. Safety organizations that we work with also help us, you know, design certain interventions and communications that may be helpful for users as resources. So I think that they’re definitely a very important part of the ecosystem. And for the Telegram enforcement, should I hand it to you?
Speaker: Just to say something really quick, because that question also carries something that is that clearly cooperation is uneven. So that’s why we need legislation. We need to put everybody in a bar that’s the lowest acceptable level of cooperation. We’ve got very cooperative companies and others that are not. So I believe that that is one of the reasons why we really need regulation.
Tim Scott: So just to finish on that point, the risk, obviously, with doing that is that you lower the bar. So by not making it mandatory, you get good actors who go above and beyond and actually create really good work. If you codify it and put it down to the minimum standard, everyone’s, well, that’s all I need to do. So I’ll just tick that box and move on. We don’t then get the innovation. I would not happily change places with you, although Brazil’s probably a little bit nicer weather.
Speaker: No, but exactly. That’s why it’s not an easy, straight answer. It’s a very tricky thing to do. So we have to do it the best way possible, and you’re right.
Auke Pals: Arda, did you want to answer the Telegram question?
Arda Gerkens: The interesting thing is that you said that Telegram is now complying. Well, I mean, they were already complying on terrorist content. regulation. They were very compliant for our organizations and others too in Europe, but they just really don’t do enough. Because I can still see channels with names Renton Terrace, Christchurch shooting or other names which clearly point out the kind type of material probably be found in those channels. So even if they are arrested, they may take a little step. I don’t know if that’s indeed in the end will solve the problem. And again here I do think would be nice if the industry themselves could also put more pressure on Telegram to be compliant. Because you know I sometimes feel that there’s like you know we have the good actors here on the table and the bad actors are not on the table. And I don’t agree. I think you’re a one group and you should also talk amongst each other and see that if you have bad apples in in the room, you could help to get them out.
Roelof Meijer: Rolof Hoogster. Thanks Bauke. Rolof Meeijer, SIDN again. SIDN runs the .nl country code top-level domain. My answer is in the bottom right one. I think infrastructure actors can do, technical and infrastructure actors can do a lot. But all those things cost money. So we are not for profit. So first it’s relatively easy, although it’s still justifiable cost. But I think for a commercial company to spend really a significant amount of money on combating crime, online or offline by the way, it has to be part of the of the strategy, the culture, the company values. And I think that is very often the problem. It is seen as a cost. And as soon as you arrest the CEO, now suddenly these are justifiable costs. with spending them you will prevent the second arrest of the CEO. I’m ridiculizing it a bit but I think that’s very often the point that shareholders will complain because these are costs that are as long as they are not enforced it’s a voluntary choice so you can also choose not to spend the money on that. So I think our biggest challenge is making sure that in the end fighting abuse is part of the culture of the values of also the very large commercial companies.
Auke Pals: You want to reject? Yeah, thank you.
Andrew Campling: I completely agree with all of that. I think the point of arresting the CEO or other senior execs is there have to be meaningful consequences for non-compliance and increasingly for some of the big players the level of fines that are exercised are just not meaningful. Even Apple’s fine of $12 billion, a $3 trillion company that’s still not really material. Whereas arresting the CEO focuses minds and does drive action. And then just briefly the other point yes, the risk of having legislative action to enforce compliance might then lower the bar for some. That’s a fair chance. But of course the regulation doesn’t have to be static. There’s no reason why you can’t raise the bar every year so that you do keep pushing companies to do more, try harder. And I’m sure regulators will be entirely up for doing that. And having a general duty of care, for example absolutely raises the bar because you’re wide open then to challenge if you can show that there are widespread abuses irrespective of whatever measures you’ve taken.
Tim Scott: Thank you. The CEU team? I mean, we could keep trading this all day, Andrew but I guess that then raises the bar to entrance into the market if you continue to do that, which would then stifle innovation and consolidate the position of the big incumbents that are causing the problems. So this is going to go back and forth if we… But I take your point.
Deepali Tamhane: The only caution I have is when we talk about arresting CEOs. I think that there are a lot of countries where a Lot of political speech is also Illegal and I think that we also have to be cognizant of those concerns because a lot of companies including ours have dealt with with Pressures from government to remove content, which we would not consider Illegal, but we would consider fair political criticism. I see your comments from the room
Audience: Hello again Not speaking personally not for my organization, but I’m trying to play a little bit of a devil’s advocate here aren’t going back to the the point of Clear responsibilities aren’t we putting an oversized responsibility on the platforms it to solve an issue We has a society. We haven’t been able to solve in such a long time I mean we doing enough as a society to solve the problem at its roots and not necessarily at the Television where it’s spreading itself on maybe we’re putting too much on the platforms and not doing enough to help them achieve this share goal because just the the discourse has been a bit of Us versus them kind of pushing forth and perhaps not not enough of collaboration and multi-stakeholder Everybody’s sitting at the table and trying to actually find the root causes
Auke Pals: Oh, thank you, and I did see some comments from the room as well on public-private partnerships Is anyone one want to Give a comment about that
Arda Gerkens: Arna, well, I totally agree with you I do believe that many times for the problems that we have People are looking at all the end line. So how can we? Look into encrypted environments to stop the spreading of child sex abuse material Well, maybe we should go back and just address why there’s even this kind of material out there Because many of it doesn’t even come from abuse, but it’s even Voluntarily leaked images or AI generated or whatever, you know, it’s still a form of abuse but this so I think There should be much more attention in policies for the beginning and see how we can solve this as a society But I also think that what we really need is more cooperation. Also. I’m looking at the regulators for example instance. My organization itself is part of the global online safety regulators network which now have eight regulators inside and it’s very interesting to see that some of the platforms which are not here on the table today but basically think that we don’t talk to each other so they they tell one organization this and and then we found out that they do different so you know that helps but I think we need to do that more and and also I’m worried about because you’re talking about the height of the fines we can alter that you know we can just make 10 20 30 percent whatever but these are the bigger companies you’re talking about the big tech but we see a lot of problems with a smaller tech the smaller platforms what we’re not talking about here it’s a gap it’s team it’s you know you name them which are smaller platforms who really don’t care that much about regulation and for us it’s going to be difficult to either collect those fines you know if we can impose them we’re going to need to cooperate within the countries to make sure that if I cannot collect a fine and I can go to your country you can help me collect that one so the internet is global so the solution needs to be global and so the corporation needs to be global to even to be able to to tackle the problems that we have thank you and
Auke Pals: meanwhile as you want to answer I’ve put on a new question on the screen so these all wrap your phones join menti.com and use the code and the Palia I can give
Deepali Tamhane: you I just I just wanted to agree with the with the point that you made I think that that’s really really important and when we make it I don’t think it’s taken as seriously but I think that it’s really important to have that conversation and you know one of the examples that I will I will give is that as a platform you know issuing rape threats is something that we you know we remove as violation of community standards and I think multiple platforms do that but we still see rape threats and people think that it’s normal discourse to issue rape threats online and what is what can we do and that’s more as a society and less as a platform what can we do to address that is that you know the role of schools educators, parents, as well as platforms. I think that’s a conversation that’s not always had. The second is that we report millions of pieces of content to NCMEC that goes to law enforcement authorities. We also don’t have visibility into what are the prosecutions that are taking place based on the reports that we are giving. And I think that’s also really important because it’s the last part of the chain to understand are the reports we’re giving useful or are we just roaming in cycles where we’re removing the content, but at the end of the day, you’re not able to take criminal actions against actual predators, so. Thank you.
Auke Pals: Okay, let’s move on. In the last 15 minutes, we want to answer the question, how do we prevent legislation that threatens the open and free internet whilst addressing illegal and harmful content? And I do see some people actively typing in the room as well. And I’ll just give someone on the floor who I see is quite actively typing. Sir, can I give you the floor?
Audience: Thank you. In fact, yes, that’s my comment. The third one on the top floor. I think we need to preserve privacy and so on, but at the same time, we need to clear the procedures to be able to reach who put that harmful contents. So we need some kind of legislation and of course procedures, how to be able to reach this, whether through official, let’s say, channels or whatever. So I think this is needed.
Auke Pals: Thank you. Gentlemen, next to you, did you also put down a? No, I didn’t. No, okay. Did you? No, same thing. Oh, same, same. Any reflections on the comment that was just made? Or on the question?
Speaker: Well, just to inform also that this last piece of legislation in Brazil, that privacy in anonymous stuff and online, it proposes that people can keep their identity, their real identity to the public, but the platforms should know who that person is, if necessary, for any court order or something, the platform would know who that person is. So I think this is a very important piece of legislation. Thank you. Thank you. who each profile on their platforms really are. So it kind of put the responsibility on the digital platforms. I don’t know if that’s the best way to do it, but it’s one way that’s been proposed in Brazil and we have to discuss it. Thank you. And I’ll walk to someone else I saw actively typing.
Roelof Meijer: Rulof. Okay, Rulof Meyer again. In fact it’s something I feel very strongly about. I have two answers there. One is by realizing that we share responsibility and act on it. I think that’s the summary of it all. Just talking for my industry, I think it’s been far too long that the domain name industry has been saying, no no we can’t do anything because we will break the internet, we will get financial claims, it will be a slippery slope because now it’s illegal and then it will be unwanted and then it will be unpleasant and then it will be apolitical or something like that. So there were all kinds of excuses to do nothing. Just I think for the fear of being held responsible for something. And slowly the sector is overcoming that position and Arda is glad that we now talk about DNS abuse. I think that’s a first step because it’s still kind of saying we don’t do content, we cannot interfere in content, but we can do something if the DNS is being abused. I think it’s a kind of artificial distinction in online abuse. But anyway, but I think that’s the most important thing that we feel that we share a responsibility and of course also regulators realize that we are not the responsible party, but we have a responsibility. And again, like in my previous reaction, that responsibility that feeling should stem from company values. Thank you. Brian, I saw you
Brian Cimbolic: actively knocking. Yeah, I agree 100% with Roloff, that I think that, you know, I think that there has to be the recognition that while the registries and registrars shouldn’t be the natural place to first approach issues relating to website content, there is a role for us to play, particularly when you’re talking about, you know, not file sharing sites, but sites that are dedicated to a specific purpose, whether that’s child sexual abuse material, whether that’s stolen credit cards, whether that’s, you name it. There are instances in which there are clear, just patently illegal issues, and that there are times where it’s appropriate for registries and registrars to step in and do something. So I agree with Roloff entirely.
Arda Gerkens: Arda? Yes, I think the way that DNS Abuse took this task upon them is very interesting, because they established a framework in which they said, if it falls under these and these categories, that’s something that we should address then at that moment. And of course, this is for us a very important question, as we have, we were assessing terrorist content online, we have regulations, so, but how do you explain that regulation, you know, so it’s always for us a day to day job to evaluate the posts that we review. But I think, you know, we have universal human rights. And for me, if it’s hampered one of these universal human rights, then I think that’s something that can be illegal and harmful content. And for me, everybody can say whatever they want, you know, freedom of speech is okay, but as long as you are not interfering with these human rights, then it’s fine by me. But once you do, then I think that’s something we should act upon. Because if we don’t, and that’s something we tend to forget, we think we cannot interfere with anything online, unless it’s really illegal content, because then it will hamper the open free internet and open free speech. But we are now in a world online, that’s especially women, but I think many of us don’t even dare to speak or be themselves anymore. Because if they do, that might have serious consequences, not only online, but also offline as we have seen. For instance, like I said, women, LGBTQ+, we know women don’t go into politics for that reason, rape threats, we all see them. And that’s hampering freedom of speech, and that should end. We should have a world online where we can all discuss whatever we want without those threats. Thank you.
Auke Pals: I saw a comment, or hands raised.
Audience: Just a small comment, I’ll put it on the screen, which is to sort of turn this question around. Oftentimes there’s pushback to do certain things because it will break the free and open internet. I observe, though, that some companies, I’m not sure if this is true of any in the room, do precisely that in order to get market access into certain countries. So they absolutely change their products, change their approach, adapt things to comply with exactly the same rules that they’re complaining about, but to get access to certain autocratic countries, but refuse to do that in democracies. So it’s almost as though democracies are being punished, but on the other hand, they will happily comply with exactly the same things in autocratic places. So perhaps we ought to just call out where there’s hypocrisy like that and say, well, if you can do that in that country, do it here as well, please. Thank you very much.
Auke Pals: Anyone who wants to comment on that in the last five minutes of this session? I saw you. No? No? Okay. My sound is off. Wrap up? Oh, yeah. No. Last.
Audience: Thank you. I’ll be very quick. The question made me think and took me back to the 1990s when we started talking about how to regulate the Internet or not regulate at all. There was this clear school of thought that argued that the Internet should be left alone without any laws. But we’ve actually moved on quite far from that position. There are sufficient laws, and in fact, there was a myriad of legislation affecting addressing illegal and unlawful… harmful content. The solution is not more legislation. What hasn’t worked is effective enforcement of the laws in place and that’s where we have actually failed. So essentially what we need is more effective enforcement of the laws by regulators and other mechanisms whereby laws can be enforced rather than thinking about more legislation. Thank you very much. No I don’t want to delay you more and in fact because I’m sorry it’s a very important session sorry I couldn’t stay with you from the beginning start with you from the beginning but because DNS abuse was mentioned the next by the way workshop is on room 2 if you like to continue at 315 it will be about DNS abuse defined and experience so if anyone interested to join us there we’ll be happy to do
Auke Pals: that. Thank you very much and as we reached to the pitch part of the next session I would like to give room to anyone who likes to reject or reflect on
Deepali Tamhane: on the session. I mean just quickly it’s not a reflection on any of the substance but I think it’s really important to continue having these conversations and we’re really grateful to be invited because we have regulators in the room other companies in the room and I think this is exactly the kind of information sharing and you know also the sharing of what our position is even if we even if we disagree and I think it’s it’s really important so thank you. Yes and I
Arda Gerkens: just wanted to add what you said indeed we have had a regulation for legislation for a very long time but I think it’s very hard for law enforcement to to enforce that and I do believe that we regulators can play a very good role because for us it’s also more easy to talk to the companies the platforms the infrastructural platforms or companies and we need to do so to to get our regulations right. Just to ask a final remark I would like to say that I’m very
Speaker: pleased about what happened this in this time because we started with very technical pics about DNS and stuff like this etc. And we ended with, I believe, the straightforward answer for this question is democracy, democratic values, democratic process happening, protests, if you think something is not being done accordingly. So society is free to participate in all that, and thanks God we are in democratic countries, free that we can talk like this and come to good and meaningful results. So that’s what I believe, that’s what I think must happen in Brazil, and I’m very glad to participate
Tim Scott: on that. I guess to echo sentiments on the panel in terms of, from a private sector perspective, talking to regulators, talking to policy makers, and having them talk to us means that we can understand each other better. It doesn’t have to be adversarial, and we don’t have to have conflicting and competing aims, and I think in many cases, and certainly in my experience in my career on both sides of the fence, that is rarely the case, but understanding each other and talking to each other is the way in which we can achieve that level of better regulation. So this is to be welcomed and continued.
Brian Cimbolic: Brian, do you want to make a final remark? No, just to, one, thank you very much for accommodating me being remote, and two, just to reiterate, I think that there is room for responsible and thoughtful action at the infrastructure level, but again, it has to be carefully tailored to avoid collateral damage.
Auke Pals: Thank you very much. Now I would also like to thank the rapporteur Marjolein, and do we have a final conclusion? Well, I just want to say to you, there is a simple conclusion. We need the dialogue. and share responsibility. And thank you for organizing this session and thank you all very much. And with this, I would also like to thank all the panelists, all the audience for your active participation and the online participants and the remote moderator, Doreen. We really would like to continue this discussion and that’s where we are hoping for to continue this discussion locally and also at the next IGF in Norway. So if we do get our session submitted and accepted, hopefully see you there. Thank you very much. Got a good session. Very good, guys. So, we’re going to have a nice dinner for you. Oh, that’s nice.
Deepali Tamhane
Speech speed
171 words per minute
Speech length
2108 words
Speech time
736 seconds
Content removal policies and practices
Explanation
Deepali Tamhane explained Meta’s approach to content moderation, including community standards that cover illegal and harmful content. She highlighted the use of proactive technology to remove content before it’s reported.
Evidence
Meta removes millions of pieces of content quarterly for various violations, including CSAM, bullying, and suicide-related content.
Major Discussion Point
Regulating online content
Proactive content removal by platforms
Explanation
Deepali Tamhane emphasized Meta’s proactive approach to content removal. She highlighted that they remove a significant amount of content before it’s even reported by users.
Evidence
Meta reported removing over 7 million pieces of content for bullying and harassment, and over 5 million for suicide and self-injury in a single quarter.
Major Discussion Point
Addressing illegal and harmful content
Agreed with
Brian Cimbolic
Arda Gerkens
Agreed on
Challenges in addressing illegal and harmful content
Differed with
Andrew Campling
Differed on
Role of end-to-end encryption in content moderation
Brian Cimbolic
Speech speed
159 words per minute
Speech length
1748 words
Speech time
659 seconds
Challenges of regulating content at DNS level
Explanation
Brian Cimbolic explained the difficulties of dealing with content at the DNS level. He highlighted that suspending a domain name is a blunt tool that can render legitimate content inaccessible along with the problematic content.
Evidence
Example of a file-sharing site where suspending the entire domain due to one piece of illegal content would also make hundreds of thousands of legitimate content pieces inaccessible.
Major Discussion Point
Regulating online content
Agreed with
Deepali Tamhane
Arda Gerkens
Agreed on
Challenges in addressing illegal and harmful content
Role of DNS actors in combating illegal content
Explanation
Brian Cimbolic acknowledged that while DNS actors shouldn’t be the first approach for content issues, they do have a role to play in certain cases. He emphasized the need for responsible and thoughtful action at the infrastructure level.
Evidence
Mentioned the Framework to Address Abuse, which outlines principles for when registries and registrars should step in to address abuse.
Major Discussion Point
Addressing illegal and harmful content
Audience
Speech speed
156 words per minute
Speech length
846 words
Speech time
323 seconds
Need for clear responsibilities in content regulation
Explanation
An audience member emphasized the importance of clarity in responsibilities within the ecosystem of content moderation. They suggested that this clarity is crucial for effective collaboration and enforcement.
Evidence
Mentioned ongoing work in the Netherlands on a public-private partnership for content moderation.
Major Discussion Point
Regulating online content
Importance of preserving privacy while enabling content removal
Explanation
An audience member stressed the need to balance privacy preservation with the ability to identify those who post harmful content. They suggested that clear procedures are needed to reach those responsible for harmful content.
Major Discussion Point
Addressing illegal and harmful content
Need for effective enforcement of existing laws
Explanation
An audience member argued that the solution to addressing illegal and harmful content is not more legislation, but more effective enforcement of existing laws. They suggested that the focus should be on improving enforcement mechanisms.
Evidence
Pointed out that there are already sufficient laws addressing illegal and harmful content, but enforcement has been lacking.
Major Discussion Point
Balancing regulation and open internet
Differed with
Arda Gerkens
Differed on
Effectiveness of current legislation
Tim Scott
Speech speed
172 words per minute
Speech length
1112 words
Speech time
386 seconds
Importance of multi-stakeholder approach
Explanation
Tim Scott emphasized the value of a multi-stakeholder approach to online safety. He highlighted the importance of dialogue with regulators and governments to understand and address risks and threats.
Evidence
Mentioned Roblox’s ongoing dialogue with regulators to demonstrate their efforts in meeting safety expectations.
Major Discussion Point
Regulating online content
Agreed with
Deepali Tamhane
Arda Gerkens
Roelof Meijer
Agreed on
Need for collaboration between industry and regulators
Risk of lowering standards through regulation
Explanation
Tim Scott cautioned that setting mandatory minimum standards through regulation might lead to companies only meeting those minimums. He argued that this could stifle innovation and higher standards set by good actors.
Major Discussion Point
Regulating online content
Challenges of maintaining collaborative relationships over time
Explanation
Tim Scott pointed out the difficulty of maintaining collaborative relationships between industry and regulators over time. He noted that changes in personnel can disrupt established relationships and knowledge sharing.
Evidence
Mentioned experiences in the UK where proactive individuals leaving roles have impacted collaborative efforts.
Major Discussion Point
Collaboration between regulators and industry
Speaker
Speech speed
129 words per minute
Speech length
1549 words
Speech time
718 seconds
Regulation needed to set minimum standards
Explanation
The speaker argued that regulation is necessary to establish a minimum acceptable level of cooperation from all companies. They pointed out that cooperation is currently uneven among different companies.
Major Discussion Point
Regulating online content
Importance of democratic values and processes
Explanation
The speaker emphasized the importance of democratic values and processes in addressing issues of online content regulation. They suggested that societal participation and democratic freedoms are crucial in finding meaningful solutions.
Major Discussion Point
Balancing regulation and open internet
Arda Gerkens
Speech speed
175 words per minute
Speech length
3177 words
Speech time
1086 seconds
Need for regulators to maintain independence
Explanation
Arda Gerkens discussed the challenge of maintaining regulator independence while collaborating with industry. She emphasized the importance of keeping focus on the ultimate goal of cleansing the internet of harmful content.
Major Discussion Point
Collaboration between regulators and industry
Agreed with
Deepali Tamhane
Tim Scott
Roelof Meijer
Agreed on
Need for collaboration between industry and regulators
Need to address root causes of harmful content
Explanation
Arda Gerkens emphasized the importance of addressing the root causes of harmful content, rather than just focusing on removal. She suggested that more attention should be given to policies addressing why such content exists in the first place.
Evidence
Mentioned various forms of harmful content, including voluntarily leaked images and AI-generated content.
Major Discussion Point
Addressing illegal and harmful content
Agreed with
Brian Cimbolic
Deepali Tamhane
Agreed on
Challenges in addressing illegal and harmful content
Differed with
Audience
Differed on
Effectiveness of current legislation
Protecting free speech while addressing harmful content
Explanation
Arda Gerkens argued for the need to balance free speech with addressing harmful content online. She suggested that content interfering with universal human rights should be considered illegal and harmful.
Evidence
Mentioned the impact of online threats on women, LGBTQ+ individuals, and politicians’ willingness to engage in public discourse.
Major Discussion Point
Balancing regulation and open internet
Roelof Meijer
Speech speed
148 words per minute
Speech length
704 words
Speech time
285 seconds
Value of public-private partnerships
Explanation
Roelof Meijer emphasized the importance of public-private partnerships in addressing online content issues. He argued that both regulators and industry need to recognize their shared responsibility in tackling these problems.
Major Discussion Point
Collaboration between regulators and industry
Agreed with
Deepali Tamhane
Tim Scott
Arda Gerkens
Agreed on
Need for collaboration between industry and regulators
Risk of breaking open internet through regulation
Explanation
Roelof Meijer cautioned against the risk of breaking the open internet through overzealous regulation. He noted that the domain name industry has historically been reluctant to take action due to fears of breaking the internet or facing financial claims.
Evidence
Mentioned the industry’s gradual shift towards addressing DNS abuse as a first step.
Major Discussion Point
Balancing regulation and open internet
Andrew Campling
Speech speed
137 words per minute
Speech length
802 words
Speech time
349 seconds
Challenges of end-to-end encryption for content moderation
Explanation
Andrew Campling highlighted the challenges posed by end-to-end encryption for content moderation. He argued that some tech choices, like encryption, give platforms plausible deniability for not acting on harmful content.
Evidence
Cited Meta’s implementation of end-to-end encryption on Facebook Messenger, which prevents visibility of potential CSAM content.
Major Discussion Point
Addressing illegal and harmful content
Differed with
Deepali Tamhane
Differed on
Role of end-to-end encryption in content moderation
Potential hypocrisy in selective compliance with regulations
Explanation
Andrew Campling pointed out potential hypocrisy in how some companies selectively comply with regulations. He suggested that companies sometimes adapt their products to comply with rules in autocratic countries while resisting similar changes in democracies.
Major Discussion Point
Balancing regulation and open internet
Agreements
Agreement Points
Need for collaboration between industry and regulators
Deepali Tamhane
Tim Scott
Arda Gerkens
Roelof Meijer
Importance of multi-stakeholder approach
Need for regulators to maintain independence
Value of public-private partnerships
Speakers agreed on the importance of collaboration between industry and regulators to effectively address online content issues while maintaining regulatory independence.
Challenges in addressing illegal and harmful content
Brian Cimbolic
Deepali Tamhane
Arda Gerkens
Challenges of regulating content at DNS level
Proactive content removal by platforms
Need to address root causes of harmful content
Speakers acknowledged the complexities involved in addressing illegal and harmful content, including technical limitations and the need for proactive measures.
Similar Viewpoints
Both speakers emphasized the need for careful consideration when involving DNS-level actors in content regulation to avoid unintended consequences on the open internet.
Brian Cimbolic
Roelof Meijer
Role of DNS actors in combating illegal content
Risk of breaking open internet through regulation
Both argued that simply creating more regulations might not be the solution, and instead focused on the importance of effective implementation and enforcement of existing standards.
Tim Scott
Audience
Risk of lowering standards through regulation
Need for effective enforcement of existing laws
Unexpected Consensus
Recognition of shared responsibility in content moderation
Deepali Tamhane
Brian Cimbolic
Roelof Meijer
Arda Gerkens
Proactive content removal by platforms
Role of DNS actors in combating illegal content
Value of public-private partnerships
Need to address root causes of harmful content
Despite representing different sectors (platforms, DNS operators, regulators), there was an unexpected consensus on the shared responsibility in addressing online content issues, moving beyond traditional sector-specific roles.
Overall Assessment
Summary
The main areas of agreement centered around the need for collaboration between industry and regulators, the challenges in addressing illegal and harmful content, and the recognition of shared responsibility across different sectors.
Consensus level
Moderate consensus was observed, with implications for fostering more collaborative approaches to content regulation. However, differences remained in specific implementation strategies and the balance between regulation and maintaining an open internet.
Differences
Different Viewpoints
Role of end-to-end encryption in content moderation
Andrew Campling
Deepali Tamhane
Challenges of end-to-end encryption for content moderation
Proactive content removal by platforms
Andrew Campling argued that end-to-end encryption hinders content moderation efforts, while Deepali Tamhane defended the use of encryption, stating that safety mitigations can still be implemented.
Effectiveness of current legislation
Audience
Arda Gerkens
Need for effective enforcement of existing laws
Need to address root causes of harmful content
An audience member argued that existing laws are sufficient but lack enforcement, while Arda Gerkens emphasized the need to address root causes of harmful content.
Unexpected Differences
Approach to regulation in different countries
Andrew Campling
Deepali Tamhane
Potential hypocrisy in selective compliance with regulations
Content removal policies and practices
Andrew Campling unexpectedly pointed out that some companies might be more willing to comply with stricter regulations in autocratic countries than in democracies, which contrasts with the general narrative of protecting free speech and open internet principles.
Overall Assessment
summary
The main areas of disagreement revolved around the balance between content moderation and privacy/encryption, the effectiveness of current legislation versus new regulations, and the role of different actors in the internet ecosystem in addressing illegal and harmful content.
difference_level
The level of disagreement was moderate, with speakers generally agreeing on the need to address illegal and harmful content but differing on methods and responsibilities. These differences highlight the complexity of regulating online content while preserving an open and free internet, suggesting that ongoing dialogue and collaboration between stakeholders is crucial for developing effective solutions.
Partial Agreements
Partial Agreements
Both speakers agreed on the need for regulation, but Tim Scott cautioned that setting mandatory minimums might lower overall standards, while the other speaker argued for regulation to establish a baseline level of cooperation.
Tim Scott
Speaker
Risk of lowering standards through regulation
Regulation needed to set minimum standards
Both speakers acknowledged the potential role of DNS actors in addressing illegal content, but emphasized the need for careful, targeted action to avoid unintended consequences.
Brian Cimbolic
Roelof Meijer
Role of DNS actors in combating illegal content
Risk of breaking open internet through regulation
Similar Viewpoints
Both speakers emphasized the need for careful consideration when involving DNS-level actors in content regulation to avoid unintended consequences on the open internet.
Brian Cimbolic
Roelof Meijer
Role of DNS actors in combating illegal content
Risk of breaking open internet through regulation
Both argued that simply creating more regulations might not be the solution, and instead focused on the importance of effective implementation and enforcement of existing standards.
Tim Scott
Audience
Risk of lowering standards through regulation
Need for effective enforcement of existing laws
Takeaways
Key Takeaways
Collaboration and dialogue between regulators, industry, and other stakeholders is crucial for effectively addressing illegal and harmful online content
There is a need for clear responsibilities and standards in content regulation, while maintaining the independence of regulators
Content moderation at the infrastructure/DNS level should be carefully considered to avoid collateral damage
Proactive content removal by platforms and technological solutions play an important role, but societal approaches are also needed to address root causes
Balancing regulation of harmful content with preserving an open and free internet remains a key challenge
Resolutions and Action Items
Continue multi-stakeholder dialogues and collaborations on content regulation issues
Explore ways for regulators and industry to share information and cooperate while maintaining regulatory independence
Consider developing clearer frameworks for when infrastructure/DNS-level actors should take action on illegal content
Unresolved Issues
How to effectively regulate smaller platforms and ‘bad actors’ that may not engage in collaborative efforts
Addressing challenges of end-to-end encryption for content moderation
Determining appropriate consequences and enforcement mechanisms for non-compliance with regulations
How to harmonize content regulation approaches globally given differing laws across jurisdictions
Suggested Compromises
Focusing regulation on clearly illegal content while allowing more flexibility for borderline cases
Having platforms know real identities of users but allow public anonymity
Developing ‘shared responsibility’ frameworks between regulators and industry rather than top-down approaches
Thought Provoking Comments
We have a problem here… For instance, that for the platforms, it’s difficult to accept hash databases coming from a governmental organization. Because that would basically indirectly mean that a government is telling them what to host and not to host on their platforms, right?
speaker
Arda Gerkens
reason
This comment highlights a key tension between government regulation and platform autonomy in content moderation.
impact
It sparked discussion about potential solutions for information sharing between regulators and platforms while maintaining independence.
Tech choices are often made by the platform operators, which, in my view, give them plausible deniability for not acting, to the point about not complying with domestic legislation. So, for example, switching on end-to-end encryption, which hides the existence of CSAM being distributed actively on the platforms.
speaker
Andrew Campling
reason
This comment challenged the platforms’ narrative about privacy protection and highlighted potential negative consequences of certain technical choices.
impact
It led to a debate about the balance between privacy and safety, particularly around end-to-end encryption.
Aren’t we putting an oversized responsibility on the platforms to solve an issue we as a society haven’t been able to solve in such a long time? I mean, are we doing enough as a society to solve the problem at its roots and not necessarily at the distribution where it’s spreading itself?
speaker
Audience member
reason
This comment shifted the focus from platform responsibility to broader societal responsibility in addressing online harms.
impact
It prompted reflection on the need for a more holistic approach to online safety, involving education, law enforcement, and societal change.
The solution is not more legislation. What hasn’t worked is effective enforcement of the laws in place and that’s where we have actually failed. So essentially what we need is more effective enforcement of the laws by regulators and other mechanisms whereby laws can be enforced rather than thinking about more legislation.
speaker
Audience member
reason
This comment challenged the assumption that more legislation is needed and instead focused on improving enforcement of existing laws.
impact
It redirected the conversation towards discussing ways to improve enforcement and cooperation between regulators and platforms.
Overall Assessment
These key comments shaped the discussion by highlighting complex tensions between regulation, platform autonomy, privacy, and safety. They pushed the conversation beyond surface-level solutions towards more nuanced considerations of shared responsibility, effective enforcement, and the need for ongoing dialogue between all stakeholders. The discussion evolved from technical specifics to broader questions of democratic values and societal responsibility in addressing online harms.
Follow-up Questions
How can platforms and regulators collaborate effectively while maintaining regulatory independence?
speaker
Auke Pals
explanation
This was a central theme of the discussion, with participants exploring ways to balance cooperation and independence.
What role can technical and infrastructure actors play in combating illegal material online?
speaker
Auke Pals
explanation
This question was posed to the panel and audience to explore the responsibilities and capabilities of different stakeholders in addressing online harms.
How can age verification and age estimation controls be improved on platforms?
speaker
Andrew Campling
explanation
The speaker highlighted inadequate age verification as a significant issue, suggesting a need for further research and development in this area.
What are effective ways to address the root causes of harmful content, rather than just focusing on removal?
speaker
Audience member
explanation
This question challenges the current approach of content moderation and suggests a need to explore more fundamental societal solutions.
How can smaller platforms be better engaged in content moderation efforts?
speaker
Arda Gerkens
explanation
The speaker noted that smaller platforms often don’t care as much about regulation, indicating a need to explore strategies for their inclusion.
What are the implications of end-to-end encryption for content moderation, and how can safety be ensured in encrypted environments?
speaker
Andrew Campling and Deepali Tamhane
explanation
This was a point of contention, with differing views on the balance between privacy and safety in encrypted communications.
How can regulators and industry collaborate to develop clear standards for age assurance technologies?
speaker
Abhilash Narayan
explanation
The speaker highlighted a lack of guidance for industry on age assurance standards, suggesting a need for collaborative development of such standards.
How can the effectiveness of content removal reports to law enforcement be measured and improved?
speaker
Deepali Tamhane
explanation
The speaker noted a lack of visibility into prosecutions resulting from platform reports, indicating a need for better feedback loops and effectiveness measures.
How can legislation be crafted to address illegal and harmful content without threatening the open and free internet?
speaker
Auke Pals
explanation
This question was posed to the panel and audience, highlighting the ongoing challenge of balancing content regulation with internet freedoms.
How can the hypocrisy of companies complying with restrictive regulations in some countries but not others be addressed?
speaker
Audience member
explanation
This question raises issues of consistency in platform policies across different jurisdictions and the potential for double standards.
Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.
Related topics
Related event
Internet Governance Forum 2024
15 Dec 2024 06:30h - 19 Dec 2024 13:30h
Riyadh, Saudi Arabia and online