WS #241 Balancing Acts 2.0: Can Encryption and Safety Co-Exist?
27 Jun 2025 10:15h - 11:30h
WS #241 Balancing Acts 2.0: Can Encryption and Safety Co-Exist?
Session at a glance
Summary
This workshop discussion focused on the complex challenge of balancing encryption, privacy, and public safety, particularly regarding child protection online. The panel, moderated by David Wright from SWGFL, included representatives from the FBI, South African telecommunications regulator ICASA, the Canadian Center for Child Protection, and the Internet Watch Foundation. The central question explored whether encryption and safety can truly coexist and how to move beyond polarized positions toward collaborative solutions.
Katie Noyes from the FBI emphasized that encryption and safety must coexist, citing over 17,000 cases impacted by encryption challenges, particularly involving child sexual abuse material and sextortion cases. She highlighted how end-to-end encryption has created “warrant-proof” communications that hinder investigations even when law enforcement has proper judicial authorization. Andrew Campling discussed how privacy arguments can be weaponized in standards development forums, noting that technical solutions like client-side scanning could address known child abuse material without weakening encryption or creating backdoors.
Honey Makola addressed regional diversity challenges, particularly in the Global South, emphasizing that encryption policies must reflect local contexts while maintaining judicial oversight and clear legal thresholds. Lloyd Richardson from the Canadian Center for Child Protection described the “weaponization of encryption” by platforms that turn a blind eye to abuse, comparing it to financial institutions’ regulated approach to preventing money laundering.
The discussion revealed significant technical and political complexities, with audience members raising concerns about government overreach and the potential for abuse of lawful access systems. Participants acknowledged that this ongoing debate requires continued multi-stakeholder engagement to develop rights-respecting solutions that protect both privacy and vulnerable populations, particularly children online.
Keypoints
## Major Discussion Points:
– **Technical Solutions for Lawful Access**: The panel explored various technical approaches to enable law enforcement access to encrypted communications without weakening encryption itself, including client-side scanning, homomorphic encryption, and prospective solutions that would only affect specific users under judicial orders.
– **Real-World Impact on Child Protection**: Extensive discussion of how end-to-end encryption impacts investigations of crimes against children, with specific examples of sextortion cases and the FBI reporting over 17,000 cases impacted by encryption barriers across all field offices.
– **Weaponization of Privacy Arguments**: Examination of how privacy concerns are sometimes used to shut down legitimate safety discussions in standards development organizations, and the need for more nuanced technical discussions rather than blanket rejections.
– **Global South and Regional Diversity**: Discussion of how encryption policies must account for different regional contexts, technical capabilities, and governance structures, particularly in developing countries with limited infrastructure and varying regulatory frameworks.
– **Standards Development and Multi-Stakeholder Participation**: The importance of diverse representation in technical standards bodies like IETF, including better participation from civil society, regulators, and child protection organizations, along with addressing barriers to participation.
## Overall Purpose:
The workshop aimed to move beyond polarized positions on encryption versus safety to explore collaborative solutions through multi-stakeholder engagement. Building on a previous IGF workshop, the goal was to examine whether encryption and safety can truly coexist and identify technical models, governance frameworks, and balanced approaches that respect both privacy rights and public safety needs.
## Overall Tone:
The discussion maintained a constructive and collaborative tone throughout, despite addressing a highly contentious topic. Panelists consistently emphasized finding middle ground rather than taking absolute positions. While some audience questions introduced more skeptical perspectives about government access and potential abuse, the overall atmosphere remained professional and solution-oriented. The moderator successfully maintained civility by reminding participants of the code of conduct, and panelists frequently acknowledged the validity of opposing concerns while advocating for their positions.
Speakers
**Speakers from the provided list:**
– **David Wright** – CEO of UK charity SWGFL, Director of the UK Safe Internet Centre, Workshop moderator
– **Katie Noyes** – FBI representative working in next generation technologies and lawful access, manages FBI’s patent program and technology standards participation
– **Andrew Campling** – Works in tech space, participates in standards organizations like IETF, Trustee of the Internet Watch Foundation focused on finding and removing child sex abuse material online
– **Makola Honey** – Works for ICT regulator ICASA in South Africa, Vice chair of Study Group 17 responsible for security at the International Communications Union
– **Richardson Lloyd** – Director of technology at the Canadian Center for Child Protection in Winnipeg, Manitoba, Canada, operates Canada’s national tip line for reporting online crimes against children and global tool Project Arachnid
– **Boris Radanovic** – Workshop assistant monitoring online comments and questions
– **Audience** – Multiple audience members who asked questions during the session
**Additional speakers:**
– **Warren Kamari** – IETF participant who spoke about the organization’s openness and upcoming meeting in Madrid
– **Simon** – Student from Norway who asked about government access to encrypted communications
– **Tapani Tajvainen** – Representative of Electronic Frontier Finland, mathematician and computer scientist with university background
– **Vinicius Fortuna** – Works on internet access resilience and privacy at Jigsaw
– **Vittorio Bertola** – Experienced participant in IGF and IETF discussions (role/title not specified)
Full session report
# Workshop Report: Balancing Encryption, Privacy, and Public Safety
## Executive Summary
This workshop discussion, moderated by David Wright (CEO of UK charity SWGFL and Director of the UK Safe Internet Centre), built upon a previous IGF 2024 workshop to address the contentious issue of balancing strong encryption with public safety needs, particularly child protection online. The panel brought together diverse perspectives from law enforcement, civil society, technical standards organisations, and regulatory bodies to explore practical pathways forward through multi-stakeholder engagement and technical innovation.
The discussion featured a structured Q&A format with significant audience participation both in-person and online, revealing both areas of potential consensus and persistent disagreements about technical solutions and policy approaches.
## Panel Composition
**David Wright** (Moderator) – CEO of SWGFL and Director of the UK Safe Internet Centre
**Katie Noyes** – FBI representative working in next-generation technologies and lawful access, managing FBI’s participation in technology standards organisations
**Andrew Campling** – Technology sector professional participating in standards organisations like IETF, and Trustee of the Internet Watch Foundation
**Makola Honey** – South Africa’s ICT regulator ICASA and Vice Chair of ITU Study Group 17 responsible for security
**Richardson Lloyd** – Director of technology at the Canadian Center for Child Protection, operating Canada’s national tip line and Project Arachnid
**Boris Radanovic** – Workshop assistant monitoring online comments and questions
## Key Perspectives and Arguments
### Law Enforcement Challenges
Katie Noyes established that “encryption and safety have to coexist” and provided concrete data showing over 17,000 FBI cases across all field offices are impacted by encryption barriers at various levels. She detailed the case of Jordan DeMay, a 17-year-old who died by suicide after sextortion by Nigerian perpetrators, noting that encryption barriers prevented accessing crucial evidence that could have aided prosecution and sentencing.
Noyes highlighted that the FBI holds 26 patents and participates in standards organisations like ITU and ICANN to collaborate on public safety from a technical perspective. She proposed exploring technical approaches including homomorphic encryption for narrowing searches, though acknowledged significant challenges with mobile device implementation.
### Technical Standards and Child Protection
Andrew Campling argued that privacy arguments are sometimes “weaponised” in standards development forums to shut down legitimate discussions. He proposed client-side scanning as a viable solution, explaining that it would work by “turning images into numbers” and comparing them against hash lists of known illegal material.
Campling cited WhatsApp’s URL preview feature as an example of client-side scanning already in use, arguing that similar technology could detect known child sexual abuse material without weakening encryption. He provided stark context with research indicating approximately 300 million child victims annually—roughly 14% of children worldwide.
He also advocated for greater diversity in standards organisations, particularly increased participation from civil society and underrepresented groups.
### Global South and Regulatory Perspectives
Makola Honey emphasised that Global South countries face unique challenges with limited infrastructure and often rely on external resources for data processing. She provided examples of different approaches between South Africa and Mozambique, noting that regulation need not hamper innovation if designed with flexible frameworks.
Honey stressed the importance of judicial authorisation and clear legal criteria such as necessity and proportionality in any framework, while highlighting the need for meaningful Global South participation in technical standard-setting.
### Child Protection Operations
Richardson Lloyd described what he termed the “weaponisation of encryption” by platforms that use end-to-end encryption to avoid content moderation responsibilities. He argued that some companies deploy encryption not primarily for user protection but as a cost-saving measure to avoid expensive content moderation.
Lloyd drew parallels to financial institutions, which are regulated to prevent money laundering despite using encryption, suggesting similar regulatory approaches could apply to child protection.
## Technical Solutions Discussed
### Client-Side Scanning
The most extensively discussed technical approach was client-side scanning. Campling explained it would involve comparing images to hash lists of known illegal material before encryption, with potential safeguards including cryptographic security measures and trusted flaggers.
He referenced Apple’s Neural Hash system, which was “audited by external folks” before being discontinued due to public pressure, as an example of how such systems could be implemented with external oversight.
### Advanced Cryptographic Approaches
Noyes proposed exploring homomorphic encryption, though acknowledged it faces significant implementation challenges at scale, particularly on mobile devices. She suggested it might be more useful for characterising and narrowing searches rather than full decryption.
## Audience Engagement and Counterpoints
### Privacy and Security Concerns
Audience members provided crucial counterpoints to panel proposals. Vinicius Fortuna from Jigsaw highlighted the Salt Typhoon hack, where lawful intercept information was accessed by adversaries, demonstrating real-world risks of lawful access systems.
Tapani Tajvainen, representing Electronic Frontier Finland, challenged technical proposals by asking for concrete implementations and questioning security implications, particularly regarding homomorphic encryption’s practical applications.
### Alternative Approaches
Audience members suggested focusing on endpoint access rather than breaking encryption channels, improving traditional investigative methods, and developing privacy-preserving technologies. Warren Kamari from IETF noted that remote participation in IETF is completely free, encouraging broader participation in standards development.
## Areas of Consensus
Despite disagreements on implementation, participants agreed on several fundamental principles:
– Encryption and safety must coexist rather than being mutually exclusive
– The scale of child sexual abuse material online represents an urgent crisis
– Judicial oversight and legal frameworks are necessary for any lawful access
– International collaboration and multi-stakeholder engagement are essential
– Technical innovation is necessary to address current challenges
## Persistent Disagreements
### Technical Feasibility
Fundamental disagreements persist about whether proposed solutions like client-side scanning can be implemented securely without creating exploitable vulnerabilities. Privacy advocates pointed to real-world examples of lawful access infrastructure being compromised, while law enforcement emphasised potential safeguards and oversight mechanisms.
### Risk Assessment
Participants demonstrated different risk assessments regarding lawful access infrastructure, with law enforcement focusing on judicial oversight and privacy advocates highlighting potential for abuse and exploitation by malicious actors.
## Cross-Border and Fragmentation Concerns
Vittorio Bertola suggested focusing on technical ways to prevent negative spillover effects between countries with different approaches to lawful access, acknowledging that different nations will make different political decisions based on their circumstances.
The discussion recognised that managing cross-border implications while maintaining internet interoperability remains a significant challenge requiring flexible international cooperation mechanisms.
## Future Directions
The workshop identified several areas requiring continued development:
– Refining technical approaches with appropriate safeguards
– Developing flexible policy frameworks that accommodate different national approaches
– Enhancing multi-stakeholder engagement with greater diversity in standards development
– Continuing dialogue in appropriate technical forums
Boris Radanovic noted that future panels need more diversity to better represent different perspectives in the encryption debate, while the extensive online participation demonstrated significant interest in these issues beyond the immediate participants.
## Conclusion
The workshop demonstrated both the complexity of balancing encryption with public safety and the potential for constructive engagement when participants move beyond polarised positions. While fundamental disagreements persist about technical solutions and risk assessments, the discussion revealed significant common ground on principles and the urgent need for action to protect children online.
The recognition that different countries will make different choices about lawful interception, combined with the need to prevent negative spillover effects, suggests that future solutions may need to be more flexible and diverse than current approaches anticipate. Continued multi-stakeholder dialogue and technical innovation offer the best prospects for developing workable solutions that address both privacy rights and child protection needs.
Session transcript
David Wright: Music Okay, good morning everybody and a warm welcome to this workshop this morning which we’re going to run through. So balancing acts, encryption, privacy, sorry, balancing acts 2.0 can encryption and safety co-exist and it’s great to see the speakers join us as well this morning. Is David Wright, I’m CEO of a UK charity SWGFL and director of the UK Safe Internet Centre and it just really falls to me to moderate this particular session. Before I do give the floor to the speakers just to introduce themselves and add some opening remarks, just to give a little context around this workshop. So we’re building on the success of the IGF 2024’s workshop which was balancing acts, encryption, privacy and public safety and so this workshop advances that conversation from conflict to collaboration solutions. Last year we explored how the weaponization of privacy can obstruct safety measures particularly child protection and this year we ask whether encryption safety can truly co-exist and how. So we will as a panel be exploring some of the technical models enabling lawful access without undermining encryption. Look at global developments in encryption governance particularly regulatory and standards frameworks. Tensions between platform accountability user privacy and state responsibilities to protect vulnerable groups. So we aim to move beyond this polarized positions and explore how multi-stakeholder engagement can help develop rights respecting balanced approaches to encryption policy. That’s the subject that we aim to cover. So we will be just running through, I’ll allow the panelists to introduce themselves shortly and add some general opening remarks before throwing a series of questions that we have to them. Then there’ll be an open floor. We’ll be asking, enabling you to ask particular questions of the panelists based on what it is that they’ve clearly contributed. Obviously the code of conduct applies to all of us. We have seen some very emotional or emotive subject that this instigates but please I ask that code of conduct that the IGF upholds is indeed upheld. So I’m just going to start off by allowing say the panelists to introduce themselves. I’m going to start off with Andrew.
Andrew Campling: Hi, good morning everyone. My name is Andrew Campling. Congratulations for surviving to day, the final day of our event. I work in the sort of tech space amongst other things, spend my time at some of the STOs such as the ITF and also I’m a trustee of the Internet Watch Foundation which is focused on finding and removing child sex abuse material online.
David Wright: Andrew, thank you. I’m next going to throw it to Katie who’s joining us online. Katie, if you can introduce yourself and as well some opening remarks.
Katie Noyes: Yeah, absolutely. Good morning everyone. I’m very sorry to not be there with you in person. I’m Katie Noizo. I’m from the FBI and I have been working in next generation technologies and lawful access for the better part of five years. And really what that means is really what that focus is on assessing emerging technologies, emerging and disruptive technologies and really preparing our environment. I have some really fun portfolios. One of those, of course, is lawful access and we’re going to talk about that today. But I also have the FBI’s patent program which is super fun. And we have 26 patents which most people don’t know about our organization. But I also have the group that is involved in our technology standards. So we have membership and even leadership roles in many of the international standards development organizations so that we can be shoulder to shoulder with all of our multi-stakeholder partners from a public safety point of view. So it’s a great collaborative space. So very happy
David Wright: to be here at least virtually. Thanks. Thank you, Katie. Next, if I can ask Honey
Makola Honey: to introduce and also opening remarks. Honey? Yes. Good morning. I’m Makola Honey. I currently work for the ICT regulator in South Africa, ICASA. And I am also vice chair of the Static Group 17 responsible for security of the International Communications Union. So I am fairly new to technical work in that respect. But I am very passionate about regulation, of course, as I work for the regulator. But it’s important to also highlight that regulation need not hamper innovation. But at the same time, so in this discussion, mainly we want to look at that balance of regulation for protection of privacy, but at the same time, ensuring that there is access, especially for the protection of children online participation.
David Wright: Thank you, Honey. And finally, if I can just throw it to Lloyd, to an introduction and
Richardson Lloyd: opening remarks. Lloyd? Good morning. Thank you for the introduction. So my name is Lloyd Richardson. I’m the director of technology at the Canadian Center for Child Protection here in Winnipeg, Manitoba, Canada. I’ve been doing this work for about 20 years now. Our organization operates Canada’s national tip line for reporting online crimes against children. We also operate a global tool called Project Arachnid, which deals with the removal of child sexual abuse material globally. We see the front lines of harms to children. I’ve been working in the space a long time. Previously, I worked in the technology sector before intersecting with children in this space. And I’m very passionate about this issue. Pleasure to be here today. Thank you, Lloyd. And it would be remiss if I didn’t remark really on the time of the day that both you and also Katie join us, so being 4.30 in the morning. So I appreciate the commitment that you’ve shown. It’s going to be a long day for you. Thank you very much. So I’m actually
David Wright: going to start off with throwing the question to Katie, or actually a couple of questions to Katie. So Katie, just to give you these questions. So what are the real world consequences of encryption on criminal investigations, particularly those involving children? And also, how can law enforcement seek access in a manner that aligns with human rights and avoids systematic vulnerabilities?
Katie Noyes: Thank you, Katie. Yeah, thanks. These are great questions. First, let me remark, honey, I really loved what you just said, which is regulation need not hamper innovation. And that’s probably a great place to start, which is I couldn’t agree more. And it’s been only the collaboration, honestly, with industries that are involved, even in this space where encryption is now becoming a challenge, that we’ve been able to gain, I’ll even call it superiority. I mean, it’s only against adversaries and criminals and been able to bring them to justice, been able to conclude successful investigations because of the technology partnerships and being able to stay ahead, or at least have some parity with understanding the criminal tactics, techniques, and procedures and the methods they’re using to turn our citizens into victims of crime. And again, take that very seriously. I think just, again, a quick general statement is that encryption and safety have to coexist. We’re here today because they have to coexist. We have to find a way. The challenge is real. And let me kind of explain why. I think sometimes this gets missed in a little bit of understanding. So for those of you who may be new to this conversation, first, welcome. And we very much welcome your perspective. So we’re a very open group here, both the panelists who I’ve known for some time, but really all of us in this space. There are two parts to the process. The first is where we in law enforcement, we actually need to have predication. And we need to be able to show and, you know, offer that predication to get an order. And so that’s the first step in a rule-of-law country like the United States. It’s the first step, is we have to prove to a judge that we have predication to be able to request the information in question or request access to digital evidence. It’s the second part of the process where the focus of this conversation and others ends up going. Once we have that order, then what happens? And in the case of encryption, it’s really kind of an interesting one, and particularly end-to-end encryption, which is, in many cases, when an application or a technology was rolled out, it wasn’t end-to-end encrypted. So for sometimes, you know, up to seven or eight years, law enforcement had this process with providers where we would retain, you know, we would garner the order through the process, we would present the order, and then we would have the technical access needed so that two-part process had parity. Now in the case of deployments, and I think this is the challenge for us, is now we in law enforcement, we know what we’re missing. And I can give you some examples. First, let me just state, we’re always asked to quantify the scope and the scale of the impacts. We do try to now have a little bit of an understanding of the quantifiable side of this. So we’re trying to, and we ask questions of our investigators across all of our field offices on a quarterly basis, are they facing these impacts and challenges in the cases? And I was able to share that figure, and I’m able to, again, hear that we’re over 17,000 cases impacted. And that’s of all manner. It’s in all 56 field offices are reporting these impacts. And it’s everything, when I say impacts, it’s everything from, you know, needing to approach a challenge or approach an investigation using alternate methods, which may or may not be as successful as giving the order to the provider and requesting that technical assistance or access to the evidence that way, all the way to really stymieing or stalling a case because we just don’t have the evidence to continue a successful investigation. And one of the biggest impacts that we see year after year is violent crime against children and child sexual abuse material. Let me bring that home to a case. Again, what I told you is, you know, in many cases where we are able to see and have access to the evidence, we now know what the cost is when it is then in what we call warrant proof, meaning it is behind the end-to-end encryption with no way to access the content. Here’s how critical that is. Many of you have either experienced this in your own countries or you’re, you know, you’ve heard about our cases on sexual extortion or what we call sextortion. We had a very, very prominent case of a young man named Jordan DeMay in Michigan out of our Detroit field office. That individual unfortunately took his own life because he was being extorted by criminals who were perpetrating their crime from Nigeria. Now, we were lucky in this case that the application in question and the communications were available to us. They were not encrypted. And so we were able to see how they groomed Jordan, how they approached him, how they got him to share information. And then ultimately, and this is so key, because it went to sentencing, one of these individuals actually goaded Jordan to take his own life and was telling him, you should take your own life. That featured prominently in that individual’s sentencing. So it is crucial for seeking justice for these victims, for the parents of Jordan DeMay in particular in this case, but it was crucial for us to understand that there were also over 100 other victims that this crime was being perpetrated against. And that’s maybe one of the key challenges on what we’re missing now is, is that we can’t stop the harm when we can’t actually follow to conclusion a successful investigation in this space. I’ll very quickly touch, because I think I’m probably way over some time that I’m allotted here, but I’m very excited for the conversation of what is in the realm of the technically possible. How can we actually find a method, a technical solution where encryption and safety can coexist? We’ve been very encouraged by some of the research and white papers and throwing things out like homomorphic encryption, looking even at prospective solutions where it’s a little bit like a wiretap, if you’re familiar, which is from this point forward with the order, now we’d only be affecting the architecture of the single subject for who we have predication. So I think there’s a lot to discuss in this space where we could find comfort, we could find observability, an ability to audit, ability to really look at what we’re doing in law enforcement and be able to view those processes, but also ensuring that we’re enabling the access that we want to make sure that we can stop the harms, conclude successful investigations. I’ll stop there, knowing that hopefully we’re going to have a little bit more of this conversation later. Thank you so much again for the opportunity.
David Wright: Thank you, Katie. Yes, we shall now move it on to, I’m going to throw the question next to Andrew on my left. So Andrew, in actual fact, there’s two, perhaps three questions here. So I’d ask you, how do privacy arguments become weaponized in standards development forums, which is the first question. What are the barriers to greater diversity and child protection representation in standard setting bodies? And then perhaps finally, as just a supplementary question, are there any examples that you could share that might demonstrate good practice to us? Andrew.
Andrew Campling: Thank you for the questions, David. Yes, so firstly, in terms of how do privacy arguments become weaponized, and I’ll broaden it to say not just in standards development organizations, but sometimes in other fora. As David said in his opening remarks, it is an emotive topic, whether we’re talking about privacy or child safety, both of those are emotive topics. So sometimes that in itself makes a useful discussion challenging. As though we do have examples where the phrase privacy is used to close down a meaningful conversation and just use as the reason to not do something. And it may be a good reason, I hasten to add, but I think it needs better explanation than just we can’t do that because privacy. I think you need to unpack that and say specifically why, and then if it’s in an STO, for example, dig into the specific technical reasons and understand whether they are indeed valid or whether there are solutions. So there may be a technical solution, and there often is, that will give you both privacy and child safety. In reality, they don’t need to be in opposition. In fact, in some cases, if you talk specifically about encryption, inadvertently the use of encryption to improve privacy can sometimes both weaken privacy and security. There are some current examples where metadata is being encrypted to improve privacy, but some of that metadata includes, from a technical point of view, what are called important indicators of compromise that give signals to your firewall or your enterprise security software that a system has been compromised by malware. And by encrypting that metadata, you may actually be open to attack. So ironically, you may be losing your privacy because of that encryption. So I think we have to be careful sometimes that we don’t blindly do things for good reasons without fully understanding the consequences of those reasons. I’ll give you a different example. We know from research that the big so-called end-to-end encrypted messaging platforms are widely used by offenders to share child sex abuse material. There’s extensive research with an enormous sample size. Last time I checked, it was over 60,000 pedophiles were in the sample, explaining what tools they used. use, and these are the tools they use to share their material. And again, often the conversation on can we do anything about that is closed down straight away because we can’t do anything with end-to-end encryption because of privacy, whereas in fact there are techniques you can use to block the sharing of at least known material without the need to weaken encryption, without the need to create backdoors. Something called client-side scanning can be used, which doesn’t affect, as I say, the strength of the encryption. There’s absolutely no need for a backdoor. And ironically, many of the messaging platforms already use that technique for user convenience features. So it’s already being employed in the software. So people that object to it in principle possibly don’t realize that it’s already in the software that many of us use. So you can do that without, as I say, affecting encryption or privacy in any way. The only time you impact privacy, if it turns out someone is trying to share known illegal images, but then to be frank, and this audience will understand this, you’ll understand that privacy is a qualified right. If you are doing an illegal act, then I think it’s reasonable that you might at that point surrender your right to privacy. Your second question, David, what are the barriers to greater diversity and the representation of child protection groups in standards bodies? In fairness, to be more specific, it rather depends on the standards body. So for example, there’s a number of us here this week from the Internet Engineering Task Force or that participate in that. There are no barriers. Anyone can attend. You don’t have to pay to attend remotely. You obviously have to pay to attend in person. And by the way, the remote tools for attendance at the ITF are fantastic. So you attend on an equal basis to the in-person participants with access to the mic cues, the conversations and so on. That said, and some of the other SDOs, there are other barriers like you have to be part of a national group or pay to participate and so on. So it depends on the SDO whether there are specific intentional barriers. There are other barriers though. So the cost, there is a real cost to attend. The ITF, for example, to try and diversify attendance, we meet in different parts of the world, rotate each time. So there are pretty extensive travel costs, hotel costs and so on. There’s also the cost of people’s time. Not only do you have to commit to attend the meeting, so if you attend in person, you’re committing to attend for a week. With any of the SDOs, you typically have to attend for a while to understand how the system works and to engage with it effectively. So you’re usually looking at a multi-year commitment to effectively engage. And you obviously need to bring relevant knowledge. So just wandering in off the street, as it were, would be quite challenging, I think probably in any of the SDOs. Having said that, it’s absolutely an important thing to do because any of the SDOs can always benefit from better diversity in their attendance. So again, I’ll refer to the ITF as my reference point. We have fantastic representation from the tech community. In my view, we are appalling on gender diversity and need to do significantly better there. And we absolutely would benefit from better representation from groups with experience outside of the tech sector, whether that’s network operators, whether that’s civil society groups, regulators, governments. And if we had that broader representation, we would have better standards. And I believe that’s true of all of the SDOs. So please, if you’ve got the right sort of knowledge level to be able to engage, please do so. It’d be very worthwhile. And then final answer, why would it be worthwhile? Well, as an example, parental controls are probably the one tool that parents have to try and manage, if that’s the right word, the user experience that their children have. But we know that there’s so many of them, and even technical experts struggle with the parental controls because the sheer number of them on any given device, you’ve got the sort of the operating system level controls, the application controls, maybe your ISPs controls. They’re all different. They don’t interwork. They use sometimes the same terminology to mean different things. And I have had a CTO of a technical organization tell me that he struggles with the parental controls, and he certainly has enormous technical competence. So it’s a problem. Where we can use SDOs, some of us are working on a proposal to develop a protocol so those parental controls can interwork, so they can exchange signals with each other, so that you don’t have to invest time in every single one of them to put reasonable controls on your device. And if anyone’s interested in that, let’s have a conversation afterwards. But that might be a useful thing that we can do where the civil society can actually input useful knowledge that isn’t currently present in some of the SDOs.
David Wright: Andrew, thank you. Thank you very much. We’re next going to turn to Honey. And so, Honey, your two questions. So how should encryption policy address regional diversity, particularly in the Global South, where technical solutions may be harder to implement? And what’s the regulatory models that show promise in balancing safety and privacy? Honey.
Makola Honey: Thank you so much for the question. And I just want to pick up a little bit on what Katie said. And I completely agree that encryption and safety are not mutually exclusive. And Katie’s feedback also highlights the importance of international collaboration with the example that she mentioned from Nigeria. But to make encryption policies work together, we need to think about how diverse our starting points really are, especially in the Global South. As you said, we have the example of South Africa, where I come from, where cryptography providers have registered with the state. And then you also have Mozambique, where encrypted documents must be decrypted by experts. So those examples show that countries are really trying to balance the safety, privacy, and capacity, but just in the way that fits their own local context, whether this is wrong or this is right. But the main point is that they are trying to have a certain balance. And then in the Global South, we must also take into consideration that we have limited local infrastructure, including data centers. And this also poses a challenge, forcing some of the countries to rely on external resources for data storage and processing. So when we talk about global encryption policy, for me, it boils down to three things. Flexible policy frameworks support diverse paths of implementation, and at the same time, also respecting the sovereignties of the specific countries. There’s a need for investment in local capacity, especially for regulators, law enforcement, and the technical community as well. Boris mentioned the technical work involved, and I believe that meaningful participation of the Global South in technical standard-setting is very important. So we’re not just adapting to decisions, but we are helping shape them. That provides a real opportunity while we’re trying to build capacity and technical skills and capabilities at the same time. But in the end, the encryption policy will work if it doesn’t reflect – it will not work if it doesn’t reflect the diversity of the different regions. And in terms of the second question, what regulatory models show promising balance, safety, and privacy? I’m assuming that is still with regard to the Global South. So I think the most promising regulatory models are the ones that enable law enforcement access without weakening encryption. and Sophia . Frameworks in my country, South Africa, Nigeria and Brazil require that access to encrypted data be granted only through judicial authorization and based also on clear legal criteria such as necessity and proportionality. So this ensures a balance between law enforcement capability and the protection of fundamental rights. We need greater clarity in terms of the obligation on cryptography providers, alignment with evolving international norms also. But I do believe that we are on the right path. But there is still room for improvement. ≫ Thank you very much. I would like to move on to the next question. This is a question I think we have all heard. So I think you all agree that there is still room for improvement.
David Wright: ≫ Honey, thank you very much. Just by way of a final contribution, I’m going to throw it to Lloyd. So Lloyd, from your perspective, how does strong encryption impact child protection? And how do you ensure child protection is prioritized in encryption debates? Lloyd?
Richardson Lloyd: ≫ Yeah, thank you. Firstly, I would agree with all of the previous colleagues. It aligns with everything that I was going to say. I like to encapsulate it by saying it’s not necessarily about banning encryption, banning this type of encryption. It’s more about the weaponization of encryption. I think we can point to real examples of that in the world where we see, for example, Meta used to have just client server encryption in place related to Facebook Messenger. There was an enormous amount of child sexual abuse material being traded in that way. They were able to prevent it from being distributed and they were able to prevent child sexual abuse material being used in that way. There’s a lot of cases in child sexual abuse of encryption where you’re seeing a lot of harm being created and you’re essentially turning a blind eye to it. In a lot of ways, when we’re talking about media distribution in adult sort of contexts, I think there needs to be methods by which we can identify child sexual abuse material specifically in a media sense. It touches on some of the examples Katie gave earlier about where adults have access to children on the Internet by way of a platform not delineating children users or turning a blind eye to what is obviously a child user and an adult user. They intersect with terrible consequences. I can echo what Katie said about we’ve seen a handful of suicides here in Canada for the exact same reasons. Our tip line deals with dozens of sextortion requests every single day. The idea that we would turn down the that’s sort of the perverse weaponization. If I’m industry in this space, that’s a cost centre for me. If I can turn on end-to-end encryption and avoid any sense of moderation, it’s cheaper for me. It makes more sense to approach the situation that way. I really see problems with that where we need to look at where end-to-end encryption can be applied in a responsible way. I would use examples that we see in the brick and mortar world in terms of what we see with the financial institutions. Certainly if banks decided they wanted to have financial transactions completely end-to-end encrypted from point-to-point, that would be a great scenario for people to perform money laundering. The reason they’re not allowed to do that is because we have regulation in place saying, no, you may not. You must inspect these communications here. That’s what we’ve developed. We’ve learned lessons over time that if you don’t do that, you’re going to have problems with money laundering. It’s the exact same thing we see on the internet, yet we’ve been incredibly slow to regulate in this space and apply the appropriate technologies where there’s a balance of safety and privacy.
David Wright: Lloyd, thank you very much. Now, what we’re going to do is clearly we’re going to open the floor for particular questions. Before I do, I’m just going to turn to my colleague, Boris, who’s compiling or is watching the online comments. So, Boris, if I can invite you to the microphone if there’s any particular questions that we’ve seen online that can be addressed to the four panelists. Thank you, thank you very much.
Boris Radanovic: Hello, everybody. There has been a lovely discussion in the comments, and thank you, everybody, for your contributions. I’m going to ask, try to convey this the best I can, so please help me. First one was, is it possible to create an encryption protocol that balances personal privacy and national security without compromise? The second part as well was a question, did not know about the FBI standards work. Is FBI, which organization is it working with? There has been a couple of comments on IETF, so I’m not sure whether these are correct or not. Please correct me, but IETF waives fees for NGOs participants, one contributor said, and decisions are made remotely and physical participation is not required. Thank you so much for that, if that is correct. There was one question in regards to parental controls, which I think might be interesting. Perhaps maybe government’s suggestion from a contributor could fund better development of parent and child-friendly protective tools for free deployment on widely used systems. I think it was an interesting perspective. As well, there were questions and comments depending on the contributions and Andrew’s contributions of examples of privacy concerns. Is there any that you could share, dear moderator? I hope that is enough. If you would care to ask the panel. Thank you so much.
David Wright: Boris, thank you very much. We’ll have a go at answering some of those questions before we just open the mic. I think there was an easy one as well. Katie, if there was any information around the organizations that you work with that you referred to?
Katie Noyes: Yeah. Thanks. I’m glad to hear the interest in that. Yes. I’ll just give a couple of examples. The first is in the international telecommunications union. Honey, I love seeing you as study group 17. We’re involved in the study groups there. From all manner. In fact, we’re just getting into the artificial intelligence study groups there and trying to keep pace from a public safety point of view. So we’re in ITU. We’re trying to keep pace with that. We’re in ITU. And ICANN, Internet Corporation for Assigned Names and Numbers that manages the domain name system, if you’re not following that group, we actually have the co-chair of the public safety working group is from our organization. And that’s really heavily focused on really stemming and making sure we all collectively understand DNS or domain name system abuse. And then also accesses to IP addresses to be able to make those connections. So we’re in that group. We’re in ITU. And ICANN, Internet Corporation for Assigned Names and Numbers, that manages the domain name system, if you’re not following that group, we actually have the co-chair of the public safety working group is from our organization. And that’s really heavily focused on really stemming and making sure we all collectively understand DNS or domain name system abuse. So we’re in ITU. We’re in ITU. And ICANN, Internet Corporation for Assigned Names and Numbers that manages the domain name system, if you’re not following that group. And then also accesses to IP addresses to be able to make victim notifications.
David Wright: How’s that process working? Again, is there an observable law enforcement access system in place? Is there an observable law enforcement access system in place? And then also accesses to IP addresses to be able to make victim notifications. So we’re in ITU.
Andrew Campling: We’re in ITU. And ICANN, Internet Corporation for Assigned Names and Numbers, that manages the domain name system, if you’re not following that group, we actually have the co-chair of the public safety working group is from our organization. And then also accesses to IP addresses to be able to make victim notifications. And then also accesses to IP addresses to be able to make victim notifications. One of the things about the new second phase of the protocol is that there are concerns that there might be unilateral access to the encrypted content. I think it would be perhaps understating to say that I think there’s some skepticism as to the appropriateness of that approach. One of the proposals was called the ghost protocol. But it has been criticized as effectively providing a back door that could be exploited by sort of bad actors. I think that’s what we are looking at. That’s what we are looking at. And that’s what we are looking at. So I think there’s a need to provide more information on that online. I think on the fees for the ITF, I wanted to mention on that, you can participate online with literally no cost, no financial cost at all. There are no question fee waivers for remote access. Unlimited without question fee waivers. Certification because ordinarily you’ve also got airfare costs and hotel costs to pay and there’s no funding for that So you’d need to clarify why you needed a fee waiver for the registration fee if you can afford to be present anyway but Remote access totally free Somebody also asked if I had any examples of weaponized Discussions I do but I don’t think it’ll be appropriate to share those because we want to keep this as a constructive discussion Sorry parental controls and government contribution, I think that would be a good conversation to have Equally, it might be a good conversation have so governments reviewing The the operation of parental controls or maybe at least getting involved in the discussion about the Usefulness of having a protocol so there could be some interconnectivity between those controls Some standardization there is I think will be hugely welcome
David Wright: Okay. Thank you very much Okay, if I can now Invite if I can just ask you introduce yourself as well And if I could ask you to keep it to a couple of minutes, that would be good
Audience: Thank you. Hi, I’m Warren Kamari I spend most of my time in the IETF and I just like to thank Andrew for noting how open it is I also want to amplify his message that we need greater gender diversity, but also participation from regulators civil society, etc As people have noted it is free to participate remotely and our next meeting is July 19th to the 25th If you’d like to participate come chat with me. I’m happy to help you figure out how to get started It’s in Madrid, by the way, if anyone in Europe wants to attend in person, thank you Warren Hello, so my name is Simon I’m a student here in Norway and I want to ask the panel their perspective on this as My view is that if we start doing these things as allowing governments to access encrypted communications and Yeah, you can say we do client scanning and only on an alert of child sexual sexual abuse or anything relates to that I don’t see how this won’t harm things as journalism as if a government turns corrupt I want how can journalism keep communicating and stay safe for communicating and Criticizing the government
David Wright: Simon thank you very much. Okay, I think we’ll take each question Okay, I think we’ll take each question in turn so Andrew do you want? Yeah
Andrew Campling: Maybe a thought on that and hopefully some of the other panelists will as well. So thank you Simon for the question Specifically on client-side scanning for the sharing of known child sex abuse material There are a number of ways that you can insert controls. So for example, you can only accept The hashes to get to the specifics from a trusted flagger. You can cryptographically secure the Hash file that’s being shared to protect it from being interfered with by a badly in intention third party including a government Ultimately, though. I think we have to be honest that if there is a Government with bad intentions as perhaps other tools that they have other levers. They they can use Anyway And there’s maybe not technical solutions to that Yeah, there are other solutions needed. But I think you can put some Safeguards in to protect against that abuse and again happy to have a longer discussion offline in the interest of getting more people
David Wright: Great question. Okay. Can I extend if there’s any response to that Katie honey or Lloyd?
Richardson Lloyd: I Would I’d weigh in by saying there’s a little bit of like Confusion of what of like throwing in the government automatically there there seems to be like the the automatic distrust but in the examples we’re talking about it’s more the the man in the middle being the person in industry who’s doing the Intervention there in terms of data, which doesn’t automatically mean the government is decrypting and looking at anything it’s it’s incumbent upon the company to share that information with the government likely based on a judicial order depending on what country you’re in so like it’s It’s it’s a sort of a hypothetical based on a particular government in a particular country So it’s hard to answer it really specifically but what what I could say in the context here I Don’t see that. I don’t see the government as the the interceptor of anything there. It’s it’s industry who’s doing it
David Wright: You even got a round of applause there Lloyd as well I
Katie Noyes: Was gonna weigh in real quickly on that one too is you know, I think these are good points and that that’s yeah I think part of our Reason why we’re saying why we’re asking for this technical assistance as I said, there’s like a two-part process One is for us getting the order and then serving that process right to a provider But I think that’s why it’s so important to have the provider in the mix of these, you know Processes because there then is a check and balance I think one of the challenges for journalists worldwide is that like I Think Andrew was saying there are all manner of tools used against them And by the way, that’s part of our protective mission is to ensure that we’re protecting right, you know One and one of the things about privacy that I think is interesting is when you define Privacy if you look at a lot of the terms of service for a lot of the providers who are providing these services It wants you to be free from harassment So that free from harassment in your example where you’re saying there’s harassment coming from protect, you know Potentially an authoritarian regime or or any other manner of exploitation there? It’s how do we protect against that and that is one of the reasons why? For us in the FBI we’re so keen to I think have the partnership and Collaboration with the process of the orders that are provided to a provider and there’s a another party That can offer some checks and balances and make sure that they’re only delivering What is actually covered in the order when we start getting to other techniques like computer network? Exploitation or hacking tools or things that are used against? Journalists regularly That’s where I think you get into this ungoverned space where those accesses are no longer You no longer have any of those checks and balances So it is a really good question to ask a really good point to make that you know There are structured processes versus unstructured processes and why we favor Finding the solutions here in a collaborative manner with the providers to ensure that there is the ability for Auditing and for some controls around this
David Wright: Thank you, Katie. Okay, if I can turn to the next question, please. Thank you. Thank you
Audience: I’m Tapani Tajvainen here representing Electronic Frontier Finland, but my background is at the University I’m a mathematician and computer science. So I think I understand these techniques a bit first a quick comment on client-side scanning I have studied the idea of fair amount and I really don’t think it can be done safely. There are lots of problems I’ll just mention one that hasn’t been come up it basically means installing a software and everybody’s phone and that would be an irresistible honeypot for all kinds of bad actors notably including Intelligence agencies in countries. We really don’t want to have Listening out talks, but I was intrigued by the mention of homomorphic encryption. I Think I understand how that works I know what homomorphism is and I don’t see how it could be effectively used here All the proposals I have seen seem to be based on either misunderstanding That simply don’t work or else in effect amount to breaking or circumventing Encryption and whether you are like in client-side scanning you actually not technically breaking encryption You are in circumventing it but the effect is the same So in particular I’d like to ask if you have a concrete Implementation or even theoretical paper that opens up how homomorphic encryption could be used here that actually has a you know testable code or even theory paper because it’s really it’s too heavy to be run on a phone at the moment and the solutions I see effectively amount to sharing the key breaking the encryption and To save time it’s the same question about differential privacy, but I won’t go more deep on that. Thank you
David Wright: Thank you. So just before I open it to the panelists So we do have apologies from one of the panelists who was actually going to be talking about some of the technical solutions So Arnold is not able to join us today. So I suspect although I am clearly going to open it to panelists that the question about Homomorphic encryption was one that Arnold would have probably have taken but equally I do open that floor Andrew you
Andrew Campling: Yeah, I’ll pick up the the question on client-side scanning and that is a great question because I think perhaps we need to have clarity on specifically what we mean and how it could be done. So what I’m talking about is, let’s say WhatsApp, I’ll pick on that purely as an example. The actual code in WhatsApp does the client-side scanning, so not a, let’s say, government-supplied piece of software. That would be, as the questioner said quite rightly, problematic to have third-party software. So it would be the application provider that installed the functionality. And by the way, if you think that’s just a problem in principle, as I mentioned in my opening remarks, it’s already in your phone. An example of that, again I use WhatsApp, but it applies to other messaging apps. If you type in a URL in WhatsApp, unless you’ve disabled the functionality, it will by default produce an image of the landing page within the message when it’s sent. That’s client-side scanning. It’s looked at your message, that’s your content of the message, worked out that there’s a URL there. It’s actually fetched that from the relevant site. I believe with WhatsApp, it’s actually in doing that, by the way, it’s shared your IP address. So if you’re concerned about privacy, you might want to think about that. Whereas if you do client-side scanning of images, so to be clear, what I envisage is not scanning your message at all, only scanning any images, any files that you may wish to include with it. All you’re doing on the client is turning the image into a number. So you don’t need to know what the image is, you turn it into a number. All you’re sharing, if you like, is comparing that number to a list of known CSAM material. So my view, and again, we can debate this afterwards, is there are no privacy implications with that unless there’s a match at which point, if you are sharing known CSAM, I think you’ve surrendered your right to privacy at that point anyway. But yeah, we can continue that discussion, but I think in that specific implementation, there isn’t a concern with a third-party piece of software. I don’t know if anyone else wants to comment on that or on the homomorphic encryption point.
David Wright: I was going to exactly open it to, so Lloyd, Katie, Honey, if there’s any response to those questions.
Katie Noyes: Yeah, I definitely want to weigh in. I’ve been watching the chat, by the way, and thank you for putting your comments in there and your feedback as we’re having this discussion. And I really want to take up sort of like, how do we find some compromise here? I know that the people on the panel and the people in the chat and the people who are in the audience do not want children to be exploited. They don’t want child sexual abuse material, and they, all of us, want to ensure there’s safety there. So let me open just sort of, and please, you can throw it in the chat, and I hope I’m not breaking any protocol here with the panel, but I’m just interested. We’ve been really trying to figure out what’s a workable way forward here. We’ve been interested in reading papers about abuse-resistant law enforcement access systems, for instance, and things that are trying to build trust, which has to be a key here. I understand the criticisms of the FBI. We all understand them, and we want to earn the trust. And so we can only really, I think, do that if there are these processes, right, where there are orders involved, there’s some manner of auditing, there’s some manner of transparency around these processes. So I wanted to put forward this idea that we had heard before, which is looking at this prospective solution. Yes, I’m the first to say it’s not going to resolve all of the challenges we have with access to digital evidence that we have authorities for. However, maybe it’s a starting point for these conversations, which is, if we’re looking at a prospective solution, what I mean by that is, it’s kind of like we give the order to a provider who is managing the encryption already, that’s what these companies do, and they’re doing all manners and sorts of things around that encryption, like patches and other things. And I know some of that is different. But if we looked at a prospective solution where we were then only affecting the architecture of a single user with predication, you know, is this something that could be perceived as workable by folks on all sides of this debate and conversation? Is this a starting point for a true technical solution that provides, you know, kind of the assurances that folks are looking for, can build some trust, but then also at least restore some of the access that we already had? And I do think that gets missed in this discussion, is that for many, many years, some of these platforms did not have a full end to end encryption solution. That gets added later. So that that access has been impacted without a solution in mind, even though there’s, again, global recognition about the authority. So I’m just hopeful that this conversation can come out of the back and forth on this is why we should have no and this is why we should have all. And where can we meet in the middle? And I’ve really enjoyed listening to the other panelists here talk about some of these proposals. So I’m very interested in hearing, you know, more discussion around that. Let’s maybe stop with and I think Andrew threw it out, you know, the nose and the weaponizing of the it can never happen and find that space in the middle where some of this can and should happen. And I’d love to take up fully the discussion of homomorphic encryption. I agree there are challenges at scale. It’s also I think the use case there is more for narrowing based on characterization than it is for really decryption. So I’m with you on that. But I think there’s some promise around if you established a homomorphic model up front, some of the opportunities. But again, we would very much like to engage in some discussions with more technical
David Wright: folks over. Thank you, Katie. Lloyd or Honey, any remarks?
Richardson Lloyd: I was going to touch on the the example that Apple pretty much fully built with neural hash. It was a couple of years back and then quite quickly threw in the garbage. It was it was audited by external folks. It was a complete example of how you can do client side scanning. I’m not saying that I particularly favor such a solution, but it is a potential solution to to addressing child sexual abuse material and environment. And it actually I would argue that it actually sided on privacy more than child protection. But again, better than better than nothing. Which is where it is now.
David Wright: Thank you, Lloyd. Perhaps that is a starter for the conversation that Katie’s call for, too. OK, I’m sorry, honey, if you would like to. Yeah, please.
Makola Honey: It’s not necessarily on the specific comment. There’s some comments on the chat that I would like to to maybe respond to a little bit if I’m allowed.
David Wright: So we we just a minute because we can’t actually see the the comments. Boris, I guess it’s difficult to know what the comments are that honey, you’re can you can you actually introduce what the comment is that you would like to specifically respond to, given that we can’t see those comments?
Makola Honey: So maybe just weighing in first on the one that was by one of the audience on the government involvement, which we all heard. I think it also boils down to what we discussed last year in this very panel on the issue of geopolitics when it comes to international law and how it can pose a challenge. So that question basically speaks to the different governance structures that are there and, you know, we really need to find a way of working around it. But for me, I think it boils down to the importance of why judicial authorization is important when granting government access and also the clear legal thresholds that needs to be defined in legislation or policy and the transparency mechanisms also needs to be embedded in such policies. That’s the first one. And there was a comment by Ian saying that the Electronic Communication Transaction Act by South Africa has drawn a lot of criticism and yes, I do agree that he is right, it has drawn criticism, particularly regarding clarity and alignment with global norms and standards, but I think it’s also important to make sure that we have a clear legal threshold for that. practices. But that said, I’d just like to mention that I believe South Africa is on the right path and as I mentioned judicial oversight and provider registration are already embedded in the framework which is an important step in balancing rights and security which the workshop really is about. Yes, there is definitely room for improvement, clear obligations for providers like I said, stronger procedural safeguards and alignment with evolving international norms and that’s the beauty of policy in this space. It is constantly evolving and regulators also need to be aware of that. Yes, those are the questions that I wanted to respond to. Thank you.
David Wright: Thank you, Honey. We may well just come to Boris, you just after this next question so I’ll just line you up for that one. Please, thank you for waiting so patiently as well.
Audience: No problem. My name is Vinicius Fortuna and I work on internet access resilience and privacy at Jigsaw and thanks for this important conversation. I think unfortunately it was a little bit one-sided so I wanted to add another perspective. So like Katie mentioned predication but like with the big south typhoon hack of the US telecoms they leveraged the wiretapping infrastructure that is being promoted here and that put citizens at risk including governments and children. And predication didn’t help us there, didn’t protect us. Andrew mentioned the weakening of security due to hidden indicators of compromise so I want to point out that like you can only use indicators of compromise after you’ve been compromised already and maybe you were compromised because you’re not using encryption but it can help in some cases. But I’ll mention end-to-end encryption. I want to actually be constructive, offer some ideas and first is that like we need to understand that end-to-end encryption is protecting the channel. You can still see the data on the endpoints and it’s being brought up here so I can run like malware protection or parental controls. I actually use encrypted DNS with malware protection and parental controls I think is a good solution but it needs to be improved so I think that is work that needs to be done. I think that’s a good suggestion. And on investigation like the victim has access to the data right so you can work with the victim to try to get the criminal because they can give you the data so that’s another idea. And then there are some new encryption standards that are helpful. For example there is privacy pass which was actually developed at IETF that allows you to make attestation and prove that you can have some property in a privacy preserving way so you can do age verification in a privacy preserving way which I think will be really helpful. And last thing is that even when we use a HTTPS the domain names are still in plain text. With AI you can actually tap into like your child’s school Wi-Fi and collect the domain names and create like a profile of your child and then use that for targeting and extortion which is scary and so we need to adopt standards like encrypted DNS and encrypted client hello that will hide that information and make them safe.
David Wright: Vinicius, thank you very much for those questions and I’m just going to open that. Andrew is…
Andrew Campling: Could maybe go to Katie first because I think she might have something to say about salt typhoon maybe. I think she’s better informed on that than any of the rest of us if she wishes to comment.
Katie Noyes: Yeah I mean I think there’s a lot of discussion around that you know the lawful intercept it wasn’t actually the vector for attack but the information was accessed so it is absolutely something we’re concerned about and another panelist or another person in the chat sorry for us to keep going back to the chat but there’s some great great discussion going on there. There was a question asked to me of you know is the FBI concerns around you know encryption and what we’re asking for and absolutely that’s why we’re you know I’m standing here before you today saying we are not against encryption we’re not against end-to-end encryption but we also recognize there is a balance here and that’s the one of the things where you know we’re really trying to have this conversation with all of you and with each other because we haven’t had these conversations before which is what are those solutions for balance. Again I know all of us are committed to doing better about stemming child sexual abuse material so what are those solutions what are the ways that you know and again I understand you’re absolutely right making sure that victims are able to have an easy way to report and preserve the evidence to be able to report it but the challenge for us is that you know the victims are sometimes not even in the mix as far as you know and with extortion of course you know there’s a lot of challenges there let’s just say because you know if someone takes their own life or again if it is perpetrator to perpetrator we want to make sure that we’re identifying other folks and that preservation is a tough one too to ask people to preserve their own evidence and put the sole responsibility only on the victim and user so your points are right don’t don’t don’t mistake me there’s there’s they’re great ideas and um want to hear more of those so I’ve been very excited to hear uh the interventions here so keep them coming but yeah with salt typhoon we are very concerned um that’s one of the reasons we’re very huge proponents of making use of encryption but I think it is worth noting that was actually not the attack vector again it may be neither here nor there for you because the lawful intercept information was accessed and and again we are all about protecting that
Andrew Campling: in a better way ourselves as well thank you katie andrew yeah are two very quick and they’ll be a bit superficial for time so we can maybe speak speak afterwards on the indicators of compromise um two quick thoughts one is yes encrypted dns is useful to protect from a third party observer but you are sharing your data still with the resolver operator some of those resolver operators are you maybe in in operating under regimes where there’s not a very good privacy regulation so assuming that just by putting in encrypted dns you’ve fixed privacy which isn’t what you said to be fair but some people might just assume you do that you’re done that’s not true you’ve got to consider who’s operating the resolver as well otherwise you may have bigger issues and then also on the indicators of compromise the issue is with things like ech you encrypt the sni data and that’s absolutely used by firewalls and other software to identify when you have uh when an exploit has happened and when maybe data is being exfiltrated so um I think we need to be very careful and some of the people that have uh working on that were maybe a bit focused on encryption and not so much on the broader security uh ecosystem if you will but that’s probably a longer conversation that we can usefully have um in the next seven minutes 21 seconds um who’s
David Wright: honey did you want to add anything on okay we can move on i’m now torn uh sir please do ask the question boris i will come to you uh next uh we have only seven minutes left so we probably will run out of time for further questions but please i’ll try to be brief victoria bertola i think i know
Audience: most of the people in this conversation and i mean how many times have we been having this conversation at igf at the itf in the last 10 years i think like 20 30 so i’m and every time there’s someone that comes and says i mean rightfully i mean if you do this it will be abused it will i mean whatever system you put to intercept to give some lawful intersection possibilities will be may be abused or will be abused and this is not the point unfortunately there is a part of our society in politics in civil society in public institutions that thinks that even if it will be abused it’s still better than not having lawful interception so the problem of this discussion is it is really a political discussion of balances of compromise and i i’m i think that the compromise might be very different depending on which country you’re on you’re from when you leave there’s how big is crime in your country whether you have organized crime or you don’t so i mean allowing lawful interception norway is going to be very different than allowing it in iran so this is why i mean i’m starting to think that maybe we could focus i mean the development of this in another direction which is maybe okay every country will decide this at the political level but if you decide to do this can we find ways to prevent the bad effects from spilling over to other countries that decide they don’t want to do this and what’s the best day of deciding this and of managing this do we have any advice for governments that want still what in parliaments they still decide that they want to do this in terms of in the same sort of technical ways of doing this. So maybe this could be a work strength that could be more productive than just restating the arguments every time. Thank you.
David Wright: I admired the question as well, but I guess if it was an easy thing, then we wouldn’t be re-recycling there, it would have been fixed. So I think that recognizes the point, Andrew. Yeah, great point, Vittorio.
Andrew Campling: Can I just maybe build on that slightly without addressing it, because it’s a complex solution. So I think if I extrapolate that a little, I think it’s worth stating that in my opinion, the status quo is just not acceptable. And the reason for that, and maybe a stat we probably ought to just remind ourselves because it’s really important. If we look at child sexual abuse material or tech facilitated child sexual abuse and exploitation, the current estimates are that there are roughly 300 million child victims annually. That’s roughly 14% of the children across the world, which means every country, probably every one of us, we might not know them, but probably we got people close to us that are victims. So it’s the sheer scale of the problem that unfortunately has been enabled by the internet, which is why I feel that we need to find a solution, but to Vittorio’s point, accepting that there are trade-offs. We have to do this with accepting there are trade-offs and maybe try and minimize the risks of those trade-offs, but they will exist.
David Wright: Thank you, Andrew. Lloyd, Katie, any response to that?
Richardson Lloyd: I think it was a really good point actually. And I think that if we continue down this road, you’re going to see a sort of split as nation states declare their sovereignty on the internet. It won’t be that one unified thing anymore. You’re going to see, well, if you’re not going to come to this sort of standard because this is a country of law and order, then you’re going to start seeing fracturing happen, I think. So I think it’s a really good point that you can’t compare. It’s not a fair comparison to say what’s going to happen in this country will happen in this country. We absolutely need these protections in place. We can’t just stick our head in the sand and say, well, nothing we can do about it. We absolutely need to do something. And I think the risk here is the fracturing that we’re going to see after the fact if we don’t address this issue to some satisfactory sort of level.
David Wright: Okay. So in the last couple of minutes, Boris, if we can, I’m now intrigued with all the conversation that apparently has been going online. If you can summarize that in less than two and a half minutes that would be much appreciated.
Boris Radanovic: That’s not going to be possible. I just want to say thank you so much to everybody online as well for contributing. It’s really difficult to follow some of the discussions because there’s points going left and right and please read them afterwards. I hope there’s going to be recording. An interesting point has been raised and I think that just showcases the point of this discussion in setting up a future discussion to have more diversity in our future panel discussions. And I think it’s an important point to make and many of the contributors online, I feel as well would be great. One of the questions that was, I don’t know if it was missed, was the perspective of the panelists on the civic space impact of weaker privacy standards. There are good reasons, comments on privacy reasons to use one sort of applications or other. Then there was a comment as well, high privacy technical standards are not banning police investigations in more traditional ways and other technical investigative techniques. There was a diverse discussion online and I just want to say one more thank you to everybody who contributed and please keep on sending them. Thank you.
David Wright: Okay. Well, in the final one minute, I think it would just fall to me to pull this to a close. And Boris, I will start with a thank you to you to trying to summarize all of that. I think we’ve heard clearly a lot of the course the last hour and a quarter and there were some notable quotes. Katie, I think is actually one of your first comments, which was encryption safety have to coexist. We have to find solutions to this, despite, as the question said, is we keep coming around to this subject. It is clearly quite a difficult and complex issue, but one that we, Lloyd, as you just kind of reminded us, one that we have to resolve in some manner. And I really liked the suggestion actually to focus in one particular place and that’s where we perhaps will start. I thank all of you for joining us this morning, as Andrew said, on the last and final day of the IGF. But if I can just close, if you can show your appreciation to what is an extraordinary panel. Thank you all so very much. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you.
Katie Noyes
Speech speed
169 words per minute
Speech length
3070 words
Speech time
1089 seconds
Encryption and safety must coexist and technical solutions need to be found
Explanation
Katie argues that encryption and safety are not mutually exclusive and that collaborative solutions must be developed. She emphasizes that the challenge is real and requires partnership between law enforcement and industry to find technical solutions that maintain both security and access for legitimate investigations.
Evidence
Katie mentions FBI’s collaboration with industry partners has enabled superiority against adversaries and successful investigations through technology partnerships
Major discussion point
Balancing Encryption and Safety/Child Protection
Topics
Cybersecurity | Human rights | Legal and regulatory
Agreed with
– Andrew Campling
– Richardson Lloyd
– Makola Honey
Agreed on
Encryption and safety must coexist rather than being mutually exclusive
Over 17,000 FBI cases are impacted by encryption barriers, particularly involving child sexual abuse material
Explanation
Katie provides quantitative data showing the scope of encryption’s impact on law enforcement investigations. She explains that this includes cases requiring alternate methods or being completely stalled due to lack of access to encrypted evidence.
Evidence
FBI asks questions of investigators across all 56 field offices on a quarterly basis about encryption impacts; all field offices report these impacts; violent crime against children and child sexual abuse material are among the biggest impacts
Major discussion point
Balancing Encryption and Safety/Child Protection
Topics
Cybersecurity | Human rights | Legal and regulatory
Agreed with
– Andrew Campling
– Richardson Lloyd
– David Wright
Agreed on
The scale of child sexual abuse material problem requires urgent action
Disagreed with
– Richardson Lloyd
– Audience
Disagreed on
Alternative investigative approaches versus technical access
Homomorphic encryption and prospective solutions targeting single users with predication could provide compromise approaches
Explanation
Katie suggests exploring technical solutions like homomorphic encryption and prospective access that would only affect the architecture of a single subject for whom law enforcement has predication and a court order. This approach aims to restore previously available access while maintaining oversight and auditing capabilities.
Evidence
Katie mentions research and white papers on homomorphic encryption and compares prospective solutions to wiretaps that only affect specific subjects with proper authorization
Major discussion point
Technical Solutions and Standards Development
Topics
Cybersecurity | Legal and regulatory | Infrastructure
Disagreed with
– Audience
Disagreed on
Effectiveness of homomorphic encryption as a solution
Two-part process requires predication and judicial orders before technical access, with providers serving as checks and balances
Explanation
Katie explains that law enforcement must first prove predication to a judge to get an order, then present that order to providers for technical assistance. She emphasizes that having providers in the process creates important checks and balances rather than unstructured access methods.
Evidence
Katie describes the legal process in rule-of-law countries like the United States where judges must approve orders; contrasts this with ungoverned spaces like computer network exploitation where there are no checks and balances
Major discussion point
Law Enforcement Access and Oversight
Topics
Legal and regulatory | Human rights | Cybersecurity
Agreed with
– Makola Honey
– Richardson Lloyd
Agreed on
Judicial oversight and legal frameworks are necessary for lawful access
Disagreed with
– Audience
Disagreed on
Risk assessment of lawful access infrastructure
FBI participates in international standards organizations like ITU and ICANN to collaborate on public safety from a technical perspective
Explanation
Katie describes the FBI’s involvement in various international technical standards organizations to ensure public safety perspectives are included in technology development. This includes leadership roles and collaborative work with multi-stakeholder partners.
Evidence
FBI is involved in ITU study groups including artificial intelligence study groups; co-chairs ICANN’s public safety working group focused on DNS abuse and IP address access for victim notifications; has 26 patents which most people don’t know about
Major discussion point
Technical Solutions and Standards Development
Topics
Infrastructure | Legal and regulatory | Cybersecurity
Agreed with
– Andrew Campling
– Makola Honey
– David Wright
Agreed on
International collaboration and multi-stakeholder engagement are essential
Andrew Campling
Speech speed
147 words per minute
Speech length
2704 words
Speech time
1099 seconds
Privacy arguments are sometimes weaponized to shut down meaningful discussions in standards bodies without proper technical justification
Explanation
Andrew argues that the phrase ‘privacy’ is sometimes used to close down conversations without adequate explanation of specific technical reasons. He suggests that technical solutions often exist that can provide both privacy and child safety without them being in opposition.
Evidence
Andrew provides an example where metadata encryption intended to improve privacy can actually weaken security by hiding indicators of compromise from firewalls and security software, potentially opening systems to malware attacks
Major discussion point
Technical Solutions and Standards Development
Topics
Cybersecurity | Human rights | Infrastructure
Client-side scanning can block sharing of known illegal material without weakening encryption or creating backdoors
Explanation
Andrew argues that client-side scanning technology can prevent the sharing of known child sexual abuse material without affecting encryption strength or requiring backdoors. He emphasizes that this technology is already implemented in messaging platforms for user convenience features.
Evidence
Research with over 60,000 pedophiles shows they use end-to-end encrypted messaging platforms to share material; WhatsApp already uses client-side scanning to generate URL previews by analyzing message content and fetching images
Major discussion point
Balancing Encryption and Safety/Child Protection
Topics
Cybersecurity | Human rights | Infrastructure
Disagreed with
– Audience
Disagreed on
Feasibility and safety of client-side scanning implementation
Client-side scanning already exists in messaging apps for user convenience features, so the technology is already implemented
Explanation
Andrew points out that client-side scanning is not a new concept but is already being used in applications like WhatsApp for features like URL preview generation. He argues that extending this existing technology to scan for illegal content would not introduce new privacy concerns.
Evidence
WhatsApp analyzes message content to identify URLs and fetches preview images, which constitutes client-side scanning; this process also shares IP addresses with external sites
Major discussion point
Technical Solutions and Standards Development
Topics
Cybersecurity | Infrastructure | Human rights
Current parental controls are fragmented and need standardized protocols for better interoperability
Explanation
Andrew describes how the multitude of different parental control systems across devices, applications, and ISPs creates confusion and difficulty even for technical experts. He advocates for developing protocols that would allow these systems to interwork and exchange signals.
Evidence
A CTO of a technical organization told Andrew that he struggles with parental controls despite having enormous technical competence; controls exist at operating system, application, and ISP levels with different terminologies and no interoperability
Major discussion point
Technical Solutions and Standards Development
Topics
Infrastructure | Human rights | Sociocultural
IETF and other standards bodies need greater gender diversity and participation from civil society, regulators, and governments
Explanation
Andrew acknowledges that while technical standards organizations like IETF have good representation from the tech community, they lack diversity in gender and participation from other sectors. He argues that broader representation would lead to better standards.
Evidence
IETF has fantastic tech community representation but is ‘appalling on gender diversity’; would benefit from better representation from network operators, civil society groups, regulators, and governments
Major discussion point
Participation and Diversity in Standards Bodies
Topics
Infrastructure | Human rights | Legal and regulatory
Agreed with
– Katie Noyes
– Makola Honey
– David Wright
Agreed on
International collaboration and multi-stakeholder engagement are essential
Remote participation in IETF is completely free with no barriers, though multi-year commitment and relevant knowledge are needed
Explanation
Andrew clarifies that while IETF remote participation has no financial barriers, effective engagement requires significant time commitment and relevant technical knowledge. He encourages participation while acknowledging the practical challenges.
Evidence
No cost for remote IETF participation; remote tools provide equal access to mic queues and conversations; meetings rotate globally; typically requires multi-year commitment to understand and engage effectively
Major discussion point
Participation and Diversity in Standards Bodies
Topics
Infrastructure | Development | Human rights
Encryption of metadata can inadvertently weaken security by hiding indicators of compromise from security software
Explanation
Andrew warns that encrypting metadata for privacy purposes can have unintended security consequences by preventing firewalls and security software from detecting malware indicators. This creates a situation where attempts to improve privacy actually reduce both privacy and security.
Evidence
Metadata encryption can hide indicators of compromise that firewalls and enterprise security software use to detect system compromises; this can leave systems open to malware attacks, ironically losing privacy through security breaches
Major discussion point
Privacy and Security Concerns
Topics
Cybersecurity | Human rights | Infrastructure
Richardson Lloyd
Speech speed
205 words per minute
Speech length
1089 words
Speech time
317 seconds
Strong encryption impacts child protection by enabling criminals to operate without detection, similar to weaponization
Explanation
Lloyd argues that the issue is not about banning encryption but about its weaponization by criminals. He contends that strong encryption allows child sexual abuse material to be distributed and enables harmful interactions between adults and children without platform oversight.
Evidence
Meta previously had client-server encryption on Facebook Messenger with enormous amounts of child sexual abuse material being traded, but they were able to prevent distribution; Lloyd’s organization deals with dozens of sextortion requests daily; several suicides in Canada from sextortion similar to Katie’s examples
Major discussion point
Balancing Encryption and Safety/Child Protection
Topics
Cybersecurity | Human rights | Sociocultural
Agreed with
– Katie Noyes
– Andrew Campling
– Makola Honey
Agreed on
Encryption and safety must coexist rather than being mutually exclusive
Disagreed with
– Katie Noyes
– Audience
Disagreed on
Alternative investigative approaches versus technical access
End-to-end encryption prevents platforms from moderating content and creates cost incentives to avoid responsibility
Explanation
Lloyd argues that end-to-end encryption allows industry to avoid the costs of content moderation by turning a blind eye to harmful content. He suggests this creates a perverse economic incentive where encryption is used to reduce operational costs rather than protect privacy.
Evidence
Content moderation is a cost center for industry; turning on end-to-end encryption allows companies to avoid moderation expenses; this creates cheaper operational models
Major discussion point
Balancing Encryption and Safety/Child Protection
Topics
Cybersecurity | Economic | Legal and regulatory
Financial institutions provide a model where complete end-to-end encryption is regulated to prevent money laundering
Explanation
Lloyd draws a parallel between financial regulation and internet communication, arguing that banks are not allowed to have completely end-to-end encrypted transactions because it would enable money laundering. He suggests similar regulatory approaches should apply to internet communications.
Evidence
Banks are regulated and must inspect communications to prevent money laundering; this regulation exists because lessons were learned about the problems that occur without such oversight
Major discussion point
Law Enforcement Access and Oversight
Topics
Legal and regulatory | Economic | Cybersecurity
Agreed with
– Katie Noyes
– Makola Honey
Agreed on
Judicial oversight and legal frameworks are necessary for lawful access
Makola Honey
Speech speed
141 words per minute
Speech length
971 words
Speech time
412 seconds
Regulation need not hamper innovation, and flexible policy frameworks must respect national sovereignty while enabling diverse implementation paths
Explanation
Honey argues that regulation and innovation can coexist, and that encryption policies must accommodate different regional approaches while respecting national sovereignty. She emphasizes the need for flexible frameworks that allow countries to implement solutions fitting their local contexts.
Evidence
Examples from South Africa where cryptography providers register with the state, and Mozambique where encrypted documents must be decrypted by experts, showing different countries trying to balance safety, privacy, and capacity in ways that fit their local context
Major discussion point
Regulatory Approaches and Global Perspectives
Topics
Legal and regulatory | Development | Human rights
Agreed with
– Katie Noyes
– Andrew Campling
– Richardson Lloyd
Agreed on
Encryption and safety must coexist rather than being mutually exclusive
Global South countries face unique challenges with limited infrastructure and must rely on external resources for data processing
Explanation
Honey highlights that Global South countries have limited local infrastructure including data centers, forcing reliance on external resources for data storage and processing. This creates additional challenges when implementing encryption policies and technical solutions.
Evidence
Limited local infrastructure including data centers in Global South countries forces reliance on external resources for data storage and processing
Major discussion point
Regulatory Approaches and Global Perspectives
Topics
Development | Infrastructure | Legal and regulatory
Judicial authorization and clear legal criteria like necessity and proportionality are essential for balanced frameworks
Explanation
Honey advocates for regulatory models that require judicial oversight and clear legal standards for accessing encrypted data. She emphasizes that frameworks in countries like South Africa, Nigeria, and Brazil require court authorization based on specific legal criteria.
Evidence
Frameworks in South Africa, Nigeria, and Brazil require judicial authorization for access to encrypted data based on clear legal criteria such as necessity and proportionality; South Africa has judicial oversight and provider registration embedded in their framework
Major discussion point
Regulatory Approaches and Global Perspectives
Topics
Legal and regulatory | Human rights | Cybersecurity
Agreed with
– Katie Noyes
– Richardson Lloyd
Agreed on
Judicial oversight and legal frameworks are necessary for lawful access
Meaningful participation from Global South in technical standard-setting is important for shaping rather than just adapting to decisions
Explanation
Honey emphasizes that Global South countries need to participate meaningfully in technical standards development rather than simply adapting to decisions made elsewhere. She sees this as an opportunity to build capacity while ensuring diverse perspectives shape technical standards.
Evidence
Honey mentions the importance of not just adapting to decisions but helping shape them, providing opportunities while building capacity and technical skills
Major discussion point
Participation and Diversity in Standards Bodies
Topics
Development | Infrastructure | Legal and regulatory
Agreed with
– Katie Noyes
– Andrew Campling
– David Wright
Agreed on
International collaboration and multi-stakeholder engagement are essential
Audience
Speech speed
178 words per minute
Speech length
1220 words
Speech time
410 seconds
Client-side scanning poses risks as it would create software on everyone’s phone that could be exploited by bad actors
Explanation
An audience member with mathematical and computer science background argues that client-side scanning cannot be implemented safely because it requires installing software on everyone’s device, creating an attractive target for malicious actors including intelligence agencies from hostile countries.
Evidence
The speaker identifies as a mathematician and computer scientist who has studied client-side scanning extensively and mentions it would be an ‘irresistible honeypot for all kinds of bad actors notably including intelligence agencies’
Major discussion point
Privacy and Security Concerns
Topics
Cybersecurity | Human rights | Infrastructure
Disagreed with
– Andrew Campling
Disagreed on
Feasibility and safety of client-side scanning implementation
Salt Typhoon hack leveraged wiretapping infrastructure, demonstrating risks of lawful access systems being compromised
Explanation
An audience member points out that the Salt Typhoon hack of US telecoms leveraged existing wiretapping infrastructure, showing how lawful access systems can be exploited by adversaries. This demonstrates that predication and legal processes don’t protect against infrastructure compromise.
Evidence
Reference to the Salt Typhoon hack of US telecommunications systems that used wiretapping infrastructure; the hack put citizens, governments, and children at risk
Major discussion point
Privacy and Security Concerns
Topics
Cybersecurity | Legal and regulatory | Infrastructure
Disagreed with
– Katie Noyes
Disagreed on
Risk assessment of lawful access infrastructure
Technical solutions like privacy pass can enable age verification while preserving privacy
Explanation
An audience member suggests that new encryption standards like privacy pass, developed at IETF, can provide constructive solutions by allowing privacy-preserving attestation and age verification without compromising user privacy.
Evidence
Privacy pass was developed at IETF and allows making attestations and proving properties in a privacy-preserving way, specifically enabling age verification while maintaining privacy
Major discussion point
Technical Solutions and Standards Development
Topics
Cybersecurity | Human rights | Infrastructure
Disagreed with
– Katie Noyes
Disagreed on
Effectiveness of homomorphic encryption as a solution
End-to-end encryption protects channels but data remains accessible at endpoints for investigation and protection measures
Explanation
An audience member argues that end-to-end encryption protects communication channels while still allowing access to data at the endpoints, suggesting that investigation and protection can work with victims who have access to the data rather than breaking encryption.
Evidence
The speaker mentions that victims have access to the data and can work with law enforcement, and that malware protection and parental controls can still function at endpoints
Major discussion point
Privacy and Security Concerns
Topics
Cybersecurity | Human rights | Legal and regulatory
Disagreed with
– Katie Noyes
– Richardson Lloyd
Disagreed on
Alternative investigative approaches versus technical access
Different countries will make different political decisions about lawful interception based on their specific circumstances and crime levels
Explanation
An audience member argues that this is fundamentally a political discussion about balances and compromises that will vary by country based on factors like crime levels, presence of organized crime, and political systems. They suggest focusing on preventing bad effects from spilling over between countries with different approaches.
Evidence
The speaker notes that allowing lawful interception in Norway would be very different than allowing it in Iran, and mentions factors like organized crime presence and country-specific circumstances
Major discussion point
Regulatory Approaches and Global Perspectives
Topics
Legal and regulatory | Human rights | Sociocultural
David Wright
Speech speed
148 words per minute
Speech length
1688 words
Speech time
681 seconds
The workshop aims to move beyond polarized positions and explore multi-stakeholder engagement for balanced encryption policy
Explanation
David Wright frames the workshop as building on previous discussions to advance from conflict to collaboration solutions. He emphasizes the need to explore how multi-stakeholder engagement can help develop rights-respecting balanced approaches to encryption policy rather than maintaining polarized positions.
Evidence
Building on IGF 2024’s workshop ‘balancing acts, encryption, privacy and public safety’; last year explored weaponization of privacy obstructing safety measures; this year asks whether encryption safety can truly co-exist
Major discussion point
Balancing Encryption and Safety/Child Protection
Topics
Human rights | Legal and regulatory | Cybersecurity
Agreed with
– Katie Noyes
– Andrew Campling
– Makola Honey
Agreed on
International collaboration and multi-stakeholder engagement are essential
The discussion covers technical models for lawful access, global encryption governance developments, and tensions between platform accountability and user privacy
Explanation
David Wright outlines the scope of the workshop to include exploring technical models that enable lawful access without undermining encryption, examining global regulatory frameworks, and addressing tensions between different stakeholder responsibilities. The workshop aims to cover comprehensive aspects of the encryption debate from technical to policy perspectives.
Evidence
Panel will explore technical models enabling lawful access without undermining encryption; look at global developments in encryption governance particularly regulatory and standards frameworks; examine tensions between platform accountability user privacy and state responsibilities
Major discussion point
Technical Solutions and Standards Development
Topics
Legal and regulatory | Cybersecurity | Infrastructure
The status quo regarding child sexual abuse material is unacceptable given the scale of victimization
Explanation
David Wright emphasizes the urgent need for solutions by highlighting the massive scale of the problem affecting children globally. He argues that the current situation cannot be maintained given the extent of harm being caused to children through technology-facilitated abuse.
Evidence
Current estimates show roughly 300 million child victims annually, which is roughly 14% of children across the world, meaning every country and probably people close to everyone are affected
Major discussion point
Balancing Encryption and Safety/Child Protection
Topics
Human rights | Cybersecurity | Sociocultural
Boris Radanovic
Speech speed
206 words per minute
Speech length
423 words
Speech time
122 seconds
Future panel discussions need more diversity to better represent different perspectives in the encryption debate
Explanation
Boris Radanovic observes from the online discussion that there’s a need for more diverse representation in panels discussing encryption and safety issues. He suggests that many online contributors would bring valuable perspectives to future discussions on these topics.
Evidence
Online contributors made points showcasing the need for more diversity in future panel discussions; many online contributors would be great additions to panels
Major discussion point
Participation and Diversity in Standards Bodies
Topics
Human rights | Legal and regulatory | Sociocultural
High privacy technical standards do not prevent traditional police investigative techniques
Explanation
Boris Radanovic conveys from online discussion that strong encryption and privacy protections don’t eliminate law enforcement’s ability to conduct investigations through conventional methods. This suggests that encryption doesn’t completely block all investigative avenues available to authorities.
Evidence
Online comment noted that high privacy technical standards are not banning police investigations in more traditional ways and other technical investigative techniques
Major discussion point
Law Enforcement Access and Oversight
Topics
Legal and regulatory | Cybersecurity | Human rights
There are legitimate civic space concerns about the impact of weaker privacy standards
Explanation
Boris Radanovic highlights from online discussion that weakening privacy standards could have broader implications for civic space and civil society activities. This represents concerns about how encryption policy changes might affect legitimate privacy needs beyond criminal investigations.
Evidence
Online question raised about panelists’ perspective on civic space impact of weaker privacy standards; good reasons and privacy reasons to use certain applications
Major discussion point
Privacy and Security Concerns
Topics
Human rights | Legal and regulatory | Sociocultural
Agreements
Agreement points
Encryption and safety must coexist rather than being mutually exclusive
Speakers
– Katie Noyes
– Andrew Campling
– Richardson Lloyd
– Makola Honey
Arguments
Encryption and safety must coexist and technical solutions need to be found
Technical solutions often exist that can provide both privacy and child safety without them being in opposition
Strong encryption impacts child protection by enabling criminals to operate without detection, similar to weaponization
Regulation need not hamper innovation, and flexible policy frameworks must respect national sovereignty while enabling diverse implementation paths
Summary
All panelists agree that encryption and safety are not inherently opposing forces and that technical and policy solutions can be developed to achieve both objectives simultaneously
Topics
Cybersecurity | Human rights | Legal and regulatory
The scale of child sexual abuse material problem requires urgent action
Speakers
– Katie Noyes
– Andrew Campling
– Richardson Lloyd
– David Wright
Arguments
Over 17,000 FBI cases are impacted by encryption barriers, particularly involving child sexual abuse material
Research with over 60,000 pedophiles shows they use end-to-end encrypted messaging platforms to share material
Meta previously had client-server encryption on Facebook Messenger with enormous amounts of child sexual abuse material being traded
Current estimates show roughly 300 million child victims annually, which is roughly 14% of children across the world
Summary
There is strong consensus that the current scale of child sexual abuse material distribution represents an urgent crisis requiring immediate technical and policy responses
Topics
Cybersecurity | Human rights | Sociocultural
International collaboration and multi-stakeholder engagement are essential
Speakers
– Katie Noyes
– Andrew Campling
– Makola Honey
– David Wright
Arguments
FBI participates in international standards organizations like ITU and ICANN to collaborate on public safety from a technical perspective
IETF and other standards bodies need greater gender diversity and participation from civil society, regulators, and governments
Meaningful participation from Global South in technical standard-setting is important for shaping rather than just adapting to decisions
The workshop aims to move beyond polarized positions and explore multi-stakeholder engagement for balanced encryption policy
Summary
All speakers emphasize the importance of collaborative approaches involving multiple stakeholders across different sectors and regions to develop effective solutions
Topics
Infrastructure | Legal and regulatory | Development
Judicial oversight and legal frameworks are necessary for lawful access
Speakers
– Katie Noyes
– Makola Honey
– Richardson Lloyd
Arguments
Two-part process requires predication and judicial orders before technical access, with providers serving as checks and balances
Judicial authorization and clear legal criteria like necessity and proportionality are essential for balanced frameworks
Financial institutions provide a model where complete end-to-end encryption is regulated to prevent money laundering
Summary
There is agreement that any lawful access to encrypted communications must be subject to proper judicial oversight and clear legal standards to prevent abuse
Topics
Legal and regulatory | Human rights | Cybersecurity
Similar viewpoints
Both speakers advocate for specific technical solutions that can provide law enforcement access while maintaining encryption integrity, focusing on targeted approaches rather than broad surveillance capabilities
Speakers
– Katie Noyes
– Andrew Campling
Arguments
Homomorphic encryption and prospective solutions targeting single users with predication could provide compromise approaches
Client-side scanning can block sharing of known illegal material without weakening encryption or creating backdoors
Topics
Cybersecurity | Infrastructure | Human rights
Both speakers argue that privacy concerns are sometimes used inappropriately to avoid addressing legitimate safety issues, whether in technical standards or platform operations
Speakers
– Andrew Campling
– Richardson Lloyd
Arguments
Privacy arguments are sometimes weaponized to shut down meaningful discussions in standards bodies without proper technical justification
End-to-end encryption prevents platforms from moderating content and creates cost incentives to avoid responsibility
Topics
Cybersecurity | Legal and regulatory | Economic
Both speakers emphasize the importance of diverse participation in international technical standards development to ensure all perspectives are represented in shaping technology policies
Speakers
– Katie Noyes
– Makola Honey
Arguments
FBI participates in international standards organizations like ITU and ICANN to collaborate on public safety from a technical perspective
Meaningful participation from Global South in technical standard-setting is important for shaping rather than just adapting to decisions
Topics
Infrastructure | Development | Legal and regulatory
Unexpected consensus
Client-side scanning technology already exists in consumer applications
Speakers
– Andrew Campling
– Richardson Lloyd
Arguments
Client-side scanning already exists in messaging apps for user convenience features, so the technology is already implemented
Apple pretty much fully built with neural hash. It was a couple of years back and then quite quickly threw in the garbage. It was audited by external folks. It was a complete example of how you can do client side scanning
Explanation
There was unexpected agreement that client-side scanning is not a theoretical concept but already implemented technology, with Lloyd even acknowledging Apple’s neural hash implementation despite generally supporting child protection measures
Topics
Cybersecurity | Infrastructure | Human rights
Need for better diversity in technical standards participation
Speakers
– Andrew Campling
– Boris Radanovic
Arguments
IETF and other standards bodies need greater gender diversity and participation from civil society, regulators, and governments
Future panel discussions need more diversity to better represent different perspectives in the encryption debate
Explanation
Unexpected consensus emerged between a technical standards participant and the online discussion moderator about the need for more diverse representation, suggesting broad recognition of this issue across different stakeholder groups
Topics
Infrastructure | Human rights | Sociocultural
Recognition of legitimate privacy concerns even while advocating for access
Speakers
– Katie Noyes
– Andrew Campling
– Audience
Arguments
FBI concerns around encryption and what we’re asking for and absolutely that’s why we’re standing here before you today saying we are not against encryption we’re not against end-to-end encryption
Privacy is a qualified right. If you are doing an illegal act, then I think it’s reasonable that you might at that point surrender your right to privacy
There are legitimate civic space concerns about the impact of weaker privacy standards
Explanation
Despite being advocates for law enforcement access, there was unexpected consensus acknowledging legitimate privacy concerns and the need to balance competing interests rather than dismissing privacy arguments entirely
Topics
Human rights | Legal and regulatory | Cybersecurity
Overall assessment
Summary
The discussion revealed significant consensus on fundamental principles: encryption and safety can coexist, the child abuse problem requires urgent action, international collaboration is essential, and judicial oversight is necessary for lawful access. There was also agreement on the need for technical solutions and diverse stakeholder participation.
Consensus level
High level of consensus on core principles and problem recognition, with moderate consensus on specific technical approaches. The implications suggest a foundation exists for collaborative policy development, though implementation details remain contentious. The agreement on fundamental issues provides a basis for moving beyond polarized positions toward practical solutions.
Differences
Different viewpoints
Feasibility and safety of client-side scanning implementation
Speakers
– Andrew Campling
– Audience
Arguments
Client-side scanning can block sharing of known illegal material without weakening encryption or creating backdoors
Client-side scanning poses risks as it would create software on everyone’s phone that could be exploited by bad actors
Summary
Andrew argues that client-side scanning is technically feasible and already exists in messaging apps, while audience members with technical expertise argue it creates unacceptable security risks by installing software that could be exploited by malicious actors including intelligence agencies.
Topics
Cybersecurity | Human rights | Infrastructure
Effectiveness of homomorphic encryption as a solution
Speakers
– Katie Noyes
– Audience
Arguments
Homomorphic encryption and prospective solutions targeting single users with predication could provide compromise approaches
Technical solutions like privacy pass can enable age verification while preserving privacy
Summary
Katie suggests homomorphic encryption as a potential technical solution for lawful access, while audience members question its practical implementation and suggest alternative privacy-preserving technologies like privacy pass.
Topics
Cybersecurity | Infrastructure | Human rights
Risk assessment of lawful access infrastructure
Speakers
– Katie Noyes
– Audience
Arguments
Two-part process requires predication and judicial orders before technical access, with providers serving as checks and balances
Salt Typhoon hack leveraged wiretapping infrastructure, demonstrating risks of lawful access systems being compromised
Summary
Katie emphasizes the safeguards and oversight in lawful access processes, while audience members point to real-world examples like Salt Typhoon where lawful access infrastructure was compromised by adversaries, demonstrating inherent vulnerabilities.
Topics
Cybersecurity | Legal and regulatory | Infrastructure
Alternative investigative approaches versus technical access
Speakers
– Katie Noyes
– Richardson Lloyd
– Audience
Arguments
Over 17,000 FBI cases are impacted by encryption barriers, particularly involving child sexual abuse material
Strong encryption impacts child protection by enabling criminals to operate without detection, similar to weaponization
End-to-end encryption protects channels but data remains accessible at endpoints for investigation and protection measures
Summary
Law enforcement representatives argue that encryption creates significant barriers requiring technical solutions, while audience members suggest that traditional investigative methods and endpoint access can still be effective without compromising encryption.
Topics
Cybersecurity | Legal and regulatory | Human rights
Unexpected differences
Economic incentives driving encryption adoption
Speakers
– Richardson Lloyd
– Katie Noyes
Arguments
End-to-end encryption prevents platforms from moderating content and creates cost incentives to avoid responsibility
Encryption and safety must coexist and technical solutions need to be found
Explanation
Lloyd’s argument that companies use encryption primarily to avoid moderation costs rather than for legitimate privacy reasons was unexpected, as it frames encryption adoption as economically rather than privacy motivated. This contrasts with Katie’s more collaborative approach toward industry partnerships.
Topics
Economic | Cybersecurity | Legal and regulatory
Metadata encryption creating security vulnerabilities
Speakers
– Andrew Campling
– Audience
Arguments
Encryption of metadata can inadvertently weaken security by hiding indicators of compromise from security software
Technical solutions like privacy pass can enable age verification while preserving privacy
Explanation
Andrew’s argument that privacy-enhancing encryption can actually reduce security was unexpected, as it challenges the common assumption that more encryption always equals better security. This technical nuance wasn’t anticipated in a discussion typically focused on law enforcement access versus privacy.
Topics
Cybersecurity | Infrastructure | Human rights
Overall assessment
Summary
The main areas of disagreement center on technical implementation approaches (client-side scanning vs. alternative methods), risk assessment of lawful access infrastructure, and the balance between traditional investigative methods versus requiring technical access to encrypted communications.
Disagreement level
Moderate to high disagreement on technical solutions with significant implications for policy development. While there’s consensus on the importance of child protection, the fundamental disagreement on whether technical access to encryption is necessary or safe creates a substantial policy impasse that requires continued multi-stakeholder dialogue and potentially different regional approaches.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers advocate for specific technical solutions that can provide law enforcement access while maintaining encryption integrity, focusing on targeted approaches rather than broad surveillance capabilities
Speakers
– Katie Noyes
– Andrew Campling
Arguments
Homomorphic encryption and prospective solutions targeting single users with predication could provide compromise approaches
Client-side scanning can block sharing of known illegal material without weakening encryption or creating backdoors
Topics
Cybersecurity | Infrastructure | Human rights
Both speakers argue that privacy concerns are sometimes used inappropriately to avoid addressing legitimate safety issues, whether in technical standards or platform operations
Speakers
– Andrew Campling
– Richardson Lloyd
Arguments
Privacy arguments are sometimes weaponized to shut down meaningful discussions in standards bodies without proper technical justification
End-to-end encryption prevents platforms from moderating content and creates cost incentives to avoid responsibility
Topics
Cybersecurity | Legal and regulatory | Economic
Both speakers emphasize the importance of diverse participation in international technical standards development to ensure all perspectives are represented in shaping technology policies
Speakers
– Katie Noyes
– Makola Honey
Arguments
FBI participates in international standards organizations like ITU and ICANN to collaborate on public safety from a technical perspective
Meaningful participation from Global South in technical standard-setting is important for shaping rather than just adapting to decisions
Topics
Infrastructure | Development | Legal and regulatory
Takeaways
Key takeaways
Encryption and safety must coexist – technical solutions need to be developed rather than viewing them as mutually exclusive
The scale of child sexual abuse material online is massive, with approximately 300 million child victims annually (14% of children worldwide)
Over 17,000 FBI cases are currently impacted by encryption barriers, particularly involving crimes against children
Client-side scanning technology already exists in messaging applications for user convenience features and could be adapted for detecting known illegal material without weakening encryption
The status quo is unacceptable – continuing to have the same debates without solutions while children remain at risk
Different countries will make different political decisions about lawful interception based on their specific circumstances, crime levels, and governance structures
Multi-stakeholder collaboration in standards development organizations is essential, with need for greater diversity including civil society, regulators, and Global South participation
Regulation need not hamper innovation if designed with flexible frameworks that respect national sovereignty
Resolutions and action items
Continue technical discussions on homomorphic encryption and prospective solutions that target single users with judicial predication
Develop standardized protocols for parental controls to enable better interoperability across devices and platforms
Increase participation in standards bodies like IETF, particularly encouraging gender diversity and civil society engagement
Explore client-side scanning implementations that include safeguards like cryptographic security and trusted flaggers
Focus future discussions on preventing negative effects from spilling over between countries with different lawful interception policies
Unresolved issues
Technical feasibility and safety concerns around client-side scanning, with experts disagreeing on whether it can be implemented securely
How to balance law enforcement access needs with protection against authoritarian government abuse
Scalability issues with proposed technical solutions like homomorphic encryption on mobile devices
Lack of concrete, testable implementations for many proposed technical solutions
How to handle the fragmentation of internet governance as different countries adopt different approaches
Concerns about lawful access infrastructure being exploited by bad actors (as demonstrated by Salt Typhoon hack)
How to ensure meaningful checks and balances in law enforcement access systems across different legal frameworks
Suggested compromises
Prospective solutions that only affect the encryption architecture of single users who are subject to judicial orders with predication
Client-side scanning limited to known child sexual abuse material with cryptographic protections and trusted flaggers
Provider-mediated access where companies serve as intermediaries and checks/balances in the law enforcement access process
Focus on endpoint access and victim cooperation rather than breaking encryption channels
Judicial authorization requirements with clear legal criteria like necessity and proportionality
Country-specific approaches that allow different nations to balance privacy and safety according to their circumstances while minimizing cross-border impacts
Investment in alternative technical solutions like privacy-preserving age verification and improved parental controls
Thought provoking comments
Encryption and safety have to coexist. We have to find a way… Now in the case of deployments, and I think this is the challenge for us, is now we in law enforcement, we know what we’re missing.
Speaker
Katie Noyes (FBI)
Reason
This comment reframed the entire debate from an either/or proposition to a collaborative problem-solving challenge. Rather than positioning encryption and safety as inherently conflicting, it established the foundational premise that both must be preserved, shifting focus to ‘how’ rather than ‘whether.’
Impact
This set the constructive tone for the entire discussion and influenced other panelists to adopt solution-oriented language. It moved the conversation away from polarized positions toward exploring technical and policy compromises.
Privacy arguments become weaponized… sometimes the phrase privacy is used to close down a meaningful conversation and just use as the reason to not do something… you need to unpack that and say specifically why.
Speaker
Andrew Campling
Reason
This observation identified a critical dysfunction in policy debates – the use of legitimate concepts as conversation-stoppers rather than starting points for nuanced discussion. It challenged participants to move beyond rhetorical positions to substantive technical analysis.
Impact
This comment elevated the discussion’s analytical rigor and encouraged more specific, technical responses from subsequent speakers. It also validated concerns about bad-faith arguments while maintaining space for legitimate privacy concerns.
Regulation need not hamper innovation. But at the same time… we want to look at that balance of regulation for protection of privacy, but at the same time, ensuring that there is access, especially for the protection of children online.
Speaker
Makola Honey
Reason
This comment introduced crucial Global South perspectives and challenged the assumption that regulation inherently stifles innovation. It highlighted how different regions approach the balance differently based on local contexts and capacities.
Impact
This broadened the discussion beyond Western-centric viewpoints and introduced considerations of regional diversity, technical capacity limitations, and different regulatory approaches. It influenced later discussions about country-specific solutions.
It’s not necessarily about banning encryption… It’s more about the weaponization of encryption… If I’m industry in this space, that’s a cost centre for me. If I can turn on end-to-end encryption and avoid any sense of moderation, it’s cheaper for me.
Speaker
Lloyd Richardson
Reason
This comment introduced economic incentives as a driving factor in encryption deployment decisions, suggesting that some companies may use encryption not primarily for user protection but to avoid content moderation costs. This added a crucial business analysis dimension to the technical and legal discussion.
Impact
This reframed part of the debate from purely technical/rights-based arguments to include economic motivations, leading to more nuanced discussion about industry responsibilities and the need for regulatory frameworks that account for business incentives.
I think unfortunately it was a little bit one-sided so I wanted to add another perspective… with the big salt typhoon hack of the US telecoms they leveraged the wiretapping infrastructure that is being promoted here and that put citizens at risk including governments and children.
Speaker
Vinicius Fortuna (Jigsaw)
Reason
This intervention provided crucial balance by highlighting real-world security vulnerabilities in lawful access systems, demonstrating that the theoretical risks privacy advocates warn about have materialized in practice. It also offered constructive technical alternatives.
Impact
This comment forced the discussion to grapple with concrete evidence of lawful access infrastructure being exploited by bad actors, adding credibility to privacy concerns while maintaining a constructive tone by offering alternative solutions.
How many times have we been having this conversation at IGF at the ITF in the last 10 years… this is really a political discussion of balances of compromise… maybe we could focus… can we find ways to prevent the bad effects from spilling over to other countries that decide they don’t want to do this.
Speaker
Vittorio Bertola
Reason
This meta-observation about the cyclical nature of the debate was particularly insightful because it acknowledged the fundamental political nature of the issue while proposing a pragmatic path forward that accepts different countries will make different choices.
Impact
This comment shifted the discussion toward more practical, implementable solutions and influenced the final exchanges about potential internet fragmentation and the need for country-specific approaches while minimizing cross-border negative effects.
The current estimates are that there are roughly 300 million child victims annually. That’s roughly 14% of the children across the world… So it’s the sheer scale of the problem that unfortunately has been enabled by the internet, which is why I feel that we need to find a solution.
Speaker
Andrew Campling
Reason
This stark statistical reality check provided crucial context for why the technical community cannot simply dismiss law enforcement concerns. The scale of harm described made abstract technical debates feel inadequate to the human cost.
Impact
This data point appeared to influence the tone of subsequent interventions, with even privacy advocates acknowledging the need for solutions rather than simply defending the status quo. It reinforced the urgency behind finding workable compromises.
Overall assessment
These key comments fundamentally shaped the discussion by establishing it as a collaborative problem-solving exercise rather than a zero-sum debate. Katie’s opening reframing set a constructive tone that influenced all subsequent contributions. Andrew’s observation about weaponized privacy arguments elevated the analytical rigor, while Honey’s Global South perspective and Lloyd’s economic analysis added crucial dimensions often missing from these debates. The technical interventions from Vinicius and others forced engagement with real-world security concerns, while Vittorio’s meta-commentary provided a pragmatic path forward. Together, these comments moved the discussion from ideological positions toward nuanced technical and policy solutions, though the fundamental tensions remain unresolved. The conversation demonstrated both the complexity of the challenge and the possibility of more productive engagement when participants move beyond rhetorical positions to substantive analysis.
Follow-up questions
How can homomorphic encryption be effectively used for lawful access without breaking encryption?
Speaker
Katie Noyes
Explanation
Katie mentioned homomorphic encryption as a potential technical solution but acknowledged challenges at scale and noted it’s more for narrowing based on characterization than decryption. A participant (Tapani Tajvainen) challenged this, asking for concrete implementation or theoretical papers showing how it could work.
What are the specific technical details and safeguards for client-side scanning implementation?
Speaker
Andrew Campling and Tapani Tajvainen
Explanation
Andrew proposed client-side scanning as a solution, but Tapani raised concerns about safety and implementation. Andrew suggested continuing the discussion offline to address technical specifics and safeguards against abuse.
How can a prospective lawful access solution work where only a single user’s architecture is affected?
Speaker
Katie Noyes
Explanation
Katie proposed exploring a prospective solution that would only affect the encryption architecture of a single user with predication, similar to a wiretap, and asked if this could be a workable starting point for compromise.
How can parental controls be standardized and made interoperable across different platforms and devices?
Speaker
Andrew Campling
Explanation
Andrew mentioned working on a proposal to develop a protocol so parental controls can interwork and exchange signals, making them easier for parents to use effectively.
What are the best practices for countries that decide to implement lawful interception while minimizing negative spillover effects to other countries?
Speaker
Vittorio Bertola
Explanation
Vittorio suggested focusing on technical ways of managing lawful interception and preventing bad effects from spilling over to countries that choose not to implement such measures.
How can greater diversity and civil society representation be achieved in technical standards development organizations?
Speaker
Andrew Campling and Warren Kamari
Explanation
Both speakers emphasized the need for better gender diversity and participation from regulators, civil society, and other non-tech sectors in standards bodies like IETF to improve the quality of standards.
What are the civic space impacts of weaker privacy standards?
Speaker
Online participants (via Boris Radanovic)
Explanation
This question was raised in the online chat but not fully addressed during the panel discussion, highlighting the need to examine broader societal implications.
How can abuse-resistant law enforcement access systems be designed to build trust and provide transparency?
Speaker
Katie Noyes
Explanation
Katie mentioned reading papers about abuse-resistant law enforcement access systems and emphasized the need for processes that include orders, auditing, and transparency to build trust.
What are concrete examples of privacy arguments being weaponized in standards development forums?
Speaker
David Wright (moderator) and Andrew Campling
Explanation
Andrew mentioned having examples but declined to share them to keep the discussion constructive, suggesting this could be explored in future discussions.
How can government funding support the development of better parental control and child protection tools?
Speaker
Online participants (via Boris Radanovic)
Explanation
This suggestion from online participants about government funding for free deployment of protective tools on widely used systems was noted but not fully explored.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.