WS #199 Ensuring the online coexistence of human rights&child safety
WS #199 Ensuring the online coexistence of human rights&child safety
Session at a Glance
Summary
This panel discussion focused on the complex interplay between technology, privacy rights, and child protection in the digital space, particularly concerning encryption and lawful access. Experts from various fields explored how to foster an online environment that respects human rights while prioritizing child safety.
The debate centered around the implementation of end-to-end encryption by major tech companies and its impact on law enforcement’s ability to access communications for child protection purposes. Panelists discussed legislation in countries like Australia and the UK that aim to provide lawful access to encrypted communications, while also acknowledging the need for robust safeguards and oversight.
There was disagreement on whether backdoors or client-side scanning could be implemented without compromising overall security and privacy. Some argued that these measures are necessary to combat child exploitation, while others warned of potential abuse and the risk of driving users to less regulated platforms.
The discussion highlighted the need for a multi-stakeholder approach to find solutions that balance security, privacy, and child safety. Panelists explored potential technical innovations, such as abuse-resistant lawful access mechanisms and homomorphic encryption, as ways to potentially address some concerns.
The conversation also touched on the global nature of the problem, the importance of considering victims’ perspectives, and the challenges of implementing universal solutions across different jurisdictions and cultures. There was a general consensus that continued dialogue and collaboration between governments, industry, civil society, and academia is crucial to addressing these complex issues.
In conclusion, while no definitive solutions were reached, the discussion emphasized the ongoing need for innovative approaches and balanced regulation to protect children online while preserving privacy and security for all users.
Keypoints
Major discussion points:
– The debate over encryption and lawful access, including recent legislation in various countries
– Technical challenges and potential solutions for balancing security, privacy, and child safety
– The role of different stakeholders (government, industry, civil society) in addressing these issues
– The impact of end-to-end encryption on law enforcement and child protection efforts
– Potential compromises or alternative approaches to enable some lawful access while preserving security
The overall purpose of the discussion was to explore the complex interplay between technology, privacy rights, and efforts to protect children in the digital space. The panel aimed to bring together diverse perspectives to discuss how to foster an online environment that respects human rights while prioritizing child safety.
The tone of the discussion was generally collaborative and solution-oriented, with panelists acknowledging the complexity of the issues and the need for multi-stakeholder cooperation. While there were some disagreements, particularly around the feasibility and desirability of backdoors in encryption, the tone remained respectful. Towards the end, there was a shift towards more urgency in finding practical solutions and determining who should take the lead in bringing stakeholders together.
Speakers
– Stewart Baker: Moderator, Washington D.C.-based attorney specializing in homeland security, cybersecurity, and data protection
– Dan Suter: Principal advisor to the prime minister and cabinet ministries in New Zealand on lawful access and cross-border data policy
– Mallory Knodel: Executive director of the Social Web Foundation, technology and human rights expert specializing in internet governance and digital policy
– Katie Noyes: Section chief for the FBI science and technology branches, Next Generation Technology and Technology and lawful access section
– Gabriel Kaptchuk: Assistant professor in the computer science department at the University of Maryland, College Park, focuses on cryptographic systems
– Mia McAllister: Introducer/facilitator of the panel discussion
Additional speakers:
– Andrew Campling: Trustee for the Internet Watch Foundation
Full session report
Encryption, Child Safety, and Lawful Access: Balancing Priorities in the Digital Age
This panel discussion brought together experts from various fields to explore the complex interplay between technology, privacy rights, and child protection in the digital space. The conversation centered on the challenges posed by end-to-end encryption and the need to balance security, privacy, and child safety in an increasingly interconnected world.
Key Themes and Debates
1. Legislation and Regulation of Encryption
The discussion highlighted recent legislative efforts in countries like Australia and the UK to address encryption and lawful access. Dan Suter, speaking about Australia and the UK’s approaches, advocated for an incremental approach with safeguards, referencing Australia’s TOLA legislation and the UK’s Investigatory Powers Act. He emphasized the need for consistent engagement between governments and tech firms to develop effective solutions.
Mallory Knodel, from the Social Web Foundation, warned that overly restrictive laws could force companies to leave certain jurisdictions, citing the example of the Session app leaving Australia due to concerns about TOLA legislation. This exodus could potentially undermine both privacy and child protection efforts. Katie Noyes of the FBI noted that the EU is also exploring ways to provide law enforcement with access to data while respecting privacy rights, mentioning the G7 Romalion group’s lawful access working group.
Gabriel Kaptchuk, an academic expert in cryptography, cautioned against mandating backdoors or weakening encryption, highlighting the potential security risks associated with such approaches. He explained the crucial differences between software update keys and encryption keys for messaging, emphasizing the importance of understanding these distinctions in policy discussions.
2. Balancing Security, Privacy, and Child Safety
A central tension in the discussion revolved around the implementation of end-to-end encryption by major tech companies and its impact on law enforcement’s ability to access communications for child protection purposes. Katie Noyes emphasized that while end-to-end encryption protects users, it also hinders investigations into child exploitation. She provided concrete data from the National Center for Missing and Exploited Children to ground the discussion in real-world impacts.
Noyes illustrated the importance of content access in investigations by discussing the Jordan Demay case, where access to message content was crucial in understanding the circumstances leading to a teenager’s suicide and identifying other potential victims.
Mallory Knodel stressed the importance of protecting children while respecting privacy rights, arguing for a nuanced approach that considers the diverse needs of users and platforms. Andrew Cummings, representing the Internet Watch Foundation, highlighted the scale of the child sexual abuse problem and criticized the tech sector’s approach, suggesting exploring client-side scanning of known child sexual abuse material (CSAM) as a potential compromise.
3. Technical Approaches and Innovations
The discussion explored various technical approaches to address the encryption debate. Gabriel Kaptchuk suggested investigating “abuse-resistant” lawful access mechanisms that could provide limited access without compromising overall security. He introduced the concept of “hotness” of keys, proposing systems that would make key theft detectable and thus deter abuse.
Katie Noyes mentioned the potential of homomorphic encryption and other emerging technologies to enable data analysis without fully decrypting information. She also proposed exploring prospective data-in-motion solutions that could target specific subjects rather than affecting all users.
However, Gabriel Kaptchuk also highlighted challenges with perceptual hashing and content matching techniques, emphasizing the need for robust safeguards and acknowledging the technical limitations of these approaches. The discussion touched on the potential of client-side scanning and age verification as possible solutions, though their effectiveness and privacy implications were debated.
4. Multi-stakeholder Approach to Solutions
There was broad agreement on the need for a collaborative approach involving government, industry, civil society, and academia to address these complex issues. Katie Noyes emphasized the importance of bringing together diverse stakeholders to find balanced solutions. Dan Suter called for leadership in bringing stakeholders together, while Andrew Campling expressed frustration with the lack of progress on implementing child protection measures.
Areas of Agreement and Disagreement
The panelists generally agreed on the need for multi-stakeholder collaboration and the importance of balancing security, privacy, and child safety. There was also consensus on the complexity of the issue and the need for innovative solutions.
However, significant disagreements emerged regarding the effectiveness and risks of encryption backdoors. The interpretation of recent security incidents, such as the “salt typhoon” hacks, revealed differing perspectives. Mallory Knodel suggested these incidents demonstrated the risks of built-in lawful access mechanisms, while Katie Noyes disputed this interpretation based on FBI investigations, emphasizing the need for accurate information when discussing such cases.
Unresolved Issues and Future Questions
The discussion left several critical questions unanswered, including:
1. How to implement effective age verification or limits on encrypted platforms without compromising privacy?
2. How to address the global nature of platforms and crimes while respecting jurisdictional differences?
3. What technical solutions can provide lawful access while maintaining strong security and privacy protections?
Potential areas for further exploration include:
1. Investigating prospective data-in-motion solutions that don’t affect all users
2. Developing abuse-resistant lawful access mechanisms
3. Improving content-based detection methods while addressing technical limitations
4. Allowing for diversity in platform approaches rather than mandating one solution for all
5. Focusing on known CSAM detection and user reporting tools as interim measures
In conclusion, while no definitive solutions were reached, the discussion emphasized the ongoing need for innovative approaches and balanced regulation to protect children online while preserving privacy and security for all users. The complexity of the issue underscores the importance of continued dialogue, technical innovation, and collaboration between all stakeholders in the digital ecosystem.
Session Transcript
Mia McAllister: Today’s session aims to provide meaningful insights into the complex interplay between technology, privacy rights, and efforts to protect children in the digital space. As digital technologies continue to evolve, they offer both opportunities and challenges. Today’s panel brings together experts from diverse fields to explore how we can foster an online environment that respects human rights while prioritizing child safety. I’ll go ahead and introduce our panel today. Online, we have our moderator, Stuart Baker. I don’t think he’s joined just yet, but Stuart Baker is a Washington, D.C.-based attorney specializing in homeland security, cybersecurity, and data protection. He has held notable government roles, including serving as a first assistant secretary for policy at the Department of Homeland Security and general counsel of the National Security Agency. Stuart is also an author and host of the weekly Cyber Law Podcast. His vast experience in cybersecurity law and policy adds depth to this discussion on human rights and child safety in the digital age. Next we have Dan Suter. He is a principal advisor to the prime minister and cabinet ministries in New Zealand on lawful access and cross-border data policy. With a background as a criminal defense lawyer and a prosecutor specializing in serious organized crime, Dan has also served in international roles, including as the UK liaison prosecutor to the United States. He is a contributor to the UNODC Practical Guide for Requesting Electronic Evidence Across Borders and brings significant expertise in sustainable capacity building and cybercrime policy development. Next we have Mallory Nodal. She is the executive director of the Social Web Foundation. She is a technology and human rights expert specializing in internet governance and digital policy. Mallory is active in internet and emerging technical standards at the IETF, IEEE, and the UN. Her background in computer science and civil society brings a unique perspective to the intersection of technology, policy, and human rights. Next in the room we have Katie Noyes. She is the section chief for the FBI science and technology branches, Next Generation Technology and Technology. Thank you for joining us. and lawful access section. She serves as the organization’s lead on 5G, internet governance, and technology standards development. Katie is a senior strategic advisor for technology policy at the FBI with over 20 years of experience in the intelligence community, including service as an Army military intelligence officer and various roles with the Defense Intelligence Agency. Katie brings extensive expertise in security and policy development. Lastly, but certainly not least, we have Dr. Gabriel Kapchuck. Gabe is an assistant professor in the computer science department at the University of Maryland, College Park. Gabe’s work focuses on cryptographic systems with an emphasis on making provable secure systems practical and accessible. His expertise spans academia, industry, and policy, including work with Intel Labs and the United States Senate. Gabe’s insights bridge the technical and policy realms contributing to the development of secure online environments. So, as you all can see, today we have a wide range of experts, and I’m really excited for today’s discussion. We’ll have about 60, oh, perfect, okay. We’ll have about 60 to 65 minutes of moderated discussion, and then I’ll leave room for questions from both the audience and online. So, without further ado, and since Stuart is online now, I’ll turn it over to you, Stuart.
Stewart Baker: Okay, that’s great. Hopefully, I can turn on my camera as well. Yes, there we go. All right. Thanks, Mia. That was terrific and a great way to get started. I thought it would be useful to try to begin this by. talking a little bit about where we are today, what’s happened over the last year or so, that would give us a feel for the environment in which this discussion is occurring. Particularly because there’s been a lot of movement in Western democracies on this question, I thought it would be useful to ask Dan to start by giving us a feel for what some of the British Commonwealth countries have been debating, with respect to protection of children and the worries about undermining strong encryption. Dan, do you want to kick us off?
Dan Suter: Hey, thanks Stuart. Thanks to everybody over in Riyadh. Great to be part of such an esteemed panel today, all the way from New Zealand in the small hours over here. We’re only about one o’clock in the morning. So look, it’s really important at this point to highlight that I’m not, so I’m going to speak about Australia and the UK. I’m not a representative from those jurisdictions, but you are right, Stuart, the legislation in both countries, it really has been a point of discussion on how it can be used. But look, I really want to say there are practical implications on what can be achieved by regulation in this space. And a more meaningful strategy would be to consider how governments consistently engage with tech firms on the issue of child safety and lawful access. It’s really not enough simply to recognise the risk, as probably we have done as five countries. So I’m looking there at the UK, Australia, Canada, New Zealand and the US. We need to really raise our ambition and develop a collective approach. engaging with each other and towards a safety by design ethos, including designed and lawful access that does not undercut cyber security or privacy. And certainly that’s exactly where those five countries are moving towards in relation to, and I may speak to this a bit later and others on the panel, in relation to 2023 and 2024 five country ministerial communiques. But look, one of the primary duties of a government is to keep their citizens safe from serious harm. And here we’re talking obviously about child safety as well. And carefully governed and exceptional lawful access should be a crucial part of those efforts to protect the public from harm. So when I speak about the legislation that follows, this primary duty is reflected there with very much an incremental approach, either through consultation or voluntariness. So in relation to Australia, so here we’re going to get a little bit more technical, but Australia has their telecommunications and other legislation amendment bracket assistance and access act 2018. So shortened to TOLA. And that introduced a framework for Australian agencies to make both voluntary and mandatory requests for industry assistance to gain access to encrypted data. Part 15 is the really important aspect of TOLA. And to emphasize this, it establishes a graduated approach for agencies in Australia to receive assistance by establishing three main powers. So one is a technical assistance request where agencies can request this voluntary help from designated communications providers, so industry where they are willing and able to give assistance. Secondly, technical assistance notices or TANs. agencies can compel designated communications providers to give assistance where they have already the technical ability to do so. And then a technical capability notice or a TCN. Agencies can require providers to build limited capabilities to help law enforcement and intelligence authorities. So these three powers, they can be used in secret with penalties for disclosing their existence. Therefore, customers of those platforms may not know if data has been requested or even accessed under TOLA. There is independent oversight that’s already there in Australia in relation to actions conducted by intelligence agencies by an Inspector General of Intelligence and Security and equivalent oversight for law enforcement agencies as well. And the operation of the act is subject to ongoing review by Australia’s Parliamentary Joint Committee on Intelligence and Security, and that actually reports on how many orders have been issued. I can tell you in 2019-20, there were 11 TARs issued, 21-22, there were 30 technical assistance requests ordered. So let’s move on to the UK, and the UK passed its Investigatory Powers Act in 2016, and that includes an obligation on communication service providers to remove encryption or electronic protections on communications for investigatory purposes after service of a technical capability notice. And to really emphasize again, this is an incremental approach. So there’s a consultation phase in relation to those communication service providers before a technical capability notice is issued. So again, really robust safeguards. There’s a double lock mechanism, for example, with independent. oversight by the investigatory powers commission. And just very quickly in terms of my own jurisdiction as well, New Zealand, obviously most of the major communication service providers are based offshore. The main issue in relation to New Zealand, therefore, is extraterritoriality and enforcement. There are a couple of important provisions within our legislation, which be the Telecommunications Bracket Interception Capability and Security and Bracket Act 2013. Commonwealth jurisdictions, we often have really convoluted Act names, but there is a duty for service providers to assist with, and we’re talking about encryption here, decrypting telecommunications when there’s an execution of an interception warrant. So that legislation that’s used its shorter name, TICSA, its overriding objective is to ensure that communication service providers are ready to be able to intercept communications. And there is a provision for a minister to direct those communication service providers to be able to say, look, you need to be in a position to be intercept ready and accessible. And part of that duty will be in relation to decrypting communications. I’m not aware of that ever having been done, but the simple fact is that it’s really difficult to enforce for those over the top providers, such as Meta with WhatsApp and Facebook Messenger, for example, to be able to enforce any of those provisions through a ministerial direction. But look, again, just to complete this phase, is one aspect. And there have been those points of discussion with the use of these particular pieces of legislation and the provisions that they provide. But the real emphasis should be here. on what we can all agree and moving the debate on to ensure that we reach a position where we understand in terms of that safety by design ethos and progressing towards where we have commonalities in the debate. So passing back to you, Stuart.
Stewart Baker: Yeah, Dan, just one follow-up question. When the Investigator Powers Act and the Australian Bill were moving through parliament, there was considerable anxiety expressed by industry and civil society that these acts enabled the government to insist that encryption on a particular service be redesigned so as to allow for lawful access and in a way that might impact the security of communications. Do you think that’s a reasonable reading of those bills? And as far as we know, has there ever been an order or capability notice that required that kind of modification?
Dan Suter: Look, of course, in terms of the debate, there’s always gonna be focus on what the extremist point can be in relation to legislation. But I think it’s really important to, again, re-emphasize that there is that incremental approach in working towards in relation to a point where, and ultimately, it is for governments to determine in terms of the protection and the safety of their citizens. But built in within that legislation, of course, when we might have a debate about this, we’re not gonna focus on the intricacies because we haven’t seen this work through in terms of how it practically applies. But there are robust safeguards, safeguards that have been there for well-established in a long time as well. We’re not talking about safeguards that are just being plucked out for the benefit of government. These are safeguards that have been there and we know that work, the double lock mechanism, the IGES in terms of having oversight in relation to intelligence agencies. It has to be, in terms of any legislation, to ensure that there is that social license that these safeguards are built in. But there also has to be that balance in relation to ensuring that where the public do need to be protected, that power is available. But I can tell you from a New Zealand perspective, in terms of the legislation that I’ve referred to, there has to be a balance with cybersecurity and also privacy in relation to preventing any collateral intrusion in relation to individuals. And these have to be specific and targeted to ensure that there isn’t that collateral intrusion. And I think it’s really important that when we talk about this debate, we’re understanding that we are talking about the protection of citizens. We are talking about that being at a very last stage. But there has to be the power and the capability there, if needed, with those safeguards built in. But back to you, Stuart.
Stewart Baker: That’s great. Mallory, do you see this the same way that the English-speaking countries, other than the United States, have given themselves the authority to have a pretty dramatic impact on the encryption services that are provided? But have, for a variety of reasons, not gone to the full extent of their authority and have built in a number of protections for privacy and security?
Mallory Knodel: Right. So because we are in such a late stage of these debates and not a lot has changed on the regulatory side for a while, I’ll have to say no. And I think that’s not a surprise. I think we’ve obviously had a similar debate now for a very long time. I do actually think a lot of other externalities have changed besides government positions on this. And I’ll only mention, because we’re, of course, really short on time by now, that what’s relevant to what Dan was just saying is, Australia, because of TOLA, you now have one less privacy startup. So there’s an application folks were using called Session. Session is an end-to-end encrypted app. It’s interesting because it doesn’t use phone numbers or usernames in a persistent way, so it provides a little bit more pseudonymity when using the application. So that’s what kind of differentiates Session from maybe other apps. They’ve announced very recently that they have to leave. They’re going to Switzerland because they have been visited by the authorities and they’re quite worried that they’re going to be asked to backdoor it or to provide user information to the police. And that is exactly what private sector companies have said about the UK Online Safety Act. It’s unfortunate that Ofcom, the regulator, has been somewhat silent on how they would handle orders to backdoor, whether they would do it under a gag order, whether they would be transparent about that. But we have heard from Signal, at least, and certainly WhatsApp has not been shy about expressing Meta’s position on this, but that they would leave the UK before backdooring the software, for sure. And already, right, and this gets into more of the solution space, already there is data that can be obtained that can be provided and that is provided based on leaks from a few years ago and sort of, I don’t know, it was like a slide deck that the law enforcement community was using to explain which of these encrypted services have which metadata and how you can get it. These sort of already exist, right? So once an application decides to completely leave a jurisdiction or to completely not comply with requests like a backdoor. then you also lose access to that metadata as well. You also lose access to the helpful service data that you could have potentially used. So it’s not a great move for anyone, right, when this happens, but it will continue to happen because what is being provisioned in these laws amounts to mandated backdoors that change the software for everyone all over the world, not just for that jurisdiction, and it changes it in a persistent way so that that backdoor or that capability is always there, and it changes it for everyone who’s ever used the application, irrespective of whether or not they are suspected in a crime, and it’s just a much too overbroad piece of legislation. And yeah, and so that what you’re talking about, Dan, where we would rather complement regulation with the ability to work together and find solutions, you take that off the table when applications start leaving jurisdictions over your risky laws.
Stewart Baker: One question, Mallory, Sessions left Australia as its corporate headquarters. Maybe they also plan never to sell in Australia. I’m not sure we know that. Yeah, that’s potentially. But quite significantly, nobody else who provides end-to-end encryption has said we’re leaving. That suggests that maybe Sessions’ concern is over a capability notice that might have affected their efforts to make pseudonymous accounts.
Mallory Knodel: No, just to interrupt you, Stuart, because I know where you’re going with this question, it’s just because the law is about companies in Australia having to comply. So as Session leaves the jurisdiction, they are no longer swept up in this regulation, and also I’ll note that as far as I can tell, staff members have also had to relocate physically because they’re-
Stewart Baker: Because obviously they’re subject to jurisdiction, yes, okay. So this is a question of the Australians having limited their authority to people who are located in their jurisdiction as opposed to people who sell services in their jurisdiction, because it wouldn’t be hard to extend jurisdiction to people who are offering those services.
Mallory Knodel: I think it’s hard. I think it’s definitely hard. I think that’s what the UK wound up doing eventually, but TOLA was some years ago. And if you, I wanted to also mention that I think it’s interesting, we’re just basically talking about the Five Eyes countries because there is an obvious concerted and coordinated effort to work on legislation as a block. So you had Australia sort of doing the impossible, getting any kind of backdoor law on the books first, taking that hit, but kind of with some measured approach so that it wasn’t like every into an encryption app on the planet. It’s just the ones that within Australia’s jurisdiction. Now you have the UK coming in some years later, managed to put a backdoor on the books, but it’s again, like limited powers. Anyway, so I, and you see, you know, we’ve, we’ve all these countries, Canada has also managed to do something and just follows from there, but this is certainly an effort done. I think that Australia doing something more measured was a tactic to get something that people could live with. They probably would have rejected something a little stronger. So yeah,
Stewart Baker: I, you’re, you’re absolutely right. It feels as though people, the attackers are circling and taking occasional nips from their target without actually launching an attack. Why don’t we, why don’t we move just to focus on what’s happening in Europe as well? So we have a complete picture. Katie, can you give us a sense of where the debate is in Brussels?
Katie Noyes: Yeah. So first of all, let me just extend my gratitude. I wish you all were here in the room. We’ve got an awesome audience of folks here. You can’t see half of them, but you’re all missed here. We wish you were here. But I think really, let me just hit a tick before I get there, if you’re okay with it, Stuart, which is the whole goal of bringing this to the Internet Governance Forum was because we’re multi-stakeholder, we’re representative of that on this panel, and I’m really grateful for that. I will now bring this home, which is that’s what’s going to solve this problem. Candidly, I don’t think it’s going to be government, certainly not alone. It’s not going to be the private sector and the companies alone. It’s not going to be just civil society. It’s also going to include people at their kitchen tables. I absolutely want to make sure we bring this home for the audience in the room, who are very interested in policy. But I think we all want to know what does this mean in tangible terms? Going back to Brussels for a minute, and how this actually even affects the FBI is that these are global companies with global capabilities. We have global public safety challenges. There are global terrorism organizations. There are global fentanyl traffickers. There are global trafficking and child sexual abuse material networks that work across the globe. I think it’s key to highlight that first, that it’s not a European problem, it’s not an Asian problem, it’s not an African problem, it’s an all of us problem. Why do I say that? Because we all are trying really hard to learn from each other. I think that idea of trying to harness best practices is key. On this, the European Commission actually just put out a report, so it’s very timely, in November. They had commissioned a high-level group, and the group was specifically to look at, and I want to make sure I get the title right because it’s key here, it was Access to Data for Effective Law Enforcement. And if you get a chance to read the report, I highly recommend it, because I think what it goes to is some of the things we’ve been talking about. I guess I will take a slightly different approach and say, I think things are very different, and I think they’re very different around this conversation, because I was sitting in Berlin at the International… Governance Forum a few, you know, right before, like, I think a year before COVID. And the conversation was very different. It was, I’d say, very strict on the privacy side. There seemed to be, and please don’t take this as a pejorative comment, but there was a lot of trust in technology companies, and that they were solving civil society’s problems. And that sort of idea that public safety might come in and sort of mess that up or, you know, be a chilling effect. I have found the last two days I’ve been sitting in on multiple panels, it is a wildly different conversation. And the conversation is coming down to responsibility. What roles and responsibilities do each of us have? And again, I want to go right into the face of this narrative that somehow safety, security, and privacy are diametrically opposed. I think it’s a false narrative. If you go back to the UN rights, we’re at a UN organization, there’s a right to safety, a right to security, a right to justice, a right to privacy. There is an expectation that this all coexists, thus the name of the panel. So I think when you read the, you know, what they’re doing in the European Commission, it really does look to us. And it’s something we’re also trying to emulate with a partnership we newly have with UC Berkeley, where we had a summit to kind of have some of the same conversations, the major themes around responsibility. So it talks to what are the expectations of government in this realm? Is there an idea around incentivization? So it’s putting a more active role and a more active responsibility on governments as well to meet industry, to meet civil society, to meet the needs, because again, we do need to achieve that. And then take it one step further. Again, it is not up to government, and we all understand that, to prescribe a technical solution. And that’s not what we’re trying to do. But we do recognize it probably does take some government incentivization, some communication of priorities and needs. And I think there’s a lot of space there to achieve that. And again, going back to that report, it actually details out some of these approaches and fresh off the presses from November.
Stewart Baker: I understand all of this and there’s no doubt that the European Commission has proposed legislation that would clearly incentivize better access and more law enforcement insight into what’s going on on some of these services. But that proposal has really been stuck for a few years now due to strong opposition from some members of the European Union. Do you think that’s changing?
Katie Noyes: Yeah, you know, I can’t speak to how the process works there or take any bets on that, Stu, but let me kind of get to some of what we’re hearing. And we heard it out of the G7, by the way. I don’t know if folks are aware, but the G7 Romalion group actually commissioned a lawful access working group and it ran last year and they voted to renew it for this upcoming year as well with the Canadian presidency. I think it’s key and it’s key because I actually think the landscape has changed. And I’ll give you maybe two areas where I think it’s the combination of these two issues kind of intersecting. One is the threat landscape. We have solid data and it’s solid data not coming from law enforcement this time, it’s coming from outside non-government organizations. So many of you are familiar with the National Center for Missing and Exploited Children, my colleagues around here in the room. It’s a U.S.-based non-profit that really takes tips and leads from the electronic service providers. Last year was the highest number of tips ever received by the electronic service providers, like META for Facebook and Instagram, if you’re wondering what an ESP is, but it was 36 million reports. Then, and NCMEC is very public about this, they take those tips and leads and they provide them as tips and leads to law enforcement all over the globe. So we in the FBI, we get a number of those and we start an investigation, at least looking into assessing whether there’s an active threat or not. So the threat environment is booming. Why is it booming? Because the technology industry is booming. You know, sitting around the table years ago as a teenager, I wasn’t talking, we weren’t talking about social media and gaming platforms where you are connecting to others. But that tech boom sort of comes with a little bit of a cost or a tax, which is the tech industry is moving at such a fast clip. And this is where I think some of the difference is. I think the multi-stakeholder environment, particularly civil society, as I’ve heard here, but I also heard it even from a few company representatives. They’re taking a slight pause to say, okay, and this is a good one to talk about, and it goes to sort of, I think, what Mallory was getting to as well, which is the focus, like when something has been deployed, well, we know Meta, Apple, all these companies have gone back now and they’re instituting technical remedies or ability for reporting. So a user can report when something has been assessed as harmful to them or a potential criminal activity. They’ve all now gone back and created reporting mechanisms. That’s very new, and a lot of that was announced at the Senate Judiciary Committee hearing in January. So I do think this landscape changing where more questions are being asked by legislators, and again, I’m using a US example, because I apologize, I haven’t quite followed the process in Europe as closely, although we’re seeing a lot more reporting and I think real push for some of these changes to bring industry and governments together to solve these challenges. So again, just a quick summary, I think the threat environment has changed. We see digital evidence in almost every single one of our cases. If you had asked me that question, even five, six years ago, I would have given you a very different figure, and then we’re seeing just the ubiquitousness of tech deployments, and now we’re seeing the ubiquitousness of adding that layer of end-to-end encryption that can’t be pierced. And so I think we’re seeing, and by default, by the way, so a user doesn’t get to decide for themselves anymore, now the company is deciding. And again, let me just, last point, and I’ll turn it back over to you. I think that’s the key point here. Maybe what we’re seeing is maybe this issue is really finally going to a multi-stakeholder conversation. I think with very prominent cases like sexual extortion, hitting, actually ending up with 17-year-olds and teenagers in the U.S. committing suicide, people wanna have this conversation because they’re seeing it in their neighborhoods and at their kitchen tables.
Stewart Baker: Back to you. Mallory, do you? Do you see this the same way that despite or maybe because of the fact that legislation hasn’t really gone to the ultimate point of mandating lawful access, that there is a better opportunity for more voluntary cooperation?
Mallory Knodel: Yeah, so I guess from my perspective, again, we’ve been having the same public debate for a while. It’s been a couple of years now that I’ve been on a stage with NECMEC and FBI talking about the same thing. It was an IGF, but it was a USA IGF. The conversation here has been the same. The externalities have changed. So around that same time, my then employer, Center for Democracy and Technology, put out a report suggesting that reporting and user agency features in Indian encrypted apps would be a good way forward. We also suggested metadata. And now companies are doing that. So civil society suggests it. Companies do it. Companies also have now expanded very significantly trust and safety as a whole area of work that all of them are concerned about. Because as we know, this problem of child safety exists far beyond the boundaries of indigent encryption. It is all over social media, in the clear, and it’s still a problem. And so working to clean that up has been a huge effort. And probably there’s a lot of explanations for why those numbers have been changing. We don’t know what those numbers mean. It doesn’t necessarily mean that there’s more threat or risk. It may mean that there’s a lot more reporting and there’s a lot more awareness of it. And we don’t even know how much of that is new versus old content, et cetera. So I think that, yeah, there’s a lot of really interesting solutions that are cropping up. I think the tragedy is that there’s a lot of us still stuck in this. this backdoor conversation that’s not really going where and it hasn’t for a long time. And it would be great to truly actually engage to the solutions. But I think that requires, which is what civil society and industry have done, a sort of acceptance of end-to-end encryption as a feature that users all over the world have required and wanted and requested and begged for because they want that protection. We didn’t see such a demand for end-to-end encryption until it was revealed that the Five Eyes countries were spying on everyone back in 2013. So there’s also that part of the story. But I think, again, if we can accept this as a sort of minimum requirement of secure communications for a lot of different reasons, right? Because encryption protects kids, because encryption protects businesses, et cetera, et cetera, then we can really build some cool stuff on top of it and try to fix this issue. So I’d love to see us get into that space. And then I’ll just add one more thing, which is we’ve also seen externally that backdoors don’t work too. So another thing that’s happened very recently is that for some communications that have been built in lawful access backdoors, I’m talking mostly the network layer. So this is where telecommunication services have encryption ostensibly, but it’s been by law backdoored. The law in the US is called CALEA. That was exploited just as sort of folks like Civil Society and other security and safety professionals were saying it would be. And that was the sort of salt typhoon hack. So I wanted to bring that into the conversation because we’ve seen both major successes in figuring out how to do trust and safety, child safety work on top of Indian encryption. We’ve also seen major fails where we’ve had insecure communications and how that’s been negatively affecting businesses and the security of all people using those networks.
Stewart Baker: Let me ask Katie if she wants to address that because I’m not sure everybody agrees. that that’s what happened with the salt typhoon hacks.
Katie Noyes: Yeah, we certainly don’t agree to that. We actually, you know, the media, quite frankly, got that one a little bit wrong. Can you all hear me? Yes. Going through? Okay, great. I can’t hear it on my own, so. But yeah, we’ve gone out publicly, by the way, to try to dispel this myth and correct the record. What we’re actually finding, because we are investigating, so Mallory, if you have direct access to the information, certainly would like to talk to you more, but from the investigation, what we’re actually learning, again, not to say that when we get through all of the investigations, because there are multiple here, that we won’t find there was some vector or something, but I can tell you right now, the investigation has not yielded that, that the CALEA and the lawful intercept capability was not appearing to be the target. And it actually, what we’ve seen in two specific targets, that the perpetrators of salt typhoon, the Chinese-backed salt typhoon group, actually had access to the network well before they actually accessed the CALEA capability. So that tells us it wasn’t the vector and it wasn’t the main target. We already do know, too, and we put this out very publicly, so if anyone is interested, we do have published awareness on our website, FBI.gov, you can find, but we certainly do not want that to be used or leveraged in this debate when it is erroneous. Again, does not mean that there shouldn’t be strong security. It doesn’t mean that there actually shouldn’t even be encryption. We are very supportive of encryption technologies. We just want them to be managed in a way, much like in the telecommunications. And again, I’m with everyone who says there should be stronger security and stronger security even around CALEA. Absolutely, we join those calls, but certainly wanna make sure the record reflects accuracy here that does not appear to be the target or the vector, but we did see access, so that is an actual truism.
Mallory Knodel: Yeah, I think, yeah, target. versus vector versus leveraged. The fact that widespread communications had this capability, I think, are maybe three different things, but still significant.
Katie Noyes: But it also matters that I think the general population is probably, I would say, as a citizen myself, I’m way more interested about what also else did they have? Because to everyone’s argument, most people were law abiding citizens. You don’t want any of the security to change for that. Well, those law abiding citizens wouldn’t have been in the CALEA data anyway. This is for individuals where we have some sort of predication or authorized access. Again, I’m not arguing that it’s not a terrible security problem. Don’t misunderstand me. It’s a terrible security problem. And it should be enhanced security. And again, go back to encryption is one of those, but also multi-factor authentication, strong passwords. I mean, all of that was a factor in a lot of what we’re seeing here. So it’s not, I don’t think isolating this to this one issue makes very much sense for this.
Mallory Knodel: No. So I was just going to say that I think there might be a lot of elements to it, but we are talking about encryption right now. And so of course, we’re going to only talk about the things that are impacting on the debate around encryption. I think that’s totally fair game.
Stewart Baker: So let me ask them about the encryption. Katie, one of the things that the FBI suggested people do if they’re concerned about the salt typhoon hacks, which are certainly a major security threat, was that they use strong encryption and I assume end-to-end encryption. And a lot of people in civil society have said, well, there you go. Why even the FBI thinks you ought to have strong encryption? And isn’t there some inconsistency between wanting to have lawful access and wanting people to use strong encryption to protect against very real threats?
Katie Noyes: So absolutely not. We, again, we’re back to, we think that we can achieve all of these things. Will there be trade-offs to some degree? Certainly. Will there maybe be differences for the way we approach the entirety of a population of a user base against perhaps looking at, you know, a scaled solution only for individuals where we actually have court authorization and our authorities warrant some type of access to content data? We’re very open to the conversation, but yes, please let me say for the record, the FBI supports encryption. This is the part of the debate that I think is also not new, and I’m very surprised that we continue to have to answer this question, but happy to do it again is that we are very supportive of that, particularly from a cybersecurity perspective. And the FBI is a user of encryption. But what we don’t do is willfully blind ourselves to all of the activities because there is a responsibility. Again, we are all responsible, and I think this is where the debate, I do feel it’s changed. Again, I go back to, I understand, I feel like we’re here today to talk more of an action plan. At least that’s what I’m here to do. I think the FBI’s point of view in this debate today, I’m hoping we’re going to get to that conversation of something that could be achievable because, agree with the UN, got to achieve all four of those. I think the discussion now needs to stop being, should we? And it now needs to be not that we accept we can’t and we just stop trying, but that we’re the best innovators in the world. We all represent countries and institutions that are the best innovators of the world. We didn’t say, oh, cancer’s a hard problem. Let’s not try to solve it. We don’t do that. So let’s-
Stewart Baker: No, that’s quite right, Katie. Dan has been quiet this whole time. He is going to have to bear the burden of representing the innovators of the world because he’s our technical expert cryptographer. And there have been some interesting suggestions about how to square or at least accommodate both security and lawful access, including the idea of scanning for objectionable material on the phones of the sender before it gets sent so that none of the private communications are compromised unless there’s a very, very good reason to believe a particular communication has objectionable material in it. Gabriel, if you could talk a little bit about both that proposal, which came up in the- you debate and any other technical approaches that you think are promising to get us out of what’s a pretty old debate?
Gabriel Kaptchuk: Yeah, thanks. It’s an interesting place to be in that some really core parts of the technical puzzle here have not meaningfully changed for a long time. And at the same time, we have cryptographic and we have computing capabilities that are a little bit different than they were before. And this allows for different types of processing to happen on endpoints. And so to pick up, I think, on something that Mallory was saying earlier, we’ve seen a lot of changes that are happening on what is available to users on their endpoints. So this is not shifting the, is there a backdoor? Is there not a backdoor in the actual encryption layer? But rather saying, can we put some kind of processing on the client’s device that locally processes and gives them some more information? And so one thing that came up a couple of years ago that was proposed by Apple was a proposal in which they were going to blur certain images, particularly for youth accounts. And then there was a kind of mechanism by which if the youth wanted to look at the thing, look at the actual image, it would notify an adult. And there was kind of two different things that were happening there, one of which kind of showcased the ability to do some kind of powerful stuff on the endpoint, and one of which showed the kind of brittleness of this type of approach. So on the one hand, we now have the ability to actually kind of process images on somebody’s phone and say, well, maybe we should blur this thing. Maybe this isn’t the type of thing that we should just show to people no matter what. And I think there’s actually a fair amount of consensus that this is not a radical idea. Maybe if I blurred every image that I got or ones that kind of locally were determined to be not something great, that would not be that problematic. Where there was a lot of pushback from the community, it was the fact that then there was kind of an automatic trigger of some kind of information pushed to another device or pushed to a server or pushed to something like that. That is to say, kind of breaking out this model of end-to-end encryption in order to alert somebody else of an event. And that’s actually where that proposal was found to be most objective. And so we have various different kind of like ways of thinking about this, right? If we’re able to process on the device itself and able to identify this is content that we’re concerned about. We can kind of give users a little bit more, maybe you could say push or maybe a little bit more usable type of ways to kind of control the information that they see or report the information they see. And that’s something we really know how to do. When it comes to kind of active scanning that then kind of pushes information off the device itself. This is where things start to get a lot more complicated and a lot more controversial and a lot more difficult to do. In particular, you kind of brought up in the EU, we’ve kind of seen a push, a concerted push to move away from kind of an old paradigm, particularly around child abuse material to kind of flag the known instances of child abuse material. So this is an image which matches another image that NCMEC has. And therefore we kind of with high confidence can say that this image is a problem image and kind of with confidence, I’m going to return to that in a moment, but with some degree of confidence that there’s a match there. And there’s been a push to shift away from that paradigm and towards detecting new images or new content or the solicitation of images or solicitation of content. And this is a much trickier problem. As a technologist, I don’t know how to write down a program that can on the client side with 100% certainty actually differentiate between this is a problem conversation, this is not a problem conversation. And when the ramifications of getting that wrong is that people’s information is going to get pushed to a server and it’s going to get kind of opened, that’s a really high risk environment to write that kind of program. That’s not a low risk kind of choice and it’s not the kind of thing that you want to get wrong. And this is kind of where it’s important to start making technical differentiations between the types of access that are being requested. If it’s detecting new content, that’s really, really difficult. And I don’t think we have the technical capabilities to actually meaningfully.
Stewart Baker: What about detecting old content that’s been tweaked in order to evade the algorithm?
Gabriel Kaptchuk: Right. So this is kind of this older paradigm, which is one that, again, there’s still more things to pull apart here, and it’s not just kind of one thing, right? So we have seen some work in doing what’s called perceptual hashing. This is where you take two images and you kind of run them both through an algorithmic function to determine whether or not they’re what’s called a semantic match, where this kind of semantics somehow capture what’s in the image, as opposed to the details of the image. And on the one hand, this seems like a promising way forward, right? Because this means that you could match two different images which have been had minor edits have been made to them, but are still kind of fundamentally the same. Unfortunately, the reality of it is that our modern perceptual hashing technologies do not live up to their task. In particular, in the aftermath of Apple’s announcement that they were going to be doing some amount of client-side scanning, they also released this neural hash, a particular hash function that was supposed to do this. And it took people, I don’t know, about a week and a half to reverse engineer it and start to find kind of ridiculous what are called collisions, or two images that match kind of according to the function, but are actually semantically wildly different from one another. And this is because, you know, this is a really hard computer vision problem to determine whether or not two images are the same. And, you know, you can kind of think about this going kind of out of the context of child stuff and thinking just back to kind of the way that the US thinks about pornography, right? I can’t define pornography, but I know what it is when I see it. That kind of says that people are the ones who are able to determine whether or not content is a match. And that’s even probably, there’s edge cases where they won’t agree. To get a computer to do that when humans actually have a difficult time doing that, that’s a problem. That means that you’re going to inevitably build functions that are going to do scanning of some variety, which are going to be overbroad. And they’re going to kind of have really obvious fail cases or really obvious ways to abuse them. Something like I can kind of manufacture images that look like, according to this kind of hash function, that they are child abuse. and send them to somebody else when in fact they’re not child abuse. It’s just that I’ve kind of exploited kind of relatively easy ways of modifying images so that they kind of look according to the algorithm like the same, but not to our eyes. And so that’s where we, that’s kind of where we are today. There is kind of a push for scanning on endpoints. In my opinion, there are ways in which this could potentially empower users to do, to have an easier time moderating the content that they see or making better decisions for themselves. At the point where that data then gets pushed off device, that starts to open up kind of a different type of like rights impact assessment that needs to happen. And we have to have a different kind of confidence level in the technology than we have it today.
Stewart Baker: But let me, let me ask you from a technical point of view, we’ve heard a lot of talk about how valuable it would be to have more conversations and to find common ground. But I wonder if with Signal having long been in this offered end-to-end encryption by default, Apple, WhatsApp having done the same, and now Facebook adopting the technology for its other services, isn’t this debate really over as a practical matter? The big companies that offer these services have all moved to default end-to-end encryption, and they’re showing no signs of saying, well, maybe we should look for common ground here. They’ve done it. And maybe I’m misunderstanding the impact in the market, but what’s the incentive to look for some mechanism to satisfy child safety and law enforcement, given what has happened in the market?
Gabriel Kaptchuk: Yeah, I mean, I guess if the conversation’s over, we can all go home and go on with our day. I don’t think it’s quite that simple. I think what we’re seeing is the deployment of end-to-end encryption technologies around on many, many communication platforms as being a very clear signal that this is what users want. Right? If nothing else, this is like, you know, we’re trying to fill a market need or a market want or something like that. And importantly, I want to pick up on a thread that I think popped up a couple of times in what Mallory and Dan and Katie all said, this question about like by defaultness and what is the value or the risks around by defaultness. And at least, you know, from a technical perspective, I like to think that by defaultness is kind of the only reasonable way forward because you want end-to-end encryption to protect the people who are not going out of their way to evade kind of surveillance of any kind. Those are the people you kind of want to protect. And if you don’t protect them, then you’re actually, your system is not getting you very much. Right? The ability to build encrypted communication platforms is something that we’ve seen criminals do for quite some time. And obviously, there’s kind of a lot of conversation around the ways that international law enforcement have tried to kind of approach those systems and whatever, putting those aside. We know that people are trying to evade surveillance. They’re going to build these systems. They’re going to use encryption. Right? You want encryption by default to make sure that it’s, you know, you, your spouse, your kids who are protected against somebody inside of a tech company stalking them. And this isn’t like a wild, like crazy thing to do. We’ve seen this happen before where people kind of elevate the powers that they have within or kind of take the powers that they have and abuse them within a tech company or a company is breached maybe by a foreign government that wasn’t supposed to have access to that system. Whatever it is. Right? So we really do want encryption by default in order to protect the people that you’re trying to protect. That’s kind of like an important part of the puzzle here. You know, I think in terms of whether or not we’re done with this conversation by kind of just, you know, simply because it’s being deployed everywhere, I think that that’s kind of like giving up on trust and safety. That doesn’t make any sense here, and I think that trust and safety is obviously going to be part of tech platforms’ responsibilities going forward. The question is, what are the tools that they’re going to use, and what are the capabilities that they’re going to build into their systems to ensure that users have the ability to protect themselves? Now, we get into some tricky waters in terms of exactly what the right thing to do there is. Obviously, I’ve kind of advocated to some extent through what I’ve been saying for this kind of user and ability to control the information that people are seeing and to report and stuff like that as being a powerful mechanism, as we’ve seen kind of deployed over the last couple of years. I’ll offer one more kind of piece of this puzzle, and maybe this moves us towards a different part of the conversation, is that when I think of the big risks of trying to kind of move forward in this conversation, and I think one of the pieces that’s new is trying to understand, is there any way beyond kind of an all-or-nothing capability? And this is something that I’m technically interested in, and I think is an important part of the conversation. In particular, lawful access or backdoors as a paradigm is fundamentally kind of an all-or-nothing from a technical perspective type of trade-off. Either there is a key somewhere that lets everybody into the communications, and there’s a bunch of protections, maybe those are social protections, about who gets access to that key. That’s one paradigm. Or there is no key. That key does not exist, and therefore can never be materialized. And I want to offer that this pushes us from a regulatory perspective into kind of an opportunity to get a worst-case scenario. If we mandate there must be a backdoor, that means that this key exists now, and that key is very, very dangerous, a very high-value target, and somebody is going to go after and get it. And whether or not the salt typhoon as a particular instance is evidence of something or another, it is evidence of a paradigm in which there is the willingness by international governments to put a lot of resources going after these capabilities. The minute there’s a key, that key is going to be a high-value target. And one thing that I think is interesting in this conversation is wondering if there is a way to create a key. key that only works for certain types of content. And that’s something in the cryptographic world that may or may not exist. And there’s kind of ongoing research. But as a paradigm, I think it is a different part of the conversation, which starts to shift us away from, we have to accept that there’s never going to be any backdoor, or we have to accept that there is going to be a backdoor, and saying, well, what is this backdoor for? If we want a backdoor, and we want to just talk about kids, can we talk about a specific, limited backdoor that doesn’t then make everybody else vulnerable at the same time, just because this mere key’s existence is kind of a vulnerability? This is a difficult paradigm to work with. It’s a hard design space. We don’t know much about it. But I think it is one potential way that we can start thinking about avoiding this worst case scenario of keys actually being created and software actually being made that’s
Stewart Baker: really, really vulnerable. OK, that’s the first suggestion I’ve heard, that there might be a way out of the all or nothing aspect of this debate. But let me ask Katie and Mallory to weigh in on whether a content-based lawful access mechanism is available. I suspect Katie’s going to say yes, and it’s a warrant. But so having previewed what I suspect Katie’s argument is, let me start with Mallory.
Mallory Knodel: Thanks. No, it’s OK. I’ll be really quick. I also wanted to connect what Gabriel was just describing to what Katie said earlier. Because I think this idea that what I’m putting forward, where we sort of accept the constraints of end-to-end encryption as sort of giving up, I think suggests that the goal is the back door, right? And I think that for technologists like Gabriel and myself and others, public interest technologists in civil society and in academia and industry. The problem space, the requirements, are we need to keep people safe, that includes kids, we need to make sure our communications are secure, and that is a wider frame. It’s a sort of you list the requirements and then you build the thing that meets the requirements. Maybe that’s a backdoor, but maybe it’s a whole lot of other things. So when we say like we are giving up on backdoors, and I suspect that is true, that that has been the goal all along. It’s also the UK tech safety challenge that was a few years ago was the same. They said it was about finding solutions to child safety. They created a brief for it that said it needs to be about scanning images in end-to-end encryption. It was a presupposed goal, and that really narrows the field in terms of what kinds of innovations you can get. So you got five different projects that all did the same thing to varying degrees of success, and the best one was not very good because perceptual hashing is hard. So I want to just say I think what Gabriel is describing is these are really interesting ideas. I have more of a technical background in the internet networking and encryption. I have less of a technical background in AI, but I’ve had to learn it in the context of this work because it’s similar to a paper that’s coming out very soon that I’m working on because there’s a lot of imagination around what you can do with this data. I think some of it could be very interesting and fun, like let’s think about how these secure online platforms are being used, a lot more like social media platforms, etc., and that’s great. That’s what people want. That’s where they feel safe expressing themselves increasingly in a world that seems kind of scary and that yet will still have some of these features, and can you do cool things with content that also allow users to protect themselves and allow the platforms to make sure the experience is enjoyable? That’s another incentive. Nobody wants to use a platform that has all kinds of unwanted or gross content on it. then yeah, we get into more of a solution space. So let’s live, let’s continue this conversation and live in that sort of innovation space. I think that’s a good idea.
Stewart Baker: Katie?
Katie Noyes: Agree more. I think that’s what we’ve been trying to do is get to the table and discuss. I do think, you know, the, and I think, I like it that this panel’s gone this way. I think we’ve all moved off of the absolutist point of views which is, look, there is going to be compromise around and a lot of innovation needed on how this all can actually be achieved and coexist. And for our part, we’re very willing to come to the table. This sort of giving up thing, I mean, I’m keen on it because I think the idea too of, you know, thinking through, I’ll just give a case example because I do, we haven’t talked a case example and I think it’s worthy for these types of conversations. And I’m gonna talk about a quick success but this is what we’re afraid of, right? So take the sextortion. I think many people are suffering this challenge which is why I picked this case because it’s a universal, it’s in many different countries and by the way, the actual subjects were Nigerian. And so if you haven’t followed this case, we have a case out of our Detroit field office, a young gentleman named Jordan Demay. So he, and again, I’m gonna kind of just pierce through to all these preventative measures even though we haven’t gone there, they’re wonderful. And please let me give a plus one to all the companies that are doing this great work. But here’s the challenge, right? The hacking thing, you’re right. If things are available and a criminal thinks they can benefit from it, they’re going to target it. So they targeted dormant accounts that were being sold on the dark web, dormant Instagram accounts, hijacked one of them, just changed the pictures of it, used the same name and enticed an individual who thought he was talking to a 16 or 17 year old girl, created a relationship and pretty soon explicit image was shared. And that’s when the extortion starts. Our young 17 year old Jordan paid the first ransom and couldn’t pay the second. And here’s what we mean by why content, because I know we on the panel understand this. I’m not sure everyone has been following this, particularly at IGF, the way that we are. So here’s why content matters. If the only information we had was metadata that Jordan’s Instagram, this Instagram account was talking to this fake Instagram account. There’s no real prosecution there. There’s victimization. We could see it because unfortunately, Jordan took his own life. And here’s the interesting part of this. The mother has gone out very publicly. So I only use this because she’s gone out publicly and she has told law enforcement she never would have known why her son committed suicide if the FBI was not able to gain access to the content which showed the communications and it showed this subject goading Jordan to take his own life. That added to the sentencing, it added to the prosecution. This is what is at stake and I really will push back too. We talk about this very academically and I do it too. So I’m castigating my own self. I think. people really do need to understand the design choices and the way they are affecting them, right? I think it’s key. I also resist this idea of a backdoor. I can’t stand the definition. As I tried to look, what’s the universal definition of a backdoor? And if you go back and look at it, what it was at least five, six years ago was the FBI and law enforcement having direct backdoor access to communications. That is not, I don’t want anyone to think that is what law enforcement is asking for. We’re asking for the technical assistance side. The other thing I also kind of resist a little bit on that is there’s this idea that you’re somehow in your own home and it’s a backdoor to your own home, but you’re not in your own home. You’re in a provider’s home and there are all kinds of backdoors. And yes, I’m wise to the fact all of these hashtag backdoors are not created equal, but there are a lot of access points. There are. And all of those access points are, could be vulnerable. And again, vulnerable for different reasons. I see Gabe laughing because we had this conversation. I’m not, again, I’m not saying that all of these accesses are equal, but they are there and they’re there for a reason and not a bad reason. They need to make sure they’re updating any vulnerability they actually find or one, by the way, we might find from seeing other victims and sharing that vulnerability that was identified as a tactic or technique or procedure by a criminal who’s using it here to prevent further victimization. So sorry, long winded, Jim, all over the place. But my quick answer is yes. And I, that there are absolutely solutions. We are willing. We know there has to be an active negotiation. We know it’s not going to be absolute access. And that’s, by the way, there have been some really interesting discussions around and I’ll just chuck them out there because we had great conversations at UC Berkeley with an academic institution who has a lot of cryptographers. Would love to talk more to Gabriel, too, but thinking about things like homomorphic encryption and some promise about, again, like you’re saying, Malia, identifying additional sort of, you know, categorizations of the data, right, and what it offers. But also this idea that someone raised to us, how about a prospective data in motion solution where you’re not affecting all of the users, but perhaps we’re affecting a specific subject’s designer architecture. I raise it. It’s been raised publicly. It’s in articles if you, in fact, I think, Gabe, it was in your article. And I think you even said abuse-proof lawful access. And we’re talking about the way that a prospective solution, meaning today forward and orienting that way, would also offer additional oversight. And we agree to that as well. So anyway, a resounding yes from us, Jim. We stand at the ready to start getting working on an action plan to get together and kind of start talking. What are the, taking our law enforcement operational needs and what we’re seeing from our cases and bringing it into the conversation with folks like this. And again, let me just go back to one quick hit again for the multi-stakeholder approach. This is the best way we solve these problems. Thanks.
Stewart Baker: So, Gabriel, do you think there is a abuse-resistant encryption?
Gabriel Kaptchuk: You know, it’s hard to say when you’ve written a paper called abuse-resistant law enforcement access mechanisms that you think they don’t exist. It’s difficult to quite put that back in the bag. You know, I think the work that we did in that paper was try to understand this design space more and try to think about, you know, if we are in a world where, you know, the folks who are using TOLA start issuing technical capability notices left and right, right, and suddenly there’s keys everywhere, right? That is the worst case scenario. How is it even possible to build a system that meets the technical, you know, requirements without being a total disaster? That’s what we’re trying to ask and that’s what we’re calling abuse-resistant. That’s not a global notion of abuse-resistance, right? We were actually very careful to say, like, we need to talk about what it means to be abuse-resistant. We need definitions on the ground, right? We need something on paper so that the cryptographic community can go back and actually answer a specific technical question instead of saying, aha, it’s abuse-resistant and that’s it. Right, no, we need something a little bit more formal to work with. And so the kind of, the particular notion that we worked with in that paper was trying to say, well, is there some way such that, you know, okay, you have warrants that are activating backdoors in some way, okay? And is there some way that, like, if that key gets stolen, at least we would know. At least we would all be able to tell. We would be able to say, like, something terrible has happened, right? A foreign government has taken this key and is just, like, rampantly using it to decrypt people’s stuff, right? If we’re in that world, like, can we at least detect it and say, we need to rekey the system right now, right? Something very bad is happening. And these are notions of abuse-resistance that I think haven’t been part of the conversation and we risk going towards really, really bad solutions if we don’t explore this space.
Stewart Baker: So let me push on a point that has always bothered me about, the argument that these keys are going to be everywhere, they’re going to be compromised, all of the communications are going to be exposed and that that’s a risk we can’t take. It does seem to me that everybody who has software on our phones or on our computers has the ability to gain access to that computer or that phone and to compromise the security of my communications on my phone. I am trusting every single provider of every single app that is on my phone. Obviously, that’s a worry, but we expect the manufacturer to undertake the security measures to prevent that from becoming the disaster we’ve been talking about here. Why doesn’t that same approach saying to the company that provides the communication service, you also have to have a mechanism for providing access, and we expect you to maintain that every bit as securely as you maintain the security of your patch update system. Why is that not the beginning of an approach here?
Gabriel Kaptchuk: Yeah. Let’s talk about this. Let’s start to split these into technical categories because there’s multiple things happening here. The first thing is whether or not I need to trust, I don’t know, Duolingo. They might have the ability to access my unencrypted messages. When you’re saying I need to trust every provider of every single app on my phone. Turns out Apple has done a really good job sandboxing these things so that it’s actually highly, highly non-trivial. They have made sure that we have to trust Apple, but nobody else. Great. Now, let’s talk about Apple for a moment. Let’s say there are these software update keys that is part of the ecosystem today for exactly the reasons that you mentioned. I think one really important part of this puzzle is thinking about the hotness of these keys. So this is maybe a little bit of a technical term, but like how much access does this key need? How live is it, right? For a software signing key, you’re not, that thing isn’t living on a computer that somebody has access to, right? That thing is living inside of a TPM offline, sitting somewhere. And if you want to go sign an update, you literally like have somebody get up and walk over and do the thing, right? And that reduces the amount of exposure of that key. And you’re not doing this every day, right? You’re doing this, how many, I mean, I don’t know how many updates, how many times I should be updating my phone, but we’re getting updates not that frequently from Apple. So it’s a very kind of slow and methodical capability that’s audited by a lot of people and there’s a lot of eyes on it. This is a very different world when we talk about getting access to people’s messages, right? You think that like, just if there is this key, it’s only gonna be asked for once in a while. Like, no, it’s gonna be fielding thousands, tens of thousands, hundreds of thousands of requests from countries around the globe. And there’s going to be a lot of requests that come with very, very like short time turnarounds, right? We need this content decrypted in the next five minutes because there’s a kid somewhere and we need to find them. That is a request that we are going to see because we already see it for unencrypted information. And moreover, we see that capability has been exploited in practice, right? Verizon handed over a data to somebody who impersonated a member of law enforcement because they said, hey, I need this data right now. And they just handed them the data first and then we’re gonna kind of do the due process later. And that person was just like using an owned account of some kind, right? This happened, I think in 2013, I’m sorry, 2023, right? So the hotness of these keys makes a tremendous amount of difference because the number of times you have to access it really shift the dynamics around it. That’s one piece of the conversation. There’s more to unpack there, but I’ll stop there for now.
Stewart Baker: All right, so I wanna make sure we have left enough time and I’ll ask Mia to keep me honest here. Should we be moving to questions from the audience? And if we should, Mia, I’ll ask you to begin the process. of assembling them.
Mia McAllister: Yes, we have about 15 minutes left in the session. So let’s, questions in the audience, let’s move to the audience and then we’ll pivot online. I know there are already some in the chat.
Andrew Cummings: Is this working? Yes. Hi, Andrew Cummings speaking. I’m a trustee for the Internet Watch Foundation. I should firstly say that there’s not agreement in civil society on this issue. There are lots of different points of view. That’s true of all of the different parts of the multi-stakeholder community. And there’s a lot of frustration, certainly for some of us, that the weaponization of privacy is being used to override the rights of children and other vulnerable groups, completely forgetting that privacy is a qualified right. And all of the human rights of children are being transgressed up to and including their life as we heard just now. So I think we just need a reality check on that. And also we shouldn’t use encryption interchangeably with security. They’re not the same, they’re quite different. And when we start to encrypt indicators of compromise and other metadata, A, we weaken security and therefore we completely trash privacy anyway. And it’s generally a bad practice. The scale of the problem we haven’t talked about, so just to give some non-abstract sense to this, we’re looking at about 150 million victims of child sexual violence per annum around the world. And we are seeing at the moment over 100 million reports of CSAM images and videos being reported per annum. That’s three every second. This is something which has been greatly magnified by the internet. This is a tech sector problem, not something which is a societal problem. It’s on us to fix this problem. And end-to-end encrypted messaging apps are widely used to find and share CSAM. There’s an enormously large sample size of research which is available to back that up. So we know that the messaging apps are a big part of the problem space here. We don’t need to backdoor them. Client-side scanning would immediately stop the sharing of known CSAM and it has no impact on privacy if it’s known CSAM images and it certainly doesn’t break encryption either. And also, simple things like age estimation or verification would at least keep adults off of child spaces and vice versa. So there’s some easy steps we could take here with known technology which would immediately affect this problem. And then finally, let’s not forget the sector is hugely hypocritical here. A lot of these problems apply in. democracies, they don’t apply in other types of states. So as a trivial example, Apple private relay is not available in China because it’s illegal in China. They care a lot less about the negative impacts of some of these technologies in democracies, but concede to the autocratic states and trade it for market access. So we’ve got a sector here, which is very hypocritical. And then finally, Vint Cerf in a session earlier this week said, sometimes we do need to pierce the veil of anonymity for law enforcement. And I think that’s absolutely the right approach. Yeah, we can’t treat privacy as an absolute right when that’s wrong in law and has serious consequences. So I’m not sure there’s a question there, but with the conversation so far, let’s talk about some of the victims. There are fixes here, and some groups are stopping us from making progress when progress could be made tomorrow if there was a willingness to do the easy things. Thank you.
Stewart Baker: All right, well, let me, that is sort of a question in the sense of a long set of propositions followed by the words, do you agree? So let me ask Mallory, if she does agree, there were a lot of ideas there that anonymity needs to be limited. I’m not sure that it is raised by the encryption debate because you can have encrypted communications that are fully attributable, but the client scanning would be a straightforward approach to this, that to age limits on access to communications services would be worth doing. And that we’re a bit too high on our horse when we say encryption is about privacy because it’s certainly. also becomes a vector for transmission of malware that wrecks people’s security. So it’s a double-edged sword. So Mallory, with those thoughts uppermost, what do you find in that that you can agree with?
Mallory Knodel: Well, that’s an interesting way of phrasing the question, Stuart. Thank you. It’ll challenge me. But first I wanted to just say, I’m particularly frustrated by the fact that the EU Child Protection Regulation has been stalled for years because of the encryption mandate. If that were removed, that whole piece of legislation that has all kinds of aspects of child safety could have moved forward ages ago. The fact that this is the one thing that’s been holding it back, I think should infuriate everyone who cares about child safety. So again, maybe it’s not worth saying these folks have held this issue back or these folks have held this issue back. Because again, what we’re trying to do is come up with a list of requirements and constraints. And that’s going to differ per jurisdiction. That’s going to differ per culture, et cetera. We’re in different places in the world. I think we can all agree that’s sort of the promise of the interconnected internet is that we all kind of come with our own version of that and interconnect and that’s the whole idea. One size fits all platforms are not going to be, I don’t think they’re the way sort of moving forward. I would certainly agree with that. I think there’s some of the things in there that have been said that then accommodate these kinds of other design ideas. But the issue is that backdoors or whatever you’re calling it, these measures have been then mandated for everyone at scale. So if we can start to chip away at that idea, then I think you get all kinds of different messaging apps that can thrive, to varying degrees of encryption, varying degrees of scanning, but that you would mandate everyone to do that the same, that you would mandate everyone to do that the same for everyone. Those are the problems. And if you look at, for example, the statement about the EU mandated backdoors and chat control from the Internet Architecture Board, they get to the heart of that, right? It’s something Gabriel said before. That doesn’t exist. People are going to use it. Even if you were able to sweep up, say, the largest providers, like you could sweep up WhatsApp and then, well, I think you’d just get it with WhatsApp, right, wouldn’t you? So you just sweep up WhatsApp. Everyone else could kind of do what they want. Then you’ve just disenfranchised all the WhatsApp users. And that would be a change. That would be a fundamental change to the software that everybody downloaded. When they downloaded WhatsApp, you might get migration to the other services then that aren’t swept up in that piece of legislation that do provide stronger encryption and don’t provide kinds of backdoor access just to kind of game this out. So I’m up. I am all for a very plural world in which we have lots and lots of different communications providers. What I don’t think is fair or what we actually want, right, is then requiring them all to work exactly in the same way and requiring them all to have struck the same balance when it comes to user privacy versus content moderation, because different users and different jurisdictions do want a different answer to that. OK.
Stewart Baker: Mia, do you have more questions or do you want to go back to other panelists?
Mia McAllister: Yeah, Dan, I want to bring you in. Are there any questions? There’s a lot of we see we have someone online from Germany. Any questions you want to address in the chat, Dan?
Dan Suter: It looks like. Hey, look, Mia, I can see that there are a lot of comments there from Ben from. Andrew. We’ve obviously heard from Andrew and equally in relation to Leah. I think the thing that’s really coming over and obviously we know this in terms of how this is a really difficult space. We often hear in terms of well we need to be regulated from industry but equally then we hear about well then we’re going to have companies that are going to leave and go offshore and there’s so much that can be done and we need to move to that place where we are actually doing it and look these are thorny issues, wicked problems as former Prime Minister Adern in New Zealand used to say and that requires people to come into the room and to discuss and understand our commonality because often there are points here where we do have a common approach. I hear absolutely everything that Andrew just said in terms of this question and answer session and believe you me as a former defence lawyer, as a prosecutor absolutely here in terms of we should be speaking more about the victim’s voice. We really should be here. It’s so important and equally to say look from a New Zealand legislation point of view we need the content. We have a high court ruling that says we cannot prosecute without the evidence of the content in relation to child sexual abuse matters. We have no choice here. Do we change the legislation and say well actually we can convict people on the basis of metadata? Is that really where we want to go to? I don’t think that’s the case and that’s when I say in terms of whether regulation pushes into this space to ensure that we can make children safer online and do you know what we are being pushed into a place where there’s self-reporting that equally isn’t a good space to be in as well. I can’t see my 15 year old child we talked about the sextortion case. Is he going to be self-reporting in relation to that case? Should we be pushing the responsibility onto my 15 year old or other children? Again, I don’t think we want to be in that space, but I’m sure we would also hear that there is agreement in relation to that. So that’s why we need to absolutely come together. But who’s going to lead that? And that’s a big question that is left hanging here, because I can really see the positivity coming out in terms of this panel. But who is going to take the lead? Is it where most of the service providers are located? Is that what’s required? Is it required in terms of a multilateral institution? And we know how that can be particularly difficult. Having taken part in the UN Cybercrime Convention negotiations, it’s not easy, right, in terms of multilateral process as well. But we really do need somebody to come to the fore and say, right, we’re going to get the right people in the room. We need the technologists. We need the academics. We need civil society. We need the NGOs. We need governments. And we need to come together to do this, because we do have the victims. We do have people who are dying. We need to move this point on sooner rather than later for all the good reasons that we’ve all discussed today. But passing back to you, Mia.
Mia McAllister: Thank you, Dan. We have time for one more question in the room. I’ll look to this side. Oh, oh, thank you. Is your hand? Okay. It looks like no more questions in the room. We’re going online. Any more questions online? You could just come off mute. All right. Not seeing any.
Stewart Baker: Yes. Well, then we can give the audience back three minutes of their life. They go to break early. I do think that our panel has done a great job of enlightening us about the nature of the considerations that are driving this debate and why it has been so prolonged and so difficult. And so I hope the audience will join me in thanking Mallory and Dan and Kate for their contributions and Gabriel. Thank you. Thanks, everyone. It was a pleasure moderating. Appreciate it.
Dan Suter
Speech speed
155 words per minute
Speech length
1925 words
Speech time
744 seconds
Incremental approach with safeguards in Australia and UK
Explanation
Dan Suter explains that Australia and the UK have implemented legislation with a graduated approach to accessing encrypted data. This includes voluntary and mandatory requests for industry assistance, with robust safeguards and oversight mechanisms in place.
Evidence
Examples of TOLA in Australia and the Investigatory Powers Act in the UK, including technical assistance requests, technical assistance notices, and technical capability notices.
Major Discussion Point
Legislation and Regulation of Encryption
Agreed with
Mallory Knodel
Katie Noyes
Agreed on
Importance of balancing security, privacy, and child safety
Differed with
Mallory Knodel
Differed on
Approach to regulating encryption
Need for consistent engagement between governments and tech firms
Explanation
Dan Suter emphasizes the importance of governments consistently engaging with tech firms on child safety and lawful access issues. He suggests developing a collective approach towards a safety by design ethos that does not undercut cybersecurity or privacy.
Evidence
Reference to the 2023 and 2024 five country ministerial communiques.
Major Discussion Point
Legislation and Regulation of Encryption
Agreed with
Katie Noyes
Mallory Knodel
Agreed on
Need for multi-stakeholder collaboration
Importance of content access for prosecutions
Explanation
Dan Suter emphasizes the importance of access to content for successful prosecutions in child sexual abuse cases. He argues that relying solely on metadata is not sufficient for convictions in many jurisdictions.
Evidence
Reference to a New Zealand high court ruling requiring content evidence for child sexual abuse prosecutions.
Major Discussion Point
Balancing Security, Privacy and Child Safety
Call for leadership to bring stakeholders together
Explanation
Dan Suter calls for leadership to bring various stakeholders together to address the challenges of encryption and child safety. He emphasizes the need for a collaborative approach involving technologists, academics, civil society, NGOs, and governments.
Major Discussion Point
Multi-stakeholder Approach to Solutions
Mallory Knodel
Speech speed
169 words per minute
Speech length
2521 words
Speech time
892 seconds
Laws could force companies to leave jurisdictions
Explanation
Mallory Knodel argues that strict encryption laws could force companies to leave certain jurisdictions. She suggests that this could result in a loss of access to helpful service data for law enforcement.
Evidence
Example of Session, an end-to-end encrypted app, leaving Australia due to concerns about TOLA.
Major Discussion Point
Legislation and Regulation of Encryption
Differed with
Katie Noyes
Gabriel Kaptchuk
Differed on
Effectiveness and risks of encryption backdoors
End-to-end encryption protects users but hinders investigations
Explanation
Mallory Knodel acknowledges that end-to-end encryption protects users’ privacy but can hinder law enforcement investigations. She argues for a balanced approach that respects both privacy rights and the need for child safety.
Major Discussion Point
Balancing Security, Privacy and Child Safety
Agreed with
Dan Suter
Katie Noyes
Agreed on
Importance of balancing security, privacy, and child safety
Differed with
Dan Suter
Differed on
Approach to regulating encryption
Need to focus on broader solutions beyond backdoors
Explanation
Mallory Knodel advocates for exploring broader solutions beyond backdoors. She suggests focusing on innovation and user-centric approaches that allow for diverse communication platforms with varying degrees of encryption and content moderation.
Major Discussion Point
Technical Approaches and Innovations
Agreed with
Dan Suter
Katie Noyes
Agreed on
Need for multi-stakeholder collaboration
Katie Noyes
Speech speed
180 words per minute
Speech length
3476 words
Speech time
1158 seconds
EU exploring access to data for law enforcement
Explanation
Katie Noyes discusses the European Commission’s recent report on Access to Data for Effective Law Enforcement. She highlights the shift in conversation towards responsibility and the need for a balance between safety, security, and privacy.
Evidence
Reference to the European Commission’s report and the G7 Romalion group’s lawful access working group.
Major Discussion Point
Legislation and Regulation of Encryption
Agreed with
Dan Suter
Mallory Knodel
Agreed on
Importance of balancing security, privacy, and child safety
Need for collaboration between government, industry and civil society
Explanation
Katie Noyes emphasizes the importance of a multi-stakeholder approach to solving encryption and child safety issues. She argues that neither government, private sector, nor civil society alone can solve these problems.
Evidence
Reference to the partnership with UC Berkeley and the summit to discuss these issues.
Major Discussion Point
Multi-stakeholder Approach to Solutions
Agreed with
Dan Suter
Mallory Knodel
Agreed on
Need for multi-stakeholder collaboration
Differed with
Mallory Knodel
Gabriel Kaptchuk
Differed on
Effectiveness and risks of encryption backdoors
Potential of homomorphic encryption and other technologies
Explanation
Katie Noyes mentions the potential of homomorphic encryption and other technologies as possible solutions. She suggests exploring prospective data-in-motion solutions that could provide lawful access without affecting all users.
Major Discussion Point
Technical Approaches and Innovations
Gabriel Kaptchuk
Speech speed
211 words per minute
Speech length
3306 words
Speech time
939 seconds
Risks of mandating backdoors or weakening encryption
Explanation
Gabriel Kaptchuk discusses the risks associated with mandating backdoors or weakening encryption. He argues that creating a universal key for lawful access would be a high-value target for attackers and could compromise the security of all users.
Evidence
Reference to the potential exploitation of lawful access capabilities by foreign governments.
Major Discussion Point
Legislation and Regulation of Encryption
Differed with
Mallory Knodel
Katie Noyes
Differed on
Effectiveness and risks of encryption backdoors
Exploring abuse-resistant lawful access mechanisms
Explanation
Gabriel Kaptchuk suggests exploring abuse-resistant lawful access mechanisms. He proposes the idea of creating keys that only work for certain types of content, potentially allowing for limited lawful access without compromising overall security.
Evidence
Reference to his paper on abuse-resistant law enforcement access mechanisms.
Major Discussion Point
Technical Approaches and Innovations
Challenges with perceptual hashing and content matching
Explanation
Gabriel Kaptchuk discusses the challenges associated with perceptual hashing and content matching technologies. He explains that current technologies are not reliable enough to accurately identify problematic content without risking false positives.
Evidence
Example of Apple’s neural hash function being reverse-engineered and producing collisions.
Major Discussion Point
Technical Approaches and Innovations
Andrew Campling
Speech speed
154 words per minute
Speech length
559 words
Speech time
216 seconds
Potential for client-side scanning of known CSAM
Explanation
Andrew Campling suggests that client-side scanning of known Child Sexual Abuse Material (CSAM) could be an effective solution. He argues that this approach would not impact privacy or break encryption while addressing the issue of CSAM sharing.
Evidence
Reference to research showing the widespread use of encrypted messaging apps for sharing CSAM.
Major Discussion Point
Balancing Security, Privacy and Child Safety
Frustration with lack of progress on child protection measures
Explanation
Andrew Campling expresses frustration with the lack of progress on implementing child protection measures. He argues that the tech sector is being hypocritical and that simple steps could be taken immediately to address the problem of online child sexual exploitation.
Evidence
Statistics on the scale of child sexual violence and CSAM reports globally.
Major Discussion Point
Multi-stakeholder Approach to Solutions
Agreements
Agreement Points
Need for multi-stakeholder collaboration
Dan Suter
Katie Noyes
Mallory Knodel
Need for consistent engagement between governments and tech firms
Need for collaboration between government, industry and civil society
Need to focus on broader solutions beyond backdoors
The speakers agree on the importance of collaboration between various stakeholders, including governments, tech firms, civil society, and academia, to address the challenges of encryption and child safety.
Importance of balancing security, privacy, and child safety
Dan Suter
Mallory Knodel
Katie Noyes
Incremental approach with safeguards in Australia and UK
End-to-end encryption protects users but hinders investigations
EU exploring access to data for law enforcement
The speakers acknowledge the need to balance security, privacy, and child safety concerns when addressing encryption issues.
Similar Viewpoints
Both speakers express concerns about the potential negative consequences of mandating backdoors or weakening encryption, including security risks and the possibility of companies leaving certain jurisdictions.
Gabriel Kaptchuk
Mallory Knodel
Risks of mandating backdoors or weakening encryption
Laws could force companies to leave jurisdictions
Both speakers suggest exploring innovative technical approaches to address the challenges of encryption and lawful access, such as homomorphic encryption and abuse-resistant mechanisms.
Katie Noyes
Gabriel Kaptchuk
Potential of homomorphic encryption and other technologies
Exploring abuse-resistant lawful access mechanisms
Unexpected Consensus
Recognition of the complexity of the problem
Dan Suter
Mallory Knodel
Katie Noyes
Gabriel Kaptchuk
Need for consistent engagement between governments and tech firms
Need to focus on broader solutions beyond backdoors
Need for collaboration between government, industry and civil society
Exploring abuse-resistant lawful access mechanisms
Despite their different perspectives, all speakers unexpectedly agree on the complexity of the encryption and child safety issue, acknowledging that there are no simple solutions and that a nuanced, collaborative approach is necessary.
Overall Assessment
Summary
The main areas of agreement include the need for multi-stakeholder collaboration, the importance of balancing security, privacy, and child safety, and the recognition of the complexity of the issue. There is also some consensus on exploring innovative technical solutions.
Consensus level
Moderate consensus with significant implications. While there are differences in approach, the speakers generally agree on the need for collaboration and innovative solutions. This consensus suggests potential for progress in addressing encryption and child safety challenges, but also highlights the ongoing complexity and need for careful consideration of various perspectives.
Differences
Different Viewpoints
Effectiveness and risks of encryption backdoors
Mallory Knodel
Katie Noyes
Gabriel Kaptchuk
Laws could force companies to leave jurisdictions
Need for collaboration between government, industry and civil society
Risks of mandating backdoors or weakening encryption
Mallory Knodel and Gabriel Kaptchuk emphasize the risks of mandating encryption backdoors, including potential exodus of companies from jurisdictions and security vulnerabilities. Katie Noyes, while acknowledging these concerns, advocates for a collaborative approach to find solutions that balance security and privacy needs.
Approach to regulating encryption
Dan Suter
Mallory Knodel
Incremental approach with safeguards in Australia and UK
End-to-end encryption protects users but hinders investigations
Dan Suter supports an incremental regulatory approach with safeguards, as implemented in Australia and the UK. Mallory Knodel, while acknowledging the need for balance, emphasizes the importance of end-to-end encryption for user protection and expresses concerns about regulatory approaches that could undermine this protection.
Unexpected Differences
Interpretation of recent security incidents
Mallory Knodel
Katie Noyes
End-to-end encryption protects users but hinders investigations
EU exploring access to data for law enforcement
There was an unexpected disagreement about the interpretation of the ‘salt typhoon’ hacks. Mallory Knodel suggested it demonstrated the risks of built-in lawful access backdoors, while Katie Noyes disputed this interpretation, stating that the FBI’s investigation did not support this conclusion. This highlights how even technical incidents can be interpreted differently by various stakeholders in this debate.
Overall Assessment
summary
The main areas of disagreement revolve around the effectiveness and risks of encryption backdoors, the appropriate regulatory approach to encryption, and the interpretation of security incidents. There is also disagreement on the balance between user privacy and law enforcement needs.
difference_level
The level of disagreement among the speakers is significant, reflecting the complex and contentious nature of the encryption debate. While there is some common ground on the need for innovative solutions and multi-stakeholder collaboration, the fundamental differences in approach and priorities suggest that reaching a consensus on encryption policies will remain challenging. This implies that future discussions and policy-making in this area will likely require careful negotiation and compromise among various stakeholders.
Partial Agreements
Partial Agreements
All speakers agree on the need for innovative solutions to balance security, privacy, and child safety. However, they differ on the specific approaches: Katie Noyes advocates for closer collaboration with law enforcement, Mallory Knodel emphasizes user-centric approaches and diverse platforms, while Gabriel Kaptchuk proposes exploring abuse-resistant lawful access mechanisms.
Katie Noyes
Mallory Knodel
Gabriel Kaptchuk
Need for collaboration between government, industry and civil society
Need to focus on broader solutions beyond backdoors
Exploring abuse-resistant lawful access mechanisms
Similar Viewpoints
Both speakers express concerns about the potential negative consequences of mandating backdoors or weakening encryption, including security risks and the possibility of companies leaving certain jurisdictions.
Gabriel Kaptchuk
Mallory Knodel
Risks of mandating backdoors or weakening encryption
Laws could force companies to leave jurisdictions
Both speakers suggest exploring innovative technical approaches to address the challenges of encryption and lawful access, such as homomorphic encryption and abuse-resistant mechanisms.
Katie Noyes
Gabriel Kaptchuk
Potential of homomorphic encryption and other technologies
Exploring abuse-resistant lawful access mechanisms
Takeaways
Key Takeaways
There is ongoing tension between protecting privacy/security through encryption and enabling law enforcement access to combat child exploitation
A multi-stakeholder approach involving government, industry, civil society and academia is needed to find balanced solutions
Technical innovations may offer ways to enable limited lawful access without fully compromising encryption
There is frustration with lack of progress on child protection measures due to encryption debates
Different jurisdictions and users have varying needs/preferences regarding privacy vs. content moderation
Resolutions and Action Items
Continue exploring technical solutions like client-side scanning and abuse-resistant lawful access mechanisms
Engage in more collaborative discussions between stakeholders to find common ground
Consider approaches that allow for diversity in encryption/privacy levels across platforms rather than one-size-fits-all mandates
Unresolved Issues
Who will take the lead in organizing multi-stakeholder collaboration on solutions?
How to balance user privacy/security with need for content access in investigations
Whether and how to implement age verification or limits on encrypted platforms
How to address global nature of platforms/crimes while respecting jurisdictional differences
Suggested Compromises
Explore prospective data-in-motion solutions that don’t affect all users
Consider content-based or limited lawful access mechanisms rather than full backdoors
Allow for diversity in platform approaches rather than mandating one solution for all
Focus on known CSAM detection and user reporting tools as interim measures
Thought Provoking Comments
We need to really raise our ambition and develop a collective approach engaging with each other and towards a safety by design ethos, including designed and lawful access that does not undercut cyber security or privacy.
speaker
Dan Suter
reason
This comment shifts the framing from an adversarial stance to one of collaboration, suggesting a more holistic approach that balances multiple priorities.
impact
It set a more constructive tone for the discussion and introduced the idea of ‘safety by design’ as a potential path forward.
Already there is data that can be obtained that can be provided and that is provided based on leaks from a few years ago and sort of, I don’t know, it was like a slide deck that the law enforcement community was using to explain which of these encrypted services have which metadata and how you can get it.
speaker
Mallory Knodel
reason
This comment introduces nuance by pointing out that even with encryption, some useful data is still available to law enforcement.
impact
It challenged the binary framing of the debate and suggested that existing capabilities may not be fully utilized.
I think what we’re seeing is the deployment of end-to-end encryption technologies around on many, many communication platforms as being a very clear signal that this is what users want.
speaker
Gabriel Kaptchuk
reason
This comment reframes the debate in terms of user demand and market forces rather than just policy considerations.
impact
It shifted the discussion to consider user preferences and the practical realities of the technology landscape.
We have solid data and it’s solid data not coming from law enforcement this time, it’s coming from outside non-government organizations. So many of you are familiar with the National Center for Missing and Exploited Children, my colleagues around here in the room. It’s a U.S.-based non-profit that really takes tips and leads from the electronic service providers. Last year was the highest number of tips ever received by the electronic service providers, like META for Facebook and Instagram, if you’re wondering what an ESP is, but it was 36 million reports.
speaker
Katie Noyes
reason
This comment introduces concrete data from a neutral source to illustrate the scale of the problem.
impact
It grounded the discussion in real-world impacts and statistics, moving beyond theoretical arguments.
And is there some way that, like, if that key gets stolen, at least we would know. At least we would all be able to tell. We would be able to say, like, something terrible has happened, right?
speaker
Gabriel Kaptchuk
reason
This comment introduces a novel technical approach to mitigating risks associated with lawful access mechanisms.
impact
It opened up discussion of more nuanced technical solutions that could potentially bridge the gap between privacy and law enforcement needs.
Overall Assessment
These key comments helped move the discussion from abstract policy debates to more nuanced considerations of technical realities, user preferences, and practical impacts. They introduced new frameworks for thinking about the issue (safety by design, abuse-resistant mechanisms) and grounded the conversation in concrete data and real-world examples. This shifted the tone from adversarial to more collaborative, exploring potential middle-ground solutions and acknowledging the complexity of balancing multiple priorities.
Follow-up Questions
Has there ever been an order or capability notice that required modification of encryption to allow for lawful access?
speaker
Stewart Baker
explanation
This is important to understand the real-world impact of legislation like the Investigatory Powers Act and TOLA on encryption and privacy.
Is there a way to create an encryption key that only works for certain types of content?
speaker
Gabriel Kaptchuk
explanation
This could potentially provide a middle ground between full encryption and lawful access, addressing both privacy and law enforcement concerns.
How can we develop a collective approach among countries for engaging with tech firms on child safety and lawful access?
speaker
Dan Suter
explanation
A coordinated approach could lead to more effective solutions and consistent policies across jurisdictions.
What are the possibilities and limitations of homomorphic encryption in addressing the encryption debate?
speaker
Katie Noyes
explanation
This technology could potentially allow for data analysis without compromising encryption, offering a new avenue for balancing privacy and security.
How can we design prospective data-in-motion solutions that affect only specific subjects rather than all users?
speaker
Katie Noyes
explanation
This approach could potentially provide lawful access while minimizing the impact on overall user privacy and security.
Who should take the lead in bringing together stakeholders to find solutions to the encryption debate?
speaker
Dan Suter
explanation
Identifying a leader or organizing body is crucial for moving the conversation forward and implementing practical solutions.
How can we better incorporate victims’ voices into the encryption and child safety debate?
speaker
Andrew Campling
explanation
Understanding the real-world impact on victims is crucial for developing effective policies and solutions.
What are the potential impacts of client-side scanning on privacy and encryption?
speaker
Andrew Campling
explanation
This technology has been proposed as a potential solution, but its implications need to be thoroughly examined.
How can age estimation or verification be implemented effectively in online spaces?
speaker
Andrew Campling
explanation
This could potentially address some child safety concerns without compromising encryption.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
Internet Governance Forum 2024
15 Dec 2024 06:30h - 19 Dec 2024 13:30h
Riyadh, Saudi Arabia and online