WS #125 Balancing Acts: Encryption, Privacy, and Public Safety
WS #125 Balancing Acts: Encryption, Privacy, and Public Safety
Session at a Glance
Summary
This workshop focused on balancing encryption, privacy rights, and public safety, particularly in relation to child protection online. Experts from various fields discussed the challenges and potential solutions to this complex issue.
The discussion highlighted the tension between privacy and security, with some arguing that privacy concerns are sometimes weaponized at the expense of child safety. Participants emphasized the need to reject framing this as a choice between privacy and security, instead advocating for solutions that address both.
Key challenges identified included global inconsistencies in laws and standards, rapidly evolving technologies, and the difficulty of protecting against online abuse while maintaining privacy. The importance of international collaboration was stressed, with calls for finding common ground and developing harmonized legal and technical standards.
Participants suggested several approaches, including client-side scanning for known child sexual abuse material, age verification tools, and considering sub-contexts for different user groups. The need for public awareness and education about encryption and its impacts was emphasized.
The discussion also touched on the role of internet standards bodies like the IETF, with calls for greater multi-stakeholder engagement in these technical forums to ensure societal implications are considered. Participants agreed that finding solutions requires input from diverse stakeholders, including government, private sector, and civil society.
Overall, while acknowledging the complexity of the issue, the panel expressed optimism that balancing encryption, privacy, and public safety is a “mission possible” with continued dialogue and collaborative efforts.
Keypoints
Major discussion points:
– Balancing encryption, privacy rights, and public safety, especially regarding child protection online
– The need for multi-stakeholder collaboration and international cooperation on encryption policies
– Challenges of evolving technologies and inconsistent global regulations around encryption
– Educating the public and raising awareness about encryption’s impacts
– Engaging technical standards bodies like IETF to consider societal implications of encryption decisions
The overall purpose of the discussion was to explore the complex challenges of balancing encryption, privacy, and public safety, and to identify potential paths forward through multi-stakeholder collaboration and public education.
The tone of the discussion was thoughtful and solution-oriented. While speakers acknowledged the difficulty of the issues, there was an emphasis on finding pragmatic ways to make progress rather than viewing it as an impossible task. The tone became more optimistic and action-oriented by the end, with calls for stakeholders to get involved in technical standards bodies and educate the public.
Speakers
– David Wright: Director of the UK Safe Internet Centre and CEO of UK charity SWGFL
– Andrew Campling: Director of 419 Consulting
– Taddei Arnaud: Deputy Director, Cyber Information Development Bureau, Cyberspace Administration of China
– Makola Honey: Manager of policy research and development unit at Independent Communications Authority of South Africa, vice chairperson of ITU-T study group 17
– Boris Radanovic: Head of engagements and partnerships at SWGFL
– Alromi Afnan: Vice chairman of ITU-T study group 17, director of cyber security operations centre at CST
Additional speakers:
– Cynthia Lissoufi: Works with ITU, from South Africa
– Catherine Bielek: Infectious disease physician at Harvard Medical School
Full session report
Balancing Encryption, Privacy Rights, and Public Safety: A Multi-Stakeholder Approach
This workshop, part of the Internet Governance Forum (IGF), brought together experts from diverse backgrounds to discuss the complex challenges of balancing encryption, privacy rights, and public safety, with a particular focus on child protection online. The discussion highlighted the tension between privacy and security concerns, while emphasising the need for collaborative, multi-stakeholder solutions to address these interconnected issues.
Key Challenges and Framing of the Debate
The participants identified several key challenges in addressing encryption and privacy:
1. Global inconsistencies in laws and standards
2. Rapidly evolving technologies
3. Difficulty in protecting against online abuse while maintaining privacy
4. Balancing various human rights and interests
A significant point of contention emerged regarding the framing of the debate. Andrew Campling, Director of 419 Consulting, argued that privacy rights are sometimes weaponised at the expense of child safety, stating, “In my view the weaponization of privacy is being used and has been and is continuing to be used to override all of the human rights of children and other vulnerable groups and I think that’s a fundamental problem.” Campling also highlighted the scale of the issue, noting, “We’re seeing roughly 100 million reports of CSAM images and videos every year and that’s roughly three new images being found every second.”
In contrast, Boris Radanovic, Head of Engagements and Partnerships at SWGFL, advocated for rejecting the framework of privacy versus security altogether. He used a vivid analogy to illustrate his point: “We should utterly reject the framework of conversation of having privacy versus security. And if we reject it, I’ll just remind everybody that most of us flew to this wonderful country, and what if 90% of our flights had 90% of a chance to land in Ankara, maybe in Zagreb, maybe in London? None of us would take that option or those odds.” This reframing encouraged participants to think about achieving both privacy and security simultaneously rather than trading one for the other.
Proposed Solutions and Approaches
The discussion yielded several proposed solutions and approaches to address the challenges:
1. Technical Solutions:
– Client-side scanning for known child sexual abuse material (CSAM) images, as suggested by Andrew Campling
– Consideration of sub-contexts with different encryption requirements for various groups, proposed by Arnaud Taddei, Global Security Strategist at Symantec by Broadcom
2. International Collaboration:
– Makola Honey, Manager of Policy Research and Development Unit at Independent Communications Authority of South Africa, emphasised the importance of international collaboration to find common ground
– Alromi Afnan, Vice Chairman of ITU-T Study Group 17, highlighted the need to address global inconsistencies in laws
3. Public Awareness and Education:
– Boris Radanovic stressed the need for adaptable education for different capabilities and age groups
– Alromi Afnan noted that public awareness is a key part of online safety
– Panelists suggested developing targeted educational programs to help users understand the complexities of encryption and privacy
4. Engagement with Technical Standards Bodies:
– Andrew Campling called for civil society groups to engage with technical standards bodies like the Internet Engineering Task Force (IETF)
– Campling also highlighted how changes in underlying technology can affect parental controls and other safety measures
5. Learning from Other Models:
– Some participants suggested using the COVID-19 pandemic response as a model for balancing individual privacy and public safety needs
Multi-Stakeholder Approach and International Cooperation
A key area of agreement among participants was the need for a multi-stakeholder approach involving diverse perspectives. Speakers emphasised the importance of international collaboration and the development of harmonised legal and technical standards. Makola Honey suggested that international bodies can convene neutral dialogues to find balanced solutions, stating, “We need to find common ground, and international bodies can facilitate these discussions in a neutral way.”
The role of internet standards bodies like the IETF was highlighted, with calls for greater multi-stakeholder engagement in these technical forums to ensure societal implications are considered. Andrew Campling emphasised this point, stating, “What I think we need to do is to get people from the groups that are here, at least some of them, to engage over there, so civil society groups, governments, regulators, others who have got sufficient technical knowledge to engage in the standards bodies need to attend and pay attention to what is happening there, and the implications for some of the decisions being taken.”
Unresolved Issues and Future Considerations
Despite the productive discussion, several issues remained unresolved:
1. Developing global standards that balance needs of different regions and technical capabilities
2. Effectively educating the public about complex encryption issues
3. Addressing challenges posed by emerging technologies like quantum computing
4. Resolving conflicts between different legal and regulatory frameworks across countries
Arnaud Taddei contributed significantly to the discussion by emphasizing that security is not provable, stating, “Security is not provable. We can only prove insecurity.” This perspective added depth to the conversation about the challenges of implementing robust security measures.
Conclusion
While acknowledging the complexity of the issues at hand, the panel expressed optimism that balancing encryption, privacy, and public safety is a “mission possible” with continued dialogue and collaborative efforts. The discussion emphasised the need for ongoing multi-stakeholder engagement, international cooperation, and public education to address these challenges effectively. As the digital landscape continues to evolve, finding solutions that protect both privacy and public safety remains a critical goal for policymakers, technologists, and civil society alike.
Session Transcript
David Wright: is now 1.45 here. We will make a start if that’s okay with everybody. And a very warm welcome to this workshop, workshop number 125. And here we are looking at balancing acts in terms of encryption, privacy, and public safety. My name is David Wright. I’m director of the UK Safe Internet Centre and CEO of a UK charity, SWGFL. And I’m delighted to be able to introduce you to the panel that we have today, both here and online as well. This is obviously the last workshop of this IGF, and it’s a real pleasure to be able to close out this IGF ahead of the closing ceremony with this particular subject. So it is, we are going to be looking at this complex balance between encryption, privacy rights, and public safety, and the needs for structured multi-stakeholder discussion. We have an hour with you, so we are going to have to keep comments, questions, perhaps brief, but it is such an important subject that we see. I’m just first of all going to introduce the panel to you. And for the panellists, I’m going to have to probably just condense some of the bios just because of time. And I’m going to run through them in terms of the sequence of questions as well. So I’d like to welcome Andrew Campling, who’s here with us. Andrew is Director of 419 Consulting, a public policy and public affairs consultancy focused on the tech and telecom sectors. He has over a decade of non-executive experience. experience backed by nearly 40 years of experience in a wide range of increasingly senior roles in a mainly business-to-business technology context. It’s been engaging in initiatives linked to encrypted DNS, encrypted SNI and related developments in internet standards, primarily to understand their impact in real-world deployments. It’s worthwhile pointing out Sir Andrew is a trustee of the Internet Watch Foundation, which is a global charity, one of our partners within the UK Safe Internet Centre and holds an MSc in strategic marketing management and an MBA. Andrew is currently studying in his spare time, studying law and plans to complete his LLM in the next couple of years. Online joining us is Arnaud Taddy. If we might be able to bring up Arnaud onto the screen. Arnaud is a global security strategist semantic by Broadcom. He’s an executive advisor of security strategy and transformation for the top 150 semantic customers. As part of his mission, Arnaud participates in specific standardization defining organizations and in particular, and this warrants congratulations Arnaud, elected chair of the ITU SG17 representing the UK and works at the IETF. He started his career in 1993 at the famous CERN IT division in Geneva, which created the World Wide Web and where he led the team responsible for communication, authentication and authorization. In 2000 joined Sun and became one of the 100 elected global principal engineers. In 2007 he joined Symantec and hold chief architect roles up to director of research as direct report to Dr Hugh Thompson, Symantec CTO and actual RSA conference chairman. I’d then also like to move on to the next panelist, which we can also bring Honey onto the screen, please. So Honey McCulloch is. the manager of the policy research and development unit at the Independent Communications Authority of South Africa where she guides the regulator in navigating its evolving roles across various aspects of ICT developments including cyber security. She also serves as a vice chairperson of the ITU-T study group 17 which focuses on setting international standards for cyber security. Within study group 17 Honey co-convenes the correspondence group on child online protection working to identify gaps in technical measures and promote initiatives that create safer online environments for children. If I can now draw in back into the room if I can introduce Afnan. Afnan Alromi is an accomplished cyber security leader with over 12 years of experience in managing complex projects and shaping cyber security strategies. As vice chairman of the ITU-T study group 17 cyber security and director of the cyber security operations centre at CST she plays a key role in advancing cyber security resilience locally and globally. Afnan holds advanced degrees in software engineering and computer science along with several industry professional certifications and is known for her expertise in strategic planning, vulnerability management and fostering international collaborations. Afnan welcome. Finally if I can welcome my colleague Boris. Boris Rudanovich is an expert in the field of online safety and currently serves as the head of engagements and partnerships at SWGFL the UK-based charity. Like me he also works with the UK Safe Internet Centre which is part of the European InSafe network. His work involves educating and raising awareness about online safety for children, parents, teachers and other stakeholders across the world. Boris has worked extensively with various European countries including Croatia where he worked at the Safe Internet Centre and he’s been involved in numerous missions to countries like Belarus, Serbia, Montenegro, North Macedonia. to present online safety strategies to government officials and NGOs. His focus is on protecting children from online threats, such as cyberbullying, child sexual exploitation and scams, as well as empowering professionals through workshops and keynote speeches. So I would just like to welcome the panel, and also this afternoon, forgive me, I’m joined by my colleague, Niels here is going to moderate the online conversations for when we get through to chats. If I can just invite the panellists just to give a quick opening couple of sentences. Andrew, if we can start with you, please.
Andrew Campling: Good afternoon, everyone, and also hello to everyone online. As David said, my name is Andrew Campling. I’m a trustee of the Internet Watch Foundation, amongst other things. I think this is an incredibly important issue, which we’ll get into in a moment. And I think, although we’ve focused on the trade-offs and specifically talk about privacy, as I’ll expand on it in a short while, I really want to get into the debate about privacy versus other human rights. Because I think we over-inflate the importance of privacy and completely ignore often all of the other human rights, including fundamental rights, as opposed to privacy as a qualified right. But that’s something we’ll come on to when David asks us questions, I’m sure.
David Wright: Thank you, Andrew. Arnaud, if I can throw it to you, please.
Taddei Arnaud: Yes, can you hear me correctly? Yes? Yes, we can. Thank you for the chance to be in this workshop. The topic is really heartbreaking when you start to… understand what is at stake. It’s both concerning to see the level of harm that is increasing, perhaps accelerating, and at the same time we are facing a real design issue and we need to make trade-offs that are really difficult for a number of humans. So observing this from ITU and SG17 is an interesting journey and hopefully we are maturing and putting ourselves in the conditions where we can have a meaningful discussion. Thank you.
David Wright: Thank you. If I can now just turn to Hani.
Makola Honey: Yes, thank you. I just wanted to draw attention a little bit to the work that I do as the convener of the correspondence group, where we are focusing on identifying and addressing gaps in child online protection standardization within the study group 17. We have done a lot of work in reviewing the regulations, the standards that are currently in place, and you know the work has progressed well and we are on our way in identifying the gaps. But I would also like to take this opportunity please to invite the people in the group, as well as online, to please join the correspondence group on child online protection. This can be done through subscription on my workspace, but for the purpose of today’s meeting, for me, I think encryption is a very important and powerful tool that can help us safeguard communications and information, but it also creates significant challenges in protecting children online. So for the purpose of today’s meeting, I just want us to try and find a balance between privacy on the other hand, and other issues such as the protection of children online. It’s a challenging balance. But one, I believe that it is essential for the effectiveness of child online protection, and I look forward to the engagements this afternoon.
David Wright: Thank you. Honey, thank you very much. Now turning back to the room, Naftali, if I can hold it to you, please.
Alromi Afnan : Good afternoon, everyone, and looking forward to this wonderful discussion today and engaging in this topic, and looking forward to discuss the challenges that just my colleague Honey just discussed that are to be discussed today, and also see how we can succeed with balancing the encryption and privacy rights and at the same time public safety. So looking forward for this discussion, thank you.
David Wright: Thank you, Afnan. And finally, Boris, if you can introduce.
Boris Radanovic: Thank you very much. I appreciate the invitation and ability to contribute, especially from a diverse set of points looking at this. And I love the title. It says encryption, privacy, and public safety, and I think that is the framework of conversation that I think we should all have and support. And one of the questions in my mind that we can later hopefully answer is how do we create meaningful and impactful discussions on these topics that takes into account a wide array of different perspective needs and abilities and representation, but equally respecting the direction that we all want to take to a better and safer world, which includes protections of children in itself. So really proud to be here.
David Wright: Thank you, Boris. Okay. So moving on to the particular questions that we’re going to pose, and then panelists will share and discuss with everybody, after which point we will then open the floor and the virtual floor to questions. So please do hold on to questions. There will be a time. Because of timing as well, I’m going to keep panelists to perhaps four minutes. So please keep contributions succinct. So Andrew, I’m going to turn to you without any further delay. further ado first, and I wonder if you could elaborate a little on how should governments and tech companies approach the creation of lawful access mechanisms without infringing on privacy rights, straight into this point.
Andrew Campling: Fantastic, thank you David for the question and to provoke hopefully a response from some of the participants in the room and online. In my view the weaponization of privacy is being used and has been and is continuing to be used to override all of the human rights of children and other vulnerable groups and I think that’s a fundamental problem. As I said earlier, remembering that privacy is a qualified right and we need to think about all of the human rights and also again to provoke a response, encryption let’s remember is not the same as security, they’re fundamentally different things but they’re often conflated and for example if you begin as is happening increasingly in the internet standards world to encrypt indicators of compromise and metadata, you end up with weakened security and if you weaken security you have no privacy but you think you have and that’s also a big problem. So very briefly let’s put a scale to the problem and I’m going to focus on child sex abuse because that’s what the Internet Watch Foundation does. We’re seeing roughly 150 million victims of child sexual violence every year around the world and we actually are recording in the order of 100 million reports of CSAM images and videos every year and that’s roughly three new images being found every second, new images or videos. So even in the course of this workshop that’s a scary number of images and videos being found. The internet has magnified the scale of the problem of CSAM significantly. It happened pre-internet, but the ability to publish and share the images globally means, and remembering every time an image is shared that’s a crime and there’s a victim, the scale of the problem is huge compared to what it was pre-internet. We know from research that end-to-end encrypted messaging platforms are widely used to find and share CSAM and there’s a very large sample size behind that research. But to get directly to the problem, in terms of things like lawful interception, you don’t need to backdoor encryption to help solve this problem. Client-side scanning for known CSAM images would immediately reduce the size of the problem. It doesn’t break encryption, it doesn’t break privacy, so that’s an easy way to make an impact, as would be the use of tools like age estimation and age verification to keep adults off of platforms intended for children and children off of platforms intended for adults to try and keep the victims away from the criminals. So those would be my suggestions, places to start, and hopefully that will provoke a response from people in a few minutes. Thank you.
David Wright: I’m pretty sure it will do, Andrew, thank you. Just can I ask a follow-up question, just to prime the microphone again. Are developments in internet standards helping or harming human rights? So in my view, the increased use and requirement to use encryption
Andrew Campling: in some of those standards, I could give examples but probably that’s too much detail for here, is making the problem worse, not better. It’s making it harder to find where the crimes are happening and it’s making it easier for the criminals to hide. So some of the developments are actually problematic in the standards bodies, an area where I’m active, and they also coincidentally weaken security as well. So I think that’s why we need civil society to engage in places like the standards bodies. It’s mainly technologists making these decisions and we need, dare I say it, multi-stakeholder engagement to actually shine a light on things that are causing huge societal problems.
David Wright: Thank you. Thank you very much. Okay. I’m going to move on now to Arnaud, if I can, please, onto the screen. So Arnaud, the question that we would pose to you. So what technical innovations or solutions do you see as viable for achieving a balance between privacy and public safety?
Taddei Arnaud: I like the mission impossible. It’s a very difficult problem. So in one hand, one of the issues is that we have only one model for internet as we know it today. And so anything we do is going to impact various communities. So far, so good, until some versions of TLS and other considerations, we sort of managed to keep the community together and everybody could find what it needed from the setup. But for various reasons, some directions were set and for good, bad and ugly reasons, it’s not a judgment. It’s just that they were set in a certain direction and now it is pushing the solution for one specific part of the spectrum. all the humans. So that means that now we have situations where some are going to get benefits and some are going to have a problem with what is happening. So this is very difficult to move from there because we have only one design where and it’s difficult to get out of this thing. So now that doesn’t mean we cannot be creative. For other areas we started to realize that maybe the problem is the fact that we have in the background of that the anthropological assumption that was made behind was one model for all the humans. That means a very narrow model for all the humans to make sure it fits the maximum. But when you do that you lose the fact that there are sub-contexts that are very specific needs. Child land protection is one sub-context, education is one of those sub-contexts. I would add elderly people is another sub-context. And all these sub-contexts have different requirements and needs and when we have to take a step back and make better, let’s say, when we have to make design choices, the issue of a design is always the trade-offs. Which trade-offs are you going to do for a specific set of use cases and requirements? That’s what the engineer will think. So when you go in this approach perhaps the direction we could consider is sub-context. So it’s not related here to child land protection but I see for example sub-contexts that may be already happening. I discovered the event of what we call enterprise browsers that allow specific requirements for enterprise use cases. Equally you see more and more family solutions by some hyperscalers. So the question I have is, is the premise of the solution not to consider that we should perhaps re-highlight the concept of sub-context and from there we could start to envision perhaps some technical solutions. I will stop at that, thank you.
David Wright: Thank you, thank you very much for trying to take on Mission Impossible. Okay, moving swiftly on to Honey, which we can’t see you at the moment Honey, but if I can, there we go, there we go, if I can pose you the question, so how can international collaboration improve or complicate the encryption debate, especially when balancing privacy
Makola Honey: with cross-border safety issues? Thank you for the question, so I have a lot of experience in international collaboration and also having recently checked the cybersecurity resolutions of the WTSA 2024. I want to draw a little bit on that experience, right, and I would like to start with what complicates the debates in the collaboration. First of all, the participants come to the collaboration table, if I may, with different legal and regulatory frameworks. They could be divided into the ones that are pro-data protection and privacy, then you get the ones that are for government control, for national security purposes, then you get the ones that want balance, so those differences make it very difficult to agree on a unified approach to encryption across borders, so you can even imagine for the frustration for the companies operating across borders regarding encryption. Then there’s also the imbalance in cybersecurity capabilities across the different nations. You get nations with advanced cybersecurity capabilities that may argue for stronger encryption to protect the critical infrastructure. which is fair. And then on the other hand, for example, at the moment, a priority for Africa could be the protection of children online versus individual privacy. And that is understandable because currently Africa has the highest dividend when it comes to young people. So you would understand why from this perspective, protecting the children online now is actually protecting the future economy and protecting everything. And then you get the MOUs and the multilateral treaties and so forth. But from the readings, it could be said that this can be quite slow and cumbersome when issues need to be dealt with right away and governments are seeking swift access. And then there’s the geopolitical tension and distrust as well that we witness. And from my work with the ITU so far, there’s also a question of how do you develop global standards that are balanced in addressing the needs of the different regions, taking into account the imbalances in technical capabilities, the skills that different countries are having, and the level of skills actually, and the culture of privacy in itself. But the complications are just the negatives. Moving to how it can improve, taking into account obviously having what complicates it in the background. I think the starting point in the encryption debate is that there are two sides, but how do we develop common ground? And that is what the international collaboration can improve. It has the ability to facilitate dialogue between nations with the different stance on encryption. And moving from that established common ground, countries can then establish harmonized legal and technical standards. There will be compromise. to be made by different groups, but remembering or recalling that the common ground is important for facilitating the debate. And I personally believe, and from my experience, that it’s during this international dialogue where innovative solutions can be birthed. For example, if you cannot compromise, what do you do? You look for solutions. And that can be in the form of research into privacy preserving technologies, which is what the work of the CG Correspondence Group on Travel and Protection of the ITU Study Group 17 is doing. You know, we’re looking for the solution that balances the two public-private partnerships, you know, development, the work that is done by the ITU development sector in sharing information and also capacity building. But the most important thing to also remember in all of this improvement that can be brought about by international collaboration, it requires active participation and contribution from the private sector as well, not just government and regulators. So what makes the collaboration work is everybody finding that common ground first and then starting to move on there and see how they can compromise. Ani, thank you very much. And I very much share the hope that we will find solutions at events like this, despite it, as Arnaud was saying, being mission impossible.
David Wright: Okay, I’m next going to go back to the room and to Afnan. Afnan, if I can pose to you from your perspective, what are the most critical challenges in balancing encryption with public safety and privacy rights? Thank you, David.
Alromi Afnan : First of all, before I go into answering your question, I would like to thank my colleagues here in this panel discussion for their great insights and the points they brought in. They’ve actually pointed out a couple of challenges that I was hoping and planning to discuss in this talk. Although, as we say, encryption is an important tool and aspect that helps us to secure and our sensitive data and to have it more safe, although, yeah, it poses a couple of challenges. And in today’s session, although we have a little time, I’m going to discuss just a couple that are most likely those with us and the attendees are more familiar with. The first challenge that we can sense and see is when it comes to global inconsistency, and I think Arnold and Hani mentioned a couple of points on that aspect, and also the conflicting international laws, where it brings it more challenging to discuss this. For example, countries having varied legal standards and laws and regulations, and when we see international companies working in those different countries, they have to comply with each standard and they have to fulfill all those standards as well. So this brings a hard challenge in that aspect. Another challenge is when it comes to the evolving technologies, and we’ve discussed this as well in SG17, and we have a specific question to emerging and evolving technologies in that aspect. Evolving technologies and the rapid pace of technology advancements means that we have to keep up with them and we have to address those encryption standards that bring in, and one of the evolving technologies that we are considering or are discussing is the processing in SG17 is quantum computing, and quantum computing in the future will most likely break some of the encryption standards that we have today, and this poses a challenge that we need to consider or start the transition to secure quantum infrastructure. Another challenge, and it’s very important, and we have a couple of initiatives in Saudi Arabia that is looking into that aspect, which is the challenges of protecting against abuse online, and as we see, some of the tools or the applications that we have today, or even chatting applications that we use in a day-to-day life, they have a lot of encryption or they have encryption implemented to secure our communication, and although this is vital and important, however, it creates a challenge for law enforcement to help secure or have public safety more safe in the environment, and most importantly, children, where we cannot see what type of text that is being communicated and the abuse that could happen in that discussion. So these are just some of the challenges that we need to consider when we balance between encryption and also public safety and privacy rights as well. So I think this is just something that we need to consider, and the only way to reach a collaboration is to have feedback either from both, from different areas, either from the government, from the private sector, from civils, and all parties that could collaborate in that discussion and try to find a way to reach a balance in that aspect.
David Wright: So thank you. Thank you very much. That’s the ensuing conversation afterwards, which I’m going to encourage everybody to to think about questions, then when we open the floor, it does appear we’ve got some questions online too. So I say that just before I throw it over finally to my colleague, Boris. And so my question here, Boris, is what role should the public play in this discourse and how can awareness be effectively raised on the impact of encryption policies on privacy and security?
Boris Radanovic: Thank you. Short answer, education. But how to do it properly is a much longer discussion. And I think we should start with defining frameworks for meaningful discussions that allow us a communication goals and structure that allow exactly these kinds of discussions on a multiple levels of representation, diversity, and abilities and disabilities that could contribute to these conversations that I might not be able to do now. But on a broader point, I think we should all be aware that vision can only pull as hard as reality can follow. And the current reality that in these chairs around us is not 150 million children across the world being sexually abused every year, and that number is rising. That is a reality we need to face. And while the vision we can all agree is magnificent, the reality is something that we need to take into account. And thinking about the discussions, I’m gonna raise more questions than answers here, but I think this is the perfect space, is how do we make sure that we do not allow dominance and dominant discussions or part of this discussion to be apart from a certain area, agenda, stakeholder, or interest? And how do we have a meaningful level playing field for anybody contributing to this discussion? How do we make sure that we develop initially technological solutions that take into the account of the benefit of the user or the benefit of the child first and foremost, and then we continue developing those solutions? And I think all of that builds up into our, what I personally consider our principle. duty as adults to create a better and a safer world for people and young children following in our footsteps. And I’ll come back to our nod and say, I love the movie, Tom Cruise and Mission Impossible, our nod, but I don’t know if you remember that in each of those movies, a great team of people working towards their own abilities and capabilities, working together, make the movie, in the end, Mission Quite Possible. I know that doesn’t make a good marketing title, but I think that should be a good notion. And just to come on with something that Honey said, which I think is important, what we are trying to do is not easy, but we have to ask ourselves, what is easy and what is right? And Lean, I would suggest on the side, what is right and finding solutions for that. To just come back on that point for my final, we need to be able and find a way to develop global standards with local sensitivities that respect many of the things that I mentioned. And I wholeheartedly ask all of you today listening to us online and here, do ask us questions. We have a discussion on so many levels and so many representations that we all need to understand, but if we can all take into account that all of us, at one point, I don’t know if you remember, were a child and we all needed somebody to stand up for us and defend what is the benefit of us. So I ask you today to look to that prism while we are discussing this topic. Thank you very much. Okay. So that concludes the contributions from
David Wright: the panellists to set the scene as well for everybody. I am going to ask, if somebody behind me puts their hand up, can you help me out? I haven’t got eyes in the back of my head. I do also like, as well, this theme of mission possible or mission impossible, depending on which one perhaps we should be reaching out to the producers or Tom Cruise to find us a way through here. So I open the floor to any particular questions if anyone anyone has. I can see behind me, yes. Okay, if I can ask if you could just introduce yourselves, that would be helpful to start with, too. Thank you.
Audience: Yes, thank you for giving me the floor. My name is Cynthia Lissoufi. I come from South Africa and I work with most of the panelists in this session at the ITU, and it’s quite refreshing to listen to the diverse views of different stakeholders on this important topic, which is quite dear to South Africa, but not only South Africa, but many of the countries that participate in the ITU work, and specifically in the study group 17, which is the technical study group of the ITU when it comes to issues of standard and security. For me, specifically as South Africa, we believe that actually we stand a good chance, and why we are saying this is because we are looking at the upcoming WSIS plus 20 review process, where we are also bringing in the issue of the global digital compact, and we believe that some of these issues we can, as I would say a community of stakeholders that are concerned with this particular issue, we find ways, because what I’m actually picking up here today is that we are all concerned, but as we’ve said, the issue is how do we deal with this? And I’m also hearing that we need this continuous discussion, and for us to continue with this, we also need to take advantage of all the processes that are currently happening, to make sure that this issue is not pushed at the back of other priorities, because different stakeholders will fight for their priorities. So all I’m pleading for, for all the stakeholders in this room, let us take advantage of the processes. that are happening, and we make sure that the issue of the child online protection also takes the forefront in all of these decisions, especially at the UN level. Thank you.
David Wright: Okay, yeah. What we’ll do is we’ll take three questions, and then we’ll come to the panel. So,
Audience: thank you. Thank you. My name is Catherine Bielek. I’m an infectious disease physician at Harvard Medical School. And not to add another layer of complexity to this, but I certainly wonder if public health and pandemic response might balance this a little bit as well. There are some lessons I think that we can pull from how we navigated the COVID-19 pandemic in terms of data privacy, security, and public health and safety. There’s perhaps a little bit simplified, but when we did contact tracing for the COVID-19 pandemic, people can give up their own right, their qualified right to privacy. They can volunteer that information. And then certainly, how much is surveilled, it does not necessarily dictate how much data is kept, how it’s kept and where it’s kept. I think that is important too for other pandemics or syndemics, which are overlapping pandemics, especially as related to HIV, which is my area, can carry a lot of stigma or criminalization laws. So, when that information is kept, there is surveillance related to that. But in the United States, that’s kept in a secure encrypted facility at a state health department, for instance. And the amount that you surveil is not necessarily proportional to the amount that you keep. So, my question perhaps is related to how these lessons might apply to this discussion
David Wright: in other areas as well. Online, I’m seeing a lot of interaction, a lot of compliments for the speakers as well. I’ve got a question here from Cheryl. Does balancing necessarily mean we need to rank rights and risks to properly weigh them against each other? If not, how do we begin an objective comprehension? comprehensive review. If so, how do we do this on a global level? There was another question which basically comes up to, can somebody please clear the air because there’s a lot of misinformation, a lot of discussion where there’s also a lot of fake news in there. For example, when we are talking about privacy versus child protection, how can we, is it true that if we want to go towards child protection that we are giving up on privacy, I think there’s a lot of questions there to solve that one. Nils, thank you. Okay, so those three questions addressed to the panel. I’m gonna go to Andrew first.
Andrew Campling: So let me have a go at two of those, but briefly. Firstly, on the weakening of encryption question, I would argue, and I’ll be as precise as I can without hopefully getting too detailed, that specifically to detect known child sex abuse material, that needs to have no impact whatsoever on encryption. Again, to expand that ever so slightly, if in the end-to-end messaging applications, if they agreed to scan any images before they were uploaded to see if they contained known CSAM and then encrypt, there are no privacy implications to that because you don’t learn what the image is. You simply learn that it isn’t known CSAM by something called hash matching. Matching, for those of you that have knowledge in that area, you don’t need to look at the content of the message either. So you’re simply saying, does this image in a mathematical sense match a database of known CSAM? So that doesn’t, in my opinion, have any privacy implications, unless there’s a match. And if there’s a match, then you’ve committed a crime and you’re. qualified right to privacy is surrendered anyway. So that’s fine. And then just briefly the other point on the sort of, I think the question was ranking or trading off different rights. Yes, and I would always say that if you have to trade rights, you ought to bias towards the most vulnerable in society. And at the moment, in my opinion, the weaponization of privacy is largely benefiting privileged adults at the expense of lots of different vulnerable groups. And that is an unacceptable trade off. So if we have to make trade offs, we should bias the vulnerable or advantage the vulnerable, not the privileged. That’s the wrong way around, in my view. Andrew, thank you. Thank you very much. Arnaud, if I can just bring you in here.
David Wright: Yes, thank you. Thank you to Boris about Mission Impossible or Mission Possible. I like it,
Taddei Arnaud: but I had to be provocative, of course. No, I think to come back on the issue, it’s a real design problem in the sense of theory of design. And I really like the previous intervention, I could not capture the name of the person who made the analogy with the COVID-19 learnings. That’s exactly what we should do. We should learn from others and other areas where they have resolved the problem. Because sometimes there is a lot of hype about issues of, and this is not private and the data are not secure and people are losing their rights and so on. There are areas where no. The problem of why the Mission Impossible is because of something else. The problem that is underlying behind the scene is the fact that very few people realize that security cannot be proven. At the low level, altitude level on encryption, yes, you can perhaps prove mathematically some cryptography and other things. But the moment you elevate the you lose the possibility to prove that your system is secure. And if you ask anybody about is whatever control I put is secure can I trust it the answer is fundamentally no. You cannot. And that’s the problem of who guards the guards behind the scene and there is no way or I could not find a way that we can resolve the problem. In fact it’s not even the problem. The problem is not that that this is impossible per se. The problem is that few people missed that point. So when security discusses privacy it’s an unequal battle because security has little to offer to the privacy side. So we are we are turning into circle with some people trying to split us for dogmatic reasons versus let’s recognize things how they are and let’s be pragmatic and let’s recognize the design problem. If we get it back to a design problem we would then include back the ethics, the anthropology, the experience, the law. We could do something about it. That’s where the mission becomes possible. So one approach of breaking the problem in pieces of getting a subcontext could back to the question should we have different weight between people. Of course not. That would be terrible. If we end up in a place where we have to put priorities of some humans versus some others I don’t think that we have done the job. It has to be equal. All humans should be respected in this story. So if we could take a step back, understand perhaps from others, I really like the example of the COVID-19 and regroup all the possibilities with a strategy of perhaps divide to conquer so that we can split the technical design and open the options. So I believe the risk of doing this is more a problem of is going to impact significantly the way that we have built and developed our entire Internet at the moment, from the browsers to the CDNs, the servers, to all things, because now we would need to represent a richer human. I think the problem is that today the underlying problem is that the human model behind this whole design is very, very, very narrow. And we are locked. We can’t do anything, because if we help one now, we lose properties for the other one. So what if we would re-enrich the model behind the scene? How many possibilities we would create? That would be something I could propose.
David Wright: Anna, thank you very much. Boris?
Boris Radanovic: Thank you. I’ll try my best to cover it all. Thank you so much for the questions, and I’ll come to the first one, and Cynthia, bringing child learning protection to the highest levels. I full-heartedly support you and Ben at SD17, and if I can do anything as well to support more to keep this, and this is one of the places to do it, absolutely. Yes, I love the idea of using COVID learnings, especially about volunteering rights and seeing how that works in a different space and impact. I think, as well with Arnaud, I would be interested to see how that works. And on the question of how do we balance that risk in the global, I think that is the challenge. That is the biggest challenge that we have to do, but I have to agree with Arnaud. We cannot be the ones that have conflicting things and have to decide one from the other, which brings me to the point that I want to make, with no disrespect to the person asking that question. We should utterly reject the framework of conversation of having privacy versus security. And if we reject it, I’ll just remind everybody that most of us flew to this wonderful country, and what if 90% of our flights had 90% of a chance to land in Ankara, maybe in Zagreb, maybe in London? None of us would take that option or those odds. So let’s reject the framework of conversation of privacy versus security and focus on the title. that is privacy and security. They are solutions, they are ways we can achieve that. They might be difficult, they might be hard, and I’ll come back, what is easy and what is right. And I think we should, to answer the question of the online speaker, absolutely detest and reject the notion that that is the discussion we are having. None of us wanna go back on our privacy, but none of us as well wanna see the thing that as well are not mentioned, that we cannot fully trust that any system is secure. But what I can tell you that we know and have referenced and have research and evidence that they are currently unintended consequences. They are doing harm to the often unheard, unseen and unsupported people and young children across the world. So I go back and ask the question, what is right and what is easy, and let’s start doing the right thing, even though, and I still hope are not, we will find the mission quite possible in the end, and maybe laugh at this in one day, but I am worried about quantum computing and making this whole discussion basically pointless,
David Wright: but yes. I feel that’s an entirely different workshop. Okay, Afnan, do you wanna go first? I know I’ll come to you next.
Alromi Afnan : Thank you, I’ll make it very short. I just want to thank the floor for bringing those great questions. And just to come back to the pandemic and the COVID, I think one aspect that from the lesson learned brought up from this is having public awareness. I think this is big part of it that you have a right, and it’s part of the online safety that you should be granted to have since the pandemic made us all become most of our time remote. So I think part of the lesson learned here is awareness of the public community and what is right for them and what they can subscribe or work to. So this is just a comment, thank you.
David Wright: Thank you, Afnan. Arnaud?
Taddei Arnaud: Yes, very quick to come back on something I need to probably. re-qualify a bit. When I point to the fact that we cannot trust security, let’s say, I totally agree with you, Boyd. It’s exactly where I want us to go. We need to stop this debate about privacy versus security. And the fact is that at the moment security cannot be trusted and we have nothing to offer to privacy. I see it as an opportunity. Now, to come back on the person that took the analogy with health, this is exactly the same thing. We forget that in the real world your immune system has defects. You will miss a virus, you will have an auto-immune disease. Can I trust my security by design? No. And that’s why we created the health system. But can I trust the health system? Absolutely not either. If the surgeon makes a mistake, I die. If I take too many medicines, I die. So the point is that it is a paradox. But I would like people to consider it as a positive paradox, that if we would precisely heal our security and privacy people together, let’s do something about it. And then we can re-establish perhaps a new approach that could be fruitful for not prioritising humans versus each other, and on the contrary having the right design for each of our different contexts. And they can evolve over time from when people are children up to when they are elderly. That’s it. Thank you. Anna, thank you very much. Honey?
Makola Honey: Yes, thank you. And I just want to echo what my colleagues were saying about the very purpose of this workshop, right? We are here to find balance, so it’s not necessarily weighing one against the other. But the letter part of it is that we are here to find balance, and we are here to find part, the last part of the question asking how do we begin an objective, comprehensive review and how do we do that on a global scale? In my opinion, there is a need for a global body responsible for the global framework and at the moment we have at regional level, the African Telecommunication Union and other regional bodies, but internationally we have the International Telecommunication Union and I think those international bodies then have a responsibility to ensure that they become neutral conveners of the differing stakeholders with their differing viewpoints so that there can be an unrestricted dialogue regarding finding the balance in the solution because ignoring any of the views, whether extreme or on the extreme side or on the other without really looking at the matter and really discussing the situation can oversimplify the issue of encryption and that’s not what we want. So I think that to answer the question, bodies such as the international
David Wright: regulatory bodies are very important in creating that space for that dialogue. Okay, thank you very much and hopefully in terms of those questions posed, there was a suitable and adequate responses to that. We have just a little over five minutes left. Are there any other questions that anybody has? Any more questions online? Okay, I’m going to perhaps then, this question could be one to close us with given, say, just a few minutes left, and it’s one about public awareness. Public understanding of encryption is often limited. I think we’ve kind of heard about that. How can stakeholders better educate citizens, everybody, about the impacts of encryption on privacy and public safety? Who wants to take that? I’ll try and shorten it. Again, it’s about the same word I use is education,
Boris Radanovic: but more so to the fact is adaptable education, because different levels and different capabilities of people need to understand this topic from a different way. I think somebody much, much smarter than me said, if you cannot explain your topic in five minutes to a five-year-old, you might not be an expert in the topic. We need to find sensitive local environments to expand on the topics that are way too complex for the smartest people in the world. That either means that we don’t understand the field well enough, or don’t we have the right people to expand that. Yes, awareness campaigns. Yes, stakeholders who genuinely take effort of educating the people in the right way without the agenda leaning left or right is important. Having a body that can assess that and tell us who is doing it better or worse and kind of being inspired about it. We have been doing decades of awareness-raising from child sexual abuse, for intimate image abuse, for work of child online protection in general. SWGFL alone is 25 years old next year, so we know the principles that can build that and do that, but all of those principles fall on education. Sometimes it took us a decade to educate a whole nation of the importance of why do we need to do one thing or another. It will take us time, so the short answer is to educate the general public and raise their awareness. We need right people in the right place educating them and allow for some time to pass so we can do that on a global scale or a much larger scale. I hope that answers the question enough.
David Wright: That’s a good go, Boris. Andrew? Yeah, so to…
Andrew Campling: I think I would start by being less ambitious, and dare I say, repeating a point I made earlier, where a lot of the decisions about encryption are made are not here, they’re in some of the standards development organisations such as the ITF, I’m active in the Internet Engineering Task Force, which makes design choices about the underlying internet standards. What I think we need to do is to get people from the groups that are here, at least some of them, to engage over there, so civil society groups, governments, regulators, others who have got sufficient technical knowledge to engage in the standards bodies need to attend and pay attention to what is happening there, and the implications for some of the decisions being taken, because otherwise I think what we risk is developing internet standards which create societal problems, not because the people behind the standards are bad or evil, but because they don’t have the necessary knowledge. So dare I say it, and it’s probably appropriate to finish on the point here, multi-stakeholder approach, that is the way forward, and then obviously through our different communities we can then spread the message backwards into the others that we engage with, but I’d like to get, at least introduce some element of the multi-stakeholder approach into the technical bodies first, and then work backwards.
David Wright: Just before you put the microphone down there, Andrew, so can you give us an example about one of those underlying technology changes that may well have an impact, and how would that, what would that look like, just in case, assuming that not everybody has a technical understanding about, so a real life case example. Okay, and again I’ll keep this hopefully at a high level, so some of the current changes
Andrew Campling: that have been made in the underlying standards, something called Encrypted Client Hello, for example, will make it increasingly difficult for parental controls to work. So for those of you that rely on parental control, to stop your children being able to access adult type content or indeed in schools similarly use the same sort of controls potentially those that those systems will stop working not because you’ve stopped using them but because the underlying technology has changed so that will be an example where because there’s a lot of multi-stakeholder discussion it’s being overlooked so
David Wright: that’s why we need that multi-stakeholder approach so I guess there’s a there’s a point for everybody here both in the room and online as well that sounds a bit of a call to action to that if you if you weren’t aware of or indeed if you have the opportunity to engage with the IETF and have as Andrew said that level of technical understanding then please do please go and understand you know how your browser and the internet is being designed in terms of some of those those those standards and perhaps the unintended consequences that you may well see within white well perhaps white while you’re here but those views as has been widely said I think there’s this multi-stakeholder approach a really important aspect so there’s a call to action for everybody about the IETF so we have literally a couple of minutes left and I am going to just have a look around if there are any particular closing remarks anyone may well have amongst the panelists I would go around but we don’t have time to do that if anyone has any concluding concluding remark okay thank you very much so in 60 seconds we clearly did cover a lot of subjects here and I’ve got already pages of notes I think so definitely Andrew the the term that was really opened the responses about the weaponization of the weaponization of privacy to override and impact children’s safety was a bold statement to open. I think we’ve very much heard as well that this is not a privacy against security, this is a privacy and security is one thing that I think we’ve come across as well. This does require multi-stakeholder, this is not one-dimensional and it does require all of us to get involved so that the output is going to be reflective of that multi-stakeholder contribution as well. But I will finish with this is not mission impossible. Arno, Boris, this is mission possible. I think we’ve concluded with seeing the way through. So in that regard, honey, we have found a particular solution at this particular workshop. So thank you very much, everybody, for those questions as well. It’s a real pleasure to be able to moderate this panel of such amazing, esteemed and really world-leading experts here as well. So I would like you to join me in thanking them for their contribution just as we close out. Thank you very much. Thank you.
Andrew Campling
Speech speed
150 words per minute
Speech length
1324 words
Speech time
528 seconds
Encryption should not override other human rights
Explanation
Andrew Campling argues that privacy rights are being weaponized to override other human rights, particularly those of children and vulnerable groups. He emphasizes that privacy is a qualified right and should not take precedence over fundamental rights.
Evidence
The Internet Watch Foundation records approximately 100 million reports of CSAM images and videos every year, with roughly three new images being found every second.
Major Discussion Point
Balancing Encryption, Privacy and Public Safety
Agreed with
Taddei Arnaud
Boris Radanovic
Agreed on
Balancing privacy and security
Differed with
Boris Radanovic
Differed on
Approach to balancing encryption, privacy, and public safety
Client-side scanning for known CSAM images
Explanation
Andrew Campling proposes client-side scanning for known CSAM images as a solution that doesn’t break encryption or privacy. He argues that this approach would immediately reduce the scale of the problem without compromising user privacy.
Evidence
End-to-end encrypted messaging platforms are widely used to find and share CSAM, based on research with a large sample size.
Major Discussion Point
Technical Solutions and Innovations
Differed with
Taddei Arnaud
Differed on
Approach to technical solutions
Civil society groups should engage in technical standards bodies
Explanation
Andrew Campling suggests that civil society groups, governments, and regulators with sufficient technical knowledge should engage in standards development organizations like the IETF. This engagement is necessary to prevent the development of internet standards that create societal problems.
Evidence
Changes in underlying standards, such as Encrypted Client Hello, can make it increasingly difficult for parental controls to work.
Major Discussion Point
Public Awareness and Education
Agreed with
Taddei Arnaud
Makola Honey
Boris Radanovic
Agreed on
Need for multi-stakeholder approach
Taddei Arnaud
Speech speed
148 words per minute
Speech length
1394 words
Speech time
565 seconds
Need to consider sub-contexts with different requirements
Explanation
Taddei Arnaud proposes considering sub-contexts with specific needs, such as child protection, education, and elderly care. He suggests that this approach could help in making better design choices and trade-offs for specific use cases and requirements.
Evidence
Examples of sub-contexts include enterprise browsers for specific enterprise use cases and family solutions by hyperscalers.
Major Discussion Point
Balancing Encryption, Privacy and Public Safety
Agreed with
Andrew Campling
Makola Honey
Boris Radanovic
Agreed on
Need for multi-stakeholder approach
Consider sub-contexts like child protection, education
Explanation
Taddei Arnaud emphasizes the importance of considering different sub-contexts when designing technical solutions. He suggests that this approach could help address specific requirements for different groups, such as children or the elderly.
Major Discussion Point
Technical Solutions and Innovations
Agreed with
Andrew Campling
Boris Radanovic
Agreed on
Balancing privacy and security
Differed with
Andrew Campling
Differed on
Approach to technical solutions
Learn from health/pandemic response models
Explanation
Taddei Arnaud draws parallels between cybersecurity and health systems, highlighting that both have inherent imperfections. He suggests learning from health system models to develop a new approach that balances security and privacy needs.
Evidence
Examples of imperfections in health systems, such as the possibility of surgeon errors or adverse effects of medicines.
Major Discussion Point
Public Awareness and Education
Makola Honey
Speech speed
131 words per minute
Speech length
1097 words
Speech time
499 seconds
International collaboration can help find common ground
Explanation
Makola Honey argues that international collaboration can facilitate dialogue between nations with different stances on encryption. This collaboration can lead to the establishment of harmonized legal and technical standards.
Evidence
The work of the CG Correspondence Group on Travel and Protection of the ITU Study Group 17 in researching privacy-preserving technologies.
Major Discussion Point
Balancing Encryption, Privacy and Public Safety
Agreed with
Andrew Campling
Taddei Arnaud
Boris Radanovic
Agreed on
Need for multi-stakeholder approach
Research privacy-preserving technologies
Explanation
Makola Honey suggests that international collaboration can lead to innovative solutions, such as research into privacy-preserving technologies. This approach aims to balance privacy concerns with other needs, such as child protection.
Evidence
The work of the CG Correspondence Group on Child Protection of the ITU Study Group 17.
Major Discussion Point
Technical Solutions and Innovations
International bodies can convene neutral dialogues
Explanation
Makola Honey emphasizes the role of international bodies like the International Telecommunication Union in convening neutral dialogues between differing stakeholders. She argues that these bodies have a responsibility to ensure unrestricted dialogue to find balanced solutions.
Major Discussion Point
Public Awareness and Education
Alromi Afnan
Speech speed
0 words per minute
Speech length
0 words
Speech time
1 seconds
Global inconsistency in laws creates challenges
Explanation
Alromi Afnan points out that global inconsistency in laws and regulations creates challenges for balancing encryption, privacy, and public safety. This inconsistency makes it difficult for international companies to comply with varied legal standards across different countries.
Major Discussion Point
Balancing Encryption, Privacy and Public Safety
Address challenges of quantum computing
Explanation
Alromi Afnan highlights the challenge posed by evolving technologies, particularly quantum computing. She argues that quantum computing may break current encryption standards, necessitating a transition to secure quantum infrastructure.
Evidence
Discussions in SG17 about emerging and evolving technologies, including quantum computing.
Major Discussion Point
Technical Solutions and Innovations
Public awareness is key part of online safety
Explanation
Alromi Afnan emphasizes the importance of public awareness in online safety. She argues that awareness is a crucial aspect of the right to online safety, especially in the context of increased remote activities since the pandemic.
Evidence
Lessons learned from the COVID-19 pandemic about the importance of public awareness.
Major Discussion Point
Public Awareness and Education
Boris Radanovic
Speech speed
0 words per minute
Speech length
0 words
Speech time
1 seconds
Should reject framing of privacy vs security
Explanation
Boris Radanovic argues for rejecting the framework of conversation that pits privacy against security. He emphasizes the need to focus on achieving both privacy and security, rather than treating them as mutually exclusive.
Major Discussion Point
Balancing Encryption, Privacy and Public Safety
Agreed with
Andrew Campling
Taddei Arnaud
Agreed on
Balancing privacy and security
Differed with
Andrew Campling
Differed on
Approach to balancing encryption, privacy, and public safety
Need adaptable education for different capabilities
Explanation
Boris Radanovic emphasizes the need for adaptable education to help different groups understand the complex topic of encryption and its impacts. He argues that education should be tailored to different levels of capability and understanding.
Evidence
SWGFL’s 25 years of experience in awareness-raising for child sexual abuse, intimate image abuse, and child online protection.
Major Discussion Point
Public Awareness and Education
Agreed with
Andrew Campling
Taddei Arnaud
Makola Honey
Agreed on
Need for multi-stakeholder approach
Audience
Speech speed
164 words per minute
Speech length
529 words
Speech time
192 seconds
Use COVID-19 pandemic response as model
Explanation
An audience member suggests using lessons learned from the COVID-19 pandemic response as a model for balancing privacy and public safety in the context of encryption. This approach could provide insights into managing data privacy and security while addressing public health and safety concerns.
Evidence
Examples of contact tracing during the COVID-19 pandemic, where people could voluntarily give up their right to privacy for public health purposes.
Major Discussion Point
Technical Solutions and Innovations
Agreements
Agreement Points
Need for multi-stakeholder approach
Andrew Campling
Taddei Arnaud
Makola Honey
Boris Radanovic
Civil society groups should engage in technical standards bodies
Need to consider sub-contexts with different requirements
International collaboration can help find common ground
Need adaptable education for different capabilities
Speakers agreed on the importance of involving various stakeholders in discussions and decision-making processes related to encryption, privacy, and public safety.
Balancing privacy and security
Andrew Campling
Taddei Arnaud
Boris Radanovic
Encryption should not override other human rights
Consider sub-contexts like child protection, education
Should reject framing of privacy vs security
Speakers emphasized the need to balance privacy rights with other important considerations such as public safety and child protection, rather than treating them as mutually exclusive.
Similar Viewpoints
Both speakers highlighted the need for technical solutions to address specific challenges in balancing encryption, privacy, and public safety.
Andrew Campling
Alromi Afnan
Client-side scanning for known CSAM images
Address challenges of quantum computing
Both suggested learning from health and pandemic response models to inform approaches to balancing privacy and security in the context of encryption.
Taddei Arnaud
Audience
Learn from health/pandemic response models
Use COVID-19 pandemic response as model
Unexpected Consensus
Importance of public awareness and education
Boris Radanovic
Alromi Afnan
Need adaptable education for different capabilities
Public awareness is key part of online safety
Despite coming from different backgrounds, both speakers emphasized the critical role of public awareness and education in addressing encryption and online safety challenges.
Overall Assessment
Summary
The main areas of agreement included the need for a multi-stakeholder approach, balancing privacy with other rights and considerations, and the importance of technical solutions and public education.
Consensus level
Moderate consensus was observed among speakers on the need for balanced approaches and multi-stakeholder involvement. This implies a recognition of the complexity of the issue and the need for collaborative efforts in addressing encryption, privacy, and public safety challenges.
Differences
Different Viewpoints
Approach to balancing encryption, privacy, and public safety
Andrew Campling
Boris Radanovic
Encryption should not override other human rights
Should reject framing of privacy vs security
Andrew Campling argues that privacy rights are being weaponized to override other human rights, particularly those of children, while Boris Radanovic emphasizes the need to focus on achieving both privacy and security rather than treating them as mutually exclusive.
Approach to technical solutions
Andrew Campling
Taddei Arnaud
Client-side scanning for known CSAM images
Consider sub-contexts like child protection, education
Andrew Campling proposes specific technical solutions like client-side scanning, while Taddei Arnaud suggests a more context-based approach considering different requirements for various groups.
Unexpected Differences
Framing of the encryption debate
Andrew Campling
Boris Radanovic
Encryption should not override other human rights
Should reject framing of privacy vs security
While both speakers are concerned with balancing various rights and interests, their framing of the issue is unexpectedly different. Andrew Campling’s approach of prioritizing certain rights over others contrasts with Boris Radanovic’s rejection of the privacy vs. security framing altogether.
Overall Assessment
summary
The main areas of disagreement revolve around the approach to balancing encryption, privacy, and public safety, as well as the specific technical and policy solutions proposed.
difference_level
The level of disagreement among the speakers is moderate. While there is a general consensus on the importance of addressing the issue, there are significant differences in the proposed approaches and solutions. These differences reflect the complexity of the topic and the need for continued multi-stakeholder dialogue to find effective and balanced solutions.
Partial Agreements
Partial Agreements
All speakers agree on the need for broader engagement and collaboration, but they differ in their specific approaches. Andrew Campling focuses on engaging with technical standards bodies, Makola Honey emphasizes international collaboration, and Boris Radanovic stresses the importance of adaptable education.
Andrew Campling
Makola Honey
Boris Radanovic
Civil society groups should engage in technical standards bodies
International collaboration can help find common ground
Need adaptable education for different capabilities
Similar Viewpoints
Both speakers highlighted the need for technical solutions to address specific challenges in balancing encryption, privacy, and public safety.
Andrew Campling
Alromi Afnan
Client-side scanning for known CSAM images
Address challenges of quantum computing
Both suggested learning from health and pandemic response models to inform approaches to balancing privacy and security in the context of encryption.
Taddei Arnaud
Audience
Learn from health/pandemic response models
Use COVID-19 pandemic response as model
Takeaways
Key Takeaways
The discussion should focus on balancing encryption, privacy and public safety rather than pitting them against each other
A multi-stakeholder approach involving diverse perspectives is crucial for addressing these complex issues
Technical solutions like client-side scanning for CSAM could help balance privacy and safety
International collaboration and common standards are needed, while accounting for local contexts
Public education and awareness about encryption impacts is important but challenging
Resolutions and Action Items
Stakeholders should engage with technical standards bodies like IETF to provide input on encryption standards
International bodies like ITU should convene neutral dialogues to find balanced solutions
More research is needed into privacy-preserving technologies that also enable child protection
Unresolved Issues
How to develop global standards that balance needs of different regions and technical capabilities
How to effectively educate the public about complex encryption issues
How to address challenges posed by emerging technologies like quantum computing
How to resolve conflicts between different legal/regulatory frameworks across countries
Suggested Compromises
Consider sub-contexts (e.g. child protection, education) with different encryption requirements rather than one-size-fits-all approach
Use client-side scanning for known CSAM images before encryption to balance privacy and safety
Learn from pandemic response models on balancing individual privacy and public health needs
Thought Provoking Comments
In my view the weaponization of privacy is being used and has been and is continuing to be used to override all of the human rights of children and other vulnerable groups and I think that’s a fundamental problem.
speaker
Andrew Campling
reason
This comment challenges the common narrative around privacy and frames it as potentially harmful to vulnerable groups, particularly children. It introduces a provocative perspective that privacy rights may be overemphasized at the expense of other human rights.
impact
This comment set the tone for much of the subsequent discussion, prompting other participants to consider the balance between privacy and other rights, particularly child protection. It led to a deeper examination of the trade-offs involved in encryption policies.
Client-side scanning for known CSAM images would immediately reduce the size of the problem. It doesn’t break encryption, it doesn’t break privacy, so that’s an easy way to make an impact.
speaker
Andrew Campling
reason
This comment offers a specific technical solution to address child sexual abuse material (CSAM) without compromising encryption or privacy. It provides a concrete example of how technology could be used to balance competing interests.
impact
This suggestion sparked further discussion about technical solutions and their potential impacts. It shifted the conversation from abstract principles to practical implementations.
For other areas we started to realize that maybe the problem is the fact that we have in the background of that the anthropological assumption that was made behind was one model for all the humans. That means a very narrow model for all the humans to make sure it fits the maximum.
speaker
Taddei Arnaud
reason
This comment introduces the idea that the current approach to internet design may be based on an overly simplistic model of human needs and behaviors. It suggests that a more nuanced approach might be necessary.
impact
This perspective broadened the discussion beyond technical solutions to consider the underlying assumptions of internet architecture. It led to considerations of how to design systems that can accommodate diverse needs and contexts.
We should utterly reject the framework of conversation of having privacy versus security. And if we reject it, I’ll just remind everybody that most of us flew to this wonderful country, and what if 90% of our flights had 90% of a chance to land in Ankara, maybe in Zagreb, maybe in London? None of us would take that option or those odds.
speaker
Boris Radanovic
reason
This comment challenges the framing of the debate as a trade-off between privacy and security. It uses a vivid analogy to illustrate why this framing is problematic and unacceptable.
impact
This reframing of the issue shifted the discussion away from seeing privacy and security as opposing forces. It encouraged participants to think about how to achieve both simultaneously rather than trading one for the other.
What I think we need to do is to get people from the groups that are here, at least some of them, to engage over there, so civil society groups, governments, regulators, others who have got sufficient technical knowledge to engage in the standards bodies need to attend and pay attention to what is happening there, and the implications for some of the decisions being taken.
speaker
Andrew Campling
reason
This comment highlights the importance of multi-stakeholder engagement in technical standards development. It points out a gap in current processes where important societal implications may be overlooked.
impact
This suggestion provided a concrete action item for participants and shifted the discussion towards practical steps for improving the decision-making process around internet standards and encryption policies.
Overall Assessment
These key comments shaped the discussion by challenging common assumptions, introducing new perspectives, and shifting the focus from abstract principles to practical solutions. They encouraged a more nuanced understanding of the complex interplay between privacy, security, and other human rights, particularly in relation to child protection. The discussion moved from identifying problems to proposing solutions, with an emphasis on multi-stakeholder engagement and the need for more diverse representation in technical decision-making processes. The overall tone shifted from seeing encryption as a binary choice between privacy and security to exploring ways to achieve both simultaneously.
Follow-up Questions
How can we develop global standards with local sensitivities that respect diverse needs and capabilities?
speaker
Boris Radanovic
explanation
This is important to ensure that encryption and privacy standards can be applied effectively across different countries and contexts while respecting local needs.
How can we address the challenge of protecting against online abuse while maintaining encryption for secure communication?
speaker
Alromi Afnan
explanation
This is crucial for balancing the need for privacy and security with the protection of vulnerable groups, especially children.
How can we apply lessons from COVID-19 pandemic response to balance data privacy, security, and public health/safety in other contexts?
speaker
Catherine Bielek (audience member)
explanation
Learning from past experiences in managing sensitive data during a crisis could inform approaches to balancing privacy and security in other areas.
How can we objectively rank rights and risks on a global level to properly weigh them against each other?
speaker
Cheryl (online participant)
explanation
This is important for developing a framework to address conflicts between different rights and risks in encryption policies.
How can we better educate citizens about the impacts of encryption on privacy and public safety?
speaker
David Wright (moderator)
explanation
Improving public understanding of encryption is crucial for informed debate and policy-making on these issues.
How can we ensure multi-stakeholder engagement in technical standards bodies like the Internet Engineering Task Force (IETF)?
speaker
Andrew Campling
explanation
This is important to ensure that societal implications are considered when developing internet standards that affect encryption and privacy.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
Internet Governance Forum 2024
15 Dec 2024 06:30h - 19 Dec 2024 13:30h
Riyadh, Saudi Arabia and online