Encryption’s Critical Role in Safeguarding Human Rights | IGF 2023 WS #356

10 Oct 2023 00:00h - 00:30h UTC

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Sharon Polsky

Encryption plays a vital role in maintaining confidentiality and privacy across various sectors, including law and healthcare. Lawyers, doctors, and policymakers rely on encryption to safeguard sensitive information and prevent unauthorized access. The positive sentiment towards encryption is driven by its crucial role in protecting client confidentiality and patient privacy. Encrypted communications ensure secure strategy discussions for lawmakers and policy analysts.

Another important argument supporting encryption is the universal need for privacy, whether it is for personal, business, or national security purposes. Encryption is viewed as a fundamental tool that helps individuals protect their privacy. This positive sentiment emphasizes the significance of maintaining encryption as a fundamental aspect of upholding privacy rights.

However, there are concerns about the lack of understanding of technology implications in legislation. The negative sentiment surrounding this issue suggests that uninformed legislation can lead to unintended consequences. Many people use technology without fully comprehending its intricacies, including lawmakers and policymakers. This lack of education and understanding poses challenges in creating effective and well-informed legislation.

Child protection is a pressing concern, but the negative sentiment arises from the concern that laws aimed at protecting children through breaking encryption could result in harmful consequences. These laws might create vulnerabilities in encryption, potentially endangering everyone’s privacy. Furthermore, it is argued that children reporting abuse might be mistakenly flagged as suspects if encryption is compromised.

The need for lawmakers to have a correct understanding of the technology they regulate is highlighted. The negative sentiment stems from the observation that many current lawmakers lack a thorough understanding of encryption technology. Some members of parliament come from non-technical backgrounds, which hinders their ability to comprehend the nuanced aspects of encryption.

Canadian legislation, such as Bill C-18 and Bill C-26, has raised concerns about potential infringements on privacy and freedom. Negative sentiment arises from the observation that these laws allow Canada to govern internet content globally. This broad regulatory reach can undermine privacy and freedom, raising questions about the potential overreach of government intervention.

It is also worth noting that other platforms outside Canada may face challenges in complying with Canadian legislation, as they are not bound by the Canadian Charter that protects individuals against government overreach. This negative sentiment stems from the observation that foreign companies are forced to execute censorship measures, which may conflict with their existing policies and obligations.

The importance of regulators having a proper understanding of what they regulate is emphasized. This positive sentiment highlights the criticalness of regulating technology with a thorough understanding of its impacts and consequences. Sharon Polsky’s argument supports the need for regulators to possess comprehensive knowledge of the technologies they oversee.

Education is proposed as a long-term solution to bridge the gap in understanding technology implications. The positive sentiment suggests that starting from the youngest grades, education should include topics like laws, political structures, and critical decision-making related to technology. This approach aims to equip future generations with the knowledge to create effective legislation and understand the potential risks associated with technology.

Tech companies are criticized for prioritizing shareholder returns over user privacy. The negative sentiment arises from the observation that corporations primarily focus on maximizing profits for shareholders. The promise to prioritize user privacy is viewed as unreliable, as companies are seen as bound to eventually fail in protecting user privacy.

There is a growing awareness among the general public about the monetization of personal information. This positive sentiment suggests that people have become increasingly frustrated with seeing their personal information being used for financial gain. The expectation is that individuals should have control over their personal information and how it is used.

In conclusion, encryption is seen as an essential tool for maintaining confidentiality and privacy in various sectors, but there are concerns about the lack of understanding of technology implications in legislation. The legislation aimed at protecting children through breaking encryption has raised concerns about potential unintended consequences. Education is proposed as a long-term solution, and there is an increasing focus on the need for regulators and policymakers to possess a comprehensive understanding of technology. Tech companies are criticized for prioritizing shareholder returns over user privacy, and individuals are becoming more aware of the monetization of their personal information. The expectation is that companies will have to adapt their practices to meet the demand for better privacy control.

Rand Hammoud

Encryption is widely regarded as crucial for ensuring online security, safety, and trust. It plays a vital role in safeguarding human rights by providing a secure means of communication and organization for activists, lawyers, and human rights defenders. These individuals rely on encryption to protect their freedom of expression and assembly.

However, concerns have been raised regarding the vulnerability of encryption to exploitation by the surveillance industry. It has been argued that these vulnerabilities are harnessed by a billion-dollar surveillance industry, leading to human rights abuses such as enforced disappearances and extrajudicial killings. Such abuses pose significant risks to activists, undermining their ability to protect their rights.

Governments often assert that undermining encryption is necessary for national security. However, there is widespread fear that such actions would make surveillance cheaper and easier, potentially resulting in privacy infringements. There is growing use of spyware against human rights activists and journalists, highlighting the urgency to ban spyware vendors and technologies associated with human rights abuses. Spyware is unregulated and unchecked, and despite the existence of legal frameworks branding surveillance as illegitimate, it continues to be used.

Existing international standards already render surveillance capabilities invasive. The argument that law enforcement requires spyware to maintain national security and safety is contested, as there is no evidence to support its effectiveness in these areas. On the contrary, there is ample evidence indicating that spyware infringes upon individuals’ rights and diminishes their safety.

Undermining encryption is tantamount to assuming everyone is guilty until proven innocent, fundamentally contradicting the existing surveillance system. This highlights the need for an international framework to define surveillance and encryption. Such a framework should be aligned with the spirit of existing rights protections, fostering greater accountability and transparency.

However, advocating for the use of surveillance technologies in autocratic governments presents its own challenges. Limited advocacy avenues and the difficulty in implementing rights-respecting frameworks in such contexts hinder progress in this area. On a contrasting note, economic arguments can be employed to protect the economic advantages of certain companies.

Overall, there is a pressing need for a more comprehensive, global, and international framework governing the use of surveillance technologies. Given the borderless nature of technology, jurisdiction-dependent regulations are inadequate. By establishing clear guidelines and regulations, a more balanced and accountable approach can be adopted, ensuring the protection of human rights and promoting global security.

In conclusion, encryption is integral to online security and the protection of human rights. However, the vulnerabilities of encryption and the misuse of surveillance technologies pose significant risks to individuals and their rights. Upholding encryption and establishing a global, rights-based framework for surveillance technologies are crucial steps to safeguarding privacy, enhancing accountability, and preserving fundamental rights in the digital age.

Tate Ryan-Mosley

This analysis explores various arguments regarding end-to-end encryption and backdoor access. Advocates emphasise the importance of end-to-end encryption in ensuring internet security, particularly in messaging apps like Signal, Telegram, and WhatsApp. These apps employ end-to-end encryption to safeguard user data, ensuring that only the intended recipients can access and decipher messages. Notably, tech companies that create such encrypted apps do not possess decryption keys, enhancing their security.

On the contrary, critics argue that creating a backdoor to encryption would compromise its security. They contend that establishing a master key or any form of backdoor access would be challenging to control, potentially enabling misuse by bad actors or governments. Tech companies vehemently oppose compromising encryption security, as weakening it could have significant implications for user privacy and data protection.

The United Nations (UN) supports strong encryption and concurs with those who assert that encryption backdoors contravene freedom of expression. The UN underscores the imperative nature of robust encryption to enable human rights advocates and journalists to function securely, preserving confidentiality and security.

Lawmakers are currently grappling with the task of addressing harmful online content moderation while maintaining encryption security. They are deliberating ways to gain access to secure communication channels, particularly given the increasing migration of internet users to private platforms like messaging apps. This shift has made monitoring and preventing the dissemination of abusive or harmful information more challenging.

Furthermore, it is essential for lawmakers to possess accurate knowledge of technology to prevent unintended consequences in their legislation. A pertinent example is the scrutiny of the UK online safety bill and similar legislation in Canada, which may inadvertently compromise encryption in an effort to safeguard children. Concerns have been raised that such well-intentioned legislation could endanger everyone, including children, by enabling unauthorized access through encryption backdoors.

Alongside discussions on encryption and backdoor access, the analysis highlights the media’s coverage of non-Western countries. It argues that the press should strive for better representation and reporting of international stories, acknowledging issues such as biases and racism that can influence media coverage. The press is encouraged to maintain openness to improvement and be accountable for their reporting.

In conclusion, the analysis underscores the crucial role of encryption in internet security, while emphasizing the need to strike a balance between public safety and preserving privacy and human rights. It underscores the significance of encryption in protecting free speech, human rights, and the work of journalists. It also highlights the necessity for lawmakers and the press to possess a comprehensive understanding of technology to make informed decisions and enhance their practices.

Roger Dingledine

The discussions revolved around the topic of encryption and privacy, specifically examining their impact on society. Encryption was highlighted as a vital tool that allows individuals to have control over their personal information, offering them the ability to determine who can access their data and ensuring a sense of privacy and security. It was particularly emphasised that encryption is invaluable for vulnerable groups such as minorities and human rights activists, as it plays a crucial role in ensuring their safety.

However, the proposal for backdoor access to encryption was strongly rejected. The argument put forth was that incorporating a backdoor feature in encryption would undermine the entire concept, compromising the safety of everyone. It was emphasised that if a mechanism to break encryption is created, it can be exploited anywhere in the world, regardless of the country, leading to potential misuse. This raised concerns about the weakening of society and the possible dangers associated with backdoors in encryption.

The discussions also highlighted the intrinsic connection between security and privacy. It was argued that security and privacy are essentially two sides of the same coin, both being crucial aspects of individuals’ lives. Instances of identity theft were cited to illustrate the intertwining nature of security and privacy. Furthermore, the reliance of FBI agents, who play a significant role in maintaining security, on tools like TOR was mentioned, underscoring the importance of both security and privacy in their work.

Another significant point of discussion was the adverse effects of false positives generated by automated content moderation tools. It was highlighted that AI-powered systems are not infallible and can produce false positives. This means that innocent users may be falsely reported and labelled as criminals due to errors in content moderation. The potential consequences of such misreporting were stressed, as they can have serious implications and ruin lives.

The discussions also touched upon the unrealistic expectations of politicians who desire technological solutions that provide both privacy and enable surveillance. It was argued that such a solution is currently not technologically feasible and can potentially result in exploitation. Tech companies were criticised for deceiving governments by promising to develop such technology for significant sums, despite its impossibility. The need to strike a balance between privacy and surveillance was emphasised, particularly considering the long-term effects of compromising safety.

Regarding specific tools, the discussions highlighted the significance of encryption in Tor. It was mentioned that Tor is not solely for resisting surveillance but also for resisting censorship. The widespread use of tools like Tor was deemed vital for their effectiveness and safety. It was emphasised that as more common tools incorporate real encryption, it becomes a normal part of everyone’s daily life, rather than being perceived as a sign of political dissent.

Additionally, the discussions raised concerns about the compromising stance of some tech companies on privacy. It was noted that certain tech companies prioritise profit over users’ privacy rights, especially when accessing large markets like China, Russia, Saudi Arabia, and India. This practice was criticised as it enables dangerous actions against user privacy.

In conclusion, the discussions on encryption and privacy shed light on the importance of encryption in safeguarding personal information and the need to have control over its access. The idea of backdoor access to encryption was strongly rejected, highlighting its potential for misuse and the weakening of society. The inherent connection between security and privacy was underscored, with a particular focus on the negative consequences of false positives from automated content moderation tools. The unrealistic expectations of politicians in balancing privacy and surveillance were criticised, while the importance of widespread use of tools like Tor was emphasised. The compromising stance of certain tech companies on privacy for market access was also challenged. Overall, the discussions provided insights into the complex and multifaceted nature of encryption and privacy in contemporary society.

Speaker

The analysis reveals that companies often fail to prioritize privacy, despite claiming to do so. This can be attributed to their primary focus on maximizing returns for shareholders, which raises concerns about the genuine value placed on privacy in corporate decision-making.

Another pressing concern is the negative impact of cybercrime and spyware on economies. Billions of dollars are lost to cybercrime each year, with industry statistics supporting these claims. Moreover, the economic damage caused by cyber threats can surpass the economies of certain nations, emphasizing the need for effective measures to combat cyber threats and protect against economic losses.

On a more positive note, it is acknowledged that Artificial Intelligence (AI) has the potential to contribute positively, particularly in the field of medical advancements. The application of AI in healthcare can drive innovation, improve patient outcomes, and enhance overall well-being. This suggests that if properly harnessed, AI technology could play a significant role in advancing healthcare and addressing societal challenges.

In light of the alarming statistic that millions of people’s genetic identities have been compromised through privacy breaches, it is concluded that government action is imperative. Government intervention is needed to protect individuals’ privacy rights, maintain the integrity of sensitive data, and establish robust regulations that hold companies accountable for any lapses in privacy protection.

In summary, the analysis highlights the tendency of companies to overlook privacy concerns in their pursuit of maximum shareholder returns. The negative impact of cybercrime and spyware on economies serves as a wake-up call, emphasizing the need for comprehensive cybersecurity measures. While opportunities for positive contributions through AI exist, safeguarding privacy must remain a priority. Ultimately, government action is necessary to address privacy breaches, protect individuals’ data, and safeguard the interests of society as a whole.

Smith

The speaker at the event politely requested participants to form a queue at the microphone and introduced herself. She emphasized the limited time remaining and expressed the desire to address both of the upcoming questions within the given timeframe. The speaker’s request for concise questions was to ensure enough time for comprehensive answers. Additionally, the importance of participants introducing themselves before posing questions was highlighted, fostering respect and engagement.

As the queue formed, there was a sense of urgency to address the remaining questions. With only one minute left, the speaker urged the next person in line to promptly ask their question to not miss the chance for a response. This showcased the speaker’s commitment to effectively addressing all inquiries before the session ended.

In conclusion, the speaker’s management of the Q&A session demonstrated professionalism, consideration, and a strong focus on maximizing the remaining time to accommodate participants’ questions.

Audience

During the discussions, various important topics were explored, shedding light on the challenges and complexities surrounding technology, human rights, privacy, and accountability.

One significant point of discussion was the danger posed by encrypted apps in countries with authoritarian regimes. It was argued that the use of encrypted apps can actually endanger users in such countries. The Turkish government was cited as an example, using the presence of encrypted apps as evidence against individuals, highlighting the fact that autocratic nations often learn and adopt oppressive policies from each other. The call was made to consider the context and oppressive governmental practices when assuming the safety of encrypted apps for all users globally.

The biased media coverage of technological issues and human rights abuses was also extensively addressed. It was argued that Western-centric media tends to give more attention to issues in Western countries. Non-Western governments’ tech requests or laws often do not receive as much coverage, despite the potential replication of policies in similar geopolitical contexts. The need for a more global perspective in technology and human rights reporting was emphasized.

The lack of accountability for big tech companies in their interactions with autocratic nations was another key concern. It was pointed out that big tech compliance in autocratic governments is increasing, and these companies are often willing to compromise on human rights for financial gain. There was a call for increased scrutiny and accountability to ensure that these companies are held responsible for their actions in autocratic nations.

The potential for mandated encryption backdoors was also raised, particularly in the context of the UK’s online safety bill. One audience member expressed concern about this possibility and the implications it may have for privacy. The stance was against the implementation of mandated encryption backdoors.

Surveillance capitalism, the practice of tech companies using user data for profit, was identified as a concerning aspect of privacy. It was acknowledged that while governments are mostly blamed for surveillance, tech companies also play a significant role in exploiting user data for financial gain.

The rights of victims of child sexual abuse material (CSAM) were highlighted as often being overlooked. This raised the issue of the need for greater attention and support for victims of such abuse.

The discussion also revealed that, in many cases, tech companies prioritize their revenues over human rights. It was pointed out that companies encrypt data extracted from users primarily to prevent competitors from accessing it, rather than for the protection of user rights.

Double standards in abiding by privacy laws were identified as a problem. Tech companies were found to comply with laws in autocratic states but often ignore those in democratic states, indicating a lack of consistent and ethical practices.

The potential cybersecurity risks associated with data encryption on internet protocols were highlighted. It was argued that the inappropriate use of encryption can weaken cybersecurity, emphasizing the need for careful consideration and implementation.

Finally, the importance of adapting advocacy messaging to different regions was raised. It was noted that different parts of the world may require tailored approaches to effectively communicate and advance human rights and justice.

In conclusion, these discussions shed light on the complex issues surrounding technology, human rights, privacy, and accountability. They highlighted the dangers of encrypted apps in authoritarian regimes, the biased media coverage of technological issues, the need to hold big tech companies accountable, concerns about privacy and surveillance capitalism, overlooked rights of CSAM victims, tech companies prioritizing revenue over human rights, double standards in privacy laws compliance, potential cybersecurity risks of encryption, and the importance of adapting advocacy to different regions. These discussions call for greater awareness, scrutiny, and efforts to ensure the protection of human rights, privacy, and justice in the rapidly evolving digital landscape.

Session transcript

Tate Ryan-Mosley:
plain text. And so with end-to-end encryption, even the tech companies that make encrypted apps actually do not have the keys, as they would call it, to break the ciphertext. But more on that later. Most commonly, when we talk about end-to-end encryption for the average internet user, we’re talking about messaging apps like Signal, Telegram, and WhatsApp. But there are different variations of encryption. So HTTPS, for example, protects websites and website activities, and even some devices are fully encrypted with passwords and passcodes, like an iPhone, for example. And encryption has actually been debated from a policy perspective really since the beginning of time, for 20 or 30 years, as authorities have sought access to encrypted messages and devices. This access is commonly called a backdoor, and authorities or law enforcement agencies that have advocated for backdoor access often will say, you know, we just want access to some messages on a case-by-case, restricted, small-scale, targeted allowance. In the past, of course, tech companies argued that doing so would have pretty substantial risks to encryption as a whole, because the creation of a sort of master key, which doesn’t exist today, would be really hard to control from bad actors, inappropriate government uses, and just generally weaken encryption. Opponents of backdoor access say that, of course, law enforcement can’t really be trusted with this type of access, plus it’s not really how the technology works. And additionally, strong encryption is necessary for human rights advocates, journalists, and free speech more generally. Historically, the UN has actually sided on the side of the opponents of backdoor access, saying that encryption backdoors are contrary to the freedom of expression. So in the past, we’ve seen the encryption debate pop up really during times of crisis, when law enforcement agencies are looking for a particular piece of intelligence in a high-profile case like the San and Bernardino shootings in the US or the Paris bombings, both of those in 2015. But currently we’re seeing this debate crop up in the form of online safety and content moderation most commonly. There’ve been a handful of bills in the US and globally, US at the state level, but also Australia, UK, places, Canada that we’ll talk a little bit about today that are threatening encryption. So we’re gonna talk about all of this today in light of also the growing use of surveillance technologies by governments around the world and what we might do to strengthen encryption protections. So as a reminder, we will have some time for questions at the end. So please do think of them throughout our chat so that you’re ready to shoot them out to our lovely panelists at the end of this. So now that we’re kind of all on the same page about what we’re talking about, I wanna pass the first question to Roger, which is Roger, why do governments, law enforcement agencies, anybody really want backdoor access? What are they getting at?

Roger Dingledine:
Yeah, so that’s a broad question. I mean, the fundamental conflict here is between society being safe and national intelligence, law enforcement, governments wanting control in these cases. So the way that I look at this is the question is about privacy. And by privacy, I mean control or choice about your information. So if you are successfully having privacy, and one of the ways to get that is through this encryption that we’re talking about, then you get to choose who learns things about you. So that’s my definition of privacy. And one of the interesting characteristics of it is vulnerable populations need it more, find it more valuable. So if you already have a lot of power, if you’re a nation state or a Russian mafia or whatever large powerful group, you already have power. It’s not so important for you to have. an extra layer of privacy. Whereas if you’re a minority, LGBT, journalist, human rights activist, and so on, then it’s much, then this is one of the most important things for you to retain control of your own safety. Yeah, and Roger, kind of sticking with you on that point, when governments or law enforcement agencies, you know, whatever party is in control, asks for backdoor access to encryption, from a technical point of view, why is that a slippery slope? Like, why is that such a risky request? Yeah, so there are several problems here. One of the big problems is, math doesn’t know what country it’s in. Technology doesn’t know what country it’s in. So if you, let’s say you have a country with perfect rule of law, I don’t know where you’d find one of those, but let’s say you have one of those. And in that situation, the judicial process gets to decide who can break the encryption and whose messages we’ll look at. That same tool is going to be used elsewhere in the world, and there are other countries who are going to try to reuse the same mechanism for breaking the encryption. So even if in the US we had a perfect judicial system, which we don’t, what do the tech companies do? What do the tools do when the judge in Saudi Arabia asks for that same access? So the fact that there are different countries in the world is one of the main challenges to having this whole backdoor concept make any sense at all. And I guess the other way of saying that is this notion of a backdoor that law enforcement keeps asking for weakens society as a whole. It makes everybody less safe. That’s not a worthwhile trade-off.

Tate Ryan-Mosley:
Rhonda, I want to pass it to you because. you work with protecting free expression and people on the ground who are doing human rights work, how have you seen encryption being used to protect activists or even citizens who are just expressing their voices?

Rand Hammoud:
Thanks, Tate. I think one of the main things that comes up when it comes to encryption and protecting or safeguarding or enabling even fundamental rights is the fact that it is one of the biggest technologies today that is the foundation of security and safety and trust online. And so it’s enabled activists, lawyers, human rights defenders, dissidents to securely communicate, organize and protect their freedom of expression and assembly. And so if we go ahead and undermine encryption, we are thus undermining their ability to do so. And we need to place this conversation within the context of an already pervasive surveillance industry where even with strong encryptions and even when we do have data that is now encrypted and safe, we already have a large billion dollar industry that is working day in and day out to find vulnerabilities to exploit and already survey these individuals and place them at risk and thus putting them in harm’s way and even causing and enabling grievous human rights abuses such as enforced disappearances, extrajudicial killings. And so the conversation around safeguarding encryption needs to also be aware of the already existing surveillance capabilities of governments and malicious actors.

Tate Ryan-Mosley:
Yeah, I think that’s such a good point. And one thing, Sharon, I want to ask you about is even from an economic perspective, encryption is essential to data protection activities at normal businesses, right? So, yes, as Roger spoke about, you have these grave power imbalances between activists and states, but also you have people at their jobs who are protecting sensitive information who rely on encryption. Can you talk about that use case as well a little bit?

Sharon Polsky:
Absolutely, and you’re right, it’s not just the human rights people and the advocates, but it is every day people in business, and the one area that is seldom mentioned is also the lawmakers themselves. Whether you are a lawyer who has to maintain client confidentiality, or you’re a doctor and you have to maintain confidentiality of your patient information. If you’re a lawmaker, a strategist, a policy analyst, and you’re in discussion with your colleagues, you don’t want somebody else being able to infiltrate and figure out what you are strategizing. So everybody has privacy issues, whether it’s for personal privacy or for business and economic, and actually for national security reasons. Maintaining encryption is absolutely fundamental.

Tate Ryan-Mosley:
Yeah, and I think, Rhonda, I want to pass it back to you. We’re in a room, I’m sure with some policymakers, at a policymaking conference. How do you think we should respond to governments who want backdoor access to encrypted technologies, and who stands to gain and who stands to lose? Do you trust them?

Rand Hammoud:
I think to piggyback on what my fellow panelists just said, undermining encryption is also a national security issue. And so when you look at it that way, no one stands to gain. It will place national governments at risk, the same governments that are advocating for undermining encryption will be themselves at risk. And then democratic processes are included within those risks. Because when you think about journalists, activists, essential people that uphold democratic processes being at risk, or having to self-censor because they know that they could be surveyed in such a way that is mass scale, really, when you talk about undermining encryption, then that whole process is lost. And so I think from my point of view. view, there is no one left to gain except, you know, individuals or malicious actors who want to survey those people, and who want to gain access and, you know, who want to exercise population control, because essentially, that is what undermining encryption will do, it will make surveillance so much cheaper, it will take us into, you know, pre Snowden revelation days when, you know, there was mass surveillance from governments and companies. And so there is no one to gain, there is no one that is going to gain and as we shouldn’t be trusting backdoor accesses, or any sort of pretexts that really are not even technologically sound.

Tate Ryan-Mosley:
Yeah, and I feel like you’re picking up on one kind of key tension that has been in this narrative for a long time, which is, you know, our security and privacy opposing things. Can we have both? How do you achieve both? And Roger, I wanted I wanted your perspective on that, like, to what extent is this binary of security and privacy real?

Roger Dingledine:
Yeah, so security and privacy are the same thing in a lot of ways. Imagine you give out your financial data, and then somebody does identity theft on you. So going back to your example of encryption being a national security thing, I was at an FBI conference years ago. And the I talked to a bunch of FBI people, and some of them use TOR, and some of them fear TOR. And one guy was saying, surely you have some sort of backdoor, right? Surely you have some way to learn what people are doing on the TOR network. And I explained to him, I pointed to his colleagues and said, these people just told me today that they use TOR and rely on TOR every day for their job. Do you want me to have a way to learn what they’re doing on the internet? So from that perspective, it’s a, it’s a national security. It’s a security. It’s a privacy. They’re all the same. They’re all the same sides of the same coin.

Tate Ryan-Mosley:
Yeah, Sharon, do you want to expand on that?

Sharon Polsky:
I have to agree with Roger, it is all connected, and it’s all too often that people will talk about one aspect or another without relating, without connecting the dots, and you absolutely have to. But the problem I’ve found through my career, and that’s been dealing with governments and policy people and corporations, there’s been very little education about these things. We use the internet, we use computers, but a lot of people, unless you live it, unless you’re a Roger and you design these protective mechanisms, most people just use them. And that’s a problem because they know how to use it to a very small degree, they don’t understand the implications of what they’re doing quite often, and that also falls over to the lawmakers and the people who prepare the research and the briefing notes for the lawmakers. If they don’t understand what the technology is about, what the risks really are, and the unintended consequences of the legislation they draft, then they are building something that is going to create a world of problems, and for that I look to things like various pieces of legislation in Canada, some have just come in, some are still on the books being debated, and the so-called Online Safety Act in Britain. They’re all being promulgated as necessary to protect children, and doesn’t everybody want to protect children, that’s the argument. Of course we want to protect children, they are among the most vulnerable, but if you undermine encryption to ostensibly protect children, other people will also be able to get through that back door and endanger not only the children, but everybody else, and it is the very children who will be endangered because the way the laws are being written, the content will have to be scoured automatically, proactively, automatically reported to police if it is suspected as potentially, maybe, possible. being in child sexual abuse material. So what happens when a child has been abused and wants to report? Their content gets stopped and reported and they are the ones who become the suspects. In Canada, a child is chargeable under the Criminal Code of Canada as of 12 years old. Imagine the possibilities and the unintended consequences of breaking encryption.

Tate Ryan-Mosley:
Yeah, and I’m really glad you brought that up and I want to get a little bit further into the specifics here because I think, you know, this is where we’re hearing a lot of the encryption debate. You know, if we have a lot of encrypted messaging, if we have a lot of really secure portals for communications, we can’t moderate those spaces. And we know that internet users are increasingly moving to private spaces in this age of, in this current, you know, moment of social media. And so lawmakers are saying, hey, you know, what can we do about all this abuse information? What can we do about, you know, all of this, you know, bad, harmful content that’s being passed through people that that tech companies themselves and governments have no visibility into. And you brought up the UK online safety bill. This was obviously a big one. Australia, India, the U. S. We’ve also seen some discussions of, you know, providing either technical or real backdoor access to encrypted messages. I’d love to know, Sharon, like, can you tell me something kind of specific about some of the bills in Canada, where you see, you know, an unintended consequence or a misunderstanding from lawmakers of, you know, the technology or the ramifications?

Sharon Polsky:
Absolutely. And really, I don’t have the imagination to make up the stories, the examples that I will cite. I had a conversation with one of our current members of Parliament about a year ago. We were talking about this because the legislation in Canada was just being formulated. And I said, but if you break encryption for some, so that all the content can be monitored and she stopped me and I went, break encryption? No, I don’t think that’s how it works. And changed the subject. She, like many of our current members of Parliament, come from journalism. They’re educated, they’re worldly, that’s great, but they don’t get it. We have, you might have heard of Bill C-18 that just became law to update the Broadcast Act and that sounds wonderful except it now includes not just radio and television but it includes governing the Internet globally. Canada has taken it upon themselves to declare that they will govern the content. Combine that with another piece of legislation on the books, Bill C-26, and we refer to them by their numbers because unlike the United States, Canada has a history of creating legislation with very lengthy, hard to say names, not nice, concise, easily said acronyms. So Bill C-26 is another piece of legislation and that one is going to amend the Telecommunications Act to create the Critical Cyber Systems Protection Act and like the others it’ll infringe on privacy and freedom. All of these will narrow identified gaps. They do, if you look at it from a certain perspective, have a legitimate application protecting children, preventing terrorism, preventing all the ills and harms that we see so often, the very same things that were going on long before the Internet became a thing. But the problem is everything is going to be surveilled, as Ran said. That is a problem particularly because when everything is surveilled and the various pieces of legislation say some content And we will deem, we will have our separate agencies deem as misinformation, disinformation, unwanted content. The government will not be the one to do the censoring. The law will have the platforms do the automatic, routine, mandatory proactive screening. Those are outside of Canada, outside of the reach of Canadian law, of course. So it’s actually a very interesting way that they’ve created it, because similar to the Americans who have constitutional rights to freedom of speech, we have charter and protected right to freedom of expression, and the charter protects Canadians against overreach by government. So it’s not going to be the government committing overreach, it’s going to be the companies that the charter doesn’t cover. The companies will just do as the law requires. And that affects everybody, including everything from children to the elderly in every walk of life, including the politicians themselves.

Tate Ryan-Mosley:
And on that point, I mean, luckily for us, we have someone on this panel who runs a tech company, Roger. How do you think about balancing privacy with content moderation? I mean, I know this is not the Tor Project’s bread and butter, but we do know that there have been the proliferation of child sexual abuse material on some private messaging apps. So is there an approach that balances these two things? Can you achieve some level of moderation and encrypted privacy?

Roger Dingledine:
Yeah. So, fortunately, Tor is a communications tool, not one of these platforms. So we don’t have content to moderate in the way that Facebook and so on have. But from… from the So everything Sharon said is right, but it’s worse than that because you were talking about If the technology behaves in a perfect way, then it’s still bad for society But the reality is for example in the UK online safety bill. They’re imagining there will be magic AI machines that perfectly just look at pictures and perfectly decide correctly if they’re bad pictures or not bad pictures and The reality is AI doesn’t work that way. It’s not perfect You’re going to have some false positives and let’s say you have 2% of the time It says that’s a bad picture and it shouldn’t and there are 10 billion pictures being sent each day Then 2% of the users are going to get reported each day for being criminals And maybe they can drive the false positive rate from 2% down to 1% So now it’s only tens of thousands of people being misreported and having their lives ruined because the the math screwed up a little bit for them, so it’s definitely a challenge here because the politicians want this reality to be possible and it isn’t but they want it to be possible and There are all sorts of for-profit tech shark scam companies that say oh, yes yes, yes, give me millions of dollars and I’ll build a magic thing for you and it will be magic and The reality is it’s not going to work. It’s not going to do what what people want But the politicians really want it. They they they would love to have a technology solution to be able to give people privacy while also Surveilling all of them, but the reality is that the tech does not support the things that they’re wanting

Tate Ryan-Mosley:
Yeah, and just some context if people aren’t familiar these are you know, I’m sure you’re referring to a handful of technologies some of which are you know, message franking, client-side scanning, server-side scanning, and really the idea behind these types of technologies, they are different, so I’m sorry for painting with a broad brush here, are technologies that basically allow a machine to evaluate the content underneath the encryption, so that there’s not a person, you know, reviewing necessarily the content of encrypted messages, but there’s a machine checking and saying, oh, you know, this might be child sex abuse material, for example, and in the UK law, you know, it was a stipulation of the UK online safety bill that, you know, technically feasible, I think, was the terminology they used, you know, had to use those type of technologies, and then it was recently, just a month ago, repealed because the technologies do not exist or change, that part of the bill was changed, and I really like, Roger, how you said, you know, let’s talk about reality today, and Viranda, I want to pass this back to you, so talking about reality today, what sort of protections do human rights advocates, journalists need right now when it comes to, you know, protecting their own privacy and protecting themselves against government surveillance?

Rand Hammoud:
So, I think there are two main subjects to this kind of answer, and when it comes to protecting themselves from government surveillance, it mainly takes us into the idea that, you know, even before we get into undermining encryption, we already are in a space where spyware is largely used against, you know, human rights activists, dissidents, etc., and with the most recent reports that Amnesty put out, it’s become even cheaper today, for example, a predator infection costs €9,000 only, when years ago it was much, much more expensive, and so the technology is proliferating, and it is off the shelf, it is unregulated, unchecked, and governments, and who knows what other actors, are just using it against human rights activists, lawyers, journalists. And so the first thing that we need to tackle or the governments need to tackle is firstly ban spyware vendors and technologies that have already been used to enable human rights abuses. And then talk about establishing the safeguards that are needed in order to have a more human rights respecting framework to use certain digital surveillance technologies in a way that does not infringe on human rights. If that framework exists, but we first need to be able to have multiple safeguards that would ensure that even if these technologies are used, there is a mechanism to access remedy, a mechanism for investigations, et cetera, which largely even in spaces that it exists today is not respected. And we see that where there are multiple democracies where there are legal frameworks that deem the surveillance illegitimate, but it is still happening. And so the conversation around the protections, the legal protections that we need should also look into why the technology is proliferating in such a way and the pretext behind why it exists or the need behind why it exists. And the pretext that law enforcement needs this kind of technology today to ensure that everyone is safe is completely false. We have not seen any evidence that this technology has helped in any way to maintain national security or make anyone safe. But we have plenty of evidence of it making people less and less safe and infringing on their rights.

Tate Ryan-Mosley:
Yeah, absolutely. I think that’s a really interesting point. And Roger, I wanna pass it back to you. I mean, so what can tech companies do and how are tech companies responding to both, I would say, increased surveillance, increased demand for access to citizen data and also to this kind of policy moment? I mean, tech companies are beheld to the laws that govern them. So what are you seeing from the tech side?

Roger Dingledine:
Yeah, so tech companies. is not a monolith. There are a bunch of different sides to the technology world. In terms of the huge companies like Apple, it’s interesting to notice that Apple is mostly on society’s side in this, where their users want safety, and Apple wants to give them safety. And it’s actually in Apple’s interest to give them safety, because if Apple had the ability to watch everything that they’re saying over messaging, then they’re a target for people trying to break in and harm the users. So in this sense, we’re aligned with groups like Apple. On the other hand, we haven’t said the word crypto wars yet, but we have to look at history and we have to look at the fact that governments have been asking for weakening security over and over for years. And for example, in the West, for internet routers, like the backbone pieces of the internet, each router has a port called a lawful intercept port. And the idea is you go to a judge and you say, I want to be able to watch all the internet traffic going along this part of the internet, because there’s a bad guy and I want to be able to watch him. And the judge thinks about it and says, okay, sounds good. And then you plug into the lawful intercept port and you get to listen to all of the internet traffic there. And I was years ago talking to a group in the German foreign ministry, and they were trying to figure out, should we regulate, as Rand was talking about, should we regulate these spyware tools? How do we decide what counts and what doesn’t count? And there was an engineer from Dubai telecom there who was like, you guys put the lawful intercept port in. And when my prince in Dubai asks, what’s that port? And I say, oh, that’s the lawful intercept port. And he says, plug it in. Like the jurisdiction is is wildly different, but the tool works the same in Dubai versus the US versus Europe. So to bring it back to Tate’s question, part of the things that the tech companies need to think about here is this is a recurring theme where governments keep asking for more and more access, more and more weakening. And there are side effects, such as having lawful access ports on backbone internet routers, which can be used well and wisely and are often not used well and wisely. So every time we think about weakening safety for society, we need to think through where that’s going to go in the future.

Tate Ryan-Mosley:
Yeah, and we’re just about ready to take some audience questions, but Roger, I wanted to ask you to just expand on that last point before we bring in any audience questions. So have your questions at the ready. But when it comes to thinking about this globally, as you said, technology doesn’t know boundaries. There is this kind of competitive market for both spyware and privacy technology. How do you think about how might we foster a global approach to encryption protecting framework for governance? Again, a big question for you.

Roger Dingledine:
Yeah. So the answer isn’t to make all of society less safe. That cannot be the answer. And it is frustrating that the US and the UK and Europe are so excited to do that. And it’s especially frustrating at the same time as each of these countries is signing the Freedom Online Coalition, the Declaration for the Future Internet, the Global Compact, all these acronyms we’re hearing about at IGF this week. We’ve got countries saying that they value safe. for society, yet here they are trying to to pass these laws each year. So yeah, how do we, I guess, so it can’t be mass weakening. A lot of countries then look at the targeted attacks, the ones that Rand was talking about, where they go to some Israeli company and they buy the ability to break into their specific target’s phone and bypass encryption and other mechanisms. And in a sense that’s better. At least it’s not mass attacks. At least it’s not harming everybody. But the reality there is we keep seeing these targeted attacks being used against not just journalists and bloggers and activists, but French politicians and the Parliament members in Germany and so on. So that’s, I’d like to live in a world where the targeted attacks are the better answer, but that seems like a pretty bad answer also. I guess the, as a technology person, I’m good at explaining why things won’t work. But the best solution that I have is we need to maintain strong security for all of society, meaning we need encryption to work well. And as Rand was saying, we need to start regulating and deciding what small arms dealers are allowed to do in the software vulnerability exploit space. And I mean, yeah, we could go on and on about this, but I’ll pause for other people to jump in.

Sharon Polsky:
And I’m gonna do just that, because I think for the people who are going to create the regulations, if they don’t have a proper, correct understanding of what it is they’re regulating, what the impacts of not regulating, regulating in a certain way, regulating completely, if they don’t get it, then regulating is going to be… a Band-Aid approach. The long term that should have started many, many years ago is education from the youngest grades, not just in how to use a computer, how to use these wonderful devices that do provide convenience for the good among us and the opportunists among us, but educate people about everything from how are laws made, what is democracy, what are different types of political structures. Give them the education so they can make critical decisions and grow up to build systems that don’t provide the very same problems we’re tackling and struggling with now.

Roger Dingledine:
Ultimately, we need to normalize what encryption is. So one great success story is HTTPS. It used to be that governments and law enforcement said, but if everybody has encryption when they go to websites, society will collapse. Think of the children. What would happen if we aren’t able to watch what you do when you connect to a website? And now, whenever you do your online banking or you log into the IGF website or any website, you use HTTPS. It’s normal. They fought that fight. We won. Let’s look to that as an example where we need to somehow figure out how to make society safer for the next round also.

Tate Ryan-Mosley:
Yeah, and I want to pass it over to Ron to get your perspective on this. What can we do to take a positive step forward globally?

Rand Hammoud:
I think the answer is actually quite simpler than many policymakers would like to hear, because they would want to know that it’s a complicated manner. And so use that to not pass progressive laws. But really, the international standards that we already have are quite strong. We already have many rights respecting laws and rights pushing laws. And so when we look at international standards for due process, for fair trials, for freedom of expression, et cetera, they already render surveillance capabilities as invasive as they stand right now illegitimate. Surveillance in the sense that would be promoted when encryption is undermined is basically assuming that everyone is guilty until proven innocent, which is the opposite of what should be, what should happen. And it brings to the consciousness of the state people who are not guilty of anything. And so it already is sort of an unlawful kind of attack. So really what we need to do is be able to enshrine in an international framework what surveillance and encryption means, inspired by the spirit of what we already have, which is strong international protections for our rights as they stand.

Tate Ryan-Mosley:
Yeah, and I feel like the infrastructure approach is something that is increasingly, I don’t mean to put words in your mouth, but it feels like that’s a similar approach that you’re advocating for that has also been applied to areas like anti-censorship technologies and that space as well. So I want to pause and see, are there any questions either online or in the room? If it’s online, you can just add them to the chat. And in the room, please make yourself known and I will take care of you.

Roger Dingledine:
We’ve got some hands in the room. So go to the microphone and-

Smith:
Yeah, go to the mic and get in line, please and thank you. Please introduce yourself before your question, that’d be great.

Audience:
Can you hear me? Yeah. Hi, I’m Handa Nuslu. I work for Google at the Trust and Safety Function and Policy Implementation for a while. And then I founded Turkey’s Internet Observatory, Gözlemeve. So what I want to ask is I actually have questions for three of you. So for Roger firstly, so when we talk about protecting human rights activists, I feel like the conversation is sometimes assuming a functioning democracy and the functioning government that kind of is really willing to protect the citizens. So that really doesn’t apply outside of Western Europe and outside of the US. So what we see in here, for example, in Turkey, we see that people are, so when there is any encrypted app found on someone’s phone or someone’s computer that can be used as evidence to support a case that someone is doing something illegal. So if TOR project, if I’m using TOR on my computer, that can actually endanger me. So it might be more safe in terms of surveillance, but it’s not safe if we talk about the tools of oppressive governments, for example. So I was just wondering if you’re discussing, if you’re talking about human rights activism and if you’re talking about protecting democracies, is there any context or any information that you ever get on how autocracies work? Because the problem is these countries, they learn from each other. So any law that pops up in a country is likely to be transferred. And so, for example, as someone who works in technology and human rights and democracy, we do not suggest some of the telegram or signal. Yes, it is encrypted. Yes, it’s open source, but it might put you in more danger because of this. So that’s my question to Roger. I also have a question to Tate, actually. So when we see about, because I’ve been following MIT Tech Review and we do have a lot going on in Middle East and in other countries, and what we see is that these are not being reported often. So there might be an issue in the US and it will get a lot of news and presence. But when the Turkish government or some other government or any government has a big tech request and this big tech company complies or some other stuff happens, these things would get a lot more, I feel like, coverage if it was happening in other countries. But like I said, when problems happen in a country, it’s not just for that country. It’s probably going to be replicated. If there is a law popping up in a certain country, for example, against encryption, it is very likely to be replicated in a similar geography. So I was wondering if you have any insights on maybe improving the coverage on just going beyond the Western look on how human rights issues and human rights activists could be protected. And for you, sorry, Sharon. Yes, I’m bad at remembering sometimes. But my question is, because you mentioned that you do talk a lot with the government’s bodies. and you are in interaction with them is, what percentage of your work is actually focusing on holding big tech companies accountable? And if that is a perspective, because, again, big tech compliance in autarkic governments is growing a lot. And these companies, they really want to earn a lot of money, and they are willing to give up every single human rights. And so, for example, Messenger is encrypted, but we have learned from Facebook officials that they do actually give information, chatting information, once it’s requested. And these are not requests based off of security reasons. So it’s not a request to identify someone who has been missing for a while. They’re mostly politically motivated. So these are my three big questions to you. Thank you.

Roger Dingledine:
Should we try to answer them now, or should we take more? What’s the right way to… One at a time. One at a time. Okay. So you’re absolutely right that there are not as many functioning democracies in the world as we would like. In fact, if you know a good functioning democracy, please let me know. In terms of the safety of having tools like Tor installed in dangerous places, there’s actually a really interesting synergy, because Tor is not just for resisting surveillance, it’s also for resisting censorship. And in a lot of countries, like Iran, and now Russia, and Turkey, and so on, there’s a lot of censorship. So the average Tor user in Iran is using it to get to Facebook, because they blocked Facebook. And that means the average Tor user in Iran is an ordinary Facebook user who’s just using it to get around the censorship. Yes, there are some political dissidents in there, but the average user is an ordinary citizen. And that ordinariness is an important security property for having these tools. And similarly, as the whole world moves to not just Telegram or Signal, but WhatsApp and iMessage, and as more ordinary tools… tools get real encryption, it becomes a normal thing that everybody has, not a sign that you’re a political dissident. So you’re absolutely right. And the tools need to become pervasive and ordinary in order to be safe.

Tate Ryan-Mosley:
I can briefly answer the question, and also just a reminder so that we can get to all the questions. We can all try to be brief in our responses. Thank you so much for that question. It’s a very important question. I don’t know if I can give you a very satisfying answer, other than it shouldn’t be that way. And I, as an individual reporter, and us as Technology Review, are constantly trying to be better about this. I think, frankly, you get into all of these issues with journalism, and local journalism, and journalism business models right now. And racism, and where people pay attention, and who people pay attention to, I think those are all parts of the answer to your problem. But certainly, the press can and should do better at covering countries outside of the West. And so thank you for encouraging me to do so. And feel free to send me tips at any point as well. And I will do my best to cover international stories more.

Sharon Polsky:
I appreciate your question. Do we deal directly with the tech companies? No, we tend not to. We deal with putting on the record what is going on. So when we spoke to the Canadian Parliament about facial recognition, or about spyware, we put on the record the billions of dollars, the statistics from industry, as to what sort of contribution those industries, cybercrime, spyware, what do they contribute to an economy? And often, it’s larger than some nation’s economies. And we put on the record what the impact is. And of course, it’s very simple. As you said, companies are not interested in your privacy or mine. They are interested in providing the greatest return possible for their shareholders. That is their raison d’etre. So for them to say, and this isn’t specifying one company or another, for them to say, we take your privacy seriously, we will protect it. I think that’s a promise that nobody should try to make because it’s inevitably going to fail. We need to see governments recognizing what the problems are, realizing that the tech companies, yes, they certainly do provide employment, innovation, and for perfectly legitimate and wonderful purposes. You know, using AI for medical advancement, that’s great. Using AI so I can pay whatever the fee is today to spit into a vial and have my DNA analyzed by a company in the United States that says in their so-called privacy policy online, we will protect your privacy, and then they are breached. This just happened. And millions of people’s genetic identities have been spirited away. You can’t change your genetics. You can change your password. Do governments understand? Do the bureaucrats and the lawmakers and the policymakers understand? No. When it happens to them, that, I find, is when things might start to change. So we do a lot to increase their awareness of these risks.

Speaker:
Thank you for these questions and answers. Why don’t we get a question from this line here?

Audience:
Good morning. I’m Masayuki from Japan. I’m academic. This may be a bit extreme, but it’s kind of related to the previous question. Do you have a plan of action for when the backdoor is somehow, I mean encryption backdoor is somehow mandated, since we finally avoided the worst with the UK online safety bill barely and I think the fight will be continued, especially in Japan or anywhere, so thank you.

Roger Dingledine:
Do we have a plan of action for when the backdoors are really, really required? Is that the question? We will never put a backdoor in Tor. We will never undermine Tor security. I don’t care what the laws say. So we’re going to have to wrestle with whatever the political policy implications of that is. We’ve got EFF, ACLU, a bunch of legal organizations in Europe and the US and around the world who want to fight these things and I hope they succeed. We will never weaken Tor security.

Sharon Polsky:
If I can add to that, I think the most important part is that people are now becoming aware and I don’t mean just people in technology or in the privacy realm or certain policy makers, I mean the general public has gotten fed up with seeing their personal information monetized. They are starting to ask questions. I’m working with some people who are developing systems so it will completely change the dynamic. No longer will you have to submit to whatever the so-called privacy policy is on a website. You will have control over whether, when, how much, to whom your personal information goes. You will be in control to flip things around. Companies aren’t going to like it but when the people who are their bread and butter now say we’ve had enough, they will have to change how they do it and that’s going to be a plan of action.

Smith:
So we have four more minutes. I want to try to get both of these questions in. So if the answers could be brief that would be great. This line next.

Audience:
Andrew Campling, I’m a consultant on Internet standards and a trustee of the Internet Watch Foundation. A couple of quick comments, and I’ll try to be brief. In the discussion, the title is about human rights and it’s mainly been about privacy. We’ve largely, up until the last answer, ignored surveillance capitalism, if we’re going to talk about privacy. We focus on evil governments, and it seems to deflect attention from what the tech sector does itself to users, and arguably that’s a lot worse. We’ve ignored the rights of the victims of CSAM to focus on the rights of others at their expense, and I think we need to acknowledge and talk about that. We’re treating privacy as an absolute right, whereas certainly in Europe it’s a conditional right. Other human rights are absolute rights. Often now we’re protecting the conditional right to privacy at the expense of the absolute rights of people whose other rights are being infringed, such as the CSAM victims. We need to acknowledge that when we have the blind use of encryption, that can weaken privacy. So when you apply encryption to Internet protocols, that can actually weaken cybersecurity, and if you don’t have good cybersecurity, you have no privacy, even when you think you do, and I think that’s a significant problem. We need to acknowledge that most of the tech companies, and I accept not the ones here probably, they’re not defending my human rights, they’re defending their revenues because they’re encrypting the data that they’re extracting from my endpoint to… when they surveil me, and they don’t want their competitors to access that data. That’s why they want the encryption, not to protect my rights. That’s an interesting byproduct to justify the encryption. And then finally, and acknowledging the comment you just gave on Tor’s position on backdoors, almost all of the big tech companies absolutely compromise their approach to privacy in order to have market access in some of those very problematic states. So you don’t have a private relay in China because it’s illegal. But they will cheerfully ignore the laws in democracies, but will comply with the laws in more autocratic states. And I think that’s pretty problematic as well. And I’ll stop there. Thank you.

Roger Dingledine:
Yeah, we could definitely have a session on surveillance capitalism and the evils of large tech companies and how they’re attempting to primarily maximize their profit rather than actually caring about their users. One of the points that we tried to make here is there are some synergies, some overlaps, where at least in this case, Apple is interested in privacy, first of all, because it’s good for marketing. People ask for it this year. But also because it helps them have less surface area for attack so that they don’t have as much that they have to worry about for people trying to attack their users. But you’re right that that doesn’t make Apple great. And it’s also an excellent point that many tech companies choose to design their approaches with China, Russia, Saudi Arabia, all the other interesting big markets around the world, India in mind, and that causes them to do bizarre and dangerous things for their users.

Smith:
I think we have. like one minute left so if you could ask your question hopefully we can fit in an answer.

Audience:
I just want to build on the the first question really and ask about the mechanics of advocacy in different countries and and parts of the world so take one of the examples you mentioned was India and I’m just wondering whether there’s a sense in which you need to adapt the messaging and and the arguments around this to different parts of the world.

Tate Ryan-Mosley:
Ron do you want to take that I feel like you have a good perspective better than I would certainly.

Rand Hammoud:
Yeah sure I think that’s a very good point of course you know using the same narratives within different contexts doesn’t always or isn’t really fruitful it’s not as productive as you would hope of course when we are trying to do any advocacy within autocratic states that have no regard for human rights we cannot be using a human rights kind of based argument which is when you kind of talk about national security and how that is also kind of in the interest of the state or also use economic advantages or economic kind of arguments to say you know there is business espionage how do you protect kind of the economic advantages of certain companies a competitive advantage of certain companies and that’s when you know other companies come on board and try to as you know as Roger was saying and try to become allies in this space and so it is definitely incredibly important to make sure that we’re using the appropriate narrative within the advocacy spaces that we are using but also to be very mindful that you know the advocacy avenues in some contexts are just not there it is really difficult to talk about you know a rights respecting framework for the use of surveillance technologies in autocratic governments or even in democracies these days which is why we need to look at it as sort of a more global or international framework because you cannot depend depend on the jurisdiction where this technology is utilized. The technology doesn’t, the infrastructure is there. And so we cannot control how well or how bad it is utilized. And so that’s why we need to look at a more international framework for the use.

Tate Ryan-Mosley:
OK, I just want to say thank you so much to everybody for all of your questions, for your comments, to all the panelists, and Al for the participation in today’s panel. I hope you all learned something. I certainly did, and I hope you have a great time at the rest of the day’s events.

Audience:
Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you. th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th

Audience

Speech speed

136 words per minute

Speech length

1480 words

Speech time

652 secs

Rand Hammoud

Speech speed

180 words per minute

Speech length

1406 words

Speech time

469 secs

Roger Dingledine

Speech speed

175 words per minute

Speech length

2566 words

Speech time

880 secs

Sharon Polsky

Speech speed

151 words per minute

Speech length

1751 words

Speech time

694 secs

Smith

Speech speed

176 words per minute

Speech length

83 words

Speech time

28 secs

Speaker

Speech speed

226 words per minute

Speech length

19 words

Speech time

5 secs

Tate Ryan-Mosley

Speech speed

177 words per minute

Speech length

1987 words

Speech time

673 secs