Beneath the Shadows: Private Surveillance in Public Spaces | IGF 2023

11 Oct 2023 00:45h - 01:45h UTC

Event report

Speakers and Moderators

Speakers:
  • Swati Punia, Civil Society, Asia-Pacific Group
  • Felipe Freitas, Government, Latin American and Caribbean Group (GRULAC)
  • Chris Wilson, Private Sector, Western European and Others Group (WEOG)
  • Beth Kerley, Civil Society, Western European and Others Group (WEOG)
Moderators:
  • Barbara Simão, Technical Community, Latin American and Caribbean Group (GRULAC)

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Audience

During the discussion, various topics related to technology and data were explored, including the use of blockchain technology for collecting biometric data. An audience member asked for opinions on this matter. The sentiment towards this question was neutral, with no specific arguments or evidence provided for or against using blockchain for biometric data collection. However, it was mentioned that blockchain might be beneficial in controlling access to data, suggesting a potential advantage in using this technology for biometrics.

Another concern raised by an audience member was the issue of real-time surveillance in India. The sentiment expressed was negative, with the argument focusing on the lack of protection and rights for users in the face of such surveillance. The audience member questioned whether individuals are adequately informed when their data is being processed and if they are aware of being under surveillance in public areas. Unfortunately, no supporting facts or evidence were provided to further substantiate these concerns.

Furthermore, an audience member from Australia discussed the increasing use of advanced technology in accumulating data and enhancing private surveillance. This sentiment was negative, and the argument emphasized the implications this has for user privacy. It was highlighted that developed nations are acquiring wealth and control through the collection of data using advanced technologies. However, no specific evidence or examples were provided to support this claim.

In conclusion, the discussions surrounding blockchain technology, data security, biometric collection, and surveillance touched upon important implications for data protection and user rights. While the use of blockchain for biometric data collection was not extensively debated, the potential of blockchain in controlling data access was acknowledged. The concerns raised about real-time surveillance in India and the increasing use of advanced technology in data accumulation and private surveillance highlighted the need for protections and solutions to safeguard user privacy. Nonetheless, the lack of concrete evidence and specific supporting facts weakened the arguments presented.

Beth Kerley

The rapid increase in network surveillance of physical spaces, alongside traditional digital surveillance, has become a growing concern. It exposes individuals to potential targeting by both public and private entities. By 2021, it was predicted that the number of surveillance cameras globally would exceed 1 billion, blurring the lines between public and private surveillance.

Emerging technologies such as biometric surveillance and ’emotion recognition’ are giving those who control cameras in public spaces new capabilities. Facial recognition technologies are being sold as part of the surveillance package, enabling the identification of individuals in real time. Emotion recognition technology is also being used in different countries to monitor students, drivers, and criminal suspects. These new developments raise ethical and privacy concerns as they can be intrusive and have significant implications for personal freedom and autonomy.

The involvement of private companies in surveillance poses challenges to transparency and accountability. Private entities are inclined to protect their intellectual property, making it difficult for citizens and non-governmental organizations (NGOs) to understand how surveillance systems operate. Additionally, contracts between public and private partners often lack specific provisions on how private entities can utilise the resulting data. This lack of clear guidelines raises the risk of misuse and potential violation of privacy rights.

While biometric identification has its controversies, it also has legitimate uses that should not be overlooked. It distinguishes itself from biometric surveillance, which involves the monitoring and tracking of individuals without their explicit consent. Biometric identification allows users to intentionally use their physical attributes, such as fingerprints or facial recognition, to access a space or account. However, appropriate safeguards are needed to ensure that biometric data is properly protected and not misused by unauthorised entities.

The integration of sensitive data with blockchain technology is met with scepticism. Storing sensitive data in a system designed to be unerasable raises concerns about data security and privacy. The immutability of blockchain can be seen as both a benefit and a risk, as any potential breaches or unauthorised access may have long-lasting consequences.

European digital rights groups argue for a ban on real-time surveillance in public spaces. They believe that real-time surveillance is difficult to control and regulate, which can lead to potential abuses of power. Striking a balance between security and privacy is crucial to maintain public trust.

Furthermore, public awareness and understanding of surveillance systems and the information possessed by the government are vital. In countries like Estonia, where elaborate e-government systems are in place, public awareness is a key safeguard to ensure trust in surveillance practices.

In conclusion, the rapid expansion of network surveillance in physical spaces, coupled with emerging technologies, raises significant concerns regarding privacy, transparency, and accountability. The involvement of private companies, appropriate safeguards in biometric identification, scepticism towards integrating blockchain with sensitive data, and the need for public awareness and trust all play crucial roles in shaping the future of surveillance systems. Striking a balance between security and individual rights is essential for the responsible development and use of surveillance technologies.

Yasadora Cordova

The debate centres around the issue of control and consent regarding users’ biometric and personal data. One perspective in the debate argues that user control is vital in order to prevent the misuse of data and protect privacy. They suggest implementing rules and ethical frameworks that increase user awareness of data collection. This approach emphasises the importance of separating different types of identification technologies to improve user control and promote data privacy.

Another viewpoint suggests that the control over sensitive biometric data should be entrusted to a neutral third-party or citizens’ counsel. The proponents of this argument raise concerns about law enforcement having access to and retaining the ability to edit videos, as this encroaches on personal freedom and raises privacy concerns. They caution against potential abuse of data by law enforcement agencies.

Furthermore, it is argued that user control over their data is essential not only for privacy but also to prevent potential misuse. The introduction of new rules and ethical frameworks is proposed to enhance user awareness of data collection practices. By doing so, users would have more control over their personal information and be able to protect their privacy more effectively.

A related point that arises in the debate is the need for user control in data privacy. It is observed that both industries and governments are collecting data indiscriminately. The cost of maintaining the integrity of personally identifiable information is said to be increasing. Therefore, it is suggested that obtaining permission or consent for the use of a dataset is crucial, ensuring that the dataset belongs to the person it represents.

Transparency and ethical considerations in data handling are also highlighted as significant concerns. It is noted that structuring and cleaning data are among the most expensive activities in the machine learning process. The demand for transparency through regulation is seen as a potential driver for governments and industries to clarify their data practices. Transparency is seen as the foundation upon which user control can be built.

In conclusion, the debate surrounding user control and consent over biometric and personal data highlights the importance of protecting privacy and preventing data misuse. Various arguments propose different ways to achieve this, including implementing rules and ethical frameworks, entrusting control to neutral parties or citizens’ counsels, and promoting transparency in data handling. These discussions aim to establish a balance between leveraging the benefits of technology and safeguarding individuals’ rights and privacy in the digital age.

Barbara Simao

Private surveillance companies in Brazil, such as Gabriel and Yellow Cam, are providing readily accessible 24/7 surveillance solutions to neighborhoods without oversight or accountability. This poses major risks for privacy, human rights, transparency, and data sharing. The lack of sufficient information and oversight surrounding these surveillance practices is of particular concern, potentially impacting historically marginalized groups and leading to exclusion. Additionally, the demand for private surveillance solutions highlights a lack of trust in public government solutions. The regulatory gaps in Brazil regarding the use of technology and data for public security contribute to the lack of oversight and accountability. Users should be informed about the risks, legal grounds, and potential access to their data. Moreover, more legal guarantees and safeguards need to be developed to regulate the activities of private surveillance companies. Overall, greater transparency, public awareness, and comprehensive regulatory frameworks are essential to protect privacy and individual rights in the context of private surveillance in Brazil.

Swati Punia

Swati Punia raises concerns about surveillance automation and its approach to crime and criminality. She argues that current surveillance practices tend to focus on handling petty crimes, while larger, structural crimes like financial crimes are often overlooked. Swati emphasises the need to reassess our conceptions of crime and criminality to address these systemic issues more effectively.

Swati highlights the importance of an interdisciplinary approach in civil society to tackle surveillance-related challenges. She believes that conversation and collaboration among academics, lawyers, and NGOs are crucial in effectively addressing these issues. Swati points out that working in silos can limit the effectiveness of addressing systemic problems, and therefore, calls for shared learning and interdisciplinary efforts.

Furthermore, Swati stresses the need for collaboration and shared learning among the global majority to address surveillance-related challenges. She suggests that conferences and discussions can provide platforms for stakeholders from different parts of the world to engage in dialogue and share their experiences. Understanding shared experiences within similar socio-political and cultural contexts can lead to more effective solutions and responses.

Another aspect Swati discusses is the importance of digital literacy and empowerment. She notes that even educated individuals may lack digital literacy skills, such as understanding financial matters online. Swati suggests that the government should do more in terms of digital empowerment, ensuring that individuals have the necessary skills and knowledge to navigate the digital landscape.

In terms of technology, Swati argues that it should focus on building privacy and security by design. She proposes that with the lack of digital literacy, there should be technologies that inherently secure and respect the user’s privacy. Swati believes that prioritising privacy and security in technological developments can mitigate potential harms and protect individuals’ rights.

Swati also highlights the role of civil society organisations (CSOs) in capacity building. She mentions that her organisation, the Center for Communication Governance, actively works on initiatives such as the privacy law library, regional high court tracker, and professional training. Swati believes that CSOs play a vital role in enhancing understanding and expertise in surveillance-related matters.

Lastly, Swati suggests that countries like India should not simply copy-paste solutions from Europe or other developed countries. Instead, they should consider their own social, cultural, and political environments when implementing digital solutions. Swati notes that many developing nations are rapidly adopting advanced privacy norms without sufficient preparation, which may not be suitable given their unique contexts.

In conclusion, Swati Punia’s discussion on surveillance automation highlights the need to reassess our approach to crime and criminality. She advocates for an interdisciplinary approach in civil society, collaboration and shared learning among the global majority, digital literacy and empowerment, privacy and security by design in technology, and the role of civil society organisations in capacity building. Swati encourages countries like India to consider their own context when implementing digital solutions in order to better address surveillance-related challenges.

Moderator

The session titled “Beneath the Shadows: Private Surveillance in Public Spaces” focused on exploring the involvement of the private sector in surveillance and public security solutions, highlighting the associated risks, implications, and necessary safeguards. Despite Estela Aranha, the on-site speaker, being unable to attend, the session featured three online speakers, including Bárbara Simão, the Head of Research in Privacy and Surveillance at Internet Lab.

Bárbara Simão provided an overview of the topic, emphasising the role of the private sector in surveillance solutions and public security. Internet Lab, a think tank based in Brazil, specialises in digital rights and Internet policy. Bárbara holds a Master’s Degree in Law and Development and has extensive experience in digital rights research.

Beth Curley, a programme officer with the National Endowment for Democracy’s International Forum for Democracy, contributed to the session. Beth, who has a background in history and foreign services, discussed the challenges associated with private surveillance in public spaces. She offered insights based on her experience as the former associate editor of the Journal of Democracy.

Swati Punia, a technology policy researcher based in New Delhi, India, focused on the intersection of technology, law, and policy in society. With her legal background and expertise in privacy, data protection, and emerging technologies, she highlighted the importance of addressing these issues, particularly in developing countries. Swati’s current research involves exploring the potential of non-crypto blockchain in India and its implications for socio-economic challenges and privacy in the global South.

Representing the private sector’s perspective, Iezodara Córdova, the principal privacy researcher at Unico Idetec, a biometric identity company, shared valuable insights. With a history of collaborating with esteemed organizations such as the World Bank, the United Nations, Harvard University, and TikTok, Iezodara has worked on projects concerning digital citizenship, online security, and civic engagement.

During the session, questions from the audience were addressed, allowing for engaging discussions. The speakers also shared their final thoughts on the importance of regulation and policy to tackle the concerns surrounding private surveillance in public spaces.

Overall, the session provided a valuable contribution to the ongoing discourse surrounding the role of the private sector in surveillance and public security. The speakers’ diverse backgrounds and expertise added depth and richness to the discussion, offering attendees and online participants valuable insights to consider in this ever-evolving domain.

Session transcript

Moderator:
who’s here, thank you for being present, and hello to everyone who is following us online. And welcome to our session, who is titled Beneath the Shadows, Private Surveillance in Public Spaces. The general idea of the session for us is to discuss a little bit the role of the private sector in surveillance solutions and public security solutions. So we are here trying to cover in general how the private sector has been being present in public security and surveillance solutions and the risks, implications of that, and which safeguards are important. For today’s panel, we will have three online speakers. Unfortunately, our on-site speaker was not able to be present today, Estela Aranha, but we will have three online speakers, plus Bárbara Simão, who is the Head of Research in Privacy and Surveillance in Internet Lab, and who will also be introducing a little bit the subject. So I will introduce briefly Bárbara and pass the word for her so she can give us an overview of the topic, and afterwards we’ll pass to our online speakers. So Bárbara is the Head of Research, as I mentioned, for Privacy and Surveillance at Internet Lab. Internet Lab is a think tank on digital rights and internet policy, which is based in Brazil. She holds a Master’s Degree in Law and Development from the Fundação Getúlio Vargas in São Paulo, and she graduated in Law on the Faculty of Law at the University of São Paulo. She was an exchange student at the Paris Panthéon Sorbonne. She worked as a researcher in the field of digital rights at the Brazilian Institute of Consumer Defense between 2017 and 2020, and she also served as a counsellor for the Digital Health Data Protection at FIUCRUS. So, Bárbara, the floor is yours.

Barbara Simao:
Hello, everyone. Good afternoon. Actually, good morning or good afternoon or good evening depending on the time zone you’re at. As Luisa mentioned, I’m Barbara and I’m Head of Research for Privacy and Surveillance at Internet Lab. And first of all, I’d like to thank you so much for coming here, for being present. And I will give you just a brief overview of what we are talking about and why we decided this would be an interesting topic for discussion. I’ll just share my screen because I have a few images that I would like to show you. Let’s see if this goes smoothly. I think you’re being able to see it, right? Yes, we can see it. Okay. So, well, the topic of the session is private surveillance in public spaces. And Luisa already mentioned and introduced a bit of what we are talking about. But I just like to give you an overview of what’s happening in Brazil that made us think that it was interesting to bring this topic to discussion today. So, in the past couple years, we are seeing the growth of these private companies called Gabriel, Yellow Cam, and, well, different names for one same kind of business. That is, these companies that sell private solutions, private cameras, private totems with cameras that with, well, 24-7 ability to access them and that are shared between neighborhoods. So, any group of neighbors, any type of local community can buy a camera or access them by a monthly fee, and they are installed into a… … … … … … … public streets. So they are offering these totems or cameras that can be easily accessed by anyone. I bring here some of the excerpts of their websites. It is in Portuguese, but I will translate it to you. In general, they say, they claim that, I’ll just see if I can point so it’s more clear. OK. In general, they claim that modern cameras, they are solutions for security anywhere. They claim that their mission is to make streets, neighborhoods, and cities more intelligent to protect anyone inside and outside home. And Yellowcam, one of these cameras, includes the app to access the cameras is 100% free. The download can be made by anyone in Play Store or Apple Store. And in the app, it’s possible to locate and city map the cameras installed and visualize the images in real time, 24-7. And it’s even possible to search for images that were took in different times or dates. They claim also that the installation of these cameras can make the region safer and that they can be accessed by the public authorities, including the police. And they claim the tendencies with the cameras is that it can decrease criminality rates from these regions over time. So they’re basically selling these 24-7 solutions that can be acquired by local communities, by a group of neighbors, and that can be accessed without any kind of oversight and accountability. And that’s what’s concerning for us. And also the fact that some news pieces here in Brazil announced that these companies were having private channels of communications with the police stations. So the police stations weren’t actually demanding a warrant to access the images held by these private companies. They were accessing it almost like real time because of these private channels of communications that were existing between the companies and the public authorities. So we think this is an important topic for us to cover because it can pose an amount of risks for privacy and human rights. It can have impacts on transparency and data sharing between private and public bodies. And besides that, it can even affect the right to the city considering the fact that surveillance may affect in a bigger form, certain groups of people that are already excluded. And this case is somewhat relatable to what happened within the Clearview AI case, which for those who aren’t familiar with it, it was a company that held a database with over 3 million images obtained through data prepping. So they data scratch public bases of images and they shared these images with police stations over the world for identification and resolution of criminal cases. And these images were collected without any kind of information, without any kind of oversight. And there wasn’t any kind of accountability about the company’s practices. And it was a case that caught the world’s attention because Clearview AI actually became inclined by many different jurisdictions because the lack of legal grounds on it was… doing it. So I think this session is to discuss these topics, is to discuss the relation between public security, criminal procedure, and these private surveillance solutions that are arising. And not only in Brazil, but many countries have these home security solutions that are also being sold. So I think it’s important for us to discuss that having the lens of the impacts it can have to privacy and human rights and for transparency. So we prepared a few policy questions for you. And in general, we want to understand what are the broader societal implications of extensive surveillance and their impacts on human rights. How does private surveillance affect historically marginalized groups? How does the lack of transparency required from private surveillance companies affect human rights? What are the dangers concerning third-party sharing with other private institutions or public authorities without transparency? What are the liabilities that insufficient legal protections regarding the shared use of data posts to individuals and groups? And does the current regulatory landscape for privacy and data protection give us sufficient protections to ensure enforcement of human rights and equitable access to public spaces? So these are a few questions we prepared for today’s session. And these are a lot of questions to discuss, only one hour. So without further ado, and giving this brief introduction, I would like to pass the floor to Beth Curley, who will also join us for this panel. And Eloisa, I think you’ll present her, right?

Moderator:
Yes. So Beth Curley, thank you for joining us today. Beth is a program officer with the research and conference section of the National Endowment for Democracy’s International Forum for Democracy. studies. She was previously associated editor of the Journal of Democracy and holds a PhD in history from Harvard University, and a Bachelor of Science, science, foreign services from Georgetown University. Thank you for being here today, and the floor is yours. Thank you. That can you hear us.

Beth Kerley:
Hi, I’m sorry I was muted. Barbara I think you need to unmute my video as well. I can’t. Now we are hearing you. I’m able to unmute my audio but not my video. I guess I can start talking and perhaps my face will show up later on in the proceedings. So, thanks Barbara and thanks everybody. I’m sorry I can’t be there in person but really looking forward to this discussion. And so, Barbara that was really fat I hadn’t seen those slides before but I think those cases that you shared are really great illustrations of some of the broader points I was hoping to make here. And so, um, oh, there I have a video to. And so I think what I’m going to do in these remarks is first try and situate those examples in some broader global trends that we’ve been tracking, and also highlight how the potential of use of emerging technologies like biometric surveillance in connection with cameras in public spaces poses additional risks. So, to frame the comments, a little bit in an essay on what he calls subversion in Rod Deibert at Citizen Lab has written about the risks from surveillance vendors, making more widely available to both government and private clients capabilities that would previously have been available to just a few well-resourced states. His focus in that article is on the profound challenges to democracy from commercial spyware, which tracks us through the devices that we carry with us. But I would argue that this question of the growing accessibility spread and if you like democratization quote unquote of surveillance technologies, and they’re intertwining with the broader surveillance capitalist ecosystem, very much applies to the devices that other people place in the physical world around us as well. And in that regard, there are three main points that I’d like to cover. First, network surveillance of physical spaces is rapidly emerging alongside traditional digital surveillance as a pervasive reality that changes the conditions for engaging in public life and exposes people to targeting by both public and private entities. Second, emerging technologies such as biometric surveillance or so called emotion recognition are enabling the entities that control cameras in public spaces to do new things with them. And third, commercial suppliers play a crucial role in the spread of physical surveillance technologies to both public and private sector clients and their involvement, as Barbara very correctly stated, presents challenges to enforcing transparency and accountability norms. So on the growth of surveillance cameras. Already in 2019 there was an estimate that by 2021 which is two years ago, the number of surveillance cameras installed globally would exceed 1 billion, a significant number of those cameras, more than half are in But established and emerging democracies are also home to staggering numbers of cameras. Those include smartphone cameras, as well as cameras that have been installed in commercial settings, which was traditionally an anti-theft measure but now you see commercial entities installing surveillance is actually a kind of consumer convenience letting people skip the checkout line to have their faces scanned instead. The line between public and private surveillance can be very thin. So the system that Barbara described would be a perfect example of that. In India, there’s similarly an app that lets citizens share footage from their private CCTVs with the police and that’s just one of many cases. Those kinds of partnerships can reflect genuine public concerns about crime, but they also raise challenging questions on how privacy and anti-discrimination safeguards can be applied when law enforcement functions get outsourced to untrained citizens. Citizens can of course themselves also misuse access to CCTV, for instance to digitally stalk strangers or acquaintances or to engage in blackmail, but the blurry line between public and private surveillance also works the opposite way. the other way around, by which I mean to say that the private vendors who supply surveillance tech to public entities play an important role that’s increasing as that tech itself gets more complicated. For instance, when companies sell smart city packages, their selling points, strengths, and profit logic can play a large role in determining what’s included as part of those packages. Some great reporting from Access Now has shown how companies like Dahua have incentivized officials in Latin America to adopt their surveillance tools by offering so-called donations. And finally, vendors after the point of adoption can become very closely involved in managing the tools and of course the data from public surveillance projects. So simple CCTV cameras present plenty of risks, but the new AI tools that researchers like IPPM have identified as the drivers of the video surveillance market are multiplying these risks by letting people make sense of the images that are captured quickly and at scale. In a 2022 report for the forum, Steve Feldstein identified 97 countries where surveillance technologies involving AI are in use. And I think it’s safe to say, given all the trends around us, that that number has since grown. Facial recognition technologies are probably the most widely discussed quote unquote enhancement to surveillance cameras. And they might be sold as part of the package together with cameras for so-called live facial recognition, but facial recognition can also be applied to ordinary camera footage after the fact using software from private vendors like their UI, which was mentioned earlier. And the risks of facial recognition have been pretty widely discussed. I think it’s the best known type of AI surveillance. So just to very quickly recap, when it doesn’t work, facial recognition can lead to false arrests, a harm that has very specifically affected black communities in both North and South America, as documented by Joy Buolamwini and others. And when it does work, facial recognition alongside other forms of biometric surveillance like voice or gate recognition make it much easier to use cameras to track specific individuals. This again has potentially legitimate purposes, but it can very easily lend itself to political abuses as with the abuse to track and identify protesters and dissidents in Russia and Belarus, which you’ve already seen. And it also puts the potential for private citizens to. abuse facial recognition technology in greater reach. So also in Russia, there was a widely publicized lawsuit in 2020 that started when an activist was able to buy images tracking her own movements from the city’s facial recognition system on the black market for just $200. There are slightly different challenges presented by emotion recognition technology, which doesn’t focus on someone’s identity per se, but tries to infer their emotional state based on facial cues. Analysts have very sharply criticized this approach as pseudoscience. And it’s also not hard to understand the ways in which it might be abused to ensure at least the perception of conformity with government policies. But there’s still a strong commercial interest in that kind of technology, whether to monitor students, drivers, and criminal suspects in China, or to test and target ads in Brazil and the United States. And again, we see this kind of technology actually installed in public spaces so that the billboard is looking back at you, so to speak. And finally, AI means that surveillance cameras in public spaces aren’t working on their own. Analytical tools can combine information from biometric surveillance tech with information from other online and offline sources, like shopping records or government databases, to build profiles of people or of groups. On an aggregate level, all this information collection can both exert a chilling effect and enable abusive behaviors by data holders, both public and private. And just to go through a few of those, first, profiles of people and groups can be the basis for targeted information operations meant to deceive and polarize something that Samantha Hoffman has worked on. As Cathy O’Neill has described, profiles can enable discrimination, whether in the form of withholding state resources, targeting advertisements in a way that disadvantages certain people, or through negative treatment by law enforcement. Third, digital rights activists worry that the mere presence of biometric surveillance in public spaces, and this would certainly extend to cameras more generally, whether or not they’re working, can have a chilling effect on people’s willingness to attend public protests, journalists’ ability to meet with sources, and other vital civic activity. And finally, profiles can enable governments to track people’s behavior in minute detail and exercise control through subtle systems of rewards and penalties in the manner. that is somewhat loosely envisioned by China’s social credit initiatives. So finally, why does it matter that private companies are so deeply involved in surveillance? I think whether we’re talking about genuinely private surveillance or public private partnerships, there are a few basic challenges and these include first data access. So vendors who partner with governments on surveillance projects are likely to have a commercial interest in keeping the data that’s collected. And that’s all the more true as companies are seeking to train and refine AI tools that depend on data. Democratic governments on the other hand have an interest in following principles like data minimization and purpose limitation for data collection. And in the project that we worked on together as part of the Forum Smart Cities Report, I know Barbara and her co-author, Belinda Santos pointed out that a lot of the ICCT contracts that she was seeing in Brazil did not have specific provisions on how those private partners could use the resulting data. And that kind of gap is a broader trend which raises a lot of risks that vendors may be reusing data which perhaps there was some privacy infringement but it was collected for a public purpose that was so important it was worth it. But then it gets reused by commercial companies for reasons that would not have justified that infringement or it may get resold through the ecosystem of data brokers or even shared with foreign governments if we’re talking about foreign companies that are operating in different countries. Second, transparency. Public institutions in democratic societies are at least in theory supposed to follow transparency norms whereas private companies are not subject to the same rules and they’re naturally going to be inclined to try to protect their intellectual property. This can make it difficult for citizens, NGOs and journalists to find out about how the surveillance systems that are watching them work. And I would argue that this is going to become a more important issue as surveillance technologies themselves get more complex and need to be evaluated for issues like encoded bias. And finally, when private surveillance feeds into public surveillance, it can be difficult to maintain clear lines of accountability for abuses. Again, these challenges are likely to grow as citizens experience infringements such as unfavorable government decisions that they can’t have explained that are made by inscrutable technologies based on a mix of public and private data that’s been collected about them. So private surveillance, especially in an age where the trend is toward cloud-based and AI enabled surveillance is very deeply entwined in a broader surveillance ecosystem that crosses boundaries of sector, of country and of the physical and the digital world. And that ecosystem is enabling new types of infringements on human rights. We see these being taken to an extreme and authoritarian settings but they’re relevant to all of us as we grapple with ways in which technology is changing the landscape for privacy. And these risks really raise the urgency especially with private actors playing such a large role of multi-stakeholder engagement to develop new guardrails for democratic rights in a world where incredibly powerful surveillance tech is now an easy reach for our governments, companies, and even private people. And on the question of solutions since I think I’m about at my time, I am going to shamelessly turn things over to the next speaker in hopes that they will provide some answers. So again, thanks very much and look forward to the discussion.

Moderator:
Thank you, Beth. Thank you so much. for the rich contributions to this discussion. And now I will pass to Swati Punia. Swati is a technology policy researcher based in New Delhi, India. She is a lawyer by training and has earned certificates in digital trade and technology, cyber law and corporate law. Currently, she works with the Center for Communication and Governance and Academic Research Center based in the National Law University of Delhi on issues that apply at the intersection of technology, law and policy in society. Her focus areas include privacy, data protection, data governance and emerging technologies. At present, she is examining the non-crypto blockchain ecosystem in India and studying its potential for addressing socioeconomic challenge, creating inclusive in governance models and embedded in privacy in the context of developing countries of the global South. Prior to joining CCG, Swati worked with a leading Southern voice on fostering consumer sovereignty in digital economy. Swati, thank you for joining us today and the floor is yours.

Swati Punia:
Thank you so much, Barbara and Elisa. It’s so lovely to be on the same panel as all of you and thank you to Beth for laying down such apt and elaborate impact and implications of surveillance. And I think that allows me to deep dive into the question that was asked of me, which is essentially like, what are the social inequalities and the discrimination, you know, regarding these kinds of surveillance acts and what in civil society do in terms of bridging some of these big cracks in the world? that we’re seeing in the society develop. I think to just sort of hinge it to what Barbara was mentioning that’s happening in Brazil is not a standalone thing. We’re seeing that happen across the world. And India is unfortunately not behind any of these trends. We’re emulating all of these trends that we’re seeing in terms of automating surveillance. And a number of states that I know were named as like the biggest surveillance states in the world, not just in the country. It seems like every state in India is sort of in a race and competing to automate surveillance. That seems to be the top priority. Having said that, the good part is the civil society has been an active player and been studying and researching and looking at this development. And they do sort of move to courts in the last couple of years that we’re seeing when we see these kinds of instances come up and rise. But I think essentially what I want to highlight is that given that you have all of these instances happening and you have these kinds of systems put in place, the most important thing is the public-private partnership aspect. Often we’re seeing these public-private partnerships add efficiency, but here I think the main question that is to what end and for what purpose? They’re not just deployed in sort of developing the technology for the state and deploying it, but they often are also involved in management of it and sort of upgrading the systems. And nobody really, this is to anybody’s guess, nobody really knows how they’re involved in with the data management or whether they’re not. Nobody knows that when a police is sort of stopping a person on the road for like random biometrics, random face recognition and all of those clicks that they take. where does that data sort of land and what purpose it is used for and unfortunately this is despite India sort of six years back in 2017 having the landmark judgment on the right to privacy passed by the Supreme Court of India which sort of gave a very spectacular turn to the jurisprudence on fundamental rights where the Supreme Court tied the right to privacy with the right to life, liberty and dignity and sort of reading it as an important facet to ensuring equality and freedom of speech and expression and also at the same time placing you know people at the heart of new age policymaking but we’ve seen not enough happening on this side but one is going to be positive with like new data protection act coming into place and all of that but one important thing that the Supreme Court categorically mentioned over there was that privacy cannot be used to further systemic inequalities. Now what that means is that everyone recognizes the fact that automation is is not creating something new, it is often exaggerating which already is pervasive and exist in the society and we all know that the kind of societies that we live in are not exactly balanced and we have a range of inequality sort of you know within our societies or really deeply entrenched so I think the main problem then is not the way you know exactly we shouldn’t really like what I mean to say we shouldn’t really go to automation but to like take a step back and see like what how do we really understand crime and criminality as a concept to really go back over there and start from there that if automation is just a tool to exaggerate everything then should we like take a step back and then try and see what are our misunderstanding and misconceptions of what is really a crime made of because if you see all these CCTVs and all of this gadget and stuff is really being into force to handle petty crimes on the street, right? In a very set place, you know, where they’re saying, okay, you’re putting in so much of resources, money and effort to handle this one type of crimes, but is that what contributes to the larger, you know, criminality in the society? What is the percentage of it? Where is this kind of like a behavior of the state or of the private sector in conjunction with the state leading us to create what kinds of society? So in that sense, I think if we go back to just see that every state is a way of defining their crime and criminality, and we can all, I think, come together on this understanding that a lot of the people that we look as criminals are often people from these historically marginalized communities, people who come from below the poverty line, people who have already experienced and lived in equal treatment from the state and from the society. There’s certain religious castes and sects who suffer already these kinds of discrimination, and that kind of social inequality then sort of gets highlighted through technology and even exaggerated and entrenched. And the fear is that a lot of these inequalities through the use of all these automated techniques that Beth also talked about, they sort of will regularize, get regularized into the way things will function going forward. So I think the million dollar question then is that the fact who is going to make that assessment that the kind of crimes we’re trying to handle, are they the real crimes? Or I’m not taking of the fact that maybe, you know, some of this should not be done, but to what end and for what purpose? Another thing that we all sort of. is the reason some of this act is being put together is that you know to check people’s behavior and you know that sort of understanding seems to develop and even become popular that if you sort of check somebody’s behavior then good behavior will get internalized if you’re constantly being surveilled and you know one cannot deny this completely because yes we’ve seen a lot of studies which support that constant surveillance might sort of create a dent in creating good behavior and some sort of internalization can happen but again then i would go back to the same question that you know how much of these kind of crimes which are getting corrected through this behavioral surveillance we’re trying to tackle and what are these crimes and are there bigger crimes like financial crimes and everything which needs maybe more attention so maybe what needs to be looked at are we trying to plug small loopholes and small gaps and turning a blind eye to like those big cracks and holes which are sort of getting deeper and wider and the fact that you know one important aspect of criminality and crime is that is crime generally as a concept behavioral or structural i think if you can go back to thinking that because my limited understanding of the whole issue is that crime is generally structural it is not behavioral and there are a lot of studies at least that i’m aware of in the indian context written by people across civil society and one work which i’d love to highlight of a colleague shivangi narayan in the indian scenario who did like an ethnographic study of one of the states in india which is the national capital delhi where she’s uh you know categorically gets into how policing and construction of the idea of criminality impacts the society and why we define and decide to employ certain kind of measures and how they do not really work for creating a better society you’re rendering a better society, but it’s actually just the opposite. So in that sense, I think we need to go back to some of these ontological and taxonomical related questions and then assess where are we moving towards and why are we moving towards that role? I think civil society’s role is extremely important. It’s often, of course, working in its own silos, like within civil society, academics are sort of working within that closed space, lawyers are with themselves, the larger NGO system are working in their own space. I think a lot of conversations with each other is important so that you can share work and build a better understanding. For example, lawyers might be looking at laws that exist even till date, despite India having the landmark judgment on right to privacy six years back. A lot of the way things are getting defined in India in terms of surveillance still is getting decided by predated laws to this Puttaswamy judgment. And even after that, we don’t see much change sort of happening. At the same time, employers working on these kinds of laws and understanding are talking with people like NGOs who are looking into ground research, who understand how these marginalized and vulnerable communities really get impacted and bring out those instances and experiences in conjunction with their study on secondary research and laws and policy framing. I think that will help us build a clearer picture and better resources. One of the reasons could also be that, these kinds of issues now will sort of help us go towards this direction. And another thing is I think conferences and discussions like the one that Hinojosa and Barbara are hosting, which is allowing people from different geographies and people across the world in the Southern Belt and global majority come together to discuss these issues and figure out, okay, these are the similarities and these are the differences. the divergences because often my understanding is a lot of similarities and synergies and experience shared experiences that we go through in the kind of familiar socio-political and cultural context that we have in this part of the world so it will be fabulous I think in terms of growing together and understanding and learning from each other’s experiences how what we can change and how we can look at the subject. I’ll stop there and happy to come and meet again. Thank you Suari. Thank you

Moderator:
for the rich reflections you have made. Now I will pass the word to Iezodara Córdova who is representing the private sector. Iezodara is the principal privacy researcher at Unico Idetec, a biometric identity company. She has worked with various organizations such as the World Bank, the United Nations, Harvard University and TikTok on projects related to digital citizenship, online security and civic engagement. Thank you so much for joining us today Iezodara and the floor is yours.

Yasadora Cordova:
Right, thank you so much for the invitation. It’s always a pleasure to be in any event that the internet lab invites me and I have just a little bit of information to add. The first one that I would like to to feature is that I know that we navigate the intricacies of identification technologies so I want to delve into the nuanced distinctions between biometrics and facial recognition and because it’s where the question of user control takes the center stage. So biometrics as a comprehensive concept involves recognizing individuals through unique physiological or behavioral attributes such as fingerprints or iris scans, for example. Crucially, what sets biometrics apart is the insistence on user consent or authorization. So for example, in countries where there is a wide amount of people that have no digital literacy, it’s easier to use their biometrics to buy or have access to social benefits or even to complete transactions using their own identity if they’re using biometrics than keeping passwords, for example, because it’s safer. So I think when you call biometrics, you have to also emphasize the importance of user control over the data collected about them. The users are seeing that their data is being collected and they’re using this biometrics because they want to open up a set of opportunities that they didn’t have before because they couldn’t keep their password safe or they couldn’t use the system because it was too complicated. And in contrast, facial recognition, which is a subset of biometrics, hinges on the analysis of facial features for identification. This method can operate without explicit user consent or even awareness. So it raises concerns about privacy and freedom of expression and personal control. So here, the crucial point emerges. User control is paramount. The fact- that entities like law enforcement can retain and edit videos recorded by body cams, for example, underscores the potential misuse of data. So the power to control such sensitive information should ideally rest with a neutral third party, citizens, counsel, or something like this, at the very least, to preserve the user’s autonomy over their own identity. So preserving user control becomes not just a matter of privacy, but a safeguard against potential misuse. And it’s not an expensive safeguard. It highlights the need for robust ethical frameworks and regulations, but also highlights the need of putting the data in control of those who actually are the origin of the data, if you’re talking about biometrics. So we could create rules, international rules, or talk about rules that could separate those two types of different types of technologies, of identification technologies, so that we could have better frameworks to protect people that are being filmed, having their biometrics, their facial biometrics collected, like, for example, Clearview AI, and kind of demand that these companies have a way to inform the users that their data is being collected, and offering an option for these users to withdraw the consent or withdraw the permission of these companies to negotiate this data. or to collect or to keep this data in their user base, in their database, instead of just, how can I put this, instead of just assuming that it’s an impossible question. There is use to biometrics. Biometrics is already being used to create opportunities in some countries and make technology better and safer. But this is not going to happen if the user is not part of the decisions over their own data. So I think the crucial conversation should not be around the type of data that is being collected, because it could be biometrics, or it could be very sensitive type of data that’s being collected, and you are not aware of that. So I think control is more important than, it’s a more important question right now, than who controls this data is more important than what type of data is being collected. I think that’s it. And this is also a solution that can reach the end users and kind of help us build trust and give back the control to the users. That’s what I had to say. I’m happy to take questions or feedback later, feedback later if you have. And that’s it.

Moderator:
Thank you so much, Azadara. So now we have around nine minutes. So I will quickly open the floor to those who are here and may have a question. I will ask you to come close to the table to get a microphone. And we will do a quick round of interventions. For those who are online and have any questions or interventions, please write this on the chat. We have someone here who will get this. And after that, we’ll do a quick round of wrap up with our speakers. So we do have two questions here. Please.

Audience:
And thank you for sharing your very interesting thoughts about the data security and who should control the data. I would like to hear your opinion on the blockchain technology, how and whether do you think that the blockchain would be a solution for specifically collecting the biometric data? Do you think that might be a solution to just help to control the access to the data, the blockchain technology itself?

Moderator:
Thank you. I think we have another question there.

Audience:
Yeah. My question pertains to India, essentially. There is a very recent development just this earlier year, earlier in this year, where it was made known to the public that there is something called real-time surveillance happening. And this was in a reply to a right to information request. And the reply was from the Internet Service Providers Association. So in light of this, with our act having come into play, which is yet to come. into force. But my question is, are there any safeguards that the speakers would like to highlight? I understand one such safeguard was just mentioned, but in terms of the others for protecting users and giving them certain actionable rights, for instance, even being made aware of all the data that is being processed, and even a notice showing that they are under surveillance in public areas, specific to public areas, this is. So just wanted your thoughts on that. Thank you. So now I think we have one question on the chat. We have one question online from Ayawalesh Bashi. I’m sorry if I mispronounced your name. From Australia, from civil society. And the question is, increase advanced technologies such as AI, blockchain, EOT, IOT, NFC, NFT, QC, increase private surveillance in public spaces. All these technologies are creating big data information. And these days, data and information are wealth, the wealth accumulated in developed nations. All these technologies perform activities and services via the internet. So the question is, what will be the solution for end users? So far, that’s the only question we have in the chat. So thank you.

Moderator:
So thank you. I will get back to our speakers and do a quick round of wrap up. And I will ask you if you want to add any considerations, any final considerations. And any considerations you may have on how regulation and policy in general can work to address these concerns also. And please feel free to pick the question that you feel most comfortable to answer, so you don’t need to answer at all. So I will start back with Beth.

Beth Kerley:
Sure. So difficult questions there. But I think on the question of types of safeguards, it definitely does depend on what type of tech we’re talking about. So I agree, I would distinguish, I’m following up on Yesodara’s remarks, not just between facial recognition and other forms of biometrics, but also between biometric identification and biometric surveillance, the things you were talking about would mainly fall under biometric identification, where users basically intentionally use a certain physical aspect as the way to access a space or access their account, or what have you within a particular system. And in that context, I think it’s easier to apply the consent framework. Of course, there are also other forms of biometric surveillance besides facial recognition, that are very hard to opt into like voice recognition or gate recognition, something like a fingerprint, I think, you know, that’s the one I am willing to actually use on my phone and my computer, it’s slightly harder for someone to kind of get from you unawares. So would agree with that distinction. And I think that there are certainly, it’s a different question. So when we’re talking about biometric identification, I think there are indeed valid purposes for it. But there’s a really heightened need to establish appropriate safeguards. Because sometimes, even if you’re giving it over for a legitimate reason right now, it can end up later on in the hands of entities who you would prefer not to have it. And unlike a password, you can’t change your fingerprint as easily. And I do think that’s a fundamental distinction there. But I would agree that identification versus surveillance is important. And in terms of blockchain, I am less of an expert on blockchain instinctively. I think putting sensitive data in a system that is designed to be unerasable is a move that we should definitely think twice about, but open to arguments on that one. And real-time surveillance, finally, I think that is really the hardest thing to put safeguards around. And that’s why a lot of European digital rights groups in the context of the EU AI Act have been arguing that that’s something that should simply be banned, having constant awareness of who’s going in and out of public spaces. I think at the very least, you need to delete any data that is collected that way very clearly, and definitely agree with the suggestion of making people aware of when they’re being surveilled and what information about them the government possesses. In settings that have very elaborate e-government systems like Estonia, that’s actually part of the safeguards that are built in to ensure trust. So that could certainly be part of the answer. I do not have the comprehensive solution, unfortunately, to the challenge of emerging technologies and surveillance. Otherwise, I could write one report and go home.

Moderator:
Thank you, Ben. I think none of us has the solution. So thank you, actually, for all your contributions. And I will pass now to Swati.

Swati Punia:
Thank you, Alyosha, and you rightly say that none of us have the solution. But good that at least we’re coming together to discuss this. sort of at least think of ways that we can work together for a better response in society. I think Beth talked a lot about blockchain and my next panel right after this is on blockchain. Those interested, please join us there. But I’ll speak to the point on the consent and notice issue. I think, again, maybe this is how my brain’s wired in the last few days, that I want to step back and really look at some of the issues or the concepts that we’re bringing in the digital era of policymaking and regulation is that notice and consent, how is somebody who is from these vulnerable and marginalized communities or even people like us who we call ourselves educate spectrum class of people, we really don’t have, a lot of us, really don’t have the digital literacy. Like I would say for myself, I don’t have enough of financial literacy despite being educated. I really think that is the main issue that the government’s doing barely about anything in terms of using the word empowerment. Of course, that’s very nice and it’s used across all sorts of regulations or anything, but for somebody to use and implement and understand notice and consent, you need some level of that digital literacy. People wouldn’t even recognize, I think, harms when they sort of happen to them. So I kind of feel that a lot of like technology that is being used in the name of trust and everything should be focused towards building privacy and security by design with the kind of communities and the public that we have and the kind of work that we need to build on digital skilling and understanding should be taken much more seriously. And I think that’s where the CSOs are playing a massive role. And just to give an example, like we at the Center for Communication Governance, we’ve been building this privacy law library which traces privacy jurisprudence across 17, 18 jurisdictions. in the world. We also do like a regional high court tracker where we sort of map what is India sort of looking at in terms of privacy and the expanding rights over there, how is it tackling. This is to, and we also do capacity building for like not just students and professional, but also for judges and bureaucrats, because a lot of these people who will now come into enforcing and implementation of the new act and everything, really don’t understand the nuts and bolts of how to go about things. So India, and I think a lot of similar countries are jumping directly to like a privacy 3.0, 4.0 situation where they’ve really not lived through it gradually as Europe and some of these other countries live, right? So I think we have to be cognizant of that kind of social, cultural, political environment, and then think of ways that will fit into our specific, you know, pegs and not just like copy paste.

Moderator:
Thank you, Swati. Now we’ll pass first to Yazudara and then to Barbara so we can close the session. Due to time constraints, we won’t be able to take any more questions. I know there is someone online with their hands up, but we really need to close the session, but I do encourage you to get in touch both with us and with our other speakers. So I’ll pass it over to you, Barbara, and then we’ll move on to the next slide. Please, Yazuda.

Yasadora Cordova:
I’ll be real quick, I promise. So I think we find ourselves in an era where data is amassed indiscriminately, not just biometrics data. And this is propelled by both industries and governments. There is a demand for data. So amidst this deluge of information, the integrity of personal identifiable information has become increasingly expensive. It’s a daunting task. So this intricacies. of structuring and cleaning data, which are integral steps in the machine learning cycle. They are a challenge and this process is undeniably among the most expensive activities in this machine learning process. So I propose a pivotal shift in focus towards the user control. We know that you can control what you don’t see and this resonates in the realm of data privacy because if we need permission or consent over a data set, we need to make sure this data set belongs to that person. So if we demand this through regulation, we might end up compelling both governments and industries to bring light to the data practices. So this shift is not merely about implementing complex blockchain solutions. It’s a call to collaborate, to build transparent systems that are hand-in-hand with regulators and technologists. Of course, we will still have lots of work to do, even though we can conceive such systems that can be transparent over user data, but it’s crucial to recognize that transparency is the bedrock upon which user control stands. So it’s not just a technological challenge. It’s a societal demand. It’s a societal imperative. And I believe that we have to work collaboratively to shape a future where individuals have a meaningful say in how their data is utilized, but this for real, like in systems and where ethical considerations guide technology as a feature backlog toward the responsible and sustainable data-driven future, I guess. So that’s it.

Barbara Simao:
Well, I think that and Swati and Yasariah already answered a lot of what I was wanting to say, but I think when we are talking about solutions and regulations, especially in the case of Brazil, I think the appeal for private surveillance solutions from population in general comes from a place of insecurity and not trusting the public government solutions and they look for it in a way to overcome their lack of security felt in general. And I think the solution would be societal, as Adara mentioned, in the sense that this would also require a big level of trust of the people in general into the public institutions. And I think when we are talking about regulation also, especially in Brazil, we have a lack of regulation regarding the use of technology and data collection for public security purposes. And not that these private companies actually do public security because they are private solutions and then they are not exactly providing public security, but I think when we ask them what they are doing, they can use the argument of public security. So I think it’s a tricky scenario, it’s a regulatory scenario, and I think in Brazil we have a lot to develop yet in this sense, and I think there are many room for more guarantees and for more legal guarantees regarding it. And I think we’ll… should be awareness, should be also raised in the sense that the people that acquire these solutions are also informed on what are the risks and what are the grounds in which these companies can share data with public authorities and who might have access to it. And well, I think that’s it. I’m not sure if I added much to the discussion, but I would like to thank you both for coming and especially for the time zones, which I know weren’t so good for everyone, but thank you so much and that’s it.

Moderator:
Thank you everyone. Thank you for our speakers for all the contributions and for having joined us today. And thank you for everyone who were here today, both in person and online and who made excellent contributions and thank you. I hope you continue to have a great IGF. Thank you.

Audience

Speech speed

140 words per minute

Speech length

374 words

Speech time

160 secs

Barbara Simao

Speech speed

141 words per minute

Speech length

1483 words

Speech time

633 secs

Beth Kerley

Speech speed

172 words per minute

Speech length

2770 words

Speech time

967 secs

Moderator

Speech speed

142 words per minute

Speech length

1085 words

Speech time

460 secs

Swati Punia

Speech speed

182 words per minute

Speech length

2570 words

Speech time

846 secs

Yasadora Cordova

Speech speed

128 words per minute

Speech length

1156 words

Speech time

543 secs