Day 0 Event #92 Eyes on the Watchers Challenging the Rise of Police Facial
23 Jun 2025 15:15h - 16:00h
Day 0 Event #92 Eyes on the Watchers Challenging the Rise of Police Facial
Session at a glance
Summary
This discussion focused on facial recognition technology (FRT) used by police forces and its impact on civil liberties, presented by the International Network of Civil Liberties Organizations (INCLO). The speakers outlined how FRT works by comparing facial templates from images against reference databases, emphasizing that it is a probabilistic technology prone to errors and biases. INCLO developed 18 principles to govern police use of FRT after observing widespread problems across their 17 member organizations in different countries.
The presentation highlighted several concerning real-world cases demonstrating FRT’s dangers. Robert Williams from Detroit was wrongfully arrested after being misidentified as the ninth most likely match by an algorithm, despite two other algorithms failing to identify him. The speakers noted that documented cases of misidentification disproportionately affect Black individuals, and retail chain Rite Aid was banned from using FRT after thousands of wrongful accusations between 2012 and 2020.
Three detailed case studies illustrated the principles’ importance. In Argentina, CELS successfully challenged Buenos Aires’ FRT system in court, revealing that police had illegally accessed biometric data of over seven million people while claiming to search only for 30,000 fugitives. The court found the system unconstitutional due to lack of oversight, impact assessments, and public consultation. Hungary’s recent case demonstrated FRT’s weaponization against civil liberties, where the government banned Pride parades and threatened to use FRT to identify participants, creating a chilling effect on freedom of assembly.
The discussion concluded that these cases validate INCLO’s principles, which call for legal frameworks, impact assessments, public consultation, judicial authorization, and independent oversight to protect fundamental rights while acknowledging that some organizations advocate for complete bans on police FRT use.
Keypoints
**Major Discussion Points:**
– **INCLO’s 18 Principles for Police Use of Facial Recognition Technology**: The International Network of Civil Liberties Organizations developed comprehensive principles to mitigate harms from police FRT use, including requirements for legal basis, impact assessments, public consultation, independent oversight, and prohibition of live FRT systems.
– **Technical Limitations and Discriminatory Impacts of FRT**: Discussion of how facial recognition is a probabilistic technology prone to false positives/negatives, with documented cases of wrongful arrests disproportionately affecting Black individuals, and the arbitrary nature of algorithmic matching systems.
– **Argentina Case Study – Systematic Abuse of FRT Systems**: Detailed examination of Buenos Aires’ facial recognition system that was supposed to target only fugitives but illegally accessed biometric data of over 7 million people, leading to a court ruling the system unconstitutional due to lack of oversight and legal compliance.
– **Hungary’s Weaponization of FRT Against LGBTQ+ Rights**: Analysis of how the Hungarian government banned Pride events and expanded FRT use to identify participants in “banned” assemblies, demonstrating how facial recognition can be deliberately used to suppress freedom of assembly and peaceful protest.
– **Community Engagement and Advocacy Strategies**: Discussion of the need for creative grassroots education and awareness campaigns to inform the public about FRT risks, since many people are unaware these systems exist or understand their implications.
**Overall Purpose:**
The discussion aimed to present INCLO’s newly developed principles for regulating police use of facial recognition technology, using real-world case studies from Argentina and Hungary to demonstrate both the urgent need for such safeguards and the severe consequences when proper oversight and legal frameworks are absent.
**Overall Tone:**
The tone was serious and urgent throughout, with speakers presenting factual, evidence-based concerns about facial recognition technology’s impact on human rights. The tone became particularly grave when discussing the Hungary case, highlighting the immediate threat to LGBTQ+ rights and freedom of assembly. While maintaining an academic and professional demeanor, there was an underlying sense of alarm about the rapid deployment of these technologies without adequate safeguards.
Speakers
– **Olga Cronin**: Senior policy officer at the Irish Council for Civil Liberties, member of INCLO (International Network of Civil Liberties Organizations)
– **Tomas Ignacio Griffa**: Lawyer at Centro de Estudios Legales Sociales (CELS) in Argentina, also an INCLO member
– **Adam Remport**: Lawyer at the Hungarian Civil Liberties Union, also a member of INCLO
– **Audience**: Multiple audience members including Pietra from Brazil who is part of a project doing community activations about facial recognition in police use
– **June Beck**: Representative from Youth for Privacy
– **MODERATOR**: Workshop moderator (role/title not specified)
**Additional speakers:**
– **Victor Saavedra**: INCLO’s technologist (mentioned as joining online but no direct quotes in transcript)
– **Timalay N’Ojo**: Program manager of INCLO Surveillance and Digital Rights Pillar of Work, based at the Canadian Civil Liberties Association in Toronto (mentioned as joining online but no direct quotes in transcript)
Full session report
# INCLO Workshop on Facial Recognition Technology and Civil Liberties
## Executive Summary
This workshop at the Internet Governance Forum, presented by the International Network of Civil Liberties Organizations (INCLO), examined the threats that facial recognition technology (FRT) poses to fundamental human rights. The discussion featured presentations from civil liberties lawyers across three jurisdictions—Ireland, Argentina, and Hungary—who demonstrated how FRT systems are being abused by law enforcement agencies worldwide. The speakers presented INCLO’s newly developed 18 principles for governing police use of FRT, supported by case studies from Argentina and Hungary that illustrated both the urgent need for such safeguards and the consequences when proper oversight is absent.
## INCLO Network and Participants
Olga Cronin, Senior Policy Officer at the Irish Council for Civil Liberties, opened by introducing INCLO’s global network of 17 member organizations, including the ACLU in the United States, Egyptian Initiative for Personal Rights, Contras in Indonesia, CELS in Argentina, and the Hungarian Civil Liberties Union. The workshop included both in-person and online participants, with Victor Saavedra and Timalay N’Ojo joining virtually.
## Technical Foundation and Problems of Facial Recognition Technology
### How Facial Recognition Operates
Cronin explained that facial recognition is a biometric technology using artificial intelligence to identify individuals through facial features. The system creates mathematical representations from images, which are compared against reference databases. Crucially, FRT is fundamentally probabilistic rather than definitive, relying on threshold values that create trade-offs between false positive and false negative rates.
### The Robert Williams Case
The arbitrary nature of FRT was illustrated through the Detroit case of Robert Williams, who was wrongfully arrested after an algorithm identified him as the ninth most likely match for a shoplifting incident. However, there were two other algorithms run that produced different results—one returning 243 candidates that didn’t include Williams, and another returning no results. Despite these contradictory outputs, Williams was still arrested, demonstrating what Cronin called “the arbitrariness of this and it’s not this silver bullet solution that it’s often presented to be.”
### Documented Bias and the Rite Aid Case
The workshop highlighted that documented misidentification cases disproportionately affect Black individuals. Cronin referenced Rite Aid, a retail chain banned from using facial recognition after making thousands of wrongful accusations between 2012 and 2020, demonstrating the systemic nature of these problems.
## INCLO’s 18 Principles Framework
### Development and Core Requirements
INCLO developed 18 principles based on experiences across member organizations in different countries. The principles establish fundamental requirements including: sufficient legal basis through proper legislative processes, prohibition on using FRT to identify protesters or collect information on peaceful assemblies, and mandatory fundamental rights impact assessments prior to implementation.
A critical component is the clear prohibition on live FRT systems, which Cronin described as “a dangerous red line.” Live FRT involves real-time identification in public spaces, creating unprecedented mass surveillance capability.
### Oversight and Accountability
The principles mandate independent oversight bodies with robust monitoring powers and mandatory annual reporting. They also require comprehensive documentation of FRT use, including detailed records of deployments, database searches, and results obtained.
Cronin acknowledged jurisdictional differences, noting that while many INCLO members would prefer complete bans on police FRT use, “we know that that fight has been lost in certain jurisdictions,” necessitating strong safeguards where prohibition isn’t achievable.
## Argentina Case Study: Systematic Abuse
### Background and Scope Creep
Tomas Ignacio Griffa from Centro de Estudios Legales Sociales (CELS) presented Buenos Aires’ facial recognition system implemented in 2019. Initially claimed to target only 30,000 fugitives from justice, court proceedings revealed the system had actually conducted consultations about more than seven million people, with over nine million total consultations recorded.
This massive discrepancy demonstrated that “the Buenos Aires police and perhaps other offices were accessing this biometric data for other purposes, entirely different from searching for fugitives.” The system accessed databases including CONARC and RENAPER without proper authorization.
### Legal Violations and Constitutional Ruling
The system operated without proper legal authorization, lacked oversight mechanisms, and had no procedures for documenting or controlling access. Information was manually deleted, preventing audit trails. The Argentine court ultimately ruled the system unconstitutional, finding violations of fundamental rights and legal requirements.
An ongoing issue involves the government’s refusal to disclose technical details, claiming trade secrets, which prevents proper assessment of bias and discrimination in the system.
## Hungary Case Study: Targeting LGBTQ+ Communities
### Political Context and Deliberate Weaponization
Adam Remport from the Hungarian Civil Liberties Union described how Hungary’s FRT system, existing since 2016, was weaponized against LGBTQ+ communities. After passing legislation banning “LGBTQ+ propaganda,” the government banned Budapest’s Pride parade and expanded FRT use to cover all petty offences.
The government actively communicated that participants in banned assemblies would be identified through facial recognition and fined, creating a deliberate chilling effect. As Remport explained, FRT was “actively used to discourage people from attending demonstrations.”
### Lack of Transparency as a Weapon
Remport identified how “the lack of transparency, the lack of knowledge that people have of what is going to actually happen to them is also as discouraging as the outright threats of using FRT.” This strategic opacity creates self-censorship and suppresses democratic participation.
He noted that public awareness was minimal because “people never really cared about FRT, because they didn’t actually know that it existed, precisely because of the lack of communication on the government side.” By the time awareness emerged, “the system already exists, with the rules that we have now, and which can be abused by the police and the government.”
## Human Rights Implications
### Multiple Rights Affected
Speakers emphasized that FRT affects multiple human rights simultaneously: human dignity, privacy, freedom of expression, peaceful assembly, equality, and due process. Cronin described how the technology turns people into “walking licence plates,” creating unprecedented tracking capabilities.
### Targeting Marginalised Communities
A recurring theme was FRT’s systematic use against marginalised communities. Cronin noted the technology is being used against Palestinians, Uyghur Muslims, and protesters worldwide, while case studies demonstrated targeting of Black individuals and LGBTQ+ communities.
## Community Engagement and Advocacy
### Public Awareness Challenges
Speakers identified lack of public awareness as a significant challenge. The strategic use of opacity prevents communities from understanding surveillance systems affecting them.
### Creative Approaches
In response to Pietra from Brazil’s question about community activation regarding facial recognition in police use, Cronin emphasized the importance of creative grassroots approaches through local artists and community organizations for building public awareness and resistance.
### Jurisdictional Variations
Speakers acknowledged different jurisdictions require different advocacy strategies. Some organizations advocate for complete bans, others focus on strong regulatory frameworks where prohibition isn’t politically feasible.
## Audience Questions and Emerging Issues
June Beck from Youth for Privacy raised concerns about laws banning face masks in public spaces as responses to citizens protecting themselves from FRT surveillance, highlighting the “arms race” between surveillance technology and privacy protection measures.
Questions about effective community education strategies revealed ongoing uncertainty about building public awareness and resistance to FRT deployment.
## International Precedents
Cronin mentioned the Bridges case in the UK, where Liberty successfully challenged South Wales Police’s use of automatic facial recognition, demonstrating that legal challenges can succeed when proper procedures aren’t followed.
## Conclusions
The workshop demonstrated that FRT poses serious threats to fundamental human rights across diverse jurisdictions. The case studies from Argentina and Hungary validated INCLO’s 18 principles by showing real-world consequences when safeguards are absent. Success in Argentina and ongoing resistance in Hungary provide models for advocacy strategies, while INCLO’s principles offer frameworks for ensuring any FRT deployment respects basic human rights and democratic values.
The speakers conveyed urgency about addressing FRT deployment before systems become further entrenched, emphasizing that coordinated civil society action can achieve meaningful victories in protecting democratic freedoms.
Session transcript
Olga Cronin: Texas Methodist Church SPD United Methodist Foundation Examples of Methodists Center Bethesda, Texas Text on Screen Hi everyone and thanks a million for joining us here today and INCLO is very happy to be here and very grateful to the organizers of IGF. INCLO stands for the International Network of Civil Liberties Organizations and it’s a network of 17 national civil liberties and human rights organizations worldwide and with member organizations across the global north and south that work together to promote fundamental rights and freedoms. I won’t mention all 17 members and to save time but they include ACLU in the US, the Egyptian Initiative for Personal Rights in Egypt, Contras in Indonesia, the Association for Civil Rights in Israel and Liberty in the UK and we welcome two new members just this year, ALHAC based in the West Bank and Connect Us in Brazil. We also have member organizations in Ireland, Hungary and Argentina which is why we are here and today. My name is Olga Cronin, I’m a senior policy officer at the Irish Council for Civil Liberties and which is a member of INCLO and Adam Remport there, my far right there is a lawyer at the Hungarian Civil Liberties Union, also a member of INCLO and on my right is Manuel Tufro, a lawyer at the, excuse my Spanish, Centro de Estudios Legales Sociales in Argentina, otherwise known as CELS, also an INCLO member and we are also joined online by INCLO’s technologist Victor Saavedra and Timalay N’Ojo, the program manager of INCLO Surveillance and Digital Rights Pillar of Work who is based at the Canadian Civil Liberties Association in Toronto. Most people in this room probably already know what FRT is but just very, very briefly. Facial recognition is a biometric technology and it uses artificial intelligence to try and identify individuals through their facial features. Generally speaking, FRT works by comparing a face print or biometric template of a person’s face taken from an image that could be sourced from CCTV or social media or body-worn cameras and compares that template of a person unknown against a database of stored face prints or biometric templates of people whose identity is known. The image of the person unknown would generally be called a probe image and the database of stored biometric templates of known people would be generally called a reference database and if you’re wondering what kind of reference databases of stored biometric facial templates are used by police, you can think of passport databases or driver’s license databases or police mugshot databases. So these systems are built on the processing of people’s unique biometric facial data so the unique measurements of your face you can compare it to DNA or iris scans or your fingerprints biometric data. Very quickly, there’s three points I’d like to make about FRT in terms of the live and retrospective use of FRT but also the threshold values that are fixed for probable matches and the fact that it’s a probabilistic technology. So real-time or live facial recognition involves comparing a live camera feed of faces against a predetermined watch list to find a possible match that would generate an alert for police to act upon. Retrospective basically means comparing still images of faces of unknown people against a reference database to try and identify that person. Now the European Court of Human Rights and the Court of Justice of the European Union have views live and real-time use of FRT is more invasive than retrospective but it should be said that tracking a person’s movements over significant length of time can be as invasive if not more invasive than one instance of real-time identification. For an FRT system to work there’s a threshold value fixed to determine when the software will indicate that a match or a possible match has occurred. Should this be fixed too low or too high? Respectively it can create a high false positive rate or high false negative rate. There is no single threshold that eliminates all errors. So when you think about what a police officer will get in their hand afterwards, if they use FRT, they will essentially get a list of potential candidates. Person A with a percentage score next to them, a similarity score. Person B with another similarity score. How long this list could be anyone’s guess because it largely depends on the reference database and a number of other factors. Just very quickly I’ve just concluded this picture of a man called Robert Williams from Detroit. This is what’s called an investigative lead report from the Michigan State Police in respect to Robert Williams, a father of two who was wrongfully arrested and detained after he was misidentified as a shoplifter by FRT in January 2020. We could do a whole session on Robert’s case but I just thought it was interesting to show the probe image that was used in his case. You can see it there, it’s a CCTV still and the picture on the right is of Robert’s driver’s license picture. You’ll also see, forgive the slide, it’s just popped over different fonts, apologies, but basically it’s important to note that Robert was arrested after an algorithm identified him as the ninth most likely match for the probe image but there were two other algorithms run. One returned 243 candidates, Robert wasn’t on that list, and another returned no results at all and yet he was still arrested and detained. So really the point of this is just to show the arbitrariness of this and it’s not this silver bullet solution that it’s often presented to be. And there are increasing number of people who have been wrongly accused due to FRT and you’ll notice that all the people in these images are people who are black. They are all from the States, Sean Thompson and Sarah, not her real name, is from the UK and there are increasing numbers of these misidentifications happening all the time. In 2023 the US Federal Trade Commission banned the retail pharmacy agency, retail chain rather, Rite Aid from using FRT in shops because it was creating, between 2012 and 2020, there was thousands of people wrongfully basically accused of being shoplifters and told to leave stores, predominantly people who were black and this was all misidentifications. So with FRT there’s an immediate danger of misidentifications, it’s unreliable, it has this bias and discriminatory aspect but also there’s the larger and more long-term concerns, longer-term consequences and that is this mass surveillance concern. FRT allows police, it gives them a seismic shift in this kind of surveillance power, it does turn us into walking license plates and it tilts that power dynamic into the hands, further into the hands of police. So, you know, we know and we’ve heard of the use of FRT against Palestinians, we know and have heard of the use of FRT against Uyghur Muslims and protesters in Russia but the most, I suppose, most recent situation regarding the use of FRT, that’s been in the news at least, is the use of FRT this weekend at Pride in Hungary, which Adam will talk to in a bit. This is just a brief slide just to outline the different human rights that are affected by FRT at a minimum. The right to dignity, privacy, freedom of expression, peaceful assembly and association, equality and non-discrimination, rights of people with disabilities, the presumption of innocence and the right to effective remedy and the right to fair trial and due process. In CLO, as I said, we are members in 17 jurisdictions and we, over the last number of years, since we brought out a report about the emerging issues with FRT in 2021, we could see that this is becoming a significant issue. We knew about the biometric database of Palestinians, we could see our member organization Uyghur in Russia brought a complaint to the European Court of Human Rights, brought a case to the European Court of Human Rights over Russia’s use of FRT against protesters. There have been wrongful arrests in Argentina, which my friend Tomas will talk about. There was the famous Bridges case in the UK, the Clearview AI scandal in Canada, all of these various aspects and essentially what we decided, we stood back and we thought, in many of these jurisdictions there’s no legislation to underpin this use of FRT. In different jurisdictions they have different, you know, data protection rights or privacy rights or perhaps none at all and it was essentially, we could see how patchwork it was. Different organisations within our members were calling for different things, some were calling for bans, some were calling for moratoriums, some were calling for legislation and so what we decided to do was to come up with, create a set of principles for police use of FRT in the hope that it could mitigate some of these harms. I won’t stay too long on it but basically our methodology was we just created a steering group within the network, we met obviously throughout, we agreed what information we needed, we surveyed our members to find out what actually information there is available in their jurisdictions, we agreed on the harms and risks and we looked at the cases that were coming through, we looked at, you know, obviously media stories as well, not everything you know ends up in court and then we agreed upon a set of questions that we felt, we feel should always be asked when it comes to police use of FRT and essentially the principles are an answer, our attempt at answering those questions and we did have some expert, great expert feedback with a number of experts, academics and otherwise and we did that virtually and in person and essentially these are the principles, I don’t want to keep, take up all the time but essentially there are 18 principles and the first principle is about a legal basis and essentially what we’re saying here is that any interference with the right and FRT interferes with many rights as I mentioned earlier must have a legal basis and that legal basis must be of sufficient quality to protect against arbitrary interferences. We say that they cannot, we say that police cannot use FRT unless there is a legal basis. We also say that it should never be used in certain circumstances and that includes it should not be used to identify protesters or collect information on people attending peaceful assemblies which is very pertinent to what Adam is going to talk about. The second principle concerns mandatory fundamental rights impact assessment, so here we’re saying that the police need to carry out a series of impact assessments with respect to all fundamental rights prior to any new use of FRT and we’re saying that these assessments must include an assessment of the strict necessity and proportionality of the FRT use. We have copies of the principles here if anyone would wish to go through them in more detail, they are quite detailed, so I won’t go into detail of each of them, but obviously those assessments we’re saying they must explicitly outline the specific parameters of use, who will use it, who it will be used against, where it will be used, why it will be used and how it will be used, the rights impacted, the nature and extent of the risks, how those risks will be mitigated and a demonstrated justification for how and why the benefits of the deployment will outweigh the rights impacts and the remedy available to someone who is either misidentified or whose biometric data was processed when it should not have been, which will speak to Tomas’s point in a minute. Principle three is about the fundamental rights impact assessments that I just mentioned, having to be independent of the vendor assessment. It’s not enough for a vendor to say that this is X and this is Y and everything is OK and I’d like to mention here that Bridges case, the Court of Appeal case in the UK, which our colleagues Liberty took, because in that case the Court of Appeal held that the public sector equality duty under the Equality Act there requires public authorities to give regard to whether a policy could have a discriminatory impact and essentially in that case it was held that the South Wales Police had not taken reasonable steps to make inquiries as to whether or not the FRT algorithm the police was using risked bias or racial or sex grounds. And the court actually heard from a witness who was employed by a company specialising in FRT and he said that these kinds of details are commercially sensitive and cannot be released and we hear this a lot. But it was held in the end, the court held that while that was understandable, it wasn’t good enough and it determined the police never sought to satisfy themselves either directly or by way of independent verification that the software didn’t have an unacceptable bias. Principle four is no acquisition or deployment of any new FRT without a guarantee of future independence from the vendor. So this is about vendor lock-in, this risk that a customer would be at risk of not being able to transition to another vendor. Principle five is saying that all versions of all assessments must be made public before the deployment. Principle six is about the obligation of public consultation and here we’re saying that before any policing authority deploys FRT it must hold meaningful public consultation. Principle seven, authorities must inform the public how probe images are used in FRT operation. Principle eight is about the technical specifications of any FRT system and how they must be made public before any deployment. Principle nine is that live FRT should be prohibited. We do believe that live FRT is just too dangerous and should be banned, it is a red line. But as I said before, retrospective FRT can be just as dangerous. Principle ten is about mandatory prior judicial authorisation. Eleven is about record of use and here we’re saying that the police must document each and every FRT research performed and provide this documentation to the oversight body. I haven’t mentioned it yet but principle sixteen provides for an independent oversight body. Principle twelve ensures that an FRT result alone would not be sufficient basis for questioning. And then obligation to disclose, there should be mandatory disclosure of the details of the FRT operation applied against individuals. Principle fourteen, any FRT misidentification of a person must be reported and there should be mandatory annual reporting by authorities of those misidentifications in principle fifteen. Principle sixteen is the independent oversight body that I mentioned before. Under principle seventeen, that oversight body must publish annual reports. Principle eighteen is that the impact assessment must be made available to the oversight body before the system is employed. I need to move on very quickly to hand this over to Tomas but basically we hope that these principles, the aim of the principles is to both help reduce FRT harms but also empower civil society and the general population to kind of step forward and ask the right questions and push back and advocate for safeguards with a clear understanding of these technologies. We hope that the information can be used to voice our opposition but also as an advocacy tool when debating and discussing FRT with law and policy makers. So for now I will pass it over to Tomas who can speak to a situation in Argentina and how they met with the principles.
Tomas Ignacio Griffa: Thank you very much Olga, hello everyone. So I’m going to be talking a little bit about our experience in Argentina at CELS regarding the FRT. We’ve been working since 2019 in a litigation against the implementation of facial recognition technology in the city of Buenos Aires. I think this case provides a very interesting example regarding the importance of the ideas behind the principles that Olga was explaining just a moment ago. So very briefly I’m going to talk about how the facial recognition system in the city of Buenos Aires works and what its legal framework looks like. I’m going to talk about what the process in which we question the constitutionality of the system was like. I’m going to explain the principles that were set forth in the ruling by the local judges and finally I’m going to talk a little bit about how all this highlights the relevance of the principles that we were talking about. So first regarding the system, the fugitive facial recognition system in the city of Buenos Aires, or Sistema de Reconocimiento Facial de PrĂ³fugos in Spanish, was implemented in the city of Buenos Aires by a ministerial resolution on April 2019. According to the resolution the system was to be employed exclusively to look for fugitives, that is to say people with pending arrest warrants, and exceptionally for other tasks specifically mandated by judges in individual cases. The system worked with the National Fugitive Database, the CONARC in Spanish, which provided the identities of the people that had to be searched for, that is to say the fugitives. and with the National Identity Database, the Registro Nacional de las Personas or RENAPER in Spanish, which was supposed to provide the biometric data regarding the people that had to be searched for, the pictures of these people, the fugitives. The system was operated by the local police and in 2020 the local legislative branch sanctioned a statute that provided a legal basis for the system. So, regarding the case, it was a constitutional protection procedure or AMPARO in Spanish. It was started by the Argentinian Observatory of Informatic Rights, another Argentine NGO, and CELS also took part in the case. The case started with focus on that which research on facial recognition technology around the world has repeatedly shown, that is to say, the risk of mistakes and wrongful identifications, racial and gender biases and impacts on the right to privacy and so on. However, as the judge started gathering information about the system, it became quite clear that there was another big problem, which was its practical implementation. So, as I said, the facial recognition system was intended to work crossing data between a national database of fugitives and wanted people, which consists of maybe 30 to 30,000 names, and the biometric data gathered by the National Identity Database. So, the National Identity Database was supposed to provide the biometric data on those 30 to 30,000 people. However, when the judge asked the National Identity Database how many individual consultations the government of the city of Buenos Aires had made, it turned out that the government had made consultations about more than seven million people, more than nine million consultations in total regarding more than seven million people. So, clearly the Buenos Aires police and perhaps other offices were accessing this biometric data for other purposes, entirely different from searching for fugitives, and to this day we do not know exactly how and why this data was accessed. During the process, during the trial, a group of experts performed an audit of the system. They found that thousands of people had been searched by the facial recognition system without any legal basis, that is to say people who are not fugitives. They also found that information regarding the use of the system had been manually deleted in such a way that it was impossible to recover it, and they found also that it was impossible to trace which specific public officers had operated the system. So, with all this, the local judge ruled that the facial recognition system employed by the city of Buenos Aires was unconstitutional. She found that the system had been implemented without complying with the legal provisions for the protections of the constitutional rights of the citizens. She also, in the ruling, she details that the legislative commission that was supposed to oversee how the system worked had never been created, that the other local organism, the DefensorĂa del Pueblo in Spanish, which was supposed to audit the system as well, was not provided with the information it needed to perform this task, that there were no previous studies to ascertain the impact of the system on the human rights, and that there were no instances for public participation prior to the introduction of the system. The ruling also explained that, as the court appointed experts explained, it was proven that the system was illegally employed to search for people who did not have pending arrest warrants, and as I said before, local statutes provide that this was the only possibility, the only way the system could be employed. The ruling also held that local authorities had illegally accessed the biometric data of millions of people under the guise of employing this system. And finally, very briefly, the local chamber of appeals affirmed this decision and also added that the implementation of the system had to be preceded by a test performed by experts to ascertain if the software has a differential impact on people based on their race or gender. And finally, very briefly, I’m going to talk about the latest developments in the case. This order to perform the test to ascertain whether the system has a differential impact on people based on race or gender is still being carried out to this day. The government wanted to do a sort of black box test by selecting a number of people and testing the system on them. Our position here is that it is not enough to do a test of this kind and that it’s necessary for the government to disclose the technical details of the software and the datasets with which the software was trained. The government’s position is that this information is a trade secret belonging to the company that provides the software, so this is a debate that is still ongoing. And finally, going back to the principles, the case was prior to the principles, started in 2019 as I said, but I think it’s a very good example of the relevance of the ideas behind the principles and the possible consequences of ignoring them. I mean, the serious irregularities that the judge found on the implementation of the facial recognition system are, we could say, the exact opposite of what the principles stand for. So, very briefly, to give the floor to Adam, thousands of people were looked for employing the system without any legal basis, directly against the ideas set forth in principle number one. The system was implemented without any prior assessment of its impact on fundamental rights. This brings our attention to principle number two. There were supposed to be two oversight bodies according to the legislation, to the framework of the facial recognition system. This looked great in theory, however, as I said, in practice, one of them wasn’t even created and the other one was not provided with information it required to perform its function. This regards obviously to principle 16. No public consultation took place before introducing and employing the facial recognition system. This, of course, goes against principle number six. The use of the system wasn’t properly documented, information was manually deleted, it could not be recovered and it was not possible to tell which public officers had performed each operation. This, of course, regards to principle 11 and the latest developments that I was talking about regarding how this test ordered by the Chamber of Appeals will be carried out. I believe it highlights the importance of being able to access technical information regarding the system, such as the source code, the data that is employed to train the algorithm and so on, and this regards to principles 8 and 13. Thank you very much.
Olga Cronin: Thank you. I might just introduce, maybe just to say, thanks a million, thanks a million, Thomas. I think it’s safe to say that what you just described is exactly that. The principles, had they been seen to or complied with or known about beforehand, it could have been a different scenario and how things can go very wrong. Speaking about how things can go wrong, we’re now going to turn to Hungary and Adam’s going to talk about the recent legislative change there that effectively has banned Pride this weekend and also allows for the police to use FRT to identify people who have defied that ban. So over to you, Adam. Thank you very much for
Adam Remport: having me. I would like to present to you a case which may be good to demonstrate the practical problems with facial recognition, the ones that are formulated often quite abstractly but which have absolute real-life consequences, the case of the Hungarian government essentially banning the Pride parade. So the background of the case is that the Pride is not a new event, it has been held since 1995, but in February of this year the Prime Minister said that it would be banned because that was, in the government’s views, necessary for child protection. So new laws were enacted. They essentially banned any assembly quote-unquote displaying or promoting homosexuality and another law made it possible for facial recognition technology to be deployed for all petty offenses. Now I will tell you more about what petty offenses are in this context. So the legal background of the case is that Hungary has had a facial recognition technology act since 2016. It established a facial recognition database which consists of pictures of IDs, passports and other documents with facial images on them. There are specific authorities that can request facial analysis in certain specified procedures. The Hungarian Institute of Forensic Sciences, which is responsible for operating the facial recognition system. A novelty of Hungarian FRT use was that in 2024, FRT was made available for use in infraction procedures or so-called infraction procedures by the police, and in 2025, this included all infraction procedures. The reason why this is important is that participating in a banned event, an event that had been previously banned by the police, is an infraction. So if demonstrators gather at the Pride event after it had been banned by the police, it would mean that they would collectively commit infractions, probably in the tens of thousands. So let’s find out how the FRT system actually works in this scenario. The police are known to record demonstrations, and they can use CCTV and other available sources to gather photographs or images of certain demonstrations. If they find that there is an infraction happening, what they can do is that they initiate an infraction procedure, and in the course of that infraction procedure, send the facial images to the central FRT system, which then runs an algorithm and identifies the closest five matches, which are then returned to the police, and it’s the police officer operating this system who has to decide whether there is a match or not. I have to point out that this system has never been used en masse, so it has never been used against tens of thousands of people, and it is not known how this system itself will technically or the capacities of the judiciary and the police will operatively handle this kind of situation. So what we can tell about the case is that in the FRT principles, well, it’s the first principle that it must be ensured that FRT is not used to identify protesters or collect information on people attending peaceful assemblies. This is the first principle, and it is immediately violated by this kind of FRT use against peaceful demonstrators. Another principle is that there are certain uses which are banned according to the principles, such as that no FRT system will be used on live or recorded moving images or video data. We can see why it is a problem that the police record entire demonstrations. They don’t even necessarily have to follow a demonstration with live facial recognition. It is enough for the chilling effect to take place, to record everyone who is taking part in the demonstration, to then later systematically find everyone in the police’s recordings and then send fines to them, which is actually probably how this will play out in Hungary or at least how the government plans it to play out. It is an interesting case study of the lack of transparency around facial recognition. One of my conclusions will be that FRT in this present case is used actively to discourage people from attending demonstrations, but the lack of transparency, the lack of knowledge that people have of what is going to actually happen to them is also as discouraging as the outright threats of using FRT. In the case of the Hungarian system, we can tell that there was no public consultation whatsoever before the introduction of the entire system in 2016. The introduction of facial recognition as such was done in Hungary without any public consultation or without it being communicated to the public, which means that there is no public awareness or there hasn’t been up until now, when the situation has gotten worse, public awareness of even the existence of the FRT system. There was no consultation with the public, no data protection impact assessment, no consultation with the data protection authority before the present broadening of the scope of FRT, which would, of course, include this massive breakdown on the freedom to assembly. This also violates one of the articles of the protocols. It can also be said that there are no records of use, no statistics available that would tell you how the system works, how effective it is, when it is deployed against which kinds of infractions whatsoever, and persons prosecuted can almost never find out whether the FRT has been used against them or not. There are no impact assessments before the individual uses of the system, which means that the police can’t simply just initiate these searches without assessing the possible effects that it would have on someone. This is also against the principles. There is no vendor lock-in assessment either, which is also important because the Hungarian Institute of Forensic Sciences, which operates the system, explicitly said that they were only clients using this facial recognition algorithm, which raises the question of whether the data are being transferred to third countries or not. And of course, since there are no risk assessments, they haven’t been publicized either, which also goes against the protocols. So a lack of sufficient oversight is also what I would like to mention. There is no prior judicial authorization. This is important because, as I have told you, it is necessary to start an infraction procedure before FRT can even be deployed. It is not known because the law is not clear about against how many people at the same time can one infraction procedure be initiated. So this has never even really been used against more than three or four people, which makes sense, but it has never been used on a scale of tens of thousands of people. A prior judicial authorization could act as a check on this kind of massive surveillance if a judge could see whether it was necessary to surveil tens of thousands of people at the same time, but it’s not possible. There is no independent oversight body either, and of course there are no annual reports, no notification of the impact assessments to the oversight body, since the oversight body does not exist. So these all go against the provisions, I think in a very concrete manner, so that you can see that these provisions are not just abstract rules, but when they are not met with, that means actual harms in real life. My conclusion would be that what we can see is a weaponization of facial recognition technology, that instead of mitigating the risks, there is a deliberate abuse of FRT’s most invasive properties. Essentially, the government actively communicated that facial recognition would be used against people that they cannot hide because they will be found with facial recognition, and they will be found. It is inevitable. This of course has a massive chilling effect on the freedom of assembly, and we could also say that even the lack of transparency is… This is a kind of weaponized, because if there is a lack of information on the system, it is impossible for people to calculate the risks. This will have a chilling effect on them, because they won’t know whether it is true that they will actually all be found and fined. So, I would like to conclude here. Some possible next steps are legal avenues that can be taken, like the law enforcement directive in the EU or the AI Act. And enclosed principles can, I think, also be used in advocacy at international fora. Thank you.
Olga Cronin: Thanks, Adam. If you don’t mind, I might just ask a follow-up question. Given the situation that’s happening in Hungary, and given it’s so imminent this weekend, and it has got such international attention, how have the people in Hungary, how are they feeling? What’s the public opinion about the use of FRT? Has it changed? Did people care before, or how is it now?
Adam Remport: Well, people, I think, never really cared about FRT, because they didn’t actually know that it existed, precisely because of the lack of communication on the government side. So, the situation had to become this bad and severe for the people to start to even care about the problem. But now, the system already exists, with the rules that we have now, and which can be abused by the police and the government. So, many are concerned now, but proactive communication should have been necessary on the government’s part.
Olga Cronin: I just wonder if we have any questions.
Audience: Hi. Can you hear me?
Olga Cronin: Yes.
Audience: I’m from Brazil. My name is Pietra. The situation in Brazil with facial recognition is growing really fast. I think it’s very similar to what is happening in Argentina. But I was really shocked with the Hungarian case. And I’m part of a project that is trying to do some community activations about facial recognition in use in police. So, I wanted to hear from you if you’ve ever done something with community activation. And also, I wanted to ask if you believe that there is a way to use facial recognition, or if you think that it should be banned. Because in Brazil, we are discussing a lot about banning all systems that use facial recognition. So, I wanted to listen from you guys, what you think about it. Thank you.
Olga Cronin: Thank you. I can have a go at answering some of those questions. I think the idea of getting into communities and doing that education awareness piece, which is I think what you’re talking about, and maybe activating them or stirring them into taking action, is really, really important. Mainly because of the same issue that Adam just mentioned. You know, people don’t really understand it, don’t really know about it. And then when people are talking about it, people in position of authority speak of it as a silver bullet solution. With this, you know, there’s no problems. It’s like control and F, there’s no issues. And just can absolutely downplay the risks. So, I think you have to get creative. I think you have to get creative with maybe local artists. ICCL created a mural with a local artist in Dublin to highlight the dangers of FRT. But it’s also kind of getting, looping in with other civil society organizations who might not work in this space. And getting down to that kind of grassroots level. I think you just have to kind of get imaginative. You’re trying to get the word out there. And I think, you know, use all the tools available to you that you would use in general for communications. I think when it comes to a ban, ICCL, or sorry, INCLO rather, like I said, we have 17 members in 17 different jurisdictions. There are already 17 different sets of kind of safeguards and protections there in place. Some people are calling for a ban. Some people are calling for a moratorium. And other people are calling for, or other groups are calling for legislation. It really is specific to the jurisdiction and what’s happening there. But what we do know is that the risks and the harms are present. They’re pressing. That mass surveillance risk and how this can be quickly deployed against us is clear and obvious. So from many of our perspectives in INCLO, we would call for a ban. We don’t wish the police to use it. But we know that that fight has been lost in certain jurisdictions. So this is an attempt to try and make it better, at least. I hope that helps.
June Beck: Hello, my name is June Beck from Youth for Privacy. I was wondering, since we’re talking about facial recognition technologies, there’s also been a lot of movement to penalize wearing masks in public as an attempt to protect yourself against facial recognition technology. So I was wondering if INCLO or any organization have thoughts or processes or any kind of discussions on how the ban of facial masks, for example, is also in conversation with FRT. I don’t wish to take over the conversation. You might have something to say. Not really pertaining to this. Maybe a little, but if you have to. We’re out of time. We’re out of time. I would say that that’s happening. More and more laws are being passed to ban face masks at protests. It’s on the cards in Ireland as well. It’s happening in England. It’s changing to be more restrictive in England. It is happening, and it’s impossible to see how that’s not a response to the use by police of FRT. And then the response of the public to cover their faces. So it’s not something that we’ve worked on specifically yet, but it’s absolutely something that we are working on individually, if you like. Thank you.
Olga Cronin: Thanks a million. Sorry, we’ve gone over time. We’re very happy that you joined us. We hope that you enjoyed it and that you found it insightful. Thanks very much. And we have copies of the principles and hard copies if you wish. Thank you. Thank you very much. Goodbye. Thank you. Goodbye.
MODERATOR: Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two. Workshop two.
Olga Cronin
Speech speed
161 words per minute
Speech length
3253 words
Speech time
1206 seconds
FRT is a biometric technology using AI to identify individuals through facial features by comparing face prints against databases
Explanation
Cronin explains that facial recognition technology works by comparing a face print or biometric template from an image (probe image) against a database of stored face prints of known people (reference database). The technology uses artificial intelligence to try and identify individuals through their unique facial features.
Evidence
Examples of reference databases include passport databases, driver’s license databases, or police mugshot databases. The system compares images from CCTV, social media, or body-worn cameras against these stored templates.
Major discussion point
Technical overview of FRT systems
Topics
Human rights | Legal and regulatory
FRT systems are probabilistic and prone to errors, with threshold values creating false positive or negative rates
Explanation
Cronin argues that FRT is not a perfect technology but rather probabilistic, meaning it provides probability scores rather than definitive matches. The threshold values set to determine matches can be problematic – if set too low they create high false positive rates, if set too high they create high false negative rates.
Evidence
Police officers receive a list of potential candidates with percentage similarity scores. There is no single threshold that eliminates all errors completely.
Major discussion point
Technical limitations and reliability issues
Topics
Human rights | Legal and regulatory
Agreed with
– Tomas Ignacio Griffa
– Adam Remport
Agreed on
FRT systems are inherently unreliable and prone to errors with serious consequences
The technology demonstrates arbitrariness, as shown by Robert Williams case where different algorithms produced different results
Explanation
Cronin uses the Robert Williams case to illustrate how arbitrary and unreliable FRT can be. Williams was wrongfully arrested despite being only the ninth most likely match, and other algorithms either didn’t include him in results or returned no results at all.
Evidence
Robert Williams was arrested after being identified as the ninth most likely match by one algorithm, but two other algorithms produced different results – one returned 243 candidates without Williams on the list, another returned no results at all.
Major discussion point
Unreliability and arbitrariness of FRT systems
Topics
Human rights | Legal and regulatory
Agreed with
– Tomas Ignacio Griffa
– Adam Remport
Agreed on
FRT systems are inherently unreliable and prone to errors with serious consequences
FRT has immediate dangers of misidentifications and bias, particularly affecting Black individuals disproportionately
Explanation
Cronin argues that FRT systems demonstrate clear bias and discrimination, with Black individuals being disproportionately affected by misidentifications. This creates immediate dangers for these communities who are wrongfully accused and face consequences.
Evidence
All the people shown in images of wrongful FRT identifications are Black individuals. The US Federal Trade Commission banned Rite Aid from using FRT in 2023 because it wrongfully accused thousands of people, predominantly Black individuals, of shoplifting between 2012 and 2020.
Major discussion point
Racial bias and discrimination in FRT
Topics
Human rights
FRT affects multiple human rights including dignity, privacy, freedom of expression, peaceful assembly, equality, and due process
Explanation
Cronin presents a comprehensive view of how FRT impacts various fundamental human rights. She argues that the technology doesn’t just affect privacy but has broader implications across multiple areas of human rights protection.
Evidence
Specific rights mentioned include: right to dignity, privacy, freedom of expression, peaceful assembly and association, equality and non-discrimination, rights of people with disabilities, presumption of innocence, right to effective remedy, and right to fair trial and due process.
Major discussion point
Comprehensive human rights impact
Topics
Human rights
Agreed with
– Tomas Ignacio Griffa
– Adam Remport
Agreed on
FRT violates multiple fundamental human rights and requires comprehensive legal safeguards
The technology enables mass surveillance and gives police seismic shift in surveillance power, turning people into “walking license plates”
Explanation
Cronin argues that beyond immediate misidentification risks, FRT creates broader long-term concerns about mass surveillance. The technology fundamentally shifts the power dynamic by giving police unprecedented surveillance capabilities over the general population.
Evidence
Examples of mass surveillance use include FRT against Palestinians, Uyghur Muslims, and protesters in Russia. The technology allows tracking people’s movements over significant lengths of time.
Major discussion point
Mass surveillance capabilities and power imbalance
Topics
Human rights | Legal and regulatory
FRT is being weaponized against marginalized groups including Palestinians, Uyghur Muslims, and protesters
Explanation
Cronin argues that FRT is not just a neutral technology but is being actively used as a tool of oppression against vulnerable and marginalized communities. This demonstrates the broader political and social implications of the technology.
Evidence
Specific examples include use of FRT against Palestinians, Uyghur Muslims, protesters in Russia, and the recent use at Pride in Hungary.
Major discussion point
Weaponization against marginalized communities
Topics
Human rights
Agreed with
– Adam Remport
Agreed on
FRT is being weaponized against marginalized communities and protesters
Any FRT use must have sufficient legal basis and should never be used to identify protesters or collect information on peaceful assemblies
Explanation
This is the first principle in INCLO’s framework, establishing that FRT use requires proper legal authorization and explicitly prohibiting its use against people exercising their right to peaceful assembly. Cronin argues this is fundamental to protecting democratic rights.
Evidence
This principle is directly relevant to the Hungary case where FRT is being used against Pride participants.
Major discussion point
Legal basis and protection of assembly rights
Topics
Human rights | Legal and regulatory
Agreed with
– Tomas Ignacio Griffa
– Adam Remport
Agreed on
FRT violates multiple fundamental human rights and requires comprehensive legal safeguards
Mandatory fundamental rights impact assessments must be conducted prior to any new FRT use
Explanation
Cronin argues that before deploying FRT, authorities must conduct comprehensive assessments of how the technology will impact fundamental rights. These assessments must include strict necessity and proportionality analysis and outline specific parameters of use.
Evidence
Assessments must explicitly outline who will use it, who it will be used against, where, why, and how it will be used, the rights impacted, the nature and extent of risks, how risks will be mitigated, and justification for why benefits outweigh rights impacts.
Major discussion point
Prior impact assessment requirements
Topics
Human rights | Legal and regulatory
Agreed with
– Tomas Ignacio Griffa
– Adam Remport
Agreed on
Lack of transparency and public consultation enables FRT abuse
Live FRT should be prohibited as it represents a dangerous red line
Explanation
Cronin takes a strong position that real-time or live facial recognition technology is too dangerous and should be completely banned. While acknowledging that retrospective FRT can also be dangerous, she argues live FRT crosses a red line that should not be crossed.
Evidence
The European Court of Human Rights and Court of Justice of the European Union view live FRT as more invasive than retrospective use.
Major discussion point
Complete prohibition of live FRT
Topics
Human rights | Legal and regulatory
Independent oversight bodies must be established with mandatory annual reporting requirements
Explanation
Cronin argues that proper oversight is essential for any FRT deployment, requiring independent bodies that can monitor use and publish regular reports. This creates accountability and transparency in the system.
Evidence
Principles 16 and 17 specifically address the need for independent oversight bodies and their obligation to publish annual reports.
Major discussion point
Independent oversight and accountability
Topics
Legal and regulatory
Creative community activation through local artists and grassroots organizations is essential for public awareness
Explanation
In response to a question about community engagement, Cronin argues that raising public awareness about FRT requires creative approaches including working with local artists and grassroots organizations. She emphasizes the need to get imaginative in communications efforts.
Evidence
ICCL created a mural with a local artist in Dublin to highlight the dangers of FRT. She suggests looping in civil society organizations who might not work in this space and getting down to grassroots level.
Major discussion point
Community engagement strategies
Topics
Sociocultural
Different jurisdictions require different approaches – some calling for bans, others for moratoriums or legislation
Explanation
Cronin acknowledges that INCLO’s 17 member organizations across different jurisdictions have varying approaches to FRT regulation. While many would prefer a complete ban, the reality is that some jurisdictions have already implemented systems, requiring different strategic approaches.
Evidence
INCLO has 17 members in 17 different jurisdictions with different existing safeguards and protections. Some groups call for bans, others for moratoriums, and others for legislation.
Major discussion point
Jurisdictional differences in regulatory approaches
Topics
Legal and regulatory
Disagreed with
– Audience
Disagreed on
Regulatory approach – ban versus regulation with safeguards
Tomas Ignacio Griffa
Speech speed
170 words per minute
Speech length
1383 words
Speech time
486 seconds
Buenos Aires implemented FRT system in 2019 supposedly only for fugitives but accessed biometric data of over 7 million people illegally
Explanation
Griffa explains that while the Buenos Aires FRT system was officially designed to search for fugitives using a database of 30,000 people, investigation revealed that authorities had actually accessed biometric data of over 7 million people through more than 9 million consultations. This massive overreach violated the system’s stated purpose and legal framework.
Evidence
The system was supposed to work with the National Fugitive Database (CONARC) of about 30,000 names, but when the judge investigated, it was discovered that the government had made over 9 million consultations regarding more than 7 million people.
Major discussion point
Massive scope creep and illegal data access
Topics
Human rights | Legal and regulatory
Agreed with
– Olga Cronin
– Adam Remport
Agreed on
FRT systems are inherently unreliable and prone to errors with serious consequences
The system was ruled unconstitutional due to lack of legal compliance, missing oversight bodies, and no human rights impact studies
Explanation
Griffa describes how the local judge found the FRT system unconstitutional because it was implemented without proper legal safeguards. The ruling highlighted that required oversight bodies were either never created or not provided with necessary information, and no prior human rights impact studies were conducted.
Evidence
The legislative commission supposed to oversee the system was never created, the DefensorĂa del Pueblo was not provided with information needed to audit the system, there were no previous studies on human rights impact, and no instances for public participation prior to introduction.
Major discussion point
Constitutional violations and lack of safeguards
Topics
Human rights | Legal and regulatory
Agreed with
– Olga Cronin
– Adam Remport
Agreed on
FRT violates multiple fundamental human rights and requires comprehensive legal safeguards
Thousands were searched without legal basis, information was manually deleted, and it was impossible to trace which officers operated the system
Explanation
Griffa explains that expert audits revealed systematic violations including searching people who weren’t fugitives, deliberate destruction of evidence, and lack of accountability mechanisms. This demonstrates how FRT systems can operate without proper controls or oversight.
Evidence
Expert audits found thousands of people searched without legal basis (people who were not fugitives), information regarding system use was manually deleted in a way that made it impossible to recover, and it was impossible to trace which specific public officers had operated the system.
Major discussion point
Systematic violations and evidence destruction
Topics
Human rights | Legal and regulatory
Agreed with
– Olga Cronin
– Adam Remport
Agreed on
FRT systems are inherently unreliable and prone to errors with serious consequences
Government claims technical details are trade secrets, preventing proper assessment of bias and discrimination
Explanation
Griffa describes ongoing legal battles over transparency, where the government refuses to disclose technical details of the FRT software, claiming they are trade secrets belonging to the vendor. This prevents proper assessment of whether the system has discriminatory impacts based on race or gender.
Evidence
The Chamber of Appeals ordered a test to determine if the system has differential impact based on race or gender. The government wants to do a black box test, but CELS argues it’s necessary to disclose technical details of the software and training datasets. The government claims this information is a trade secret.
Major discussion point
Transparency vs. trade secrets in bias assessment
Topics
Human rights | Legal and regulatory
Agreed with
– Olga Cronin
– Adam Remport
Agreed on
Lack of transparency and public consultation enables FRT abuse
Adam Remport
Speech speed
123 words per minute
Speech length
1493 words
Speech time
727 seconds
Hungarian government banned Pride parade and expanded FRT use to all petty offenses, enabling mass surveillance of demonstrators
Explanation
Remport explains how the Hungarian government used child protection as justification to ban Pride parades and simultaneously expanded FRT capabilities to cover all petty offenses. Since participating in a banned event constitutes a petty offense, this creates a legal framework for mass surveillance of LGBTQ+ demonstrators.
Evidence
In February, the Prime Minister said Pride would be banned for child protection. New laws banned assemblies ‘displaying or promoting homosexuality’ and made FRT available for all petty offenses. Participating in a banned event is an infraction, so demonstrators would collectively commit infractions in the tens of thousands.
Major discussion point
Legal framework enabling mass surveillance of LGBTQ+ community
Topics
Human rights
Agreed with
– Olga Cronin
Agreed on
FRT is being weaponized against marginalized communities and protesters
The system violates multiple INCLO principles by targeting peaceful protesters and lacking transparency, consultation, or oversight
Explanation
Remport systematically demonstrates how Hungary’s FRT use violates numerous INCLO principles, including the fundamental prohibition on using FRT against peaceful demonstrators, lack of public consultation, absence of impact assessments, and missing oversight mechanisms.
Evidence
Violations include: using FRT against peaceful demonstrators (violates principle 1), no public consultation before system introduction, no data protection impact assessment, no consultation with data protection authority, no records of use or statistics available, no impact assessments before individual uses, no vendor lock-in assessment, no prior judicial authorization, no independent oversight body, and no annual reports.
Major discussion point
Systematic violation of FRT principles
Topics
Human rights | Legal and regulatory
Agreed with
– Olga Cronin
– Tomas Ignacio Griffa
Agreed on
FRT violates multiple fundamental human rights and requires comprehensive legal safeguards
FRT is being deliberately weaponized with government actively communicating that participants will be found and fined
Explanation
Remport argues that rather than trying to mitigate FRT risks, the Hungarian government is deliberately exploiting the technology’s most invasive properties as a weapon against LGBTQ+ rights. The government actively threatens that facial recognition will inevitably find and punish participants.
Evidence
The government actively communicated that facial recognition would be used against people and that they cannot hide because they will be found with facial recognition – it is inevitable. This creates a massive chilling effect on freedom of assembly.
Major discussion point
Deliberate weaponization of FRT against LGBTQ+ community
Topics
Human rights
Agreed with
– Olga Cronin
Agreed on
FRT is being weaponized against marginalized communities and protesters
Lack of public awareness about FRT existence due to no consultation or communication from government
Explanation
Remport explains that the Hungarian public was largely unaware that FRT systems even existed because the government implemented them without any public consultation or communication. This lack of transparency itself becomes a tool of oppression, as people cannot assess risks or make informed decisions.
Evidence
There was no public consultation when the FRT system was introduced in 2016, and it wasn’t communicated to the public, meaning there was no public awareness of the system’s existence until the current crisis. People never cared about FRT because they didn’t know it existed.
Major discussion point
Weaponized lack of transparency
Topics
Human rights | Legal and regulatory
Agreed with
– Olga Cronin
– Tomas Ignacio Griffa
Agreed on
Lack of transparency and public consultation enables FRT abuse
Audience
Speech speed
134 words per minute
Speech length
137 words
Speech time
61 seconds
Need for community activation and discussion about whether FRT should be completely banned or regulated
Explanation
An audience member from Brazil asks about community engagement strategies and whether FRT should be completely banned or regulated. This reflects broader civil society concerns about how to effectively organize against FRT and what the ultimate policy goal should be.
Evidence
The questioner mentions being part of a project doing community activations about facial recognition in police use, and notes that in Brazil they are discussing banning all FRT systems.
Major discussion point
Community organizing strategies and policy goals
Topics
Human rights | Sociocultural
Disagreed with
– Olga Cronin
Disagreed on
Regulatory approach – ban versus regulation with safeguards
June Beck
Speech speed
181 words per minute
Speech length
206 words
Speech time
68 seconds
Growing concern about laws banning face masks at protests as response to public attempts to avoid FRT surveillance
Explanation
Beck raises the issue of how governments are responding to people’s attempts to protect themselves from FRT by wearing masks, with increasing laws that penalize mask-wearing at protests. This creates a concerning dynamic where people lose the ability to protect their privacy.
Evidence
More laws are being passed to ban face masks at protests, including proposed changes in Ireland and England making restrictions more severe.
Major discussion point
Erosion of privacy protection methods
Topics
Human rights | Legal and regulatory
MODERATOR
Speech speed
51 words per minute
Speech length
46 words
Speech time
54 seconds
Workshop session transition and organization
Explanation
The moderator announces the transition to workshop two multiple times at the end of the session. This represents the organizational structure of the conference and the need to manage multiple concurrent sessions.
Evidence
Repeated announcements of ‘Workshop two’ to signal the end of the current session and transition to the next workshop
Major discussion point
Conference organization and session management
Topics
Sociocultural
Agreements
Agreement points
FRT systems are inherently unreliable and prone to errors with serious consequences
Speakers
– Olga Cronin
– Tomas Ignacio Griffa
– Adam Remport
Arguments
FRT systems are probabilistic and prone to errors, with threshold values creating false positive or negative rates
The technology demonstrates arbitrariness, as shown by Robert Williams case where different algorithms produced different results
Buenos Aires implemented FRT system in 2019 supposedly only for fugitives but accessed biometric data of over 7 million people illegally
Thousands were searched without legal basis, information was manually deleted, and it was impossible to trace which officers operated the system
Summary
All speakers agree that FRT technology is fundamentally unreliable, with Cronin demonstrating this through the Robert Williams case, Griffa showing massive overreach in Buenos Aires, and Remport highlighting lack of transparency in Hungary’s system
Topics
Human rights | Legal and regulatory
FRT violates multiple fundamental human rights and requires comprehensive legal safeguards
Speakers
– Olga Cronin
– Tomas Ignacio Griffa
– Adam Remport
Arguments
FRT affects multiple human rights including dignity, privacy, freedom of expression, peaceful assembly, equality, and due process
Any FRT use must have sufficient legal basis and should never be used to identify protesters or collect information on peaceful assemblies
The system was ruled unconstitutional due to lack of legal compliance, missing oversight bodies, and no human rights impact studies
The system violates multiple INCLO principles by targeting peaceful protesters and lacking transparency, consultation, or oversight
Summary
All speakers agree that FRT has broad human rights implications and requires strong legal frameworks with proper oversight, impact assessments, and safeguards to prevent abuse
Topics
Human rights | Legal and regulatory
Lack of transparency and public consultation enables FRT abuse
Speakers
– Olga Cronin
– Tomas Ignacio Griffa
– Adam Remport
Arguments
Mandatory fundamental rights impact assessments must be conducted prior to any new FRT use
Government claims technical details are trade secrets, preventing proper assessment of bias and discrimination
Lack of public awareness about FRT existence due to no consultation or communication from government
Summary
All speakers emphasize that governments are implementing FRT systems without proper public consultation, transparency, or impact assessments, which enables systematic abuse and prevents accountability
Topics
Human rights | Legal and regulatory
FRT is being weaponized against marginalized communities and protesters
Speakers
– Olga Cronin
– Adam Remport
Arguments
FRT is being weaponized against marginalized groups including Palestinians, Uyghur Muslims, and protesters
The technology enables mass surveillance and gives police seismic shift in surveillance power, turning people into ‘walking license plates’
Hungarian government banned Pride parade and expanded FRT use to all petty offenses, enabling mass surveillance of demonstrators
FRT is being deliberately weaponized with government actively communicating that participants will be found and fined
Summary
Both speakers agree that FRT is not a neutral technology but is being actively used as a tool of oppression against vulnerable communities, particularly LGBTQ+ individuals and political protesters
Topics
Human rights
Similar viewpoints
Both speakers emphasize the critical importance of independent oversight mechanisms for FRT systems, with Cronin advocating for this in INCLO principles and Griffa showing the consequences when such oversight is absent in Argentina
Speakers
– Olga Cronin
– Tomas Ignacio Griffa
Arguments
Independent oversight bodies must be established with mandatory annual reporting requirements
The system was ruled unconstitutional due to lack of legal compliance, missing oversight bodies, and no human rights impact studies
Topics
Legal and regulatory
Both speakers strongly oppose the use of FRT against peaceful protesters and demonstrators, viewing this as a fundamental violation of democratic rights and freedoms
Speakers
– Olga Cronin
– Adam Remport
Arguments
Any FRT use must have sufficient legal basis and should never be used to identify protesters or collect information on peaceful assemblies
The system violates multiple INCLO principles by targeting peaceful protesters and lacking transparency, consultation, or oversight
Topics
Human rights
Both acknowledge the ongoing debate about whether FRT should be completely banned or regulated, recognizing that different jurisdictions may require different approaches based on their specific circumstances
Speakers
– Olga Cronin
– Audience
Arguments
Different jurisdictions require different approaches – some calling for bans, others for moratoriums or legislation
Need for community activation and discussion about whether FRT should be completely banned or regulated
Topics
Human rights | Legal and regulatory
Unexpected consensus
Trade secrets cannot justify lack of transparency in bias assessment
Speakers
– Olga Cronin
– Tomas Ignacio Griffa
Arguments
Mandatory fundamental rights impact assessments must be conducted prior to any new FRT use
Government claims technical details are trade secrets, preventing proper assessment of bias and discrimination
Explanation
Both speakers, drawing from different legal contexts (UK Bridges case and Argentina case), reach the same conclusion that commercial trade secret claims cannot override the need for transparency in assessing discriminatory impacts of FRT systems
Topics
Human rights | Legal and regulatory
Creative community engagement is essential for FRT awareness
Speakers
– Olga Cronin
– Audience
Arguments
Creative community activation through local artists and grassroots organizations is essential for public awareness
Need for community activation and discussion about whether FRT should be completely banned or regulated
Explanation
There was unexpected consensus on the need for creative, grassroots approaches to public education about FRT, moving beyond traditional advocacy to include artistic and community-based methods
Topics
Sociocultural
Overall assessment
Summary
There is strong consensus among all speakers that FRT poses serious threats to human rights, is being systematically abused by governments, and requires comprehensive legal safeguards. All speakers agree on the technology’s unreliability, its disproportionate impact on marginalized communities, and the need for transparency and oversight.
Consensus level
Very high level of consensus with no fundamental disagreements. The speakers complement each other’s arguments with concrete examples from different jurisdictions (Ireland/UK, Argentina, Hungary) that all support the same conclusions about FRT dangers. This strong consensus strengthens the case for international coordination on FRT regulation and suggests broad civil society agreement on the need for restrictive approaches to FRT deployment.
Differences
Different viewpoints
Regulatory approach – ban versus regulation with safeguards
Speakers
– Olga Cronin
– Audience
Arguments
Different jurisdictions require different approaches – some calling for bans, others for moratoriums or legislation
Need for community activation and discussion about whether FRT should be completely banned or regulated
Summary
While Cronin acknowledges that INCLO members have varying approaches (some calling for complete bans, others for regulation), and mentions that ‘from many of our perspectives in INCLO, we would call for a ban,’ she also recognizes that ‘we know that that fight has been lost in certain jurisdictions’ requiring a pragmatic approach with safeguards. The Brazilian audience member specifically asks whether FRT should be banned completely, highlighting this strategic disagreement within the civil liberties community.
Topics
Human rights | Legal and regulatory
Unexpected differences
Overall assessment
Summary
The speakers show remarkable alignment on the fundamental problems with FRT and the need for strong protections, with only minor strategic disagreements about regulatory approaches
Disagreement level
Very low level of disagreement. The speakers are essentially presenting a unified front against current FRT practices, with their different case studies (international principles, Argentina’s legal victory, Hungary’s weaponization) all supporting the same core argument that FRT poses serious threats to human rights and requires either prohibition or very strict regulation. The only meaningful disagreement is strategic – whether to pursue complete bans or work within existing systems to implement strong safeguards. This low level of disagreement actually strengthens their collective message but may indicate a need for more diverse perspectives in the discussion to fully explore the complexities of FRT regulation.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers emphasize the critical importance of independent oversight mechanisms for FRT systems, with Cronin advocating for this in INCLO principles and Griffa showing the consequences when such oversight is absent in Argentina
Speakers
– Olga Cronin
– Tomas Ignacio Griffa
Arguments
Independent oversight bodies must be established with mandatory annual reporting requirements
The system was ruled unconstitutional due to lack of legal compliance, missing oversight bodies, and no human rights impact studies
Topics
Legal and regulatory
Both speakers strongly oppose the use of FRT against peaceful protesters and demonstrators, viewing this as a fundamental violation of democratic rights and freedoms
Speakers
– Olga Cronin
– Adam Remport
Arguments
Any FRT use must have sufficient legal basis and should never be used to identify protesters or collect information on peaceful assemblies
The system violates multiple INCLO principles by targeting peaceful protesters and lacking transparency, consultation, or oversight
Topics
Human rights
Both acknowledge the ongoing debate about whether FRT should be completely banned or regulated, recognizing that different jurisdictions may require different approaches based on their specific circumstances
Speakers
– Olga Cronin
– Audience
Arguments
Different jurisdictions require different approaches – some calling for bans, others for moratoriums or legislation
Need for community activation and discussion about whether FRT should be completely banned or regulated
Topics
Human rights | Legal and regulatory
Takeaways
Key takeaways
Facial Recognition Technology (FRT) poses significant risks to human rights including privacy, freedom of assembly, and equality, with documented bias against Black individuals and potential for mass surveillance
INCLO developed 18 principles for police use of FRT to mitigate harms, including requirements for legal basis, impact assessments, independent oversight, and prohibition of live FRT
Real-world case studies from Argentina and Hungary demonstrate how FRT can be misused when proper safeguards are absent – Argentina’s system illegally accessed 7+ million people’s data while Hungary weaponized FRT against LGBTQ+ Pride participants
FRT systems are inherently unreliable and probabilistic, prone to false positives/negatives, with different algorithms producing contradictory results as shown in the Robert Williams case
Public awareness and community engagement are crucial since many people are unaware FRT systems exist or how they operate in their jurisdictions
The technology enables governments to weaponize surveillance against marginalized groups and peaceful protesters, creating chilling effects on freedom of assembly
Resolutions and action items
INCLO created and published 18 principles for police use of FRT as an advocacy tool for civil society organizations
Legal challenges can be pursued through EU Law Enforcement Directive and AI Act provisions
Community activation through creative means like local artists and grassroots organizations should be implemented to raise public awareness
Hard copies of INCLO principles were made available to workshop participants for further distribution and use
Unresolved issues
Whether FRT should be completely banned versus regulated varies by jurisdiction – no consensus reached on universal approach
Technical details of FRT systems remain hidden behind trade secret claims, preventing proper bias assessment
How to effectively handle mass deployment of FRT against large groups (tens of thousands) remains technically and operationally unclear
The relationship between laws banning face masks at protests and FRT deployment needs further examination
Ongoing legal battle in Argentina over government’s refusal to disclose technical specifications claiming trade secrets
Long-term effectiveness of community engagement strategies for FRT awareness remains to be determined
Suggested compromises
INCLO’s 18 principles represent a compromise approach – recognizing that complete bans may not be achievable in all jurisdictions while establishing minimum safeguards
Allowing retrospective FRT use while prohibiting live FRT as a middle-ground approach, though noting retrospective can be equally invasive
Requiring independent technical assessments rather than relying solely on vendor claims about bias and accuracy
Thought provoking comments
Robert was arrested after an algorithm identified him as the ninth most likely match for the probe image but there were two other algorithms run. One returned 243 candidates, Robert wasn’t on that list, and another returned no results at all and yet he was still arrested and detained. So really the point of this is just to show the arbitrariness of this and it’s not this silver bullet solution that it’s often presented to be.
Speaker
Olga Cronin
Reason
This comment is deeply insightful because it exposes the fundamental unreliability and arbitrariness of FRT through a concrete example. It demonstrates how the same person can be identified differently by different algorithms, yet law enforcement still acted on inconclusive results. This challenges the common perception of FRT as infallible technology.
Impact
This comment established the foundation for the entire discussion by immediately dismantling the myth of FRT reliability. It shifted the conversation from theoretical concerns to concrete evidence of systemic failures, setting up the framework for all subsequent case studies and principles discussed.
However, when the judge asked the National Identity Database how many individual consultations the government of the city of Buenos Aires had made, it turned out that the government had made consultations about more than seven million people, more than nine million consultations in total regarding more than seven million people. So, clearly the Buenos Aires police and perhaps other offices were accessing this biometric data for other purposes, entirely different from searching for fugitives.
Speaker
Tomas Ignacio Griffa
Reason
This revelation is particularly thought-provoking because it exposes the massive scope creep from the stated purpose (30,000 fugitives) to actual implementation (7+ million people). It demonstrates how FRT systems can be systematically abused beyond their intended scope without proper oversight.
Impact
This comment fundamentally shifted the discussion from technical accuracy issues to systemic abuse and mission creep. It provided concrete evidence for why the INCLO principles around oversight, documentation, and legal frameworks are essential, making the abstract principles tangible through real-world consequences.
It is an interesting case study of the lack of transparency around facial recognition. One of my conclusions will be that FRT in this present case is used actively to discourage people from attending demonstrations, but the lack of transparency, the lack of knowledge that people have of what is going to actually happen to them is also as discouraging as the outright threats of using FRT.
Speaker
Adam Remport
Reason
This insight is particularly profound because it identifies how uncertainty itself becomes a weapon. The comment reveals that the chilling effect doesn’t require actual deployment – the mere possibility, combined with lack of transparency, creates self-censorship and suppresses fundamental rights.
Impact
This comment elevated the discussion to examine the psychological and societal impacts of FRT beyond direct misidentification. It introduced the concept of ‘weaponized uncertainty’ and connected technical surveillance capabilities to broader democratic freedoms, deepening the conversation about systemic effects on civil liberties.
Well, people, I think, never really cared about FRT, because they didn’t actually know that it existed, precisely because of the lack of communication on the government side. So, the situation had to become this bad and severe for the people to start to even care about the problem. But now, the system already exists, with the rules that we have now, and which can be abused by the police and the government.
Speaker
Adam Remport
Reason
This comment is insightful because it highlights a critical democratic deficit – how surveillance systems can be implemented without public awareness or consent, and by the time people become aware, the infrastructure for abuse is already in place. It reveals the strategic nature of opacity in surveillance deployment.
Impact
This response crystallized the importance of proactive transparency and public engagement principles discussed earlier. It connected the technical principles to democratic governance, showing how the lack of early intervention creates fait accompli situations where rights are harder to protect retroactively.
I was wondering, since we’re talking about facial recognition technologies, there’s also been a lot of movement to penalize wearing masks in public as an attempt to protect yourself against facial recognition technology. So I was wondering if INCLO or any organization have thoughts or processes or any kind of discussions on how the ban of facial masks, for example, is also in conversation with FRT.
Speaker
June Beck
Reason
This question is thought-provoking because it identifies the emerging ‘arms race’ between surveillance technology and privacy protection measures. It reveals how FRT deployment creates secondary policy responses that further erode privacy rights, creating a compounding effect on civil liberties.
Impact
This question expanded the scope of the discussion beyond FRT itself to examine the broader ecosystem of surveillance and counter-surveillance measures. It highlighted how FRT creates cascading policy effects that multiply its impact on civil liberties, adding another layer of complexity to the regulatory challenges discussed.
Overall assessment
These key comments fundamentally shaped the discussion by moving it through distinct phases: from establishing the technical unreliability of FRT, to demonstrating systematic abuse in practice, to revealing the strategic use of opacity as a tool of social control, and finally to examining the broader ecosystem of surveillance and counter-surveillance measures. The comments created a progression from individual harms to systemic abuse to democratic deficits, making the abstract principles concrete through real-world consequences. The discussion evolved from a technical presentation to a nuanced examination of how surveillance technology intersects with democratic governance, civil liberties, and social control. Each comment built upon previous insights, creating a comprehensive picture of how FRT threatens not just individual privacy but the broader fabric of democratic society.
Follow-up questions
How to effectively conduct community activation and education about facial recognition technology risks
Speaker
Pietra (audience member from Brazil)
Explanation
This is important because many people are unaware that FRT systems exist or understand their risks, making community education crucial for building awareness and resistance to harmful implementations
Whether facial recognition technology should be completely banned or if there are acceptable use cases
Speaker
Pietra (audience member from Brazil)
Explanation
This represents a fundamental policy question that different jurisdictions are grappling with, as some advocate for total bans while others seek regulatory frameworks
How laws banning face masks in public relate to and interact with facial recognition technology deployment
Speaker
June Beck (Youth for Privacy)
Explanation
This highlights an emerging area where governments may be restricting protective measures against surveillance, creating a concerning dynamic between FRT use and citizens’ ability to protect their privacy
How the Hungarian FRT system will technically and operationally handle mass surveillance of tens of thousands of people simultaneously
Speaker
Adam Remport
Explanation
This is critical because the system has never been tested at this scale, and the capacity limitations could affect both the effectiveness and the human rights impacts of such mass deployment
What specific purposes the Buenos Aires police used to access over 7 million people’s biometric data beyond searching for fugitives
Speaker
Tomas Ignacio Griffa
Explanation
This represents a significant violation of the system’s stated purpose and legal framework, and understanding these unauthorized uses is crucial for accountability and preventing similar abuses
Whether technical details of FRT software and training datasets should be disclosed for bias testing versus protecting trade secrets
Speaker
Tomas Ignacio Griffa
Explanation
This ongoing legal debate in Argentina highlights the tension between transparency needed for accountability and vendors’ claims of proprietary information, which affects the ability to properly audit these systems for bias
Whether data from the Hungarian FRT system is being transferred to third countries given the vendor relationship
Speaker
Adam Remport
Explanation
This raises important questions about data sovereignty and international data transfers that could have significant privacy and security implications for Hungarian citizens
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
