Pre 4: Dynamic Coalition on data and trust: Stakeholders Speak – Perspectives on Age Verification
12 May 2025 07:00h - 08:15h
Pre 4: Dynamic Coalition on data and trust: Stakeholders Speak – Perspectives on Age Verification
Session at a glance
Summary
This EuroDIG pre-session focused on examining age verification systems from multiple stakeholder perspectives, exploring their effectiveness in protecting minors online while considering associated risks and alternatives. The discussion was moderated by Regina Filipová Fuchsová and featured panelists from the Internet Society, youth organizations, and academic research backgrounds. The session began by acknowledging growing concerns about social media’s impact on young people’s mental health, citing research showing increased rates of depression, suicide, and self-harm among adolescents since 2010, with Australia’s planned mandatory 16+ age limit for social media platforms serving as a key policy example.
Tatiana Tropina from the Internet Society outlined significant concerns about current age verification technologies, emphasizing that they create privacy and security risks for all users, not just minors. She highlighted how these systems require sensitive data like government IDs or biometric information, potentially exposing users to data breaches and creating barriers for marginalized communities who may lack required documentation. The discussion revealed that age verification tools could disproportionately affect vulnerable populations, including older adults, people without bank accounts, and those with disabilities, while potentially creating a false sense of security.
Youth representatives Niels Zagema and Paulo Glowacki emphasized that young people find many age verification methods invasive and creepy, undermining trust in digital systems. They stressed the importance of youth participation in developing solutions and noted that Germany has extensive experience with age verification systems, having implemented over 100 approved solutions. Researcher Natálie TerÄová cautioned against viewing age verification as a silver bullet, emphasizing that children’s online experiences are shaped by complex factors including family dynamics, socioeconomic status, and individual circumstances.
The panelists agreed that effective child protection online requires a multi-layered approach combining education, digital literacy programs, parental involvement, platform responsibility, and youth participation in policy development. They concluded that protecting children online should not come at the expense of privacy, security, and digital inclusion for all users, rejecting the framing of child safety versus user rights as a necessary trade-off.
Keypoints
## Major Discussion Points:
– **Age verification is not a “silver bullet” for protecting minors online** – All speakers agreed that age verification systems alone cannot solve the complex problems of online safety for children, and that current implementations create significant privacy, security, and access risks for all users, not just minors.
– **Privacy and security risks of current age verification systems** – These systems require collection of sensitive personal data (government IDs, biometric scans, financial information) that creates vulnerabilities to data breaches, tracking, and misuse while potentially creating a false sense of security.
– **Access barriers and digital exclusion concerns** – Age verification tools disproportionately affect marginalized communities, older adults, people without government IDs or bank accounts, and those with disabilities, potentially pushing vulnerable populations toward less secure parts of the internet.
– **Need for multi-stakeholder, holistic approaches** – Speakers emphasized that protecting children online requires a combination of digital literacy education, parental involvement, platform responsibility, youth participation in policy development, and improved platform design rather than relying solely on technical solutions.
– **Balancing child protection with rights and autonomy** – The discussion highlighted tensions between protecting children and preserving their privacy, autonomy, and right to access information, with emphasis on creating age-appropriate spaces rather than simply excluding children from online environments.
## Overall Purpose:
This was a pre-conference working session at EuroDIG examining age verification from multiple stakeholder perspectives (technical community, youth representatives, researchers, civil society). The goal was to analyze the challenges and risks of age verification systems while exploring complementary approaches that could protect minors online without undermining fundamental rights like privacy, inclusion, and digital autonomy.
## Overall Tone:
The discussion maintained a consistently thoughtful and collaborative tone throughout. Speakers were critical of current age verification approaches but constructive in their criticism, focusing on identifying problems and proposing alternatives rather than simply opposing child protection efforts. The tone was academic and policy-oriented, with speakers building on each other’s points and showing mutual respect for different perspectives. There was a shared sense of urgency about finding better solutions that don’t create trade-offs between child safety and other fundamental rights.
Speakers
– **Moderator (Ramon)**: Remote moderator for the session
– **Regina Filipová Fuchsová**: Industry Relations Manager at EURID, session moderator
– **Tatiana Tropina**: Senior advisor at the Internet Society, involved in research, policymaking and capacity-building projects at academic institutions and with various consultancy projects for international organizations, civil society and think tanks
– **Niels Zagema**: Fellow and Dutch youth representative for European affairs at the National Youth Council, member of USDIG, advocates for the rights of young people in national and EU policy processes
– **Natálie TerÄová**: Researcher and lecturer, European representative on ICANN’s At-Large Advisory Committee, chair of the IGF Czechia, focuses on children’s online experiences, digital skills, and online safety
– **Paulo Glowacki**: Member of EURID Youth Committee, holds a degree in international relations, currently pursuing a master’s degree in international law at the Geneva Graduate Institute, active member of the German Youth IGF
– **Audeince**:
**Additional speakers:**
– **Tapani Tarvainen**: From Electronic Frontier Finland
– **Torsten Krause**: From the Digital Opportunities Foundation in Germany, child rights advocate
Full session report
# EuroDIG Pre-Session: Age Verification Systems – A Multi-Stakeholder Analysis
## Executive Summary
This EuroDIG pre-session examined age verification systems from multiple stakeholder perspectives, exploring their effectiveness in protecting minors online while considering associated risks and alternatives. The discussion was moderated by Regina Filipová Fuchsová, Industry Relations Manager at EURID, with remote co-moderation by Ramon. The session brought together representatives from the Internet Society, youth organizations, academic research, and civil society to analyze the complex challenges surrounding online child protection.
Regina opened the session by referencing growing concerns about social media’s impact on young people’s mental health, citing Jonathan Haidt’s research showing increased rates of depression, suicide, and self-harm among adolescents since 2010. Australia’s planned mandatory 16+ age limit for social media platforms, scheduled to take effect in December, served as a key policy example driving current debates. The discussion evolved beyond technical solutions to examine fundamental questions about children’s rights, digital inclusion, and the balance between protection and autonomy.
## Key Participants and Perspectives
The panel featured diverse expertise: Tatiana Tropina from the Internet Society provided technical and policy analysis; Niels Zagema represented youth perspectives from the Netherlands; Paulo Glowacki contributed insights from German youth advocacy; and Natálie TerÄová offered research perspectives on children’s online experiences. Additional contributions came from audience members including Tapani Tarvainen from Electronic Frontier Finland and Torsten Krause from Germany’s Digital Opportunities Foundation.
## Major Areas of Discussion
### Age Verification as a Standalone Solution
All speakers rejected age verification as a complete solution for protecting minors online. The panelists agreed that current implementations create significant privacy, security, and access risks while providing false security.
Tatiana Tropina emphasized that age verification systems fail to address the complex nature of online risks, stating that the issue should not be framed as a trade-off between child safety and privacy rights. This perspective challenged the commonly accepted premise that protecting children online requires sacrificing other fundamental rights.
Niels Zagema noted from a youth advocacy perspective that young people find many age verification methods invasive, undermining trust in digital systems. Paulo Glowacki added that age verification impacts how young people access educational content and participate online.
Natálie TerÄová provided research context, explaining that children’s online experiences are shaped by complex factors including family dynamics and individual circumstances that cannot be addressed through technical age checks alone. She also criticized the research methodology underlying some claims about social media’s impact, noting that “correlation is not causality” and that the scientific community has concerns with certain framings of this research.
### Privacy and Security Vulnerabilities
Significant concerns were raised about privacy and security implications of current age verification technologies. Tropina outlined how these systems require sensitive data such as government IDs or biometric information, creating vulnerabilities to data breaches and misuse.
The discussion revealed tension between GDPR’s data minimization principles and age verification systems that demand extensive personal information. TerÄová highlighted this legal contradiction, noting that companies face penalties for allowing minors through age verification systems.
Paulo Glowacki raised concerns about transparency, noting that current age verification technologies function without clear understanding of their algorithms or decision-making processes.
Tapani Tarvainen from the audience highlighted a crucial paradox: children may need to browse the internet anonymously for their own protection, as requiring them to identify themselves to potentially harmful actors could increase rather than decrease their vulnerability.
### Digital Exclusion and Access Barriers
The discussion examined how age verification systems create barriers for marginalized communities. Tropina explained that these tools disproportionately affect vulnerable populations, including people without government IDs, those with disabilities, and individuals from marginalized communities.
Biometric verification systems were criticized for poor performance on people with non-white skin tones and challenges for people with disabilities. The speakers noted that pushing vulnerable populations away from mainstream platforms could drive them toward less secure parts of the internet.
### Legal and Ethical Considerations
Glowacki referenced the UN Convention on the Rights of the Child, which balances freedom of expression and access to information with privacy rights, suggesting children have rights both to protection and to access beneficial online content.
TerÄová raised concerns about the developmental impact of requiring teens to repeatedly prove their age, suggesting this might feel invasive during a critical identity development period.
The discussion also touched on broader regulatory trends, with concerns expressed about proposed EU regulation that could expand beyond age verification into general surveillance.
### Alternative and Complementary Approaches
Rather than dismissing child protection concerns, speakers focused on alternative approaches that could address online safety without the drawbacks of current age verification systems.
Tropina advocated for adult websites to self-label with metadata, empowering parents to set appropriate device controls without requiring platforms to collect sensitive user data. She emphasized the need for multi-stakeholder development of international standards if age verification tools are implemented.
Digital literacy education emerged as a crucial complementary measure, though speakers noted this must extend beyond traditional classroom settings. TerÄová’s research indicated that enhanced digital literacy and ethical values matter more than parental mediation, especially for older adolescents.
The discussion explored creating youth-specific safe platforms rather than simply restricting access to existing ones, focusing on age-appropriate environments designed with young people’s needs in mind.
### Technical Implementation Challenges
Torsten Krause from the audience referenced German experience with age verification solutions, describing double-blind mechanisms that provide yes/no answers without revealing user identity, suggesting that privacy-preserving approaches might be technically feasible.
However, this perspective contrasted with Tropina’s more skeptical assessment of current implementations. She emphasized that even improved technical solutions would require independent auditing with publicly available reports to ensure transparency and accountability.
### Youth Agency and Participation
A crucial theme was recognizing children and young people as active participants rather than passive consumers of digital services. TerÄová emphasized not underestimating children, noting they will find ways to work around regulations and restrictions.
This perspective reframed the adult role from controller to guide, acknowledging that restrictive approaches may be counterproductive if they fail to account for young people’s agency. Zagema stressed that youth participation in design and regulation debates is essential but must be meaningful to avoid tokenism.
## Areas of Agreement
The speakers demonstrated agreement on several key points:
– Age verification alone cannot solve complex issues of child online safety and may create additional problems
– Multi-layered approaches combining technological safeguards with education and digital literacy are needed
– Meaningful youth participation in developing policies is important
– Current age verification implementations raise significant privacy and security concerns
– Trust is fundamental – systems that undermine trust may defeat their own purpose
## Points of Disagreement and Tension
Several areas of disagreement emerged:
**Parental Control Role**: Zagema expressed concerns about excessive parental influence on children’s autonomy, while TerÄová noted that parental mediation works for younger children but older adolescents rely more on peers.
**Technical Solution Feasibility**: Tropina remained skeptical about technical age verification solutions, while audience members presented German approaches as potentially viable privacy-preserving options.
**Research Foundations**: TerÄová explicitly criticized certain research methodologies regarding social media’s impact on youth mental health, despite the session opening with references to this research.
## Unresolved Issues
Several complex issues remained unresolved:
– How to balance child protection with privacy rights and digital inclusion
– Technical feasibility of privacy-preserving age verification systems
– Regulatory fragmentation across different jurisdictions
– Tension between GDPR data minimization and age verification data collection requirements
– Preventing over-blocking of beneficial content like health education resources
## Recommendations and Next Steps
The discussion generated several recommendations:
– Continue discussion at the main EuroDIG session
– Implement independent auditing of age verification systems with public reports
– Develop international standards through multi-stakeholder processes
– Focus on creating youth-specific safe platforms and age-appropriate digital environments
– Implement adult website self-labeling to empower parental controls
– Enhance digital literacy programs beyond traditional classroom settings
– Explore privacy-preserving technical mechanisms with transparency requirements
## Conclusion
This EuroDIG pre-session challenged simplistic approaches to online child protection while maintaining focus on keeping children safe online. The discussion elevated the debate from technical implementation details to fundamental questions about children’s rights, digital inclusion, and technology’s role in society.
The key message was rejecting false trade-offs between child safety and other fundamental rights. The speakers advocated for sophisticated, multi-layered approaches that recognize children as active participants with agency while providing appropriate guidance and protection.
The discussion demonstrated that meaningful progress on online child protection requires moving beyond technical solutions to address broader questions of digital literacy, social support, platform responsibility, and youth empowerment. While age verification systems may have a role in comprehensive child protection strategies, they should not be viewed as standalone solutions to complex social and technological challenges.
Session transcript
Moderator: My name is Ramon, and I will be your remote moderator for this session today. More information about the session and the speakers are available on the EuroDIG wiki, which I will share the link very soon. We encourage you to raise questions, and to raise your hand if you have any questions or if you want to present something. I’d like to ask you, if you have any questions, please put a cue in the chat and ask your question so that I can address it to the room. Before we’ll start the session, I would like to highlight the session rules, which are please enter the Zoom session with your full name. If you have to ask a question, raise your hand using the Zoom function. You will be unmuted and the floor will be given to you, and then you can raise a question. When you’re speaking, please switch on your video. State your name and your affiliation, and do not share any links to the Zoom meeting, not even with your colleagues, please. Let me just, there we go. Now we’ll send the information again in the Zoom chat, and I hand over to our moderator, Regina Fuchsová.
Regina Filipová Fuchsová: Thank you very much, Ramon, for the information. A warm welcome from me as well to this session. We are starting basically the EuroDIG Conference today with these pre-sessions on day zero, one of them being the perspectives on age verification from the viewpoint of different stakeholders. We are organizing it within the Dynamic Coalition on Data and Trust, and in cooperation with the Youth Committee of EURID and also the Czech IGF. My name is Regina Filipová Fuchsová, and I work at EURID as the Industry Relations Manager, and I will try my best to moderate the session today and give everybody space enough to raise the points and questions. We are concentrating on the various aspects of age verification as a measure to protect minors online. So let me please start with a brief note of what underlying issues are we actually addressing here. We have a bit more than the first decade behind us of experience with social media. So we can analyze the impact from different perspectives over a longer period of time already. Evidence from leading researchers across continents show a consistent link between social media use and a decline in young people’s mental health. And this is independent of local social context. Unrestricted access to social media exposes children to cyberbullying, adult abuse, image-based abuse, and illegal or age-inappropriate content. We are referring here, for example, to a well-known social psychologist, Professor Jonathan Haidt, and his book The Anxious Generation, where he writes that since 2010, tech companies have basically been exploiting our kids’ attention and mental health for profit. His research shows an evidence that between the years 2010 and 2020, the situation in the United States became especially alarming, and he documents it on data such as the high increase in suicide rate among adolescents or major depressive episodes, both very high for girls and boys in this age group, and also emergency room visit connected with self-harm, which rose for girls by 188%. percent in the decade, which is very alarming, and it was much higher than boys of the same age group. He also claims that the rise of phone-based childhood has contributed to four key harms, including social and sleep deprivation, attention fragmentation, and addiction. Acknowledging this situation, nonprofit organizations worldwide are already working to address the most harmful aspects of children’s online exposure with their missions to educate and also equip families mainly in the field of prevention of harm and also healing when the situation occurs. The minimum age requirements for social media access are now very central in regulatory debates, also in the European Union. Most platforms currently set the minimum age at 13. This is a threshold which can be easily bypassed, and it’s insufficient for protecting younger teens who are especially vulnerable to algorithm-driven content and also harmful comparisons. Australia is an emerging leader in this area. We already mentioned it in the description of this session. Its government plans to implement the Online Safety Amendments Act, which introduces a mandatory minimum age of 16 for certain social media platforms. This law is scheduled to take effect in December later this year. Since we are at EuroDIG here, we are, of course, interested in the possible response of European countries and the EU itself, taking into account the risks of regulatory overreach, which comes in attempts like this to regulate, or also opportunities for the right-respecting industry. communication, security of privacy and providing self-regulation. Age verification and the connected age systems should ideally ensure the balance from privacy rights and digital inclusion. What is important are the perspective and ethical considerations of the participants and the importance of the participation of the participants in enabling surveillance or exclusion of them. So we will focus today on practical issues and case studies as well, experienced by different stakeholders as our speakers represent a wide variety of them. Let me introduce to you our panelists and speakers. We have two persons here in person and two online. We have a woman from the Internet Society. She is a senior advisor in the Internet Society. Earlier she was involved in research, policymaking and capacity-building projects, both at academic institutions and also with various consultancy projects for international organizations, civil society and think tanks. Thank you for coming. We have on my right-hand side Niels Zagema, a fellow and Dutch youth representative for European affairs at the National Youth Council. So Niels is both a member of USDIG and also one of the Dutch young representatives for European affairs elected by Dutch youth through the Dutch National Youth Council. He advocates for the rights of young people in national and EU policy processes. Thank you for coming. And then online we have Natálie Tercova. Natálie is a senior advisor in the Internet Society and also a member of the Dutch Youth Council. She is also a member of the European Council on Human Rights. a researcher and lecturer. She serves as the European representative on ICANN’s At-Large Advisory Committee, and she is also the chair of the IGF Czechia. Her areas of focus include children’s online experiences, digital skills, and online safety. Hi, Nati. And our second speaker online is Paulo Glovacki, a member of EURID Youth Committee. He has been active in internet governance since 2022, when he first participated in the IGF in Addis Ababa. He holds a degree in international relations, and he is currently pursuing a master’s degree in international law at the Geneva Graduate Institute. Paulo is also an active member of the German Youth IGF, and yeah, and he is joining us today from Geneva, right?
Paulo Glowacki: Hello. Yes, hello, everyone.
Regina Filipová Fuchsová: Okay, so let’s dive into the discussion. I would like to start with an introductory question to all the speakers, and ask why is the topic of age verification important for each of you, for your organization or the stakeholder group you are representative of? And also, if you see the age verification as the silver bullet in protecting minors online. So can I ask Tanja to start?
Tatiana Tropina: Thank you very much, Regina, and thank you very much for having me and Internet Society on this session. Why is it important for us at the Internet Society or as a technical community as a whole? At the Internet Society, we believe that internet is for everyone. We work to make the internet open, globally connected, secure, and trustworthy. And these are not just abstract values. Internet exists as a force for good in the society. It exists to improve people’s life. This is at the core of our mission. So when we look at the age verification tools as they are now, we do not see compatibility with open, globally connected, secure, and trustworthy Internet. In fact, we believe that in the current way, shape, and form, Internet users around the world will be less secure with the age verification rule, with mandatory age verification tools. These tools do not affect only minors. They affect everybody, minors and adults alike. And we see several big risks that these tools bear if they are implemented as they are now. First of all, they introduce privacy and security and data misuse risks, as I said, not only for minors but for adults, basically for everyone. Another risk we see is the barriers for access to the Internet they introduce. And this affects the groups that already are disadvantaged, that already face these barriers for access, those groups who are already excluded somewhat, like older adults, like marginalized and disadvantaged communities. And overall, these risks can cause a chilling effect on the Internet use, on the everyday Internet use. And as I said, some of the groups could be affected very disproportionately. And at the end, when we see what is going on around the globe, We are right now at EuroDIG, but we have to think globally when we think about Internet as a global network. When various states, or stakeholders for that matter, start rolling out their own age verification solutions or start mandating them via regulation, it can create a patchwork of different approaches and different barriers. And at the end, it can disrupt the interoperable global nature of the Internet. And Regina and everyone, I know that this is only the introductory statement. I hope that I will have a chance to elaborate on these risks a bit more later in this discussion. But to wrap up, I just want to say that what Regina called a delicate balance between protection of minors online, which we absolutely appreciate and support as a very good goal, and privacy, security, and other rights. As currently framed, these solutions and this debate, instead of introducing a delicate balance, introduce a trade-off between security of minors and privacy, security of users, access. And it should not be a trade-off. We should not frame it as a trade-off. And a bit more on this later, I hope. Thank you very much.
Regina Filipová Fuchsová: Thank you very much, Tanja. Indeed, we will dive more in the risks as well of age verification systems. Can I now ask Natálie for her introductory statement? Thank you.
Natálie TerÄová: Sure. Good morning, everyone. Hope you can hear me well. Let me know by nodding. Okay, good. Thank you. Yes, we can hear you. Thank you. Thank you. Perfect. Okay. So, first of all, thank you so much for inviting me to the session. It’s a pleasure. As Regina already correctly stated, I work as a researcher during my day job, and actually my topic of focus is digital literacy of children and youth related to their online experiences and also online risks. And in this case, age verification on the surface of it does promise a way to keep, let’s say, harmful online content away from children and those who are very vulnerable. And it really does matter to me because it forces also tech companies and us as a society to really do recognize that children and young people do need some unique needs online. end-users of the internet, and sometimes they need to be treated in a different way. And in this case, if we talk about age, it can be preventing an eight-year-old from wandering into a chat room, which is 18 plus, full of intimate content that could be not useful and maybe potentially harmful for someone that age, or making sure that a 10-year-old isn’t targeted with some, let’s say, inappropriate ads. And they know how to navigate their way in this. And in my academic work, actually, we did see a lot of psychological development role in affecting how kids really do interact online. So of course, we know that younger children are more vulnerable, and teens are, on the other hand, testing the boundaries and also trying to work their way around it. So of course, it is very good to have some, let’s say, gates in place, some barriers, like age checks. However, what is really, really important to say here, and I am drawing from not only my experience but from robust data we have from around the world from very well-known scientists and academics, the thing is that just eight checks cannot solve all the issues because what really really matters and has been proven many times is the role of family, the SES, the socio-economic status and also so many other little aspects like prerequisites that children come with to the online environment and these shape the experience they have online and another thing that really we should just very briefly remember is that whenever you restrict someone from something they will find their way to still access it but doing these detours and trying to find an alternative routes these can be really really dangerous so I would just not to be too long in this opening statement I just want to say that I do not see age verification as a silver bullet, I really believe that no single tool can be a silver bullet or an ultimate solution and we should focus very much on education, enhancing digital literacy but also wise parenting, parenting strategies, we know a lot about parental mediation and how these things really really matter and of course a good platform design but that is something I will talk more I guess as we go with the session. Thank you so much.
Regina Filipová Fuchsová: Thank you very much Natalie, so yeah it looks so far that it’s not like a one-word answer to the problems which we depicted at the beginning. Can I ask Niels for his introductory statement now?
Niels Zagema: Yes you can, thank you Regina. So I’m the youth representative for the Dutch National Youth Council which means I speak to a lot of young people also in classrooms and of course we hear a lot a lot about the dangers of the Internet, but we also know that online safety is an important priority. We hear from young people that they are worried, but as the Youth Council and as a youth representative, we don’t believe that age verification is the silver bullet. I also, to be honest, have not heard anyone say it’s the silver bullet yet, but maybe that’s because I’m in different circles. In fact, when implemented poorly, I think that age verification really undermines the right of freedom that makes Internet such a powerful space for young people, and nowadays the first remark about connectivity is negative, which is, of course, understanding, but the positive often gets neglected. I mean, first of all, I think young people do want safe and appropriate online environments, but I think we can get that through different means, and it’s not trading one evil for another evil. I, for example, was in a classroom, and I asked them, what do you think of age verification? They said, well, if I have to scan my face or upload a passport just to watch YouTube, first of all, that would feel invasive and a bit creepy, and it doesn’t really show trust. And I think, I mean, I agree with it personally, but it also shows that young people are concerned about the topic of age verification, but that’s mainly when you bring the topic to them, so not really by themselves, and also when in the media you hear a lot about age verification, and it can, of course, mean different things, has a lot of technical details, and therefore it’s also difficult for young people, but also, of course, broader society to form a general opinion. So to go back to the question of whether age verification is a silver bullet, I don’t think so. I think you can focus more on education beyond the classroom, on empowering youth to have those spaces, like create more safe spaces for young people, but I do think it’s a valid concern to be addressed, and I see both sides of the argument, but the most important for youth, I think, is that they can have trust in those systems, and that age verification can neglect trust or can make trust, and that’s a way of looking at it. How do we want to see young people have trust in those digital environments? So, thank you and I think move on to the next speaker which is Paulo.
Regina Filipová Fuchsová: Thank you very much indeed.
Paulo Glowacki: Well, thank you everybody for having me. First of all, thank you Natalie and Regina for the great collaboration in organizing this. I have to give a big shout out to those two and especially to Katrin Worasz who can’t be here today, but she was very helpful in preparing me on this topic because I’m a member of the EURID youth committee, although I don’t speak on behalf of EURID or the youth committee. But of course I’m engaged in these issues, but not an expert necessarily on everything, so she was very helpful in briefing me. Let me, after this preliminary remark, start out perhaps with our or my general statement on this issue saying that of course age verification is relevant because it directly imparts how we access, young people access, online spaces, but everybody accesses online spaces. But especially for young people, I think it’s relevant because we get educational content, entertainment, but even civic participation online. So really a large degree of our lives depends on the online spaces and from the perspective of German young people, especially those engaged in digital rights and youth organizations that I have talked to in the past couple of weeks, the debate is not just about protection, as previous speakers have already outlined, it’s really about striking this balance between safety and freedom, between privacy and participation, and really striking the balance where do we need it, right? Germany is sort of a special case in the world or in Europe as far as I’m aware and looked at it. I mean Australia is pushing forward now with the social media age ban, but Germany has had a very long history actually of at least age checks, also age verification in place for so-called high-risk content that can be gambling, pornography, etc. that is already restricted and providers are actually already implementing age verification measures. Now they’re not always mandated, but many providers have gone to do so. There’s a very vibrant ecosystem of laws, of actors, of regulatory actors in Germany. And that comes, of course, from the perception that we do need to protect children from harmful content online. But we also do need to avoid the oversimplification of the solution, really, right? Because it’s not the silver bullet, let me agree with my previous speakers on that. It can become a digital gatekeeping, as Tanja has outlined. But that’s perhaps why Germany has implemented this vibrant ecosystem of providers coming to the authorities that then can approve the solutions. And then only is it that they’re implemented. So trust and agency definitely matter to us. I mean, young people, we want to be involved. We want to be involved in the development of the solutions. But of course, we also want to be asked, how is this issue taken forward? And like I said, Germany can be considered a front runner on this issue, I would say. But it’s definitely a very complex regulatory landscape. We have certain laws, like I said, implemented. We have the German Commission for the Protection of Minors in the Media, known as KionM. We have another German Association for Voluntary Self-Regulation. So there’s really a vibrant ecosystem going on. And we have over 100 solutions already approved in Germany. So I think that shows that we’re sort of in another position sometimes, I think, in the European landscape, in the global landscape, where we come from. But I definitely share the concerns and also the hopes, perhaps, on this technology. Let me stop here.
Regina Filipová Fuchsová: Okay, thank you very much, Paulo. So I think we can move to have a closer look on the main challenges and risks associated with age verification. have been get through in the last two years.
Tatiana Tropina: In addition to IDs or biometric scans, it can also include financial accounts, for example, right? So imagine what kind of information is amassed there. And immediately it exposes the users and platforms who collect and store this information to the risks of security breaches, privacy breaches, data misuse. But even let’s forget about misuse and abuse. How can we ensure that this data is handled properly, that it’s not being sold to the third parties, that it’s not being used to track users? So when people know that they can be traced, and here I come to the word creepy yet again, they will be less willing to use very legitimate services. that put their efforts to perform the age verification checks. And I know that several speakers mentioned trust in the Internet, which is very important and which is declining. And this will just have this chilling effect further on the trust in the Internet. So the bottom line for me here when it comes to the privacy and security risks is that age verification technologies do not offer any holistic approach to make people or minors, rather minors in the first place, to be safe online. Rather, they can create false sense of security while creating various vulnerabilities in security and privacy. And in this regard, as a first step, I know that we are going to talk about solutions a bit later, but we believe that at a minimum, these age verification tools should be independently audited on whether they are actually compliant with privacy and security guidance with reports available publicly so researchers can access them. So this is on privacy and security risks. The second risk I mentioned was the risk of accessibility. And in this regard, the age verification tools can have a very chilling effect as well on everyday Internet use. I’m coming back again on what information is collected and how. Government-issued ID, financial accounts. Think about people who don’t have government-issued ID or they live in a foreign country and their government-issued ID is different from the government-issued ID readable in a particular country. Think about people who cannot provide their government-issued ID. government issued ID for any legitimate reason, it will affect them significantly. The same with the financial accounts. It sounds strange, but not everybody has a bank account. So these people will not use trustworthy services. They will probably go to very dark corners of the internet. And frequently, this is going to be people who are already affected by the lack of inclusion and lack of access. People who will not have bank accounts or government issued IDs would already be vulnerable population groups, marginalized population groups, and older people. When we think about biometric tools, camera, again, it sounds strange, but not everybody has a webcam or a phone with a webcam. We also know that these tools do not perform well on people who don’t have white skin tones. They can affect access for older people who will find the use of these tools challenging. You think that it’s simple, but for many, it’s not. It’s not simple for people with cataract. It’s not simple for people with health conditions like heart veins or people who are recovering from stroke. It’s not that simple for them. It’s a major access barrier. So in this regard, we believe that the access would be significantly hampered. And as the last point on accessibility, it’s not only about old people and marginalized population groups. It’s also about young people. Because when you think how the camera verification, how the biometric verification checks perform on young people, they have the range of several years. So some teenagers or young adults might also find it challenging. When the range is not clear, so it’s both for old people and young people. And finally, I said at the beginning that the Internet Society, we do care about interoperable globally connected Internet. And in this regard, the impact of all these access barriers, security risks, lack of trust can have significant effect. And as I said at the beginning, the more countries are going to roll out these programs of age verification, mandatory tools, we might end up with Internet interoperability being impacted, breaking on various layers. We believe that it does move forward. So, if I may say, the horse is out of the barn. We cannot unroll these tools. But if we look at the international level, we probably need international standards. International standards where stakeholders participate on an equal footing, developed in the multi-stakeholder manner, so all these risks can be factored, all these risks can be addressed and not simply framed, as I said at the beginning, as a trade-off. Let’s sacrifice this and that for the safety of the minors online. So, we do have to take this question seriously and develop the standards if it comes to that point that we agree that we have to have these verification tools. Then we have to standardize them properly. Thank you.
Regina Filipová Fuchsová: Thank you very much, Tanja. You did a really thorough analysis. I would just ask who from the speakers would like to complement this overview of challenges and risks. Maybe also it would be good to tackle some from the technical field, but also legal. But just feel free to add your comments, whoever would like to. Yeah, Niels? Okay.
Niels Zagema: I mean, it’s not the technical or legal field, so my apologies for that. But I do resonate with what Tanya said. It was very extensive. But what I noticed, what also reminded me of was the restriction that comes across with the administrative slump. I know a lot of people, for example, go to government homes and they’re treated as a number because they need a passport or they stand in line. I think this age verification falls in line with the broader movement where people are not treated as people, but as in a system where they’re treated as numbers. So I think that’s a connection to make. And it can also provide a false sense of security. So we say it’s OK to provide security, but does it actually do that? And if it is implemented, is that sense of security warranted? And don’t people treat it as something that is actually that bad? And another one is to know that there’s no such thing as a universal child. The needs of men, the needs of humans are different. And therefore, also, it’s different to put an age on it. But currently, you see that with social media, there’s an age limit, for example, of 13. But because it’s not really enforced, no one really cares about it. And therefore, there is not a discussion on what actually the age should be. And I know analysis is currently still going on for guidelines, what the age should be. But this is, again, dependent on the needs of children and the needs of persons in general. So it should be more flexible and not try to fit everyone into this box where they should be, cannot really be themselves. That was what I like to add. But again, thanks to Tanja for the extensive mention of the risks.
Regina Filipová Fuchsová: Thank you. Our online speakers do like to add.
Natálie TerÄová: I can follow up if that’s fine.
Regina Filipová Fuchsová: Yes, please. Thank you.
Natálie TerÄová: I think so. I want to say I admire the long list and I fully agree with what was said by my colleague before me. All these things are very important to mention. there’s a lot of bias going on also now with new emerging technologies especially definitely as was mentioned out the skin of the color medical conditions and stuff like that and all these things can definitely hinder people’s access to things so this is very good and thank you so much for mentioning this. I might add a bit on the legal and ethical aspects because it may seem a bit funny because we have laws like GDPR that push for stricter age checks to protect children protect kids and it does really sound great however how to verify age without violating privacy laws and GDPR calls for minimization so to collect as minimum data as possible and it also calls for privacy by design meaning you shouldn’t collect more personal data than necessary yet many age verification systems as we heard demand exactly that so as we heard the personal information IDs and all sorts of these things so there’s a very big tension between protecting children in our case but also protecting their data and everyone’s data and their privacy so of course on the other side of the spectrum on the other side of the barrier companies are also worried about liability if they get age verification wrong what are the consequences they could face penalties for letting a minor slip through or breach privacy regulations by storing the sensitive data or handling them badly and these things so it’s really legally a tightrope walk I would say and ethically it’s very important to mention something we know from the psychological research which is some form of a wrestle we have here with children’s autonomy because teens in particular are developing in this period their sense of self and privacy And it shows that if we would force them constantly to prove their age, it might feel a bit too invasive and send them a message that the privacy comes second. And this is something, this is a message we definitely should not send. So it’s definitely delicate balance. I wouldn’t say we should throw the idea out, but more to implement it in a sensible and rights respecting way. That would be my add on this. Thank you so much. And Paulo, if you want to compliment me, feel free.
Paulo Glowacki: Thank you very much. And perhaps drawing from my legal background, there’s a lot of, of course, documents out there that are relevant. But I would just highlight one, perhaps the UN Convention on the Rights of the Child. I think globally, that gives us a good standing. I mean, it has 140 signatories were ratified. And in Article 13, especially contains the rights to freedom of expression, access to information online, as well as the right to privacy in Article 16. So I think, of course, those rights need to be balanced and taken into account globally. There’s also a general comment, number 25 of the UN Committee on the Rights of the Child from 2021, that highlights this relation between children’s rights in the digital environment. They’re also reflected at the European level. And of course, those are sort of risk mitigating, I would say, laws that we have or treaties, a treaty that is in place there. Now, perhaps at the European level, which is a bit more legalized and regulated even more. I see one challenge emerging on the horizon, which is the proposed regulation on to prevent and combat child sexual abuse, the CSA draft, which is currently under discussion at the European level. Nobody really knows where it’s going to go. But it contains the issue of chat controls. So really controlling minors chats and decrypting chats, that’s, of course, goes beyond age-related. verification. But that’s a huge issue. And perhaps what I’m trying to go here with is, if you have a hammer, everything looks like a nail, right? I mean, the technology of age verification is not itself a bad technology, I would say. But the question is, how do we apply it? Right. And I think I fully agree with what Tanya has outlined on the privacy concerns, the chilling effect, I would add, perhaps the lack of transparency of most of these technologies, I mean, AI technologies, most of them, they’re black boxes, we don’t know how they function, the algorithms, we don’t know how they’re how they’re working. And then a common issue in content moderation always is over and under blocking, right? I mean, if we have, if we even if we do not implement age verification technologies, but just age, perhaps if we restrict content, we always have, you know, stuff being filtered out, let’s just look at, you know, perhaps useful sex education or mental health content, right, that could be beneficial to people who would otherwise due to their socioeconomic circumstance, not have access to this, that would be blocked, or could be blocked to them. So these are concerns that we need to continuously address, I think as we go. And of course, we have the issue that not everybody in drawing back to the legal challenges here, not every country has ratified the UN Convention on the Rights of the Child. Countries have reservations on this. So we do risk having a fragmented landscape out there globally. And the EU also needs to take this into account. I mean, that the Digital Services Act is yet another document that’s relevant here in Article 28, specifically targets children protections. So I think all of these need to be taken into account and balanced out really, because they frame the discussion in legal terms. So that’s what I would, I would add to that.
Regina Filipová Fuchsová: Thank you very much, Paulo. It brings us also to the never, let’s say, or everlasting discussion about the if online environment requires more or different measures and protections than the offline world. Maybe before we have a closer look to the, let’s say, the positive side of what complementary approaches could be considered, if there is some question from the audience, either in general or any of the panelists or online we can take them on. Please.
Audeince: Thank you. I’m Tapani Tarvainen from Electronic Frontier Finland. First, I must note I very much agree with the speakers, especially Tatjana has made the case very clear. But there’s one point I want to highlight that the main problem with age verification, at least, it seems that it seems to require identification of users. And that is dangerous in itself. Especially, it’s also dangerous for the children in question. If a child tries to access, let’s say, a site with age-inappropriate material, let’s put it that way, I would rather not have that site know that identity of the child. Because the site managers may not be, well, let’s say, might have some motives that are not ideal for the interest of the children. So especially children need to be able to browse the internet anonymously. Now, there are some technical ideas of how you could verify age without revealing the identity to that site in question, but that’s something we might discuss at some point. I would be curious to hear if you think those might actually work. There are some theoretical possibilities, but I don’t know if any have been implemented so far. Thank you.
Regina Filipová Fuchsová: Would you like to react, Tania? Thank you.
Tatiana Tropina: Yes, absolutely. Thank you very much, Tapani. I think that, Regina, what you said when you moved the questions, summing it up in a way like does. online world need the same solutions and protections as offline world? And I think it very much corresponds to what Tapani said. I think there are two layers of this question. First of all, if we perform verification in the physical world, should we perform verification in an online world? Because let’s be clear here, we do perform age checks in the physical world, right? A child cannot buy alcohol. A child cannot buy a pack of cigarettes. Does it mean that we have to transpose these ideas and solutions to the online world? And here, here the question becomes acute. Because in the offline world, the age verification is temporary. You show your ID or you don’t show your ID and you walk away. It’s done and gone. Online, the data is gone. It’s not you who are gone. The data is gone. And this is why I would say, no, no, we cannot equal these two. We have to factor in different risks. And this is what you said, Tapani. It can endanger those very children. It’s interesting. There is something that I made a note of when Natalie was speaking about platforms, providers, operators being at risk of fine if a minor slips through the age verification. And I understand the risk. The problem for me is that they are not at risk if they exclude an older person from access to legitimate services, if they exclude disadvantaged groups from access to legitimate services. They are not at risk of any fine. And this feels like a big inequality in terms of what kind of barriers and risks we are creating for our society. And also, we can… improve societal trust and protection in the physical world by performing age verification checks. But in an online world, in fact, from what I’ve heard, we’re not improving trust. We’re decreasing trust. We’re trying to solve huge societal problems and harms with only technical tools. And here, again, I would agree with Natalie that the problem has so many aspects than just looking at technology and this formalistic approach, this layer, where I perform verification and how accurate it is is just simply not working. Thank you.
Regina Filipová Fuchsová: Thank you, Tanja. We have another question.
Audeince: Hey, hello, good morning. My name is Torsten Krause. I’m coming from the Digital Opportunities Foundation in Germany, and it’s more kind of a statement than a question. I’m wondering, I’m aware that we are discussing this issue from different angles and perspectives, but I’m wondering if we are on the same kind of level in this discussion, because Paulo laid down in his remarks that in Germany, we have a longstanding history of age checks, more than 20 years, and more than 100 systems are in place in this history, and that’s, I think, what maybe is kind of hurdle to overcome here is we are discussing about the protection of minors, so keep children out. That’s also what’s discussed in Australia, and I don’t like this approach, because as a child rights advocate, I am in favor of the participation of children, and I think we have to go kind of a shift that age assurance mechanism and tools. can be a key tool or a kind of a precautionary measure for creating age-appropriate digital environments where not children keep out, but maybe we keep out adults to have safe spaces for children And when we compare it with the offline work like Tanya does, what I really like is, then we have such checks also you will see if an adult is going in a kindergarten, a safe space for children. You will see that it is not a person of the same age. Online, we don’t know if we have a chat room for children, which is meant for children, and if there are going adults on because they want to offend, for example. We will not recognize that. But age checks can be a tool to find a solution for this age-appropriate safe participation for children. I totally agree on the concerns with current mechanisms of these tools. We have barriers because not everyone has a financial account or a bank account. Not everyone has an ID. That’s true. It’s creepy to scan your face, and it’s not just creepy. From a child rights perspective, it’s totally invasive and not safe to work with all this data of children. But for me, I would like to see us not to stop here at this stage, but to think how to overcome these challenges and how to create tools that secure the privacy and anonymity of all users by checking their age and how to realize that. And the German government has developed such a system together with the Fraunhofer SIT. And a double blind mechanism, working with data that already exists, so there is no new data generated and in the end the service will just get an answer yes or no if the user, which the service don’t know, who is the user, if he should be allowed to go in this space or to get this content or not. And I think that’s the way we should discuss this, how to use it as a key to make the digital environment safe for us all by securing our privacy and anonymity and widen the participation of all users in the end. Thanks.
Regina Filipová Fuchsová: Thank you very much. It actually nicely brings us to the part where we wanted to discuss the complementary approaches and it should be considered which could achieve the same goals without undermining the rights such as privacy, inclusion and autonomy, what we discussed. So can I ask also the speakers to say their view, how these complementary approaches could look like. Paulo, can I be asked with you this time?
Paulo Glowacki: Yes, thank you. Thank you. Since I have to leave at 10, this will probably be my last intervention, but once again already thank you for the organization and for all the questions that we’ve received from the floor. Perhaps let me start out with agreeing with Torsten, thank you for bringing that up again. I think you’re much more an expert on this than I am. So if you stick in the room, please, that would be great. I think also shout out to the work you guys have been doing. Let me turn to the alternatives or complementary approaches that you have asked about, Regina. I think we’ve already brought them up a bit. Natalie has highlighted, I mean, platform responsibility, of course, is one issue. We have that coming with the DSA. I mean, it’s a question of implementation now, I think, legally speaking. And then I think a big issue or a big thing that we have had in Germany, as we have experienced with dealing with these issues for a long time in a rather unique way, perhaps, is also parental and community involvement, right? We have this concept of Jugendmedienschutz, so youth media protection, where we really have a variety of projects and models that combine regulation, but also education, dialogue with young people. I think that’s very, very crucial. Of course, parental control mechanisms could be much more helpful if applied in the apps. So where the issues arise, I think we should avoid pushing this issue down the technology stack onto the operating system level. I think that we really run into big issues there. And then, of course, digital literacy. I think that’s something that we have also pushed for as the Euro Youth Committee, as the Youth Internet Governance Forum Germany. We’re keen on collaborating on these issues. So please do reach out there. But I think it’s a very important measure to complement age verification technologies. And last but not least, youth participation. I think youth participation in the designs of these systems, in the debates of the regulation, I think that’s very crucial, because it’s also showing us where do we really need the protection. I think like Niels has said it, I mean, it appears creepy, but it only appears creepy once you bring the topic to the young people. I think it’s important to reach out and keep them involved. I think there’s, of course, lots of risks and lots of benefits out there. And I think thinking about it the way that Torsten has pushed us to think is, I think, very helpful in creating safe online. spaces for everyone, of course, right, without harming our privacy and data protection. I think I’ll leave it there. And once again, thank you, everybody, for being here and having me.
Regina Filipová Fuchsová: I actually cannot hear in the room. I don’t know if it’s only me. One, two, three. It’s much better if the button is red. I just realized. So I was just thanking Paulo for his contribution and attending this panel. And just good luck for your next lecture, which you have to join. And we will continue. So we don’t need to repeat what was already said for the alternatives. But maybe if you want to compliment or highlight some thing, Tanya, in this measure, because we heard a lot of like problematic aspects. So if you can offer us some light at the end of the tunnel from your perspective as well.
Moderator: Thank you, Regina. Yes, I know that I highlighted some of the ways forward already, like, for example, development of the standards in the multi-stakeholder manner or making sure that privacy and security guidelines are followed by this age verification system by making reports available and accessible publicly. I want to make one remark before I come to the solutions on what Torsten said. And that actually, I think, brings me nicely to my sort of concluding words about solutions. The idea of safe spaces sounds very, very compelling when we think about children and protection, protecting children. But we will have to bear in mind that, and you said it yourself, that safe spaces will again require age verification. So here we come again. How do we look at this layer of technology? And to me, the way forward would be to look at this layer of technology not separately from everything else. As a German, I have a huge belief in the ability of Fraunhofer Institute to create something
Tatiana Tropina: that would be bulletproof, absolutely, foolproof. I believe in it. The point that if we try to solve complex societal problem with technology, we need to look beyond technology. And this is it. We need to look at who is included, who is excluded, how. Can we solve, what kind of aspects of complex societal problems can we solve with this technology? And what we cannot, and this is why we will need complementary solutions. To wrap this up, I will come to technology. So when we look at this, at the Internet Society, we see quite a few examples that can be less risky and more privacy-friendly, more security-friendly. Examples of this is adult websites self-labeling with metadata, right? And this helps parents to set children devices in an appropriate manner to block access to this content. So instead of creating privacy and security risks, these tools empower families, empower parents to protect their children. And in a way, this responsibility is not only with the platform, it also includes a parent. So there is certain autonomy, I would say. So they are actors on their own as well. And with this, yes, I think all the solutions I already highlighted in my previous interventions, so I don’t want to repeat myself. Thank you.
Regina Filipová Fuchsová: Thank you very much. Niels, would you like to add?
Niels Zagema: Yes, I would like to add only, for example, the fact that, I mean, many people believe that the Internet was not really designed for young people, so I think of youth-specific platforms and not only seeing what to ban, but provide alternatives that are really safe for young people in which they can better navigate the Internet. And I like what Paulo said about digital literacy, because you hear it a lot, but I think we’re addressing two different problems, both with age verification, for example, with social media and more addictive content, and more harmful content like adult websites, and they’re both on the umbrella of age verification. So we also need to look for each specific problem what the solution might be. And again, with parental and societal control, I’m a bit hesitant about how parents can influence children. I think children should have more autonomy and have the right to develop as adults without the parental narrative. But there is a role for the parents and for society in general to say, okay, this is normalized, this is not normalized. And as an alternative to age verification, I think we can much better put blame or the harm on other sides of the story and I’m mainly talking about, for example, social media and not about the harmful websites. But that we really say, okay, there are alternatives, but are we really using them? So it fits in a picture that’s much broader than age verification, but about how we want to use the Internet and how youth-specific that is. Also, another point is that youth participation is often mentioned as a good thing, that we can create trust for young people, but bad youth participation kind of hurts the cause so I think it’s often also to look critically at how youth participation is done. done, especially with regards of, like, technical solutions such as age verification, and really have a critical look on it, okay, is this actually something that young people can agree with? So those, I think, are some alternatives and some remarks I wanted to make.
Regina Filipová Fuchsová: Excuse me. Yeah. Yeah. It’s not very inclusive here, at least not obviously for my age. Natalie, can we ask you for your viewpoints? Thank you.
Natálie TerÄová: Yeah, thank you so much. I will definitely make use of my research field being on digital literacy with an aspect of parents that usually is there and plays a big role, and I’m very glad my previous colleagues and speakers mentioned this already. And this is something that really goes hand in hand. So what we know from a fact from various of longitudinal researchers, so actually some of the data that we see really does change in time, because this is something that we need to think about when we also hear all these alarming books and studies. And yes, I am a bit pointing at the opening speech about height. And honestly, this is something that in a scientific community, we do have some problems with his framing of things. And I’m more than happy to post in the chat later some of, let’s say, critics in the review of what he says, because causality is not, I’m sorry, correlation is not causality. And this is something we see a lot in his work. But enough of height. This just highlights what I want to say. One thing is that parents do play an important role in enhancing digital literacy, but this mainly works for younger children. But as children grow older and especially in the older adolescents, let’s say from 15 years old and later, they rely more on their peers for support and also some lifestyle choices when it comes to accessing various content online. So actually, in some of the studies we did that were longitudinal and parents were trying to, we say, mediate. So pretty much tell their children how to react online, what to access, what not to access and what types of risks they can encounter. We could see that even though they try and they really do as much as they could as parents, it does not really affect very much if they can actually mitigate the risks and if they do enhance their digital literacy levels. But there are some other aspects in place that play a more important role than parents, especially in this age group. But what really, really matters is the ethical compass and some of the values we pass on our family members, on our friends. Well, generally the kids and the young people in our area, because we can see, and this is something that was not done before as much, when people do encounter some harmful online content, and here I mean people as children, they have different outcomes of such encounters. Some even do intentionally search for them. And there can be various reasons. And in the past, we did not differentiate between the intentionality, if it is voluntarily or not, if this is something that just pops up and the kid is just surprised. Of course, in this case, it has negative consequences also on the well-being of the child and so forth. However, there can be various reasons why the child can be in a position that they want to search for a specific content. One example I had from… in-depth interviews with young people, they told me, oh, I was bullied in school, and I wanted to understand more why this is happening to me. So I was Googling cyberbullying. I was Googling even topics of self-harm because this is something that came to my mind. And I just wanted to understand, or maybe I wanted to find a community of people who can understand my situation. Maybe we can discuss these things. So there are so many layers behind how kids interact and what they want to see. And maybe at first glance, we would just categorize things into risky and or into beneficial ones, but it’s not so black and white. So what really, really matters is what types of values we share with them. If also they have us as the older people and parents, older siblings, if they do trust us, if they can talk to us whenever they encounter something that frustrates them, if it makes them scared, if they really know how to cope with these things. And all these aspects I’m now mentioning to you, they can all be somehow enhanced or learned with the help of enhancing digital literacy. So it really works in a circle. And I’m trying to explain it here, but please bear with me. It’s complicated. And also English, it’s not my first language. So please let me know if this really does not make much sense. However, really, really what we see is that we can do as much as we want for regulation, but we have to keep in mind that children or any end users, they are not just passive people who just enter the online environment and something happens to them and everyone react the same way and have the same outcomes. This is really not the case. And so I would say the alternative way is more like a toolbox of approaches. Once can come from us, from the family, the surrounding of the child. And then another thing would be definitely what Paula already said, all those digital literacy. programs, it doesn’t have to be from schools, it can be also from institutions, organizations that do care about this, libraries and so forth. So I will stop here, and thank you, I hope I did not take much of your time.
Regina Filipová Fuchsová: Thank you very much, very interesting, also with reference to the concrete research and statement of children, and it was also the aim of this session to involve different stakeholders, also youth participants to see the different perspectives. We have a few more minutes left, so we went actually through the introductory statements, highlighting the shortcomings of the age verification systems, we discussed quite extensively what risks are associated, and then switched to at least outlining some of the ways forward. It looks like us, and I mean us in a broad sense, cannot push away the responsibility for the safety of online environment here for the children, we cannot shift it fully on technology, technical companies, we cannot shift it away from us as a parent. It’s not easy, what seems to be the way forward, rather a multi-layered approach, which includes education, hand-in-hand with technological safeguards and social interventions, so a lot of work and discussion to do, and actually this session was meant as a warming up. discussion, a working session within one of the dynamic coalitions, and there is a main stage session on age verification later this week, I think it’s on Wednesday, where everybody is very much invited to continue the discussion. And before we conclude, I wanted to ask our panelists, out of what was said here and what we heard, what would be like the message that you would like the audience to take with them? It could be like the main, let’s say, challenge or also the main aspect of the way forward. What would you like everyone to keep in their mind when leaving the room? Who wants to start? Niels, you look like you are ready.
Niels Zagema: I wasn’t ready, but it was my fault, I was smiling. What I would like as an ending note is that I think age verification, as we’ve discussed, pretty much we’ve agreed that a lot of the downsides, but it’s dependent on transparency and the trust in the system itself. So what I would like to leave is that we should not see it as a silver bullet, but also put more focus on the broader system, and also I leave to everyone here in this room, when I hear about age verification, especially for example in news or in politics, people talk beside each other, people are miscommunicating about it because they have different goals in mind. So really be sharp in putting, that society knows what we’re talking about, and also that it can be a public discussion, and I think that the internet should be a place of high trust.
Regina Filipová Fuchsová: Thank you very much. Tania?
Tatiana Tropina: I think my main message would be that we cannot make children, minors, or young adults more secure online. by creating huge security risks, and including other groups of people who benefit also from open, globally connected, and secure Internet. We should never ever make it a trade-off. We should never frame it as a trade-off. Everybody should be safe online. Everybody should be secure online. Nobody should be excluded from access to the Internet. Thank you.
Regina Filipová Fuchsová: Thank you very much. Natalie?
Natálie TerÄová: I would say don’t underestimate children. They will find their way how to work around the regulations and restrictions. They are not just passive consumers, and technologies and the Internet is just a tool, and it’s up to us or up to the children how they will use it. And they do have the full right to benefit from technologies and the Internet as much as we do. There are definitely risks offline, but there are also risks online. So let’s be here for them, be the guides for them, and make the most so that they can minimize the risks they encounter in their lives, but still make the most of the benefits that also technologies offer for them.
Regina Filipová Fuchsová: Thank you very much. This is definitely also a very important aspect to take into account. We are almost at the end of our session with the time, so if there is one or two comments from the audience or from online, we can still take it up. If not, then thanks a lot to our speakers. It was very interesting. Thank you very much. And let’s take this as an invitation to discuss the age verification and the connected issues, not only in the working groups and dynamic coalitions which are connected with this topic, but also later this week here at EuroDIG. Thank you very much.
Tatiana Tropina
Speech speed
129 words per minute
Speech length
2020 words
Speech time
937 seconds
Age verification is not a silver bullet and creates false sense of security while introducing new vulnerabilities
Explanation
Tropina argues that age verification tools in their current form do not provide holistic protection for minors online but instead create a false sense of security while introducing various privacy, security, and access vulnerabilities. She emphasizes that these tools affect everyone, not just minors, and can make internet users less secure overall.
Evidence
She notes that age verification tools collect sensitive data like IDs, biometrics, and financial accounts, exposing users to security breaches and data misuse. She also mentions that these tools can create barriers for marginalized groups and affect internet interoperability globally.
Major discussion point
Age Verification as a Solution to Protect Minors Online
Topics
Cybersecurity | Human rights | Legal and regulatory
Agreed with
– Niels Zagema
– Paulo Glowacki
– Natálie TerÄová
Agreed on
Age verification is not a silver bullet solution
Disagreed with
– Audeince
Disagreed on
Feasibility and desirability of technical solutions for age verification
Age verification tools collect sensitive data (IDs, biometrics, financial accounts) creating risks of breaches, misuse, and tracking
Explanation
Tropina details how age verification systems require extensive personal information including government-issued IDs, biometric scans, and financial account information. This data collection creates significant privacy and security risks, including potential breaches, misuse, and user tracking, while also having a chilling effect on internet use.
Evidence
She explains that this data can be sold to third parties, used for tracking, and exposes both users and platforms to security breaches. She notes that people will be less willing to use legitimate services when they know they can be traced.
Major discussion point
Privacy and Security Risks of Age Verification Systems
Topics
Cybersecurity | Human rights | Legal and regulatory
Agreed with
– Paulo Glowacki
– Natálie TerÄová
Agreed on
Privacy and security risks of current age verification systems
Disagreed with
– Audeince
Disagreed on
Approach to creating safe online spaces – exclusion vs inclusion paradigm
Age verification creates barriers for marginalized groups who lack government IDs, bank accounts, or proper technology access
Explanation
Tropina argues that age verification requirements disproportionately affect already disadvantaged populations who may not have access to required documentation or technology. This includes people without government-issued IDs, bank accounts, or proper webcam technology, as well as those whose foreign IDs may not be readable in certain countries.
Evidence
She provides specific examples: people living in foreign countries with different ID systems, those who cannot provide government-issued ID for legitimate reasons, people without bank accounts, and those without webcams or phones with cameras. She also notes that biometric tools perform poorly on people with non-white skin tones and create challenges for older adults with health conditions like cataracts or those recovering from stroke.
Major discussion point
Access and Inclusion Barriers
Topics
Human rights | Development | Sociocultural
Multi-stakeholder development of international standards needed if age verification tools are implemented
Explanation
Tropina advocates for the development of international standards through multi-stakeholder processes if age verification tools are to be implemented. She argues this is necessary to address the risks and ensure that the implementation doesn’t create a fragmented internet landscape with different barriers in different countries.
Evidence
She mentions that various countries rolling out different age verification programs could create a patchwork of approaches that disrupts the interoperable global nature of the internet. She emphasizes the need for standards developed with equal stakeholder participation.
Major discussion point
Alternative and Complementary Approaches
Topics
Infrastructure | Legal and regulatory | Human rights
Adult websites should self-label with metadata to empower parents to set appropriate device controls
Explanation
Tropina proposes that instead of mandatory age verification, adult websites could self-label their content with metadata, which would allow parents to configure their children’s devices to block access to inappropriate content. This approach empowers families while avoiding the privacy and security risks of age verification systems.
Evidence
She explains that this approach helps parents set children’s devices appropriately to block access to inappropriate content, empowers families to protect their children, includes parents as active participants with autonomy, rather than placing all responsibility on platforms.
Major discussion point
Alternative and Complementary Approaches
Topics
Human rights | Sociocultural | Legal and regulatory
Agreed with
– Paulo Glowacki
– Natálie TerÄová
Agreed on
Multi-layered approach combining education, technology, and social interventions is needed
Age verification systems should be independently audited with publicly available reports for researcher access
Explanation
Tropina calls for mandatory independent audits of age verification systems to ensure they comply with privacy and security guidelines. She argues that these audit reports should be made publicly available so researchers can access and analyze them.
Major discussion point
Technology Implementation and Design
Topics
Cybersecurity | Legal and regulatory | Human rights
Niels Zagema
Speech speed
193 words per minute
Speech length
1275 words
Speech time
396 seconds
Young people want safe online environments but not through invasive measures that feel creepy and show lack of trust
Explanation
Zagema argues that while young people do want safe and appropriate online environments, they are concerned about invasive age verification measures that feel creepy and demonstrate a lack of trust in them. He emphasizes that young people prefer solutions that don’t trade one problem for another.
Evidence
He provides a specific example from a classroom discussion where students said that having to scan their face or upload a passport just to watch YouTube would feel invasive and creepy, and doesn’t show trust in young people.
Major discussion point
Age Verification as a Solution to Protect Minors Online
Topics
Human rights | Sociocultural | Children rights
Agreed with
– Tatiana Tropina
– Paulo Glowacki
– Natálie TerÄová
Agreed on
Age verification is not a silver bullet solution
Administrative burden of age verification treats people as numbers rather than individuals, similar to bureaucratic government processes
Explanation
Zagema draws a parallel between age verification systems and bureaucratic government processes, arguing that both treat people as numbers in a system rather than as individuals. He sees this as part of a broader movement that dehumanizes interactions with digital systems.
Evidence
He compares it to people going to government offices where they’re treated as numbers because they need a passport or stand in line, suggesting age verification falls into this same pattern of impersonal, bureaucratic treatment.
Major discussion point
Access and Inclusion Barriers
Topics
Human rights | Sociocultural | Legal and regulatory
There’s no universal child – needs vary and age restrictions should be more flexible rather than fitting everyone into rigid categories
Explanation
Zagema argues that children have different needs and developmental stages, making universal age restrictions problematic. He advocates for more flexible approaches that don’t force everyone into the same rigid categories, noting that current age limits like 13 for social media aren’t properly enforced or discussed.
Evidence
He points out that most platforms currently set minimum age at 13, but because it’s not really enforced, no one discusses what the age should actually be. He notes this should depend on individual needs of children rather than universal standards.
Major discussion point
Access and Inclusion Barriers
Topics
Human rights | Children rights | Sociocultural
Focus should be on creating youth-specific safe platforms rather than just restricting access to existing ones
Explanation
Zagema advocates for a more positive approach to youth online safety by creating platforms specifically designed for young people, rather than focusing solely on what to ban or restrict. He argues this addresses the fact that the internet wasn’t originally designed with young people in mind.
Evidence
He mentions that many people believe the internet was not really designed for young people, so there’s a need for youth-specific platforms that provide safe alternatives where young people can better navigate online spaces.
Major discussion point
Alternative and Complementary Approaches
Topics
Sociocultural | Children rights | Infrastructure
Youth participation in design and regulation debates is essential, but must be done critically and meaningfully
Explanation
Zagema emphasizes the importance of involving young people in the development of age verification systems and related regulations, but warns that poor youth participation can actually harm the cause. He advocates for critical evaluation of how youth participation is conducted to ensure it’s meaningful and effective.
Evidence
He notes that bad youth participation can hurt the cause and emphasizes the need to look critically at how it’s done, especially regarding technical solutions like age verification, to ensure young people can actually agree with the outcomes.
Major discussion point
Youth Agency and Participation
Topics
Human rights | Children rights | Sociocultural
Agreed with
– Paulo Glowacki
– Natálie TerÄová
Agreed on
Youth participation in design and regulation is essential
Young people’s autonomy and right to develop without excessive parental control should be respected
Explanation
Zagema expresses concern about the extent of parental control in online safety measures, arguing that children should have more autonomy and the right to develop as adults without being overly constrained by parental narratives. He acknowledges a role for parents and society but emphasizes the importance of youth autonomy.
Evidence
He states that while there is a role for parents and society to establish norms about what is and isn’t acceptable, children should have the right to develop as adults without excessive parental narrative control.
Major discussion point
Youth Agency and Participation
Topics
Human rights | Children rights | Sociocultural
Disagreed with
– Natálie TerÄová
Disagreed on
Role and extent of parental control in children’s online safety
Paulo Glowacki
Speech speed
176 words per minute
Speech length
1641 words
Speech time
556 seconds
Age verification directly impacts how young people access educational content, entertainment, and civic participation online
Explanation
Glowacki emphasizes that age verification is relevant because it directly affects how young people access various online spaces that are crucial for their development, including educational resources, entertainment, and opportunities for civic engagement. He argues that a large degree of young people’s lives depends on online access.
Evidence
He explains that young people get educational content, entertainment, and even civic participation online, so age verification systems directly impact these essential aspects of their lives.
Major discussion point
Age Verification as a Solution to Protect Minors Online
Topics
Human rights | Children rights | Sociocultural
Current age verification technologies lack transparency and function as black boxes without clear understanding of their algorithms
Explanation
Glowacki raises concerns about the lack of transparency in age verification technologies, particularly AI-based systems, arguing that most of these technologies operate as black boxes where the underlying algorithms and decision-making processes are not understood or accessible for scrutiny.
Evidence
He specifically mentions that AI technologies used in age verification are mostly black boxes where ‘we don’t know how they function, the algorithms, we don’t know how they’re working.’
Major discussion point
Privacy and Security Risks of Age Verification Systems
Topics
Cybersecurity | Legal and regulatory | Infrastructure
Agreed with
– Tatiana Tropina
– Natálie TerÄová
Agreed on
Privacy and security risks of current age verification systems
UN Convention on the Rights of the Child balances freedom of expression and access to information with privacy rights
Explanation
Glowacki references the UN Convention on the Rights of the Child as a foundational legal framework that provides guidance on balancing children’s rights to freedom of expression and access to information with their privacy rights. He notes this treaty has broad global support and should inform age verification discussions.
Evidence
He cites Article 13 of the UN Convention which contains rights to freedom of expression and access to information online, and Article 16 which covers the right to privacy. He notes the convention has 140 signatories and mentions the 2021 General Comment 25 on children’s rights in digital environments.
Major discussion point
Legal and Ethical Considerations
Topics
Human rights | Children rights | Legal and regulatory
Proposed EU regulation on child sexual abuse with chat controls represents concerning expansion beyond age verification
Explanation
Glowacki raises concerns about the proposed EU regulation on preventing and combating child sexual abuse, which includes chat control provisions that would go beyond age verification to actually monitor and decrypt children’s communications. He sees this as a problematic expansion of surveillance measures.
Evidence
He mentions the CSA draft regulation currently under discussion at the European level, which contains provisions for chat controls that would involve ‘controlling minors chats and decrypting chats’ which goes beyond age verification.
Major discussion point
Legal and Ethical Considerations
Topics
Cybersecurity | Human rights | Legal and regulatory
Digital literacy education, parental involvement, and youth participation in system design are crucial complementary measures
Explanation
Glowacki advocates for a comprehensive approach that includes digital literacy education, meaningful parental and community involvement, and active youth participation in designing age verification systems and related regulations. He draws from Germany’s experience with youth media protection that combines regulation with education and dialogue.
Evidence
He references Germany’s concept of ‘Jugendmedienschutz’ (youth media protection) which combines regulation with education and dialogue with young people. He also mentions the work of the Euro Youth Committee and Youth Internet Governance Forum Germany in promoting digital literacy and youth participation.
Major discussion point
Alternative and Complementary Approaches
Topics
Sociocultural | Children rights | Human rights
Agreed with
– Niels Zagema
– Natálie TerÄová
Agreed on
Youth participation in design and regulation is essential
Over-blocking and under-blocking in content moderation can filter out beneficial content like sex education or mental health resources
Explanation
Glowacki warns about the common problem in content moderation systems where beneficial content gets incorrectly filtered out (over-blocking) or harmful content gets through (under-blocking). He’s particularly concerned about blocking educational content that could be beneficial for people who might not otherwise have access to it.
Evidence
He specifically mentions that useful sex education or mental health content could be blocked, which would be particularly harmful for people who ‘due to their socioeconomic circumstance, not have access to this’ through other means.
Major discussion point
Technology Implementation and Design
Topics
Sociocultural | Human rights | Legal and regulatory
Agreed with
– Tatiana Tropina
– Niels Zagema
– Natálie TerÄová
Agreed on
Age verification is not a silver bullet solution
Natálie Terčová
Speech speed
155 words per minute
Speech length
1995 words
Speech time
770 seconds
Age checks can help prevent inappropriate content exposure but cannot solve all issues related to children’s online safety
Explanation
TerÄová acknowledges that age verification can serve a useful purpose in preventing young children from accessing inappropriate content, but emphasizes that it cannot address all the complex issues surrounding children’s online safety. She argues that many other factors, particularly family dynamics and socioeconomic status, play crucial roles in shaping children’s online experiences.
Evidence
She provides examples of preventing an eight-year-old from wandering into an 18+ chat room full of intimate content, or ensuring a 10-year-old isn’t targeted with inappropriate ads. She draws from robust international research showing that family role, socioeconomic status, and children’s prerequisites significantly shape their online experiences.
Major discussion point
Age Verification as a Solution to Protect Minors Online
Topics
Children rights | Human rights | Sociocultural
Agreed with
– Tatiana Tropina
– Niels Zagema
– Paulo Glowacki
Agreed on
Age verification is not a silver bullet solution
There’s tension between GDPR’s data minimization principles and age verification systems that demand extensive personal information
Explanation
TerÄová highlights the legal contradiction between privacy laws like GDPR that require data minimization and privacy by design, and age verification systems that typically require extensive personal data collection. This creates a challenging legal landscape where protecting children and protecting privacy seem to be in conflict.
Evidence
She explains that GDPR calls for collecting minimum data necessary and privacy by design, yet many age verification systems demand personal information, IDs, and other sensitive data, creating a ‘very big tension between protecting children but also protecting their data and everyone’s data and their privacy.’
Major discussion point
Privacy and Security Risks of Age Verification Systems
Topics
Legal and regulatory | Human rights | Cybersecurity
Agreed with
– Tatiana Tropina
– Paulo Glowacki
Agreed on
Privacy and security risks of current age verification systems
Companies face liability concerns about penalties for letting minors through versus privacy violations from storing sensitive data
Explanation
TerÄová explains that companies implementing age verification face a difficult balancing act between potential penalties for failing to prevent minors from accessing restricted content and the risk of violating privacy regulations by collecting and storing sensitive personal data required for age verification.
Evidence
She notes that companies ‘are also worried about liability if they get age verification wrong’ and could ‘face penalties for letting a minor slip through or breach privacy regulations by storing the sensitive data or handling them badly.’
Major discussion point
Legal and Ethical Considerations
Topics
Legal and regulatory | Economic | Human rights
Forcing teens to constantly prove their age may feel invasive and send message that privacy comes second during critical identity development period
Explanation
TerÄová raises ethical concerns about the psychological impact of age verification on teenagers who are in a critical period of developing their sense of self and understanding of privacy. She argues that requiring constant age verification could undermine their developing privacy awareness and autonomy.
Evidence
She explains that ‘teens in particular are developing in this period their sense of self and privacy’ and that forcing them to constantly prove their age ‘might feel a bit too invasive and send them a message that the privacy comes second,’ which is ‘a message we definitely should not send.’
Major discussion point
Legal and Ethical Considerations
Topics
Human rights | Children rights | Sociocultural
Enhanced digital literacy and ethical values matter more than parental mediation, especially for older adolescents who rely on peers
Explanation
TerÄová presents research findings showing that while parental mediation works for younger children, older adolescents (15+) rely more on peers for support and lifestyle choices. She argues that ethical values and digital literacy are more effective than parental control for this age group, and that the intentionality behind children’s online behavior matters significantly.
Evidence
She cites longitudinal studies showing that parental mediation doesn’t significantly affect risk mitigation or digital literacy levels for older adolescents. She provides an example of a young person who searched for information about cyberbullying and self-harm after being bullied, not out of harmful intent but to understand their situation and find community support.
Major discussion point
Alternative and Complementary Approaches
Topics
Sociocultural | Children rights | Human rights
Agreed with
– Tatiana Tropina
– Paulo Glowacki
Agreed on
Multi-layered approach combining education, technology, and social interventions is needed
Disagreed with
– Niels Zagema
Disagreed on
Role and extent of parental control in children’s online safety
Children are not passive consumers and will find ways around restrictions, so they should be treated as active participants
Explanation
TerÄová emphasizes that children and young people are active agents in their online experiences, not passive recipients of whatever happens to them online. She argues that they will find ways to circumvent restrictions and that their responses to online content vary greatly based on their individual circumstances and motivations.
Evidence
She explains that ‘children or any end users, they are not just passive people who just enter the online environment and something happens to them and everyone react the same way and have the same outcomes. This is really not the case.’
Major discussion point
Youth Agency and Participation
Topics
Children rights | Human rights | Sociocultural
Agreed with
– Niels Zagema
– Paulo Glowacki
Agreed on
Youth participation in design and regulation is essential
Children have full right to benefit from internet technologies and should not be underestimated in their capabilities
Explanation
TerÄová argues that children have equal rights to benefit from internet technologies and should not be underestimated in their ability to navigate online spaces. She emphasizes that while there are risks both online and offline, the focus should be on supporting and guiding children rather than restricting their access.
Evidence
She states that children ‘do have the full right to benefit from technologies and the Internet as much as we do’ and advocates for being ‘guides for them’ to help them ‘minimize the risks they encounter in their lives, but still make the most of the benefits that also technologies offer for them.’
Major discussion point
Youth Agency and Participation
Topics
Children rights | Human rights | Development
Regina Filipová Fuchsová
Speech speed
124 words per minute
Speech length
1911 words
Speech time
920 seconds
Session aims to examine age verification from multiple stakeholder perspectives given evidence of social media’s impact on youth mental health
Explanation
Fuchsová frames the session as an examination of age verification from various stakeholder viewpoints, motivated by research showing consistent links between social media use and declining youth mental health. She references evidence from researchers across continents showing this pattern independent of local social context.
Evidence
She cites Professor Jonathan Haidt’s book ‘The Anxious Generation’ and research showing that between 2010-2020, suicide rates among adolescents increased significantly, major depressive episodes rose for both girls and boys, and emergency room visits for self-harm rose 188% for girls. She also mentions that unrestricted social media access exposes children to cyberbullying, adult abuse, image-based abuse, and age-inappropriate content.
Major discussion point
Age Verification as a Solution to Protect Minors Online
Topics
Children rights | Human rights | Sociocultural
Audeince
Speech speed
134 words per minute
Speech length
727 words
Speech time
324 seconds
Children need to browse internet anonymously, especially when accessing sensitive content, to protect them from site operators with questionable motives
Explanation
The audience member argues that age verification systems that require identification of users are particularly dangerous for children, as it exposes their identity to website operators who may not have the children’s best interests in mind. They advocate for anonymous browsing capabilities, especially for sensitive content.
Evidence
The speaker (Tapani Tarvainen from Electronic Frontier Finland) explains that ‘if a child tries to access, let’s say, a site with age-inappropriate material, I would rather not have that site know that identity of the child. Because the site managers may not be, well, let’s say, might have some motives that are not ideal for the interest of the children.’
Major discussion point
Privacy and Security Risks of Age Verification Systems
Topics
Cybersecurity | Children rights | Human rights
Age verification should create age-appropriate environments where adults are kept out of children’s spaces, not just keeping children out
Explanation
The audience member (Torsten Krause) argues for a paradigm shift from exclusion-based to inclusion-based age verification, where the focus is on creating safe spaces for children by keeping adults out, rather than simply blocking children from accessing content. This approach emphasizes child participation rather than restriction.
Evidence
Krause provides the analogy that ‘if an adult is going in a kindergarten, a safe space for children. You will see that it is not a person of the same age. Online, we don’t know if we have a chat room for children, which is meant for children, and if there are going adults on because they want to offend.’ He also mentions that the German government has developed a double-blind mechanism with Fraunhofer SIT that works with existing data and provides only yes/no answers without revealing user identity.
Major discussion point
Technology Implementation and Design
Topics
Children rights | Cybersecurity | Infrastructure
Disagreed with
– Tatiana Tropina
Disagreed on
Approach to creating safe online spaces – exclusion vs inclusion paradigm
German experience shows possibility of double-blind mechanisms that provide yes/no answers without revealing user identity
Explanation
The audience member describes Germany’s development of privacy-preserving age verification technology that uses existing data and double-blind mechanisms to verify age without collecting new personal information or revealing user identities to service providers.
Evidence
Krause explains that the German government developed this system ‘together with the Fraunhofer SIT. And a double blind mechanism, working with data that already exists, so there is no new data generated and in the end the service will just get an answer yes or no if the user, which the service don’t know, who is the user, if he should be allowed to go in this space or to get this content or not.’
Major discussion point
Technology Implementation and Design
Topics
Cybersecurity | Infrastructure | Legal and regulatory
Disagreed with
– Tatiana Tropina
Disagreed on
Feasibility and desirability of technical solutions for age verification
Moderator
Speech speed
165 words per minute
Speech length
373 words
Speech time
135 seconds
Session establishes rules for participation including full names, hand-raising for questions, video activation when speaking, and prohibition on sharing Zoom links
Explanation
The moderator establishes clear participation guidelines for the remote session to ensure orderly conduct and security. These rules are designed to facilitate meaningful discussion while maintaining session integrity and preventing unauthorized access.
Evidence
Specific rules mentioned include: enter with full name, raise hand using Zoom function to ask questions, switch on video when speaking, state name and affiliation, and do not share Zoom meeting links even with colleagues
Major discussion point
Session Management and Participation Guidelines
Topics
Infrastructure | Legal and regulatory
EuroDIG conference begins with pre-sessions focusing on age verification perspectives from different stakeholders within Dynamic Coalition on Data and Trust
Explanation
The moderator frames this session as part of EuroDIG’s opening day activities, specifically as a collaborative effort between multiple organizations to examine age verification from various stakeholder perspectives. This positioning emphasizes the multi-stakeholder approach to internet governance issues.
Evidence
Session is organized within the Dynamic Coalition on Data and Trust, in cooperation with the Youth Committee of EURID and the Czech IGF, representing different stakeholder groups
Major discussion point
Multi-stakeholder Approach to Internet Governance
Topics
Legal and regulatory | Human rights | Sociocultural
Agreements
Agreement points
Age verification is not a silver bullet solution
Speakers
– Tatiana Tropina
– Niels Zagema
– Paulo Glowacki
– Natálie TerÄová
Arguments
Age verification is not a silver bullet and creates false sense of security while introducing new vulnerabilities
Young people want safe online environments but not through invasive measures that feel creepy and show lack of trust
Over-blocking and under-blocking in content moderation can filter out beneficial content like sex education or mental health resources
Age checks can help prevent inappropriate content exposure but cannot solve all issues related to children’s online safety
Summary
All speakers agreed that age verification alone cannot solve the complex issues of child online safety and may create additional problems while providing false security
Topics
Human rights | Children rights | Cybersecurity
Multi-layered approach combining education, technology, and social interventions is needed
Speakers
– Tatiana Tropina
– Paulo Glowacki
– Natálie TerÄová
Arguments
Adult websites should self-label with metadata to empower parents to set appropriate device controls
Digital literacy education, parental involvement, and youth participation in system design are crucial complementary measures
Enhanced digital literacy and ethical values matter more than parental mediation, especially for older adolescents who rely on peers
Summary
Speakers consensus on need for comprehensive approach combining technological safeguards with education, parental involvement, and digital literacy rather than relying solely on age verification
Topics
Sociocultural | Human rights | Children rights
Youth participation in design and regulation is essential
Speakers
– Niels Zagema
– Paulo Glowacki
– Natálie TerÄová
Arguments
Youth participation in design and regulation debates is essential, but must be done critically and meaningfully
Digital literacy education, parental involvement, and youth participation in system design are crucial complementary measures
Children are not passive consumers and will find ways around restrictions, so they should be treated as active participants
Summary
All youth-focused speakers emphasized the critical importance of meaningful youth participation in developing age verification systems and related policies
Topics
Human rights | Children rights | Sociocultural
Privacy and security risks of current age verification systems
Speakers
– Tatiana Tropina
– Paulo Glowacki
– Natálie TerÄová
Arguments
Age verification tools collect sensitive data (IDs, biometrics, financial accounts) creating risks of breaches, misuse, and tracking
Current age verification technologies lack transparency and function as black boxes without clear understanding of their algorithms
There’s tension between GDPR’s data minimization principles and age verification systems that demand extensive personal information
Summary
Speakers shared concerns about significant privacy and security vulnerabilities in current age verification implementations, including data collection risks and lack of transparency
Topics
Cybersecurity | Human rights | Legal and regulatory
Similar viewpoints
Both speakers emphasized respecting children’s autonomy and rights to access internet benefits while acknowledging their capabilities as active participants rather than passive recipients
Speakers
– Niels Zagema
– Natálie TerÄová
Arguments
Young people’s autonomy and right to develop without excessive parental control should be respected
Children have full right to benefit from internet technologies and should not be underestimated in their capabilities
Topics
Human rights | Children rights | Sociocultural
Both speakers highlighted how age verification systems create exclusionary barriers and dehumanizing experiences, particularly affecting already disadvantaged populations
Speakers
– Tatiana Tropina
– Niels Zagema
Arguments
Age verification creates barriers for marginalized groups who lack government IDs, bank accounts, or proper technology access
Administrative burden of age verification treats people as numbers rather than individuals, similar to bureaucratic government processes
Topics
Human rights | Sociocultural | Legal and regulatory
Both speakers emphasized the importance of balancing children’s rights to access information and privacy, referencing legal frameworks and developmental considerations
Speakers
– Paulo Glowacki
– Natálie TerÄová
Arguments
UN Convention on the Rights of the Child balances freedom of expression and access to information with privacy rights
Forcing teens to constantly prove their age may feel invasive and send message that privacy comes second during critical identity development period
Topics
Human rights | Children rights | Legal and regulatory
Unexpected consensus
Need for international standards and coordination
Speakers
– Tatiana Tropina
– Paulo Glowacki
Arguments
Multi-stakeholder development of international standards needed if age verification tools are implemented
UN Convention on the Rights of the Child balances freedom of expression and access to information with privacy rights
Explanation
Despite coming from different perspectives (technical community vs. legal/youth advocacy), both speakers converged on the need for international coordination and standards to address the global nature of internet governance challenges
Topics
Legal and regulatory | Human rights | Infrastructure
Recognition of German experience as valuable case study
Speakers
– Paulo Glowacki
– Audeince
Arguments
Digital literacy education, parental involvement, and youth participation in system design are crucial complementary measures
German experience shows possibility of double-blind mechanisms that provide yes/no answers without revealing user identity
Explanation
Unexpected convergence on Germany’s approach as a potential model, with both speakers acknowledging its comprehensive regulatory ecosystem and innovative technical solutions as worth studying
Topics
Infrastructure | Legal and regulatory | Cybersecurity
Trust as fundamental requirement for any age verification system
Speakers
– Tatiana Tropina
– Niels Zagema
– Paulo Glowacki
Arguments
Age verification is not a silver bullet and creates false sense of security while introducing new vulnerabilities
Young people want safe online environments but not through invasive measures that feel creepy and show lack of trust
Digital literacy education, parental involvement, and youth participation in system design are crucial complementary measures
Explanation
Unexpected consensus across technical, youth advocacy, and legal perspectives that trust is the fundamental issue – systems that undermine trust defeat their own purpose regardless of technical capabilities
Topics
Human rights | Sociocultural | Cybersecurity
Overall assessment
Summary
Strong consensus emerged against age verification as standalone solution, with all speakers agreeing on need for multi-layered approaches combining education, technology safeguards, and meaningful youth participation. Significant agreement on privacy/security risks and importance of trust in any implemented systems.
Consensus level
High level of consensus on fundamental principles despite different stakeholder perspectives. This suggests potential for collaborative policy development that balances child protection with rights preservation, though implementation details may still require negotiation.
Differences
Different viewpoints
Role and extent of parental control in children’s online safety
Speakers
– Niels Zagema
– Natálie TerÄová
Arguments
Young people’s autonomy and right to develop without excessive parental control should be respected
Enhanced digital literacy and ethical values matter more than parental mediation, especially for older adolescents who rely on peers
Summary
Zagema expresses hesitation about parental influence and emphasizes children’s autonomy and right to develop without excessive parental narrative control, while TerÄová acknowledges parental mediation works for younger children but emphasizes that older adolescents rely more on peers and that ethical values matter more than parental control.
Topics
Human rights | Children rights | Sociocultural
Approach to creating safe online spaces – exclusion vs inclusion paradigm
Speakers
– Tatiana Tropina
– Audeince
Arguments
Age verification tools collect sensitive data (IDs, biometrics, financial accounts) creating risks of breaches, misuse, and tracking
Age verification should create age-appropriate environments where adults are kept out of children’s spaces, not just keeping children out
Summary
Tropina focuses on the risks and problems with current age verification approaches, emphasizing privacy and security concerns, while the audience member (Torsten Krause) advocates for a paradigm shift toward creating safe spaces for children by keeping adults out rather than restricting children’s access.
Topics
Cybersecurity | Children rights | Infrastructure
Feasibility and desirability of technical solutions for age verification
Speakers
– Tatiana Tropina
– Audeince
Arguments
Age verification is not a silver bullet and creates false sense of security while introducing new vulnerabilities
German experience shows possibility of double-blind mechanisms that provide yes/no answers without revealing user identity
Summary
Tropina is skeptical about technical age verification solutions, emphasizing their risks and limitations, while the audience member presents Germany’s technical solution as a viable privacy-preserving approach that can work effectively.
Topics
Cybersecurity | Infrastructure | Legal and regulatory
Unexpected differences
Criticism of Jonathan Haidt’s research methodology
Speakers
– Regina Filipová Fuchsová
– Natálie TerÄová
Arguments
Session aims to examine age verification from multiple stakeholder perspectives given evidence of social media’s impact on youth mental health
Children have full right to benefit from internet technologies and should not be underestimated in their capabilities
Explanation
While Fuchsová opens the session by citing Haidt’s research as evidence for the need to address social media’s impact on youth mental health, TerÄová later explicitly criticizes Haidt’s work, stating that the scientific community has problems with his framing and that ‘correlation is not causality.’ This creates an unexpected tension within the session’s foundational premises.
Topics
Children rights | Human rights | Sociocultural
Degree of optimism about technical solutions
Speakers
– Tatiana Tropina
– Audeince
Arguments
Age verification systems should be independently audited with publicly available reports for researcher access
German experience shows possibility of double-blind mechanisms that provide yes/no answers without revealing user identity
Explanation
While both acknowledge the need for better technical implementation, Tropina remains fundamentally skeptical about age verification technology even with improvements, while the audience member presents existing German solutions as proof that privacy-preserving age verification is already achievable. This represents an unexpected split on the technical feasibility question.
Topics
Cybersecurity | Infrastructure | Legal and regulatory
Overall assessment
Summary
The speakers showed remarkable consensus on rejecting age verification as a silver bullet solution and agreeing on the need for multi-layered approaches, but disagreed on the specific balance between technical solutions, parental involvement, youth autonomy, and regulatory frameworks. The most significant disagreements centered on the role of parents versus youth autonomy, and the feasibility of privacy-preserving technical solutions.
Disagreement level
Moderate disagreement with high consensus on core principles but divergent views on implementation strategies. The disagreements are constructive and reflect different stakeholder perspectives rather than fundamental opposition, suggesting that collaborative solutions incorporating multiple viewpoints are possible. The unexpected criticism of the session’s foundational research (Haidt) indicates some underlying tension about the evidence base for regulatory action.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers emphasized respecting children’s autonomy and rights to access internet benefits while acknowledging their capabilities as active participants rather than passive recipients
Speakers
– Niels Zagema
– Natálie TerÄová
Arguments
Young people’s autonomy and right to develop without excessive parental control should be respected
Children have full right to benefit from internet technologies and should not be underestimated in their capabilities
Topics
Human rights | Children rights | Sociocultural
Both speakers highlighted how age verification systems create exclusionary barriers and dehumanizing experiences, particularly affecting already disadvantaged populations
Speakers
– Tatiana Tropina
– Niels Zagema
Arguments
Age verification creates barriers for marginalized groups who lack government IDs, bank accounts, or proper technology access
Administrative burden of age verification treats people as numbers rather than individuals, similar to bureaucratic government processes
Topics
Human rights | Sociocultural | Legal and regulatory
Both speakers emphasized the importance of balancing children’s rights to access information and privacy, referencing legal frameworks and developmental considerations
Speakers
– Paulo Glowacki
– Natálie TerÄová
Arguments
UN Convention on the Rights of the Child balances freedom of expression and access to information with privacy rights
Forcing teens to constantly prove their age may feel invasive and send message that privacy comes second during critical identity development period
Topics
Human rights | Children rights | Legal and regulatory
Takeaways
Key takeaways
Age verification is not a silver bullet for protecting minors online and creates a false sense of security while introducing new privacy and security vulnerabilities
Current age verification systems create significant barriers to internet access, particularly affecting marginalized groups, older adults, people with disabilities, and those without government IDs or bank accounts
There is fundamental tension between protecting children online and preserving privacy rights, digital inclusion, and internet accessibility for all users
A multi-layered approach is needed that combines technological safeguards, digital literacy education, parental involvement, and youth participation rather than relying solely on age verification
Children and young people should be treated as active participants with agency rather than passive consumers, and they will find ways around restrictions regardless of implementation
The responsibility for online safety cannot be shifted entirely to technology companies or technical solutions – it requires involvement from families, educators, and society as a whole
International standards developed through multi-stakeholder processes are needed if age verification tools are to be implemented to avoid fragmented approaches that could break internet interoperability
Resolutions and action items
Continue discussion at the main stage EuroDIG session on age verification scheduled for Wednesday
Age verification systems should be independently audited with publicly available reports for researcher access
Develop international standards through multi-stakeholder processes if age verification tools are implemented
Focus on creating youth-specific safe platforms and age-appropriate digital environments
Implement adult website self-labeling with metadata to empower parental controls
Enhance digital literacy programs beyond traditional classroom settings through libraries and community organizations
Unresolved issues
How to balance child protection with privacy rights and digital inclusion without creating trade-offs
Technical implementation of privacy-preserving age verification systems that don’t require identity disclosure
Determining appropriate age thresholds that account for individual developmental differences rather than universal standards
Addressing the fragmented regulatory landscape across different countries and jurisdictions
Resolving the tension between GDPR data minimization principles and age verification data collection requirements
Managing platform liability concerns regarding penalties for age verification failures versus privacy violations
Preventing over-blocking and under-blocking of beneficial content like sex education and mental health resources
Suggested compromises
Implement double-blind age verification mechanisms that provide yes/no answers without revealing user identity, as developed in Germany
Focus on creating safe spaces for children by keeping adults out rather than keeping children out of all spaces
Combine age verification with complementary approaches including digital literacy, parental involvement, and platform responsibility
Develop flexible age verification systems that account for individual needs rather than rigid universal categories
Empower families through self-labeling and metadata systems rather than mandatory platform-based verification
Ensure meaningful youth participation in the design and regulation of age verification systems
Implement transparency requirements and independent auditing of age verification technologies
Thought provoking comments
We should not frame it as a trade-off. We should not frame it as a trade-off between security of minors and privacy, security of users, access. And it should not be a trade-off.
Speaker
Tatiana Tropina
Reason
This comment fundamentally reframes the entire debate by rejecting the commonly accepted premise that protecting children online requires sacrificing privacy and access rights. It challenges the binary thinking that dominates policy discussions and suggests a more nuanced approach is possible.
Impact
This reframing became a central theme throughout the discussion, with other speakers building on this concept. It shifted the conversation from ‘how much privacy should we sacrifice for safety’ to ‘how can we achieve both goals simultaneously,’ elevating the discussion to a more sophisticated level of analysis.
I would rather not have that site know that identity of the child. Because the site managers may not be, well, let’s say, might have some motives that are not ideal for the interest of the children. So especially children need to be able to browse the internet anonymously.
Speaker
Tapani Tarvainen (audience member)
Reason
This comment introduces a crucial paradox that hadn’t been explicitly addressed: age verification systems designed to protect children might actually make them more vulnerable by requiring them to identify themselves to potentially harmful actors. It highlights the counterintuitive risks of the proposed solution.
Impact
This intervention prompted deeper discussion about the fundamental contradictions in age verification approaches and led to exploration of technical alternatives that could verify age without revealing identity. It added a layer of complexity that forced speakers to consider unintended consequences.
I’m wondering if we are on the same kind of level in this discussion… I am in favor of the participation of children, and I think we have to go kind of a shift that age assurance mechanism and tools can be a key tool or a precautionary measure for creating age-appropriate digital environments where not children keep out, but maybe we keep out adults to have safe spaces for children.
Speaker
Torsten Krause (audience member)
Reason
This comment fundamentally inverts the typical framing of age verification from exclusion-based to inclusion-based protection. Instead of keeping children out of adult spaces, it proposes keeping adults out of children’s spaces, shifting from a restrictive to a protective paradigm.
Impact
This perspective shift influenced the latter part of the discussion, with speakers acknowledging this alternative approach and discussing how it might work in practice. It moved the conversation from purely critical analysis to constructive exploration of alternative implementations.
Don’t underestimate children. They will find their way how to work around the regulations and restrictions. They are not just passive consumers… There are definitely risks offline, but there are also risks online. So let’s be here for them, be the guides for them.
Speaker
Natálie TerÄová
Reason
This comment challenges the paternalistic assumptions underlying many age verification proposals by recognizing children’s agency and resourcefulness. It reframes the adult role from controller to guide, acknowledging that restrictive approaches may be counterproductive.
Impact
This perspective helped synthesize the discussion’s themes about education, empowerment, and the limitations of technical solutions. It provided a philosophical foundation for the multi-layered approaches discussed throughout the session and influenced the final recommendations.
The problem for me is that they [platforms] are not at risk if they exclude an older person from access to legitimate services, if they exclude disadvantaged groups from access to legitimate services. They are not at risk of any fine. And this feels like a big inequality in terms of what kind of barriers and risks we are creating for our society.
Speaker
Tatiana Tropina
Reason
This observation exposes a critical asymmetry in how regulatory frameworks incentivize platform behavior, highlighting how current approaches may systematically disadvantage already marginalized groups while focusing solely on one type of harm.
Impact
This comment deepened the discussion about unintended consequences and social equity, leading other speakers to consider how age verification systems might exacerbate existing digital divides. It added a social justice dimension that enriched the overall analysis.
Overall assessment
These key comments fundamentally elevated the discussion from a technical debate about implementation details to a sophisticated examination of underlying assumptions, unintended consequences, and alternative paradigms. The most impactful interventions challenged binary thinking (protection vs. privacy), exposed paradoxes (protection systems creating vulnerability), and proposed paradigm shifts (from exclusion to inclusion, from control to guidance). Together, these comments created a more nuanced conversation that acknowledged the complexity of the issue while pointing toward more holistic, rights-respecting solutions. The discussion evolved from identifying problems to exploring fundamental questions about children’s agency, digital rights, and the role of technology in society.
Follow-up questions
How can age verification technologies that preserve anonymity and don’t reveal identity to websites be implemented in practice?
Speaker
Tapani Tarvainen
Explanation
He mentioned theoretical possibilities for verifying age without revealing identity but questioned if any have been actually implemented, highlighting a gap between theory and practice
How can international standards for age verification be developed through multi-stakeholder participation?
Speaker
Tatiana Tropina
Explanation
She emphasized the need for international standards developed in a multi-stakeholder manner to address the risks of fragmented approaches across different countries
How can double-blind age verification mechanisms be scaled and implemented more widely?
Speaker
Torsten Krause
Explanation
He referenced a German government system developed with Fraunhofer SIT that uses existing data without generating new data, but implementation and scalability questions remain
What should be the appropriate minimum age thresholds for different types of online content and platforms?
Speaker
Niels Zagema
Explanation
He noted that current age limits like 13 for social media aren’t enforced and there’s insufficient discussion about what the actual age should be based on children’s varying needs
How can the tension between GDPR’s data minimization requirements and age verification’s data collection needs be resolved?
Speaker
Natálie TerÄová
Explanation
She highlighted the legal contradiction between privacy laws calling for minimal data collection and age verification systems requiring extensive personal information
What are the long-term effects of different parental mediation strategies on children’s digital literacy and risk mitigation?
Speaker
Natálie TerÄová
Explanation
Her research showed that parental mediation has limited effectiveness for older adolescents, suggesting need for more research on alternative approaches
How can youth participation in age verification system design be implemented effectively without tokenism?
Speaker
Niels Zagema
Explanation
He emphasized that bad youth participation can hurt the cause and called for critical evaluation of how young people are meaningfully involved in technical solutions
What are the liability frameworks for companies regarding age verification errors and privacy breaches?
Speaker
Natálie TerÄová
Explanation
She noted companies face penalties for letting minors through but questioned consequences for excluding legitimate users, highlighting an imbalance in accountability
How do age verification technologies perform across different demographic groups and what are the bias implications?
Speaker
Tatiana Tropina and Natálie TerÄová
Explanation
Both speakers mentioned performance issues with different skin tones, medical conditions, and age ranges, but more comprehensive research on bias is needed
What constitutes effective digital literacy programs that actually reduce online risks for different age groups?
Speaker
Multiple speakers
Explanation
While digital literacy was frequently mentioned as an alternative, specific research on what makes these programs effective was identified as needed
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.