Lightning Talk #109 Ensuring the Personal Integrity of Minors Online
27 Jun 2025 10:20h - 10:50h
Lightning Talk #109 Ensuring the Personal Integrity of Minors Online
Session at a glance
Summary
This discussion focused on protecting the personal integrity of minors in digital environments, presented by Lea Peters from ECPAT Germany and Michael Terhorst from Germany’s Federal Office for the Enforcement of Children’s Rights and Digital Services (KIT). Peters defined personal integrity as the quality of being honest and having strong moral principles, emphasizing how digital environments significantly influence its development in children and youth. She presented alarming statistics showing that over 300 million children globally have been affected by online sexual exploitation and abuse in the past year, with one in eight children experiencing online solicitation or non-consensual sharing of sexual content.
The speakers highlighted that exposure to such content can desensitize children and negatively impact their personal integrity development. Rather than supporting social media bans for minors, they advocated for creating safer digital spaces that respect children’s right to participation. Terhorst explained Germany’s three-step enforcement approach under the Digital Services Act: analyzing platform functionalities, identifying associated risks, and checking existing precautionary measures. Their “dialogic regulation” method involves collaborating with platforms to implement solutions rather than immediately imposing penalties.
Key protective measures discussed included secure default settings, age assurance mechanisms, restricted contact functionalities, and improved moderation systems. The speakers emphasized that effective protection requires combining multiple measures tailored to specific age groups and platforms. They acknowledged challenges in age verification technology and the need for upcoming EU guidelines to strengthen enforcement. The discussion concluded with audience questions about platform cooperation, regulatory challenges in different countries, and ensuring that protective measures don’t inadvertently restrict access to legitimate educational content.
Keypoints
**Major Discussion Points:**
– **Definition and importance of personal integrity for minors online** – Lea Peters explained personal integrity as being honest and having strong moral principles, emphasizing how digital environments significantly impact its development in children and youth, with German legislators now including it as a protection goal in child and youth media law.
– **Scale and impact of online child sexual exploitation** – The speakers presented alarming statistics showing over 300 million children globally affected by online sexual abuse and exploitation in the last 12 months, with one in eight children experiencing online solicitation or non-consensual sharing of sexual content.
– **Regulatory enforcement approach through dialogic regulation** – Michael Terhorst described Germany’s three-step enforcement process under the Digital Services Act: analyzing platform functionalities, identifying associated risks, and checking existing precautionary measures, followed by collaborative dialogue with providers rather than immediate penalties.
– **Technical safeguarding measures and age assurance challenges** – Discussion of specific protective measures including secure default settings, contact restrictions, content moderation, and the complex challenge of implementing effective age verification systems that balance privacy, functionality, and user-friendliness.
– **Platform cooperation and global enforcement challenges** – The speakers addressed varying levels of cooperation from different platforms, with smaller platforms often more willing to comply, while noting significant challenges in enforcing regulations on platforms outside the EU and the revenue conflicts that arise when companies profit from harmful content.
**Overall Purpose:**
The discussion aimed to educate attendees about protecting minors’ personal integrity in digital environments, presenting both the scope of online risks facing children and the regulatory approaches being implemented in Germany to create safer online spaces through collaborative enforcement of the Digital Services Act.
**Overall Tone:**
The discussion maintained a professional, informative tone throughout, with speakers presenting serious statistics and challenges in a measured way. The tone remained collaborative and solution-focused, particularly when discussing the “dialogic regulation” approach. During the Q&A session, the tone became more conversational and supportive, with speakers offering practical advice and acknowledging the complexities faced by different stakeholders in various global contexts.
Speakers
– **Lea Peters**: Policy Specialist for Digital Child Protection at ECPAT Germany. ECPAT Germany is a children’s rights organization working for the protection of minors from sexualized violence, exploitation and human trafficking.
– **Michael Terhorst**: Head of the Federal Office for the Enforcement of Children’s Rights and Digital Services (KIT). Responsible in Germany for enforcing precautionary measures related to the Digital Services Act, especially Article 28.
– **Audience**: Multiple audience members who asked questions during the discussion. Specific roles/expertise not mentioned, though one identified as working for an NGO called SEED in Kurdistan, Iraq, leading a project on online child safety, and another identified as a digital native.
Additional speakers:
None – all speakers in the transcript were included in the provided speakers names list.
Full session report
# Protecting Personal Integrity of Minors in Digital Environments: Discussion Report
## Introduction and Context
This discussion focused on protecting minors’ personal integrity in digital environments, featuring two German stakeholders in child protection policy. Lea Peters, Policy Specialist for Digital Child Protection at ECPAT Germany, provided the advocacy perspective, while Michael Terhorst, Head of the Federal Office for the Enforcement of Children’s Rights and Digital Services (KIT), discussed Germany’s regulatory enforcement approach under the Digital Services Act, particularly Article 28.
## Defining Personal Integrity in Digital Contexts
Peters opened by defining personal integrity as “the quality of being honest and having strong moral principles.” She emphasized that “Personal integrity though is nothing you are just born with or that’s being formed in a vacuum. It is influenced by our family, our social circle, the culture we grew up in and the experiences we make. Digital environments and technology have become an integral part of our lives. It is nowadays a huge factor in the development of personal integrity.”
Peters noted that German legislators have incorporated the personal integrity of children and youth as a protection goal in child and youth media law, reflecting a shift from merely preventing harm to actively fostering positive character development.
## The Scale of Online Child Sexual Exploitation
Peters presented concerning statistics: over 300 million children globally have been affected by online child sexual abuse and exploitation within the past 12 months, with one in eight children globally subjected to online solicitation and non-consensual sharing of sexual images during the same period. She noted that 84% of these incidents are outside the jurisdiction of the United States.
Peters observed that police statistics show the average age of offenders is becoming younger, and every case now has a digital component. This digital dimension creates challenges for personal integrity development, as exposure to sexualized violence online can desensitize children and negatively impact their moral development.
## Germany’s Dialogic Regulation Approach
Terhorst introduced “dialogic regulation,” a departure from traditional punitive enforcement models. Rather than immediately imposing penalties, KIT employs a three-step analytical process followed by collaborative engagement with providers.
“So our idea is we have something called dialogic regulation,” Terhorst explained. “So we don’t just, yeah, send a letter saying, okay, you have to, I don’t know, pay €5 million because your platform isn’t safe. We get in touch with the provider and say, okay, we found some deficits on your platform.”
The three-step analysis involves examining platform functionalities, identifying risks for minors, and checking existing precautionary measures. Terhorst noted that “The providers don’t have to take those specific ideas. We present them. They can do something on their own. As long as it works, we’re good.”
KIT has opened between 30 and 40 cases in Germany so far. Terhorst observed that smaller platforms often actively seek guidance, while larger platforms typically wait for official guidelines before implementing changes.
## Technical Safeguards and Age Assurance Challenges
Terhorst emphasized that effective protection requires multiple precautionary measures including secure default settings, age assurance mechanisms, restricted contact functionalities, and improved moderation systems.
Age assurance emerged as a particularly complex challenge. Terhorst illustrated this with an example: “when you have a name like Butterfly13, you don’t know if Butterfly13 is a 13-year-old girl or a 46-year-old man. So that just doesn’t make it easier.”
Current technology faces significant limitations. The EU’s planned digital identity wallet systems, including the “mini wallet” and “UDI wallet” planned for autumn 2026, will initially only verify users as 18 or older, lacking the granular age verification necessary for tailored safety measures for different minor age groups.
## Platform Cooperation and Transparency Challenges
Peters highlighted the need for greater transparency in algorithmic processes and meaningful participation from civil society organizations in developing safety measures. She mentioned that consultation feedback indicated “there wasn’t enough emphasis on transparency and including civil society organizations.”
Terhorst noted varying levels of engagement from different platforms, with smaller platforms showing more willingness to comply and seek guidance compared to larger platforms. He acknowledged that regulatory authorities face significant limitations when dealing with platforms operating outside the EU.
## Economic Dimensions and Implementation Challenges
During the audience discussion, Michael, an NGO worker from Kurdistan, Iraq, raised concerns about economic incentives that perpetuate harmful content online. He noted that tech companies derive substantial revenue from harmful sites, citing Ethiopia as an example where such sites generate 6% of tech company revenues. “These tech companies, they get their revenue, most of their revenues from these sites,” he explained. “So it also affects the government because these tech companies are paying taxes to the government, right?”
Peters responded by suggesting that economic arguments could be reframed for advocacy purposes, recommending studies that demonstrate the negative economic impact of sexualized violence on national economies, noting that “people who suffer from, for example, PTSD, they often have difficulties in participating 100% later on in their professional life.”
## Balancing Protection with Rights and Access
An audience member identifying as a digital native raised concerns about potential overreach in safety measures, particularly regarding access to sex education and support for LGBTQI+ communities. Peters acknowledged these concerns while emphasizing that Germany’s legal foundations help prevent discriminatory implementation of protective measures.
## Planned Actions and Follow-Up
The EU Commission guidelines for Article 28 of the Digital Services Act are expected to be published within weeks. KIT will continue its three-step analysis process for ongoing platform cases. A specific follow-up discussion was planned between the presenters and the Iraqi participant to share regulatory strategies for telecom companies.
## Conclusion
The discussion examined the complex challenges of protecting children in digital environments while respecting their rights. The concept of personal integrity as shaped by digital experiences provided a framework for understanding child protection that extends beyond preventing immediate harm. Germany’s dialogic regulation approach offers a collaborative enforcement model, though challenges remain in age assurance technology, cross-jurisdictional enforcement, and balancing economic incentives with child protection goals. The upcoming EU guidelines may provide clearer frameworks for implementation and enforcement.
Session transcript
Lea Peters: and the American Heart Association. The American Heart Association is a non-profit organization that supports people with disabilities. The American Heart Association is a non-profit organization that supports people with disabilities. Important among deaf people. Important among the deaf. Important knowledge. Good morning, thank you everyone for being here and taking part in our talk on personal integrity and Minors online. My name is Leah Peters, I work as Policy Specialist for Digital Child Protection at EGPT Germany. EGPT Germany is a children rights organization that is working for the protection of minors from sexualized violence, exploitation and human trafficking.
Michael Terhorst: Good morning also from my side, my name is Michael Terhorst, I’m head of the Federal Office for the Enforcement of Children’s Rights and Digital Services. A quite long name, KIT is the short form, so that might be easier to remember. Today we’re going to start with Leah, who’s going to present the idea and define the personal integrity and show some potential risks. After that I’m going to continue by showing some examples how we might battle those risks. So yeah, thanks for being here.
Lea Peters: Exactly, when you read the title you might have asked yourself, but what is personal integrity? Well according to the Cambridge Dictionary, personal integrity is the quality of an individual of being honest and having strong moral principles. So this means aligning behavior with values, even when faced with challenges or temptations. It includes truthfulness, being reliable and taking responsibility for your actions. Personal integrity though is nothing you are just born with or that’s being formed in a vacuum. It is influenced by our family, our social circle, the culture we grew up in and the experiences we make. Digital environments and technology have become an integral part of our lives. It is nowadays a huge factor in the development of personal integrity. However it also poses new risks and challenges that we are faced with, thus also posing risks and challenges to the development of personal integrity, especially of minors. Those new risks can bring negative impact on the development, especially to minors who are still at an earlier developmental stage. In response to those new risks, the German legislators have included personal integrity of children and youth as a protection goal of child and youth media law. Thus they are making the development and socialization of children and youth a key factor. So when looking at sexualized violence and exploitation online, we have to look at how this can also affect the development of personal integrity of minors. Prevalent studies show that online child sexual exploitation and abuse is prevalent in every country where it is measured. Globally more than 300 million children and youth have been affected by online child sexual abuse and exploitation in the last 12 months. One in eight children globally has been subjected to online solicitation in the last 12 months. This includes unwanted sexual talk like non-consensual texting, unwanted sexual questions or unwanted sexual requests by other adults or other youth. Also one in eight children have experienced the non-consensual taking, sharing and or exposure to sexual images and videos in the last 12 months. When looking at reports, we see that in 2024 the US National Center for Missing and Exploited Children analyzed more than 30 million incidents of sexualized violence and exploitation against children and youth. 84% of these incidents are outside the jurisdiction of the United States. Emphasizing that this problem is global and in every country but also in international cooperation we have to deal with and find solutions. When looking at the offender side, we see that police statistics show a demographic change over the past years of offenders in these crime areas with the average age of offenders becoming younger and younger. In general, every case of online child sexual violence and exploitation of children has a digital component. Being because the child has been contacted through social media or messaging apps before or because the abuse and exploitation is being filmed or photos are being taken of it. Being confronted with sexualized violence is not merely uncomfortable for children and youth. It is deeply disturbing and in many cases traumatic. And like we heard yesterday in the panel, in the high-level panel in the morning, the CEO of the Five Rights Foundation said without intention in less than 15 clicks you find images of sexual abuse and sexual violence against children online. Having seen this so many times online can desensitize children and youth and have negative impact on their personal integrity because it negatively impacts the values they see, positive behavior they are exposed to as well as respect in acting with each other. Due to this, it is important to create digital environments and technologies based on the rights and well-being of children and youth. It is our duty to do better and deliver on this and it is not optional. Safe digital environments have a positive impact on children and youth personal integrity since it creates more aware, knowledgeable and respectful societies that benefit from the positive sides of technology. Thus meaning it also creates safer digital experiences for everyone, not just for children and youth.
Michael Terhorst: Thank you. So, after what you just said, there is a huge and immense necessity to protect the personal integrity of minors. The variety of ideas how to do that is also quite great. So, when we look at some voices saying that there is already enough. We hear from some people that when we look at social media platforms for example that providers already do a lot to protect the personal integrity of minors. Those voices are from the providers themselves. So, maybe it is not really objective. Because when we look at the risks arising out of the different functionalities we see on those platforms there is the need to do something. So, what should we do? There are some ideas of a social media ban. We see that in Australia, we see those ideas in other countries also in the European Union and the idea to ban young persons from especially social media platforms. The Digital Services Act, the DSA, is taking into account the idea of the rights of the child, the right to participate, the right of participation. So, we now have to work together to make sure that we create safe online spaces for children and young persons to guarantee safe usage of those platforms which are important for them, which are part of their everyday life. So, how do we do that? It is not that easy. as you can imagine, because there is no one-size-fits-it-all. There are different platforms, different functionalities, and so different risks arising out of those different functionalities. So we have to find individual solutions for every platform. That’s a lot of work to do, but that’s something we have to do, because otherwise it just doesn’t work, and the status quo is not acceptable. So what we do at KIT, K-I-T-D, so we are responsible in Germany to enforce precautionary measures when it comes to the Digital Services Act, especially Article 28. What we do in Germany is that we have three steps. So first step is we do an analysis of those platforms. We look at the functionalities. We just try everything you can do there, from going live to comment sections and all that, to see what functionalities are given on those platforms. Step two is looking at the different risks which are connected to those functionalities. Step three is checking out if those risks are already being covered by precautionary measures, which might already be implemented. And if all those risks are already being covered, we are good. It’s fine. There’s nothing the providers have to do, but there has been no case where we came to the conclusion that nothing has to be done. So there’s always something to do, and there’s always a lot to do to create online, safe online spaces for children and young persons. So our idea is we have something called dialogic regulation. So we don’t just, yeah, send a letter saying, okay, you have to, I don’t know, pay €5 million because your platform isn’t safe. We get in touch with the provider and say, okay, we found some deficits on your platform because there are functionalities which lead to the exposure to different risks to children and young persons. So something has to be done. We don’t just say that. We already give some ideas, some solutions, some possible solutions to counter those risks. And the providers don’t have to take those specific ideas. We present them. They can do something on their own. As long as it works, we’re good. And we started this about a year ago on the legal basis of the DSA. You mentioned the German Youth Protection Act. We already had this idea in 2021, the Youth Protection Act. So we had already some connections to some providers which helped us in the past and still now. So we already see some slight improvements. But as you might know, Article 28 DSA needs guidelines because you just need guidelines to really have something when you talk to the providers to show what needs to be done, what level of security has to be guaranteed. So when we talk to the providers, we give them possible solutions. We give them some time. And if it works, it’s good. If they don’t comply, then okay, the next steps, a hard enforcement has to be started because otherwise it just wouldn’t work. The guidelines from the EU Commission will be published in the next couple of weeks. So we’re happy for this next step because that makes our enforcement much easier, as you can imagine. Okay, so I talked a lot about possible precautionary measures. And here are some examples. So you can see, for example, moderation or registration, age assurance, all that. That list is not exhaustive because it can’t be. There are new functionalities like every day. There are new platforms. There are new ideas of how to create spaces for young persons, how to connect people to the different platforms. So new functionalities lead to new risks. New risks lead to the necessity of the implementation of new precautionary measures to guarantee a safe usage for children and young persons. So we are working every day, not just at the status quo, but at possible new functionalities in the future and the risks arising out of those functionalities. So when we take a brief look, you see some examples here. There are other precautionary measures already mentioned in the draft of the guidelines to Article 28. But, yeah, let’s take a look at the secure default settings, for example, because they’re always like the heart or the basis of the whole system of precautionary measures. Because, as you can imagine, when you see all those measures, it’s not like one measure which leads to a safe space. It’s always a combination of different measures because otherwise it just wouldn’t work. So the secure default settings. Children or young persons should not be able to be found when you do a Google search. Their profile should not just be open or be public. It should be private by default. So you see some examples here. Restriction, especially regarding contact. So limitation of contact functionalities. You already mentioned all those risks for the sexual integrity, especially this cyber grooming, this sexual extortion. There are so many risks. And those risks have to be mitigated. And you can do that by limiting the possibilities of communication, for example, and to create safe spaces by, for example, usage of pseudonyms or, yeah, just to make sure that not like real names, addresses, locations, all that are being published. So it is really important for us and that’s like a key factor that young persons cannot be directly contacted by like adults because in very many cases you just have no idea how old someone is. And sometimes even the providers, in some cases they do know, but they just tell you that they have no idea. But actually they do know how old you are. But in many cases they just don’t. And when you have pseudonyms, on the one hand, it’s good to protect like the individual, but on the other hand when you have a name like Butterfly13, you don’t know if Butterfly13 is a 13-year-old girl or a 46-year-old man. So that just doesn’t make it easier. So what we need is age assurance because with age assurance you can make sure that, yeah, a specific, let’s call it age bracket is given. So, for example, if you have 18 plus or an age bracket between 13 and 15, when we look at the default settings, those can be tailored to the specific age groups. So, for example, limitations to communication between 13 and 16, but when you’re 17 or 16, 17, maybe there’s some default settings which don’t need to be as strict as when you’re younger. So it has to be tailored to the specific risk and to the specific age groups. And it is also very important, I already mentioned, like Butterfly13 being a 46-year-old man, that in those cases also age verification can be used not to exclude children, young persons from, for example, content, which might harm them, for example, pornography, but to include them to create safe spaces. So, as you can see, it’s also necessary for the precautionary measures, for the security default settings. But also when I go back to all those measures, when you look at, for example, moderation when you look at reporting mechanisms It is important that children young persons really understand what’s happening when they report something. It’s important that the language used By the providers when you try to report something can be understood by children young persons So they just need to know how you are. They don’t need to know The providers don’t need to know if your birthday is like 5th of December but they need to know if you’re like between 13 and 15 or 16 to 17 whatever and that makes it really Necessary to implement age assurance. There are different kinds. There are age verification systems. There’s age estimation. There are other things being used All of them have one thing in common It is always a balancing act between privacy on the one hand and on the other hand functionality It also has to be user friendly. There’s so many aspects when it comes to age assurance it’s a it’s a really big thing and everyone’s talking about it because it’s so complicated and Because it’s it’s like about it’s it’s Bella Bella saying like balancing 20 different Interests and it’s it’s it’s really hard to find something that really works. So the EU Commission Is now publishing the first step the so-called mini wallet and in about a year, I think it’s it’s planned for autumn 2026 the so-called UDI wallet comes to the mini wallet and also the UDI wallet the first at the beginning it will be only be possible to verify your age if you’re 18 plus but they’re already also some ideas and we are really trying to to force that as we can that all those apps or those Those programs like the UDI wallet will make it possible to identify if someone is let’s say between 13 and 15 to tailor the default settings and all those other precautionary measures and to create safe spaces because if Those age assurance mechanisms don’t work It just really gets Really dangerous because you trust them and if you trust those measures and if it doesn’t work Because when you think that you are like in a safe space and you let your kids and you sit in a so-called safe space But if those mechanisms doesn’t work, it’s just it can be really harmful and all those risks earlier described Yeah can can really happen and that’s something we really don’t want okay, so our basic idea is not to exclude children to include them and to have like a digital environment offering significantly more opportunities than risks and That’s not easy because what we are doing is just one small piece. We all have to work together also Like our media literacy and all that we have to like to follow in a holistic approach because otherwise it’s Impossible to create safe spaces. So let’s work together Let’s do together and if you have any questions or maybe online if you have any questions or want to do a statement will be open for everything and Thanks for your attention Are there any questions ideas statements
Audience: Yes Thank you for the presentation I’d love to know if you can share to the degree that you can and how it’s been interacting with the platforms and the sort of Co-regulatory model that you described like are they open to Speaking with you other I assume their difference between different platforms. I’d be really curious to experience.
Michael Terhorst: Yeah. Thank you for a question. It’s actually a pretty good question because sometimes it’s actually a bit surprising for us so we started this idea of dialogic regulation because it was in the and the Youth Protection Act in Germany and There are a lot of providers coming to us and Saying also providers. We were actually not on our list and they’re coming to us and saying, okay Hey, it’s us Maybe you can look at our platform because we just want to comply with with the law. We just Want our platform to be safe? But that’s a small part of the providers We have opened between 30 and 40 cases in Germany so far and we have also started We can we can we can act in Germany and outside the EU because in the other member states there are other authorities Responsible and when it comes to very large online platforms, so could be labs. The EU Commission is responsible for the regulation, but when we look at Those platforms in Germany or outside Germany. It’s interesting because in Germany most companies almost providers more or less willing to comply But And also interesting and chatting with us that really works. So the first like the first one or two meetings. It’s also it’s I Think the communication is kind of good Also with we loves the velops. They they come to us. We we talk to each other We keep each other updated talk about best practices on song. That’s good. That’s good That still needs a lot. There’s there’s still a lot to do but this actually works pretty well After that When we get like to the next steps when we say, okay, you have to change this or that then it’s like, okay Thank you But let’s wait for the guidelines because without the guidelines to article 28 that’s what I mentioned in the beginning a hard enforcement is almost impossible because They just say okay article 28 needs needs to be filled with more with more information to to Information to to to really like Yeah To to to show what it really means. So Yeah, I think to answer your question in two months I can say a lot more if it really if it really works that well because Smaller platforms that we actually we are happy that most of them are willing to comply But there are this like block and I mentioned the the platforms outside the EU It’s not that easy for us as you can imagine and Also, maybe it’s because the law enforcement outside the EU for us it’s also we have to be honest It’s not that easy When in like all around the globe when the when they are providers and we want to enforce Something when you want to enforce something in Germany, you send a letter and they say, okay, we have to pay because otherwise But when you send this letter to you, I don’t know any other state in the world. It’s not that easy. But Yeah I hope maybe maybe in a year on the next IGF. You asked the same question. I say we are good we now have a safe space for genuine persons or at least a little bit safer and
Lea Peters: Maybe a little additional comment from the civil society perspective So, I think what we also see missing is Transparency from platforms also to the wider open public, especially when it comes to algorithmic usage on platforms And things like that This is also also one thing We mentioned in the consultation in the guideline process for the DSA that there wasn’t enough emphasis on this transparency and including civil society organizations and Academic experts in this as well as having meaningful and child and survivor participation This was a this was something that we flagged because we see it is very important Also already in the developmental stages of new features of measures that are being implemented So there is a lot of room for improvement Thank you any of the Commons questions Oh
Audience: Thank you, all right Thank you. My name is Michael. I come from Iraq. I’m working for an NGO called SEED in Kurdistan, and I’m leading a project on online child safety by SafeOnline there, and we’re trying to build a government system there to be able to respond to AXIA and with the new name TFCSEA. One of the greatest learning here when I come to IGF is the importance of regulating the tech companies. I’ve done this before in Ethiopia, and we were able to ban pornographic websites there in collaboration with the telecom company there. But the greatest challenge is when we try to ban those websites, these tech companies, they get their revenue, most of their revenues from these sites. For example, in Ethiopia, like 6% of their revenues from these sites, harmful sites. So it’s one of the greatest challenges for us, and we’re trying to do that in Iraq. So it also affects the government because these tech companies are paying taxes to the government, right? So do you have any advice regarding this, how to regulate specifically these telecom companies in a kind of environment?
Michael Terhorst: So first of all, love your name, also Michael’s. But that’s a pretty good question. I mean, it’s, as I can imagine, a completely different kind of legal system. And so it’s actually kind of hard for me to give some advice because we have another legal basis we work from. Maybe, so we made a lot of positive experience when it comes to dialogue. So just communicating and telling them, okay, you don’t want to have, sometimes bad press isn’t that bad. But in those cases, there are a lot of providers who are interested in having like a good picture of themselves in the press, online, wherever. So our idea is we get on the dialogue, and if they comply, then we don’t advertise those pages. But we say, okay, we report about our experiences we made with those providers. And we made press statements and so on, where we say, okay, there have been some improvements on various platforms. So other platforms come to us and say, okay, actually, we want to also be mentioned like in a positive way by you. Is it possible that we get in touch, that we talk and so on? So maybe that could be an idea, but you mentioned pornographic platforms. Pornographic platforms, yeah, that’s like content, that’s content regulation. And most precautionary measures I talked about, that’s more risks resulting out of interaction. So it’s also sometimes content, but it’s very often risk resulting out of communication. When it comes to cyber grooming, for example, or when it comes to extremism, sometimes it’s a mixture of both, because when you have like extremist content, on the other hand, like this communication where they try to get especially young people in their hands. So I think I have to think about it more. Maybe we can chat later and share some ideas, and you can explain me a bit more about your situation, and we will see what we can do. Okay, thank you.
Lea Peters: Yeah, and I mean, also, if you have capacity to kind of push from both sides or push the private sector, but also the government, it’s always useful to also use studies that show the negative impact of risks like sexualized violence on the overall economy of the country or the region you live in. Since studies show that also online sexualized violence has the same traumatic outcome for survivors and victims, like offline sexualized violence, and for offline sexualized violence, there are studies how this negatively impacts a country’s economy. So this might also be a route you can go, and it might also be something that the companies or the private sector is interested in to see, because people who suffer from, for example, PTSD, they often have difficulties in participating 100% later on in their professional life and things like that. So this could also have negative impact on the future workforce they want to build. So this might also be an additional angle you could try to use.
Michael Terhorst: Thank you. Do we have time for one last question? A really short question, sorry. And a short answer. Yeah, I’m sorry.
Audience: First I just want to say, as a digital native, I truly understand how vital the work you’re doing is. But I’m curious, in terms of the work that’s been done so far, a lot of the times access to education for children has been barred because of different countries’ views on what education should be. And I’m curious, do you all have any measures in place to prevent that from happening when implementing these protective measures?
Lea Peters: Yeah, I mean, so we are both based in Germany, the kid and the expert Germany. So we work on that legal basis, and they’re already in the legal basis. They are kind of a strong basis that ensures that this is not going to happen. So for example, when you look at the development of your sexualization, then there are different laws when you’re a teen. So when you’re above 14, then when you are below 14. So we are working on those measures, and they are already kind of trying to counteract that this is happening, that it has negative impacts, for example, on sex education and things like that, which are quite important, as well as seeing that this is not limiting access for vulnerable groups like LGBTQI plus community and things like that, and other groups like that. So there are kind of strong students. We are always pushing to keep it that way, because there’s also push also from a political side to restrict this more again. But yeah, as a civil society organization, like the expert Germany, together with others, we are trying to see that this is not happening. We have it in our perspective and are pushing for that. Yeah. Thank you.
Michael Terhorst: That was a perfect answer. Nothing to add. So thank you. Thank you, Leah. Thank you, everyone.
Lea Peters: Thank you.
Lea Peters
Speech speed
132 words per minute
Speech length
1304 words
Speech time
590 seconds
Definition and Importance of Personal Integrity for Minors Online
Explanation
Personal integrity is defined as the quality of being honest and having strong moral principles, which includes aligning behavior with values even when faced with challenges. This integrity is not innate but is influenced by family, social circle, culture, and experiences, with digital environments now playing a crucial role in its development.
Evidence
Cambridge Dictionary definition cited; German legislators have included personal integrity of children and youth as a protection goal in child and youth media law
Major discussion point
Personal integrity development in digital environments
Topics
Human rights | Sociocultural
Scale and Impact of Online Child Sexual Exploitation
Explanation
The scale of online child sexual exploitation is massive and global, affecting hundreds of millions of children annually. This exposure can be deeply traumatic and has lasting negative effects on children’s development and personal integrity by desensitizing them to violence and negatively impacting their values.
Evidence
Over 300 million children globally affected in last 12 months; one in eight children subjected to online solicitation; US National Center for Missing and Exploited Children analyzed 30+ million incidents in 2024; police statistics show younger average age of offenders; CEO of Five Rights Foundation stated that sexual abuse images can be found in less than 15 clicks
Major discussion point
Global prevalence and impact of online child sexual exploitation
Topics
Cybersecurity | Human rights
Platform Cooperation and Transparency Challenges
Explanation
Platforms lack sufficient transparency regarding algorithmic usage and decision-making processes, which is crucial for effective child protection. There is also insufficient meaningful participation from civil society organizations, academic experts, and child survivors in the development and implementation of safety measures.
Evidence
Consultation feedback on DSA guidelines process; emphasis on need for transparency and civil society inclusion
Major discussion point
Need for transparency and stakeholder participation
Topics
Legal and regulatory | Human rights
Agreed with
– Michael Terhorst
Agreed on
Platform transparency and stakeholder participation are insufficient
Implementation Challenges and Solutions
Explanation
Economic arguments can be effective in convincing companies and governments to prioritize child safety, as online sexual violence has the same traumatic outcomes as offline violence. The economic impact includes reduced workforce participation due to PTSD and other long-term effects on survivors.
Evidence
Studies showing negative economic impact of sexualized violence; comparison between online and offline violence outcomes; impact on future workforce participation
Major discussion point
Economic incentives for child protection
Topics
Economic | Human rights
Agreed with
– Audience
Agreed on
Economic incentives can be effective for promoting child safety
Michael Terhorst
Speech speed
149 words per minute
Speech length
2967 words
Speech time
1189 seconds
Regulatory Approaches and Enforcement Mechanisms
Explanation
Rather than implementing social media bans, the focus should be on creating safe online spaces for children while respecting their right to participation. KIT uses a systematic three-step approach to analyze platforms and implement dialogic regulation, engaging with providers collaboratively rather than punitively.
Evidence
Digital Services Act Article 28 enforcement; three-step analysis process implemented; dialogic regulation approach used; 30-40 cases opened in Germany; EU Commission guidelines to be published soon
Major discussion point
Collaborative regulatory enforcement strategies
Topics
Legal and regulatory | Human rights
Agreed with
– Lea Peters
Agreed on
Need for collaborative rather than punitive regulatory approaches
Technical Safeguards and Precautionary Measures
Explanation
Effective child protection requires multiple technical measures working in combination, with secure default settings as the foundation. Age assurance is particularly crucial for tailoring safety measures appropriately, though it presents complex challenges in balancing privacy, functionality, and user-friendliness.
Evidence
Examples of secure default settings (private profiles, restricted contact); age assurance systems mentioned; EU mini wallet and UDI wallet development; combination of measures including moderation, registration, reporting mechanisms
Major discussion point
Technical implementation of child safety measures
Topics
Cybersecurity | Legal and regulatory
Platform Cooperation and Transparency Challenges
Explanation
There is significant variation in platform cooperation, with smaller platforms often being more willing to engage and comply with safety measures. However, enforcement becomes much more challenging when dealing with platforms outside the EU, and larger platforms often delay action until official guidelines are published.
Evidence
Smaller platforms actively seeking compliance guidance; enforcement difficulties outside EU jurisdiction; waiting for Article 28 guidelines; positive press coverage as incentive
Major discussion point
Challenges in platform cooperation and enforcement
Topics
Legal and regulatory | Economic
Agreed with
– Lea Peters
Agreed on
Platform transparency and stakeholder participation are insufficient
Audience
Speech speed
135 words per minute
Speech length
328 words
Speech time
145 seconds
Implementation Challenges and Solutions
Explanation
Tech companies and governments face conflicts of interest when regulating harmful content because these companies derive significant revenue from such sites and pay taxes to governments. There are also concerns that protective measures might inadvertently restrict access to legitimate education, particularly affecting vulnerable groups.
Evidence
Example from Ethiopia where 6% of telecom revenue came from harmful sites; mention of tax revenue to governments; concerns about educational access restrictions
Major discussion point
Economic and access challenges in regulation
Topics
Economic | Legal and regulatory | Human rights
Agreed with
– Lea Peters
Agreed on
Economic incentives can be effective for promoting child safety
Agreements
Agreement points
Need for collaborative rather than punitive regulatory approaches
Speakers
– Lea Peters
– Michael Terhorst
Arguments
Regulatory Approaches and Enforcement Mechanisms
Platform Cooperation and Transparency Challenges
Summary
Both speakers advocate for working together with platforms through dialogue and collaboration rather than simply imposing bans or punitive measures. They emphasize the importance of creating safe online spaces while respecting children’s rights to participation.
Topics
Legal and regulatory | Human rights
Platform transparency and stakeholder participation are insufficient
Speakers
– Lea Peters
– Michael Terhorst
Arguments
Platform Cooperation and Transparency Challenges
Platform Cooperation and Transparency Challenges
Summary
Both speakers agree that platforms lack sufficient transparency in their operations and that there is inadequate meaningful participation from civil society organizations, academic experts, and affected communities in developing safety measures.
Topics
Legal and regulatory | Human rights
Economic incentives can be effective for promoting child safety
Speakers
– Lea Peters
– Audience
Arguments
Implementation Challenges and Solutions
Implementation Challenges and Solutions
Summary
Both acknowledge that economic arguments and considerations of financial impact can be powerful tools for convincing companies and governments to prioritize child protection measures, though they also recognize the challenges this creates when harmful content generates revenue.
Topics
Economic | Human rights
Similar viewpoints
Both speakers emphasize that protecting children online requires comprehensive, multi-faceted approaches that address the massive scale of online child sexual exploitation through technical safeguards, regulatory measures, and collaborative efforts.
Speakers
– Lea Peters
– Michael Terhorst
Arguments
Scale and Impact of Online Child Sexual Exploitation
Technical Safeguards and Precautionary Measures
Topics
Cybersecurity | Human rights | Legal and regulatory
Both acknowledge the significant challenges in enforcing child safety measures, particularly when dealing with platforms outside the EU and when economic interests conflict with safety objectives.
Speakers
– Michael Terhorst
– Audience
Arguments
Platform Cooperation and Transparency Challenges
Implementation Challenges and Solutions
Topics
Legal and regulatory | Economic
Unexpected consensus
Balancing child protection with educational access and rights
Speakers
– Lea Peters
– Audience
Arguments
Implementation Challenges and Solutions
Implementation Challenges and Solutions
Explanation
Despite coming from different perspectives (regulatory/advocacy vs. digital native), both speakers show unexpected consensus on the importance of ensuring that protective measures don’t inadvertently restrict legitimate educational access or harm vulnerable groups like the LGBTQI+ community. This demonstrates sophisticated understanding of the complexity of child protection.
Topics
Human rights | Legal and regulatory
Recognition of economic barriers to effective regulation
Speakers
– Lea Peters
– Michael Terhorst
– Audience
Arguments
Implementation Challenges and Solutions
Platform Cooperation and Transparency Challenges
Implementation Challenges and Solutions
Explanation
All speakers, despite their different roles (civil society, government regulator, and practitioner), show unexpected consensus in acknowledging and discussing the economic realities that complicate child protection efforts, including revenue dependencies and tax implications.
Topics
Economic | Legal and regulatory
Overall assessment
Summary
The speakers demonstrate strong consensus on the need for collaborative, multi-stakeholder approaches to child protection online, the importance of transparency and meaningful participation, and the recognition that economic factors significantly impact regulatory effectiveness.
Consensus level
High level of consensus with sophisticated understanding of implementation challenges. The agreement spans across different stakeholder perspectives (government, civil society, practitioners) and suggests a mature, nuanced approach to child protection that balances safety with rights and acknowledges real-world constraints. This consensus provides a strong foundation for collaborative policy development and implementation.
Differences
Different viewpoints
Revenue conflicts in platform regulation
Speakers
– Audience
– Michael Terhorst
Arguments
Tech companies and governments face conflicts of interest when regulating harmful content because these companies derive significant revenue from such sites and pay taxes to governments
Rather than implementing social media bans, the focus should be on creating safe online spaces for children while respecting their right to participation
Summary
The audience member highlighted the fundamental economic conflict where tech companies derive significant revenue (up to 6%) from harmful sites and pay taxes to governments, making regulation challenging. Michael Terhorst acknowledged this difficulty but focused on collaborative dialogue and positive press coverage as incentives, without directly addressing the revenue conflict issue.
Topics
Economic | Legal and regulatory
Unexpected differences
Approach to economic incentives in regulation
Speakers
– Audience
– Lea Peters
Arguments
Tech companies and governments face conflicts of interest when regulating harmful content because these companies derive significant revenue from such sites and pay taxes to governments
Economic arguments can be effective in convincing companies and governments to prioritize child safety, as online sexual violence has the same traumatic outcomes as offline violence
Explanation
This disagreement is unexpected because both parties are concerned with child protection, yet they have fundamentally different views on how economic factors should be leveraged. The audience member sees economic interests as obstacles, while Lea Peters sees them as potential tools for advocacy. This represents a strategic disagreement on whether to view economic factors as barriers or opportunities.
Topics
Economic | Human rights
Overall assessment
Summary
The discussion shows relatively low levels of fundamental disagreement, with most participants aligned on the core goal of protecting children online. The main areas of disagreement center around economic incentives and enforcement strategies rather than the underlying principles.
Disagreement level
Low to moderate disagreement level. The disagreements are primarily strategic and methodological rather than philosophical, focusing on how to achieve shared goals rather than questioning the goals themselves. This suggests good potential for collaborative solutions, though the economic conflict issue raised by the audience member represents a significant structural challenge that wasn’t fully resolved in the discussion.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers emphasize that protecting children online requires comprehensive, multi-faceted approaches that address the massive scale of online child sexual exploitation through technical safeguards, regulatory measures, and collaborative efforts.
Speakers
– Lea Peters
– Michael Terhorst
Arguments
Scale and Impact of Online Child Sexual Exploitation
Technical Safeguards and Precautionary Measures
Topics
Cybersecurity | Human rights | Legal and regulatory
Both acknowledge the significant challenges in enforcing child safety measures, particularly when dealing with platforms outside the EU and when economic interests conflict with safety objectives.
Speakers
– Michael Terhorst
– Audience
Arguments
Platform Cooperation and Transparency Challenges
Implementation Challenges and Solutions
Topics
Legal and regulatory | Economic
Takeaways
Key takeaways
Personal integrity development in minors is significantly impacted by digital environments, with over 300 million children globally affected by online sexual abuse and exploitation in the last 12 months
Creating safe online spaces through regulation and cooperation is more effective than blanket social media bans, requiring tailored solutions for different platforms and functionalities
Dialogic regulation (collaborative engagement with platforms) shows promise, with many smaller platforms actively seeking compliance guidance, though larger platforms often wait for official guidelines
Technical safeguards must work in combination – secure default settings, age assurance, content moderation, and reporting mechanisms are all necessary components of effective child protection
Age assurance technology is crucial but complex, requiring balance between privacy, functionality, and user-friendliness, with EU wallet systems planned but initially limited to 18+ verification
Economic arguments about the negative workforce impact of online violence can be effective tools for convincing both private sector and governments to prioritize child protection
Resolutions and action items
EU Commission guidelines for Article 28 DSA to be published in the coming weeks to facilitate easier enforcement
Continued advocacy for EU wallet systems to include younger age bracket verification beyond just 18+ users
KIT to continue three-step analysis process (functionality review, risk assessment, precautionary measure evaluation) for 30-40 ongoing platform cases
Follow-up discussion planned between presenters and audience member from Iraq to share specific regulatory strategies for telecom companies
Continued push from civil society organizations to maintain legal safeguards preventing discrimination against vulnerable groups while implementing protective measures
Unresolved issues
Enforcement challenges for platforms operating outside the EU remain difficult to address effectively
Revenue conflicts where tech companies and governments benefit financially from harmful content sites create ongoing regulatory obstacles
Lack of transparency from platforms regarding algorithmic usage and insufficient meaningful participation from civil society and survivors in development processes
Age assurance technology limitations – current systems cannot effectively distinguish between different minor age brackets (e.g., 13-15 vs 16-17)
Balancing protective measures with access to education and preventing discrimination against vulnerable groups like LGBTQI+ communities remains an ongoing challenge
The ‘Butterfly13’ problem – difficulty determining actual user ages when pseudonyms are used, creating safety risks
Suggested compromises
Dialogic regulation approach that engages platforms collaboratively rather than imposing immediate penalties, allowing providers to propose their own solutions as long as they effectively address identified risks
Graduated default settings based on age brackets rather than one-size-fits-all restrictions, with less strict measures for older teens
Using positive press coverage and public recognition to incentivize platform compliance rather than relying solely on punitive measures
Combining multiple precautionary measures in tailored combinations for different platforms rather than requiring identical solutions across all services
Holistic approach incorporating media literacy education alongside technical safeguards and regulatory measures
Thought provoking comments
Personal integrity though is nothing you are just born with or that’s being formed in a vacuum. It is influenced by our family, our social circle, the culture we grew up in and the experiences we make. Digital environments and technology have become an integral part of our lives. It is nowadays a huge factor in the development of personal integrity.
Speaker
Lea Peters
Reason
This comment is insightful because it reframes personal integrity from a static moral concept to a dynamic, socially constructed quality that is actively shaped by digital experiences. It establishes the theoretical foundation for why digital child protection is not just about preventing harm, but about fostering positive character development in the digital age.
Impact
This comment set the conceptual framework for the entire discussion, shifting it from a purely regulatory perspective to one that considers the developmental psychology of children in digital spaces. It provided the philosophical justification for all subsequent policy recommendations.
So our idea is we have something called dialogic regulation. So we don’t just, yeah, send a letter saying, okay, you have to, I don’t know, pay €5 million because your platform isn’t safe. We get in touch with the provider and say, okay, we found some deficits on your platform… And the providers don’t have to take those specific ideas. We present them. They can do something on their own. As long as it works, we’re good.
Speaker
Michael Terhorst
Reason
This introduces a novel regulatory approach that challenges the traditional adversarial model of enforcement. It’s thought-provoking because it suggests that collaborative regulation might be more effective than punitive measures, representing a paradigm shift in how governments can work with tech companies.
Impact
This comment fundamentally changed the discussion from theoretical policy to practical implementation, demonstrating how regulatory innovation can bridge the gap between child protection goals and industry cooperation. It prompted audience questions about the effectiveness of this approach and became a central theme in the Q&A session.
when you have a name like Butterfly13, you don’t know if Butterfly13 is a 13-year-old girl or a 46-year-old man. So that just doesn’t make it easier… what we need is age assurance because with age assurance you can make sure that, yeah, a specific, let’s call it age bracket is given.
Speaker
Michael Terhorst
Reason
This comment brilliantly illustrates the fundamental paradox of online child protection – the very anonymity features designed to protect children can also enable predators. It’s insightful because it shows how traditional privacy protections can conflict with safety measures, requiring innovative solutions.
Impact
This vivid example shifted the discussion toward the technical complexities of age verification and the delicate balance between privacy and safety. It made abstract policy concepts concrete and relatable, helping the audience understand why simple solutions don’t work in digital child protection.
these tech companies, they get their revenue, most of their revenues from these sites. For example, in Ethiopia, like 6% of their revenues from these sites, harmful sites. So it’s one of the greatest challenges for us… So it also affects the government because these tech companies are paying taxes to the government, right?
Speaker
Audience member Michael from Iraq
Reason
This comment is deeply insightful because it exposes the economic incentive structures that perpetuate harmful content online. It reveals how financial dependencies create systemic barriers to child protection that go beyond technical or regulatory solutions, highlighting the intersection of economics, governance, and child safety.
Impact
This comment dramatically expanded the scope of the discussion from technical implementation to systemic economic challenges. It forced the speakers to acknowledge that their European regulatory model may not be universally applicable and prompted a more nuanced discussion about different approaches needed in different economic and political contexts.
if you have capacity to kind of push from both sides or push the private sector, but also the government, it’s always useful to also use studies that show the negative impact of risks like sexualized violence on the overall economy of the country… people who suffer from, for example, PTSD, they often have difficulties in participating 100% later on in their professional life
Speaker
Lea Peters
Reason
This comment is thought-provoking because it reframes child protection from a moral imperative to an economic argument, suggesting that protecting children online is not just ethically right but economically rational. It demonstrates strategic thinking about how to build coalitions for child protection across different stakeholder interests.
Impact
This response showed how advocates can adapt their arguments to different contexts and audiences. It shifted the conversation toward practical advocacy strategies and demonstrated how child protection arguments can be tailored to resonate with economic and political decision-makers who might not be moved by moral arguments alone.
Overall assessment
These key comments shaped the discussion by progressively expanding its scope and complexity. The conversation began with theoretical foundations (personal integrity as socially constructed), moved through innovative regulatory approaches (dialogic regulation), confronted technical paradoxes (anonymity vs. safety), and ultimately grappled with systemic economic and political challenges. The most impactful comments were those that revealed underlying tensions and complexities rather than offering simple solutions – such as the economic incentives that perpetuate harmful content and the privacy-safety paradox in online spaces. The discussion evolved from a presentation of best practices to a more nuanced exploration of how context, economics, and power structures shape the possibilities for child protection online. The audience questions, particularly from the Iraqi participant, were crucial in challenging the speakers to consider the limitations and cultural specificity of their European regulatory model.
Follow-up questions
How effective will the dialogic regulation approach be once EU Commission guidelines are published and implemented?
Speaker
Michael Terhorst
Explanation
Terhorst mentioned that in two months he could say a lot more about whether the approach really works well, indicating this needs follow-up assessment after guidelines implementation
How can telecom companies be regulated when they derive significant revenue from harmful sites, especially in developing countries?
Speaker
Audience member Michael from Iraq
Explanation
This represents a complex challenge where economic interests of telecom companies and government tax revenue conflict with child protection goals, requiring further research on regulatory approaches
What are effective strategies for regulating tech companies in different legal systems outside the EU?
Speaker
Audience member Michael from Iraq
Explanation
The speakers acknowledged this is challenging and suggested further discussion, indicating need for research on cross-jurisdictional regulatory approaches
How can age assurance mechanisms for specific age brackets (13-15, 16-17) be effectively implemented in the EU Digital Identity Wallet?
Speaker
Michael Terhorst
Explanation
Current systems only verify 18+ status, but tailored safety measures require more granular age verification, which is still under development
How can transparency from platforms regarding algorithmic usage be improved and civil society participation be meaningfully included?
Speaker
Lea Peters
Explanation
This was identified as a gap in current DSA guidelines that needs further development to ensure proper oversight and accountability
What economic impact studies exist showing how online sexualized violence affects national economies?
Speaker
Lea Peters
Explanation
Peters suggested this as a potential advocacy tool but implied more research is needed to quantify these economic impacts for policy arguments
How can protective measures be designed to prevent restricting access to legitimate education, particularly for vulnerable groups?
Speaker
Audience member (digital native)
Explanation
This highlights the need for ongoing research to balance child protection with educational access and rights of marginalized communities
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.