High Level Session 4: Securing Child Safety in the Age of the Algorithms

26 Jun 2025 09:30h - 11:00h

High Level Session 4: Securing Child Safety in the Age of the Algorithms

Session at a glance

Summary

This discussion at the 20th Internet Governance Forum focused on ensuring child security in the age of algorithms, examining how algorithmic systems impact children’s digital experiences and well-being. The panel, moderated by Shivanee Thapa and featuring government ministers, tech company representatives, and child rights advocates, addressed the urgent need to protect children from algorithmic harms while preserving the benefits of digital technology.


Leanda Barrington-Leach from the Five Rights Foundation opened by highlighting alarming statistics: half of surveyed children feel addicted to the internet, nearly two-thirds feel unsafe online, and three-quarters encounter disturbing content. She emphasized that children’s negative digital experiences result from design priorities focused on maximizing engagement, time spent, and revenue rather than child welfare. The discussion revealed that algorithms can lead children from innocent searches to harmful content in just a few clicks, with platforms deliberately targeting children as a “golden demographic.”


Government representatives stressed the need for stronger regulation and enforcement. Norway’s Minister Karianne Tung announced initiatives including age limits for social media and banning phones in classrooms, while Sierra Leone’s Minister Salima Bah raised concerns about cultural erasure through algorithms trained on non-representative datasets. The European Commission’s Thibaut Kleiner highlighted the Digital Services Act as an example of effective regulation that holds platforms accountable.


Tech company representatives from TikTok and Roblox described their safety-by-design approaches, including private accounts by default for younger users, content filtering, and youth councils for feedback. However, advocates emphasized that self-regulation has failed and meaningful change requires enforceable standards. The panel concluded that protecting children online requires coordinated action across governments, platforms, civil society, and international organizations, with children’s voices central to designing systems that serve their best interests rather than commercial imperatives.


Keypoints

## Major Discussion Points:


– **Algorithmic Risks to Children**: The panel extensively discussed how algorithms designed for engagement and revenue maximization expose children to harmful content, addiction patterns, mental health issues, and manipulative design features like infinite scroll, autoplay, and targeted recommendations that can lead children from innocent searches to harmful content in just a few clicks.


– **Child-Centric Design and Rights**: Participants emphasized the need to redesign digital platforms with children’s rights at the core, implementing safety-by-design principles, privacy protections by default, and meaningful inclusion of children’s voices in platform development rather than treating child safety as an afterthought.


– **Regulatory Frameworks and Enforcement**: The discussion highlighted various regulatory approaches, particularly the EU’s Digital Services Act, age-appropriate design codes, and the need for robust age verification systems, with emphasis on the importance of enforcement rather than just policy creation.


– **Multi-Stakeholder Collaboration**: All panelists stressed that protecting children online requires coordinated efforts between governments, tech companies, international organizations, civil society, and families, with no single actor able to solve the problem alone.


– **Global Equity and Cultural Considerations**: The conversation addressed how algorithmic systems often reflect biases and may contribute to cultural erasure, particularly affecting children in the Global South who may not see their cultures and values represented in algorithm-curated content.


## Overall Purpose:


The discussion aimed to address child safety in the digital age, specifically focusing on how algorithmic systems impact children’s online experiences. The session sought to identify emerging risks, reimagine child-centric digital ecosystems, and explore accountability mechanisms across governments, platforms, and communities to better protect children online.


## Overall Tone:


The discussion maintained a serious, urgent, and collaborative tone throughout. It began with a stark presentation of the risks and harms children face online, establishing a sense of moral imperative. The tone remained constructive and solution-oriented as panelists shared their perspectives and initiatives, with participants demonstrating mutual respect and shared commitment to child protection. The conversation concluded on an inspirational note, with Minister Tung’s call to “stand on the children’s side,” reinforcing the collective responsibility and moral dimension of the issue. Despite addressing challenging topics, the overall atmosphere was one of determined optimism about the possibility of creating safer digital spaces for children through coordinated action.


Speakers

– **Shivanee Thapa** – Senior News Editor, Nepal Television; Moderator of the session


– **Karianne Tung** – Honorable Minister of Digitalization and Public Governance of Norway


– **Salima Bah** – Minister of STI Sierra Leone


– **Leanda Barrington-Leach** – Executive Director, Five Rights Foundation


– **Christine Grahn** – Head of Government Relations and Public Policy, TikTok Europe


– **Thibaut Kleiner** – Director for Future Networks, DG Connect European Commission


– **Emily Yu** – Policy Senior Director, Policy Development, Roblox


– **Thomas Davin** – Director of Innovation, UNICEF


Additional speakers:


None identified beyond the provided speakers names list.


Full session report

# Ensuring Child Security in the Age of Algorithms: A Report from the 20th Internet Governance Forum


## Executive Summary


The 20th Internet Governance Forum hosted a panel discussion on ensuring child security in the age of algorithms, moderated by Shivanee Thapa, Senior News Editor at Nepal Television. The panel brought together government ministers, technology company representatives, child rights advocates, and international organisation leaders to address the urgent need to protect children from algorithmic harms. The discussion highlighted alarming statistics about children’s digital experiences and emphasized the need to move beyond individual responsibility toward platform accountability and regulatory intervention.


## Opening Context and Problem Definition


The session began with Leanda Barrington-Leach from the Five Rights Foundation sharing a powerful question from a child: “Why won’t adults stand up for children? You watch everything we do online, you nag us to get off our devices, even though you stay firmly glued to yours, and now you just want to outright ban us. When are you going to stop making out that we are the problem instead of the system? Why don’t you stand up for us?”


Barrington-Leach presented concerning evidence of the current crisis: half of surveyed children feel addicted to the internet, nearly two-thirds feel unsafe online, and three-quarters encounter disturbing content. She provided concrete examples of algorithmic harm, explaining how children can progress from innocent searches to harmful content in just a few clicks – from slime to pornography in a single click, from trampolining to pro-anorexia content in three clicks, and receive nudges toward self-harm in just 15 clicks. She argued that these outcomes represent “a feature, not a bug, of the system.”


## Current Risks and Systemic Challenges


The panel identified multiple dimensions of harm that current algorithmic systems inflict upon children. Beyond immediate safety concerns, speakers highlighted how algorithms designed for engagement and revenue maximisation expose children to addiction patterns, mental health issues, sleep deprivation, and social isolation. The design features that drive these harms include infinite scroll, autoplay, and targeted recommendations that prioritise negative or extreme content.


Thomas Davin from UNICEF framed these issues as a public health crisis, comparable to tobacco or alcohol regulation. He expressed particular concern about neuroplasticity impacts from screen time affecting children’s brain development. Davin also highlighted a societal risk: “We are at risk of losing the notion of the concept of truth. As those algorithms bring those children into more and more of things they believe to be true, they are more and more certain of that truth and of holding that truth, and they are more and more reluctant to actually connect or open up to others who may say, well that is not my truth.”


Salima Bah, Minister of Science, Technology and Innovation for Sierra Leone, introduced concerns about cultural erasure, explaining that algorithms trained on datasets that don’t reflect African diversity and values pose risks to cultural identity: “One of the most concerning things for us as well is the potential for cultural erasure with algorithm recommendations because we understand that these algorithms are trained on data sets that potentially don’t reflect our diversity or the diversity of our societies or our realities.”


## Challenging the “Digital Native” Myth


Thibaut Kleiner from the European Commission’s DG Connect challenged the “digital native” assumption: “I think we should stop using the term digital natives when we speak about children, because sometimes, you know, you get this idea that you can leave the children with the technology and they are very savvy and they can get their way out, and we don’t understand the technology so well, whereas they do. Actually, the studies we conduct in the EU show that there is a very superficial understanding of the technology among children.”


This reframed children as vulnerable rather than naturally tech-savvy, requiring protection rather than independence. Barrington-Leach built upon this by challenging common “solutions” that shift responsibility away from platforms: “Don’t please put the burden back on parents via parental controls. For example they’re not working, we know they’re not working or on to children we heard they are not digital natives, digital literacy is not a silver bullet.”


## Government Regulatory Approaches


Government representatives outlined various regulatory approaches being implemented globally. Karianne Tung, Norway’s Minister of Digitalisation and Public Governance, announced significant initiatives including a 15-year age limit for social media platforms and banning phones in classrooms. She emphasised three essential principles: age verification, appropriate design, and banning behavioural advertising for children.


Kleiner highlighted the European Union’s Digital Services Act as an example of effective regulation that provides platforms with clear responsibilities for child protection backed by enforcement mechanisms. He noted concrete successes, such as TikTok’s withdrawal of its TikTok Lite product following regulatory intervention.


Bah described Sierra Leone’s proactive approach, working with local startups on safety-by-design principles while developing comprehensive online safety legislation specifically focused on children. She noted that 15% of Sierra Leone’s internet traffic flows through TikTok, highlighting the platform’s massive impact and the need for safety collaboration.


## Platform Responses and Industry Perspectives


Technology company representatives described their safety approaches. Christine Grahn from TikTok Europe outlined several protective measures: private accounts by default for users under 16, no direct messages until age 16, screen time limits, and content filtering systems. She also highlighted TikTok’s global youth council with representatives from 15 countries providing input to senior leadership.


Emily Yu from Roblox described their “Trust by Design” programme, which integrates children’s rights into product requirements from the start. She explained their approach of focusing on discoverability rather than personalised content limitation, combined with parental controls and content labelling systems. Roblox also operates a teen council giving teenagers worldwide opportunities to provide feedback on policies and platform direction.


However, these industry responses faced scrutiny. Barrington-Leach argued that current platform business models remain fundamentally incompatible with child safety, as they prioritise revenue through maximising time spent, reach, and activity rather than child welfare.


## The Need for Meaningful Child Participation


A significant theme was the need for meaningful child participation in digital governance decisions. Barrington-Leach observed: “Children need to be at the policy table. There are no children in this room today and generally they are completely left out.”


Platform representatives highlighted their youth engagement mechanisms, with both TikTok and Roblox describing formal youth councils. However, advocates emphasised that true participation requires more than consultation exercises. Davin stressed that “children’s voices must lead to visible action and change, not just consultation exercises.”


## Global South Perspectives


Minister Bah’s contributions highlighted how Global South perspectives reveal overlooked dimensions of digital harm. Beyond cultural erasure concerns, she emphasised that digital platforms serve as public goods bridging digital divides and exposing young people to opportunities beyond their immediate realities. This created a complex balance between recognising platforms’ positive potential whilst ensuring safety protections.


## Technical Standards and Innovation


The discussion addressed the need for technical solutions and standards to support child protection. Kleiner emphasised the need for robust age verification mechanisms and technical standards providing practical guidelines for innovators. He noted that the EU plans to introduce a robust age verification app and publish child protection guidelines.


Davin highlighted UNICEF’s digital child rights impact assessment, which provides participatory tools for companies to understand their impacts on children. He also emphasised the need for transparency in algorithmic operations.


Barrington-Leach made a crucial point about the engineered nature of digital systems: “Digital world is 100% human-engineered and can be optimised for good just as easily as for bad.”


## Areas of Consensus and Disagreement


Despite representing different sectors, panelists demonstrated consensus on several key points: the need for children’s meaningful participation, safety by design principles, multi-stakeholder collaboration, acknowledgment of current system failures, and the effectiveness of well-designed regulation.


However, significant disagreements emerged around the effectiveness of parental controls, the compatibility of current business models with child safety, and what constitutes meaningful participation versus tokenistic consultation.


## Future Directions


The discussion identified several areas requiring further investigation: neuroplasticity impacts from screen time, economic costs of inaction, technical challenges around age verification, and cultural representation in algorithmic systems.


Concrete commitments emerged: Norway will continue implementing its age limits and classroom phone bans, the EU will publish child protection guidelines, Sierra Leone will finalise its online safety legislation, and technology companies committed to continuing development of youth engagement mechanisms.


## Conclusion


The discussion concluded with Minister Tung’s call to “stand on the children’s side,” reinforcing the collective responsibility for child protection online. The panel demonstrated that while significant challenges remain, there is growing recognition among stakeholders about the need for action.


The session’s emphasis on treating digital harm as a public health issue, combined with recognition of children’s rights to meaningful participation, provides a framework for moving forward. Barrington-Leach’s observation that digital systems are “100% human-engineered and can be optimised for good just as easily as for bad” underscored that current harms result from design choices rather than inevitable technological outcomes, suggesting that change is both necessary and possible.


Session transcript

Shivanee Thapa: Please welcome to the stage, the moderator, Ms. Shivanee Thapa, Senior News Editor, Nepal Television. A very, very good morning to all the distinguished dignitaries, colleagues, participants present here. Thank you. Thank you so, so, so much for making it here this morning, despite the musical blast we’ve had last night. Thanks to the Norwegian government and the entire people who could put up all those good things together and for this very, very wonderful management all through the proceedings herein. With that said, I’m Shivanee Thapa, Bosnian, ladies and gentlemen. I am a senior journalist working with the state media in Nepal Television and feeling so, so profoundly privileged to be a part of this very important session at the 20th IGF here in Lindström. To our online participants as well, a very warm welcome. Just to a quick mention that we are at this session discussing one of the most pertinent issues of our times, an issue which no longer is an emerging risk. It is urgent, it is complex, and it is deeply, deeply personal. That is ensuring child security in the age of algorithms, ladies and gentlemen. And we all know algorithms are not passive tools. We know that quite well, right? We are in fact very, very active architects of these children’s digital experiences who are shaping what, not only children, but what we see, how long we stay on the screens and increasingly how we feel, and today as we discuss child security and safety, what these children see. As a practicing journalist, I’m very much committed to, you know, covering issues of public interest, and as a mother of a teen, my 14-year-old Vivaan is seated somewhere amongst you, and both professionally and personally, I see so clearly that what’s at stake at the moment is not just screen time. It is childhood itself. Ladies and gentlemen, therefore, this session today that we’ve tailored for you will unfold in three key arcs. We’ll hear from experts on the emerging risks first, linked to algorithmic curation. We’ll then reimagine what a child-centric digital ecosystem could look like from our perspectives or their expert perspectives, and finally we will make an attempt to explore on the accountability that must mean in this particular space from governments, from platforms, and from all of us as community of practice. I will be joined in this conversation, very, very distinguished personalities and stalwarts who have been at the reins of getting things rolling in that particular domain. So ladies and gentlemen, at this panel today, I’ll be joined in by the Honorable Minister of Digitalization and Public Governance of Norway, Ms. Karianne Tung, the Minister of STI Sierra Leone, Ms. Salima Bah, the Executive Director, Five Rights Foundation, Ms. Leanda Barrington-Leach, Head of Government Relations and Public Policy, TikTok Europe, Ms. Christine Grahn, Director for Future Networks, DG Connect European Commission, Mr. Thibaut Kleiner, Policy Senior Director, Policy Development, Roblox, Ms. Emily Yu, and last but not the least, Director of Innovation, UNICEF, Mr. Thomas Davin. Ladies and gentlemen, let’s give a round of applause as I welcome on the stage my very distinguished members of the panel. Such a pleasure it is, Madam Minister, warm welcome, a very warm welcome. Please take your seats. Such a privilege it is for me to be placed across you at this, I mean, as we discuss a very, very pertinent question of our times and, of course, for the future. And with our distinguished panelists joining us here, and all set, all poised to engage in one of, as I said, one of the most defining challenges of our times, ladies and gentlemen, to first and foremost to anchor this session with clarity and purpose, I have this pleasure to invite Ms. Leanda Barrington-Leach, the Executive Director of the Five Rights Foundation, to open the conversation, get it rolling, in fact, through this opening presentation. You have five minutes.


Leanda Barrington-Leach: Good morning. At Five Rights Foundation, we speak on behalf of children around the world. On behalf of children, thank you for inviting us and thank you for being here. Speaking of the digital world, a child not long ago asked me, why won’t adults stand up for children? You watch everything we do online, you nag us to get off our devices, even though you stay firmly glued to yours, and now you just want to outright ban us. When are you going to stop making out that we are the problem instead of the system? Why don’t you stand up for us? She had a point. Digital devices are today one of the leading causes of family disputes. Google’s head of families recently said that parents are spending upwards of 4 to 12 hours a week trying to manage their children’s online usage. Day after day, they are fighting and losing. Children also are fighting and losing. They are losing their control, their sleep, their ability to make connections, to pay attention and to think critically. They are losing their health, sometimes even their lives. I am not exaggerating. I am not generalizing. Consistently, around half of children surveyed say they feel addicted to the internet. Nearly two-thirds say they often or sometimes feel unsafe online. More than three-quarters say they encounter content they find disturbing, sexual content, violence, hate. A quarter to a third are bullied online. Half experience sexual harms, a quarter sextortion. Rates of ADHD, depression, eating disorders, child sexual abuse, and suicide are going through the roof. The acceleration of AI is now set to supercharge these risks and harms. But it does not have to be this way. Children’s digital experience is not a result of the technology itself, but it does reflect the priorities of those who own, build, and deploy it, including AI. To change children’s experience, those priorities must change. Today, most of the services where children spend most of their time are designed with three primary purposes, all geared towards revenue generation. Maximize time spent, maximize reach, and maximize activity. Typical design features used to reach these objectives include push notifications, infinite scrolls, autoplay, likes, in-game purchases, random rewards, making it easy to share and easy to connect with friend or follower suggestions. Five Rights’ pathways research showed that social media accounts registered as children were all subject to messaging from strangers and illegal or harmful content within hours of being set up. While these accounts were targeted with advertisement for games, sweets, and such like specifically for kids, they were at the same time also recommended harmful content from sexualized to pro-suicide material by algorithms that weight negative or extreme content five times higher than neutral or positive content. Children can go from a simple search for slime to porn in just a single click, or from trampolining to pro-anorexia in just three clicks, and nudge to self-harm in 15 clicks. It is clear that the problem is a feature, not a bug, of the system. Indeed, whistleblower reports and leaked internal documents show how time and again tech companies are aware of the harm they are causing children and choosing to do it anyway. One platform that I shall not name as it is no particular outlier sees children as the golden demographic and looks to hook them young. Digital research notes that it takes the equivalent of 35 minutes on the app to form a habit, and concludes, I quote, compulsive usage correlates with a slew of negative mental health effects like the loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety, as well as interfering with essential personal responsibilities like sufficient sleep, work or school responsibilities, and connecting with loved ones. It does not need to be this way. The digital world is 100 per cent human-engineered. It can be optimised for good just as easily as it can for bad. It should be optimised for children by design and default. Children’s rights as set out in the most ratified human rights treaty. must be respected, including by the tech sector. Children have the right to safety, to privacy, to freedom from commercial exploitation, to play, to participation, and meaningful agency. The best interests of the child must come first. Tech exceptionalism has to end, and like all industries, the sector must abide by established laws, reflect societal norms, and be held accountable for its impact by democratic oversight. How to do this is already well established. The General Comment 25 to the Convention sets out what states and businesses need to do. The age-appropriate design code principles embedded in law from the UK to Indonesia, the EU to California, set out enforceable regulatory requirements. Technical standards and protocols provide detailed practical guidelines for innovators. Rights respecting responsible innovation for children is perfectly feasible, but it means forfeiting some awfully profitable practices. Resistance is, therefore, unsurprisingly fierce. Perhaps you have witnessed some of it on this stage over the last few days. The arguments range from scoffing at moral panic to claiming that AI moves too fast to be regulated. The tactics range from never-ending legal battles to threats to shut down critical services. The unavoidable question, therefore, is who will set the rules and decide what good looks like? Will we continue to allow private, for-profit tech corporations to hold increasing power over all areas of public and private life, unanswerable to governments and citizens? Children are asking you to stand up for them, for their rights and a better, fairer future. So I ask you today. Innovators, will you stand up for children? Policymakers and regulators, will you stand up for children? Citizens, fellow human beings, will you stand up for children? If we stand together, with and for children, we can and will build the digital world they deserve. Thank you.


Shivanee Thapa: Thank you. Thank you so much, Leanda. Thank you for grounding us so, so powerfully. And with this, as we believe, the tone for the conversation has been set so, so right. Thank you. Let us now open the conversation to the full panel. And we’ve just heard from Leanda, from this very opening presentation, how algorithmic curation designed to personalize sometimes can unintentionally shape how children think, feel and interact. Now, based on the current evidence and lived realities, what sort of emerging risks should we draw from our collective attention, from regulators, tech platforms and families alike, is, I believe, the very first question we want to contemplate during this session. I’d want to begin with Madam Minister. Madam Minister, from Norway’s policy vantage point, what do you see as the most oppressing risks children face in algorithm-governed digital spaces today?


Karianne Tung: Thank you, moderator. And I also would like to take the opportunity to say thank you to Leanda for bringing the children’s voice to the stage. That is really important. And thank you also for addressing this really important question and one of the most pressing issues of our time. That’s child safety in the age of the algorithms. As technology continues to shape children’s life, from the videos they play to the games they play and the information they consume, we must really ask ourselves now, are we doing enough to protect them? And I don’t think we are doing enough because algorithms, they are powerful tools for personalization and engagement, but they do also expose children to harmful content, to bias and to manipulation. They can shape behavior, they can influence choices and they do serious damages when it comes to mental issues, body issues and also mentioned by Leanda here. So just let me be clear on one thing. Protecting children online is not about limiting their freedom. It is about empowering them to navigate the digital world safely, confidentially and with dignity. And it is about ensuring that technology serves their personal growth and not the other way around. So in my opinion, the platforms need to take more responsibility for taking down content that is damaging and prohibited and they also need to secure appropriate age verification and that’s something we are working on in Norway right now so that we can protect our children better.


Shivanee Thapa: Well, thank you, Madam Minister. May I turn to Minister Bah from Siera Leone’s context, Madam Minister, and the broader global reflections as well which have been reverberating in many other forums under this 20th IGF. What risks concern you most as we consider the intersection of technology, childhood and regulation?


Salima Bah: Thank you, thank you so much. And may I use this opportunity again to thank the Government of Norway for hosting us. They have been fantastic hosts and IGF for putting on this session and also to all the panelists. Really from the Government of Sirio Leone’s perspective, I think it’s such a critical area when we think about the engagement of children online and how the algorithm recommendations are set and the impact that they have. What’s been mentioned across board is something we also is of great concern to us when we think about addictive algorithms and the potential linkages we’ll see to self-harm and where that leads to with young people. When we think about manipulative recommendations and also when it comes to we think children get desensitized because as you’re scrolling through one minute, you see violence. The next minute, you’re exposed to sexual content. So the repetition of that and the almost normalization of that where as a government from our region, we’re concerned about how sometimes those are linked to the increase in violence that we see with young people as well and definitely concerned about how we protect against those. But I would say maybe from ARP, maybe it’s slightly unique but maybe it applies as well across other regions as well. One of the most concerning things for us as well is the potential for cultural erasure with algorithm recommendations because we understand that these algorithms are trained on data sets that potentially don’t reflect our diversity or the diversity of our societies or our realities. And that means children in Africa from our regions are growing up in an environment or through these where they’re exposed to languages, to expression and just identity that don’t reflect them. We’re not saying you don’t want them to see others but it’s the fact that they might not be seeing something that reflects our culture and our societies and our identity and our values as well. So with that, there’s a potential for cultural erasure and then just the adoption of cultures from elsewhere and sometimes the under-representation from our regions as well. So as a society from our region, that’s something we’re also really concerned about is how do we, as much as we know the internet and social media is good for exposing young people and to different other societies, but we also want to make sure that they can also see their own society reflected on these platforms as well.


Shivanee Thapa: Right. Let me quickly move to Ms. Christine Grahn. With your role at TikTok, how does the platform interpret and respond to this evolving risk posed by the algorithmic systems for the young users?


Christine Grahn: Thank you, first of all, to Minister Chung and the government of Norway for hosting us and to IGF for inviting us to this very, very important conversation. We as a platform think it’s really crucial that we show up and that we engage in these conversations. And if you will allow me, I would like to just take one step back and mention that TikTok is a place that people come to because it’s fun, because you can figure out what book to read over summer, where you can plan your next trip or where you can discover new artists. And also a place where our community come to express themselves and can be creative. And this really wouldn’t happen if it weren’t also a safe and inclusive place. So all of the work that we do in terms of keeping the platform safe, we do for our community and to be able to meet our mission, which is to inspire creativity and bring joy. And coming to your question, I mean, I think we all agree that this is a highly complex area. It’s very, very fast evolving. It’s difficult to be a parent. Adolescence can be hard. Every child is different. And I think the only way we can address this is with kind of a fact-based and constructive debate. And to address the risk properly, we need to define what the risks are, and we do that mainly via research. And research does support, to the Minister’s point, that there are a lot of very positive aspects. with spending time online also for young people. Also, there is research that confirms that those who are at the biggest risks are ones that are already exposed to risks in their everyday life, so if you belong to the most vulnerable part of the community or society, you’re also more exposed to risk online. So platforms like ours, we need to, I mean we have a huge responsibility in general to keep our platform, our community safe, but in particular when it comes to these most vulnerable groups. And we do this in many different ways. We do it by design, so when you create a TikTok account as a 13-year-old, you will have a kind of a face introduction to the platform. So we don’t give access to direct messages, for example, until you reach the age of 16. When you’re under 16, your account is set to private by default, and we have screen time limits in place for everyone who’s under 18. We have content levels to limit exposure to content that might not be suitable for younger audiences, and we interrupt disruptive patterns. But I also think that partnerships are key here. We have a huge network around the world, and I really, really agree with the Minister of the importance to be a global platform for our global community, where we can also support civil society organizations to be able to reach these vulnerable communities on our platform and make sure that they speak in a voice that this community truly can integrate. And we also support the teens, so when they are in need of help, it should be frictionless and easy to reach out to experts to get that support that they might be in need of. There’s always more to do, and as I said, we’re showing up, we’re here to always do better, but I think an important starting point is that we keep the conversation truly fact-based and work to find solutions together.


Shivanee Thapa: Right, thank you. Let me turn to Mr. Kleiner. What would be your perspectives from a regulatory and policy perspective at the EU level?


Thibaut Kleiner: So, thank you for organizing this very important panel, and I think also, yes, hearing the voice of children I think is very important here, and I would say that maybe we should stop using the term digital natives when we speak about children, because sometimes, you know, you get this idea that you can leave the children with the technology and they are very savvy and they can, you know, get their way out, and we don’t understand the technology so well, whereas they do. Actually, the studies we conduct in the EU show that there is a very superficial understanding of the technology among children. They can use the apps, but they don’t understand what is underneath. They can be tricked, and what we see is really that it is not something you can totally brush aside. You know, we have to take responsibility, and I think that in the EU, this is precisely what we are trying to do. We recognize the benefits and we try to empower children, but we want to make sure that the technology providers take their responsibility. Just don’t let this as this is a problem for society and for parents, because if you look at algorithms, they are doing serious harm, and we have introduced in the EU the Digital Services Act, so it’s hard regulation that precisely aims to give this responsibility to platforms in partnership, because we want them to precisely develop also better ways to know what age the children have online. You know, this is something that today you can infer, but actually we want to do more. We want to have age verification through very serious mechanism, and we will introduce actually an app in the coming month so that you can really verify the age of minors online. Secondly, we are really taking this very seriously, and we don’t hesitate to actually open proceedings when we find that the platforms are not delivering on their responsibilities under the Digital Services Act. We have already eight open cases, four with adult content platforms that are not sufficiently protecting children, but also with the lack of meta of TikTok and Facebook, and I think this is important, because we don’t want just to have rules that are not implementing. So at the end of the day, I think that today is an important moment, because we have the opportunity to keep the good parts of what is on offer, but to also collectively act on the risks, and in that context we are launching now a very important effort on guidelines, also for the platforms. The consultation just closed, it will be published hopefully by summer, as well as an inquiry to precisely identify these risks and make sure that we take measures to address them.


Shivanee Thapa: Right, I think this is the right moment for me to turn to Ms Emily Yu. Roblox, with its immersive and very, very interactive environment, how do you assess the unique vulnerabilities that algorithm might introduce for children on your platform?


Emily Yu: So safety is at the heart of everything we pretty much do at Roblox, and we evaluate every upcoming product feature with a safety by design perspective, making safety a fundamental requirement for everything that we do. We’ve, for example, launched last November a set of pretty robust parental controls that include screen time limitations that parents can set. We’ve also introduced content ratings or content labeling within our systems so that parents have awareness as to what an experience holds, and they can obviously permit or not permit their child from entering that experience. We have parental control permissions in place so that if a child decides, I’m interested in participating in this experience, they receive direct permission from their parents in doing so. And with regards to algorithms used for recommendation systems on the platform, our focus there is more on discoverability rather than limiting the content that is seen by the child based on personalization. There are millions of experiences on the platform, and what we prefer to surface are higher quality experiences and newer emerging experiences that our audience may be interested in.


Shivanee Thapa: Thank you. Mr Thomas Davin, UNICEF has a global overview of both systemic and child-specific impacts. What does the current research suggest we should be most alert to?


Thomas Davin: I’m afraid Leandra stole my thunder. She covered so perfectly the risks, and I want to maybe start by saying from a UNICEF lens, we do look at technology and AI also very positively. There’s so much we can do on learning outcomes, on health outcomes, on climate outcomes, and we’re very much bullish about trying to understand how we leverage that technology for good. But there are a number of risks, and maybe I’ll touch on what Leandra touched on to maybe re-emphasize some of the longer issues that are in our view really societal issues that some of the technologies are currently underpinning today. So one is mental health, clearly, with so many areas of mental health around nutrition, around self-harm, around so many of these issues. Addiction as a part of that significant growth, explosive growth of addiction. I would say in a somewhat genderized manner, you have boys that largely go into gaming addiction, and girls tend to do more social media addiction. So it’s not exactly the same resolution mechanism either. The other one connected to that is social isolation, with again significant potential societal costs and consequences to that. The third one I would mention is maybe a little bit philosophical, but is we are at risk of losing the notion of the concept of truth. As those algorithm brings those children into more and more of things they believe to be true, they are more and more certain of that truth and of holding that truth, and they are more and more reluctant to actually connect or open up to others who may say, well that is not my truth. And so we are to some extent jeopardizing that whole concept of what is truth, what is fact, because everybody including adults are being fed by what really we are connected to or resonating with. Maybe one issue that Leandra maybe didn’t touch on but will be interesting and important for us as society to really dig deep into is the impact of neuroplasticity. And so what is going to be the impact of those screen time and the fast space connection on really children’s brains and their ability in different ways. And we don’t really fully understand that yet, and we believe it should be a priority in terms of research and longitudinal studies to understand maybe there are some gains we are not certain, but there’s probably also some things we are losing that also plays out in adults. I don’t know about you, but I read less and less because my brain is actually less and less patient about longer-term focus, and that certainly is true for my own children. I have two teenagers. The last bit I would talk about is maybe agency, that we have a risk of really children feeling less and less able to have voice and agency on how those technologies affect them, impact them, and maybe direct some of what they have access to or what they can say. So overall I think we need to treat this as a really as a public health issue. This is an area that we don’t fully understand, this is an area that will have societal consequences and very likely significant economical costs if we don’t manage it appropriately.


Shivanee Thapa: Thank you, thank you so much. So what we are hearing across sectors represented in this panel is of course a lot of sense of optimism and commitment in your actions and in your thoughts, and also a very prominent shared concern that opaque algorithmic systems are influencing certainly children’s mental health, social well-being, development, and you know so many aspects in ways that certainly demand more coordinated attention and action. We believe that is what we heard in this round of conversation. Now let us pivot to identifying the risks, I mean like from identifying the risks to imagining solutions. So if we were to re-engineer or if we were to redesign the social media environment with children’s well-being at its core, centrally placed and not as an afterthought but as a foundational principle, what would that look like? More importantly, how can we ensure that young people are not just consulted but meaningfully involved in shaping systems that govern their digital lives? May I begin from Ms. Leanda?


Leanda Barrington-Leach: Thank you so much. Yes, the first thing I would say is the answer to your second question, how can we ensure that young people are meaningfully included? Children need to be at the policy table. There are no children in this room today and generally they are completely left out. I’d also say that I am talking about children and we often replace children with young people and often that can be like 25 to 35 year olds. Of course it’s useful to hear their voice too but that doesn’t mean that we can ignore the youngest. The second question about you know how can we ensure that we put, what are the right principles to put at the centre? I think I mentioned a few and so have others, so the first is really privacy and safety by design and default. What that means a lot of the time is turning things off, making sure that children’s experience you know stays private and they have real agency and choice. A lot of this is about process. I left at the entrance some copies of the children and AI design code. What you’ll see is it’s a very technical protocol, a step-by-step guide to asking the right questions, consulting the right experts and then testing and documenting who decided based on what criteria. A lot of the times asking the right questions is the most important thing rather than other people outside prescribing potentially what is the right answer and this is a process which leads to risk assessment and risk mitigation as I say by design and default. A few just guiding principles, don’t shut children out and don’t please put the burden back on parents via parental controls. For example they’re not working, we know they’re not working or on to children we heard they are not digital natives, digital literacy is not a silver bullet. So age verification, parental controls, controlling content and digital literacy these are not the solution.


Shivanee Thapa: Let me turn to Christine. Christine, TikTok’s given you know TikTok’s massive engagement among the youth audiences, the questions of embedding child-centric values into your algorithmic infrastructure certainly is not theoretical, it’s operational. So what would be your inputs to this question or this thought of reimagining?


Christine Grahn: Well we very much agree to the concept of safety by design and to Danda’s point turning things off as a default for the younger kind of segments of users. So I mentioned a few examples as an answer to your previous question around not allowing access to direct messages, keeping the younger teens accounts private by default and having these settings off as a starting point and then kind of over time as they grow older introduce them to more and more features on the platform. So we very much agree with that kind of base concept and we also agree with the importance of listening to the community and also the younger users on our platform and actually last year I think as a first platform we introduced a global youth council. We have representatives from 15 countries around the world, Brazil, Nigeria, Poland, Indonesia just to mention a few and it’s a forum where they can in a setting kind of created for them share their views with us directly but also and maybe you know even more importantly here in this conversation it’s a way for our most senior leaders to also hear from the youth community directly. So our first global youth council, our CEO was present. I’m going to be president one that we have in a couple of weeks and I think this is a really really important platform for that conversation. We also work with researchers to kind of indirectly hear from younger people via research and we don’t just listen we also change and I think that’s an important kind of next step because otherwise there’s no point in these listening exercises right. To just give one a concrete example we worked with a British NGO Internet Matters that spoke to both teens and parents that told us that authenticity and belonging is very very kind of core parts of their online experience and in response we made some global changes to age restrict certain beauty filters that would alter their appearances so that they could feel that authenticity on the platform. So that’s a kind of a very concrete example of what we do with what it is that we hear. We also find other ways to listen to to our community. We know that teens come to TikTok to learn and we want to obviously encourage that. We actually see reading and polling go up in the younger segments of society so we have huge projects around book talk to really really encourage that. We also make product choices, conscious product choices to capture this educational interest so we have rolled out a STEM feed which captures science, technology, math and engineering content fact-checked so it’s kind of pre-vetted and it’s on by default for everyone who’s under 18 so it’s sitting next to your for you feed and actually we see on the numbers that this is very appreciated so we really really do and we have every interest if you think of it to listen to the community and really adapt.


Shivanee Thapa: Right, that’s incredible. Let me turn to Minister Tung now from a policy and you know governance standpoint, Madam Minister. Building child-centric systems perhaps requires not just principles but mechanisms of inclusion, right? So what frameworks do you think can guarantee their voices are heard in the process?


Karianne Tung: Thank you. I think it’s necessary to change the logic behind the platforms. We need to get away from addictive designs and we have to implement models that really protect the children, I believe. In Norway, this might sound easy or natural or whatever, but in Norway we had a white paper where we said that children aren’t goods or commodities because I think that’s the main issue that children they are seen as goods on these platforms and we need to get away from that point of view and we need to be sure that there are some principles that are the foundations for the algorithms on the different platforms that is about making sure that we are standing on the UN Convention on the right of the children. We need to be sure that we have openness and that the platforms are transparent and understandable for everyone. You have to understand how you can choose your content both as a parent but also as a child and we need to have age-appropriate design and that’s why and also to ban behavioral advertisement on the platform as well and we need to stop profiling children on the platform because they are not mature enough to take these good decisions on behalf of themselves and the children’s well-being has to be the things that we put first when we are letting our children live their lives on the social media platform because as I said in the beginning I really believe also the platform can empower the children and then we need to hear the children’s voice and the children are screaming out please protect us we have to listen to the children people.


Shivanee Thapa: So this certainly shows there’s a very very clear consensus amongst us that children-centric design certainly begins with intention and but it must also be followed by inclusive design, design processes in fact and reflect that the lived realities and voices of young users themselves right. So with this let me hop on to the other question you know as regulation gains momentum globally right because many governments are now exploring or implementing regulations aimed at protecting children online. We just heard some very very engraved concerns that were reflected even during the course of this discussion. The challenge right now I believe lies in translating policy into platform level action. So we ask what are the promising policy approaches and what role should companies play in proactively aligning their design choices with children’s right? May I turn to Madam Minister Ms. Salima Bah, Sierra Leone, from Sierra Leone, how do you see this from your


Salima Bah: regional and national vantage point? Thank you, definitely I think the clamor for regulations when it comes to just the online space even for adults to be honest and definitely for children I think is a mad scramble going around government, policymakers, everybody’s looking across to see what’s worked in other places what can I adapt and I think it’s a testament to just the ever evolving nature of technology so you feel as if you have one solution I think at some point everybody thought the parental controls were the greatest things and now we see that changing in behavior so I think absolutely we see that and that’s definitely true and for our region as well even though I think our region actually have been a bit slow towards to this party if I could call it that this child protection party and I think it might be as a result of we were slower to the internet age I think we were grappling with issues of connectivity, we were grappling with issues of affordability of that, we were grappling with issues of inaccessibility of devices but obviously now we see there’s the growing expansion of the internet in our region, there’s the growing access to devices and also our youthful bulge is growing up so we’re seeing I think so now we’re also feeling the effects with more African children now have readily having access to these platforms having access and to these devices we’re seeing the impact in terms of rising in the cyber bullying and so many other factors even though we take the good parts of course which we’re not saying it’s all bad but we’re also seeing some of the negative effects also starting to impact us and the conversations around okay what do we do to ensure safety so in Sierra Leone as well one of the things we’re looking at is an online safety legislation specifically looking at children and how we ensure that especially when we look at maybe some of the negative impacts that are not really within our regions yet but we’re trying to see how we regulate in anticipation that’s because we’ve seen how that has impacted other other areas specifically so definitely as we’re doing these regulations we’re looking at potential best examples see what other countries have done the the GDPR we look at that we look at so many other regions just to kind of see what’s worked or hasn’t worked and how can we adapt some of these and for our own specific needs as well but I think definitely one of the key aspects of it as well is I think we understand that regulations alone policies standing alone really won’t get you I really think you need to work with the companies not just the big tech companies by the way we also need to work with the startups as well right because I think maybe with the big techs it’s one of those things where we’re a bit more reactive now all of a sudden we’re reactive now the platforms were already being designed now we’re reacting to see how they can introduce these safety measures in but I think with the startups the edtech startups coming up or so many other startups coming up it’s about how we work with them now at the stage so that we ensure us everybody’s been speaking about that it is in the initial design and it’s not an afterthought something they think of later and we’re seeing good uptake to be honest within our region I’ll highlight an example there’s a bunch of young people in Sierra Leone who are developing edtech solutions with AI component and to my surprise the other day I was having a conversation with them and they’re telling me how the solution they built first of all the AI is a bit homegrown so they don’t tap into the global AI data sets because they wanted to make sure that the children using it getting information from our society and our cultures what in our education system so I thought that was great but one of the really interesting things so the platform is like a learning management tool and it has this ask feature called like a learning body where children can go on a chat box and have a conversation with each other and and learn from each other and what they’ve done is introduced an AI which makes sure that the only thing you can talk about is education related so we see some of those solutions already starting and I think it’s about how as government and policymakers we work with companies to make sure that in the design of this solutions we’re addressing the issue from there.


Shivanee Thapa: Thank you, let me turn to Mr. Thomas Davin how can regulatory and corporate efforts be better aligned to ensure the digital products respect and reflect children’s rights?


Thomas Davin: So we see quite a lot of progress depending on nations on some of these regulatory systems I guess many of them have started from looking and tackling the most egregious issues for children right so child prostitution, grooming online etc the transition to understanding how other aspects of technology can be harmful is a little bit slower and quite often many nations are regulating technology in general without actually a specific angle on children. Some are doing that faster than others, part of what we see is really effective is an alliances of the regulatory approach and to some extent the monitoring approach of is this working, is this implemented, what are we doing if it’s not implemented that quite often remains a little bit vague so what happens if it’s not by design disengagement for children, what happens if there is a sense of addictive behavior, behavior hold built into the system, what do we then do? It’s not always clear from a regulatory platform systems and so trying to get to that stage and saying this is what happens. Having elements to also guide companies and Minister Bah just mentioned this is many of the companies are actually willing but maybe they are uncertain so UNICEF developed a digital child right impact assessment which is a tool, one of the tools to understand what will happen, what can happen and that is a participatory process so we bring in different elements of societies, children themselves, adolescent, young people to also speak to what they feel are right design approaches, wrong design approaches and then again questioning what then happens once that voices and Christine spoke to that, that’s quite important because many of the children we speak to say you ask us for views and then nothing happens and so that’s quite often where we feel we need to really get to that stage where once you engage children it has to be meaningful, if you want it to be meaningful it means action needs to be taken and needs to be visibly taken so visibly monitored, transparency is quite important. Another element that we see really powerful is when companies agree to kind of allow anybody to look under the hood, in other words to understand this is how the algorithm functions and once we are going to do this, this is what will change and you can then monitor that together as a society again to try to understand whether there is a sense of progress on issues we together identified. One of the interesting, so I think the Minister Tung didn’t really go fully but we think the white paper on digital upbringing in Norway is quite interesting, I think again together with regulatory we need to look at this as a public health issue which means everybody needs to be on it, parents need to understand it better, many parents once you start having conversation with your teenager, if your teenager has had a screen for the last five years it’s a little bit late and indeed it means you’re going to fight probably a losing battle and again having two teenagers I know what that means and so really guiding parents and trying to understand what are maybe steps that you need to to have in mind is going to be quite important. Bringing children, there is an interesting initiative in Scotland where they brought the Children’s Parliament from Scotland to act as a mirror to the adult Parliament on AI specifically with the Alan Turing Institutes and the Scottish AI Coalition, they are essentially going into various use cases of AI and bringing back to Parliament issues and recommendations on legislative pathways to tackle that. So there’s multiple areas of work that we can bring that together, part of what we are trying to do is kind of bring that knowledge back. back and offer that as a panel of options for different countries and societies to pick up.


Shivanee Thapa: Building on these very insights, may I turn to Ms. Emily Hu. How is Roblox even integrating child rights thinking into its design decisions, even ahead of regulation?


Emily Yu: Yes, again we have a program called Trust by Design in which we basically take at the requirement level what are fundamental rights of children and how do we end up then incorporating them into the product features that we will later publish to the platform. We also have recently launched a teen council as of earlier this year where we get a lot of feedback from teenagers throughout the world in terms of what, you know, if we have any additional or updated policies, what their feedback is on that and find out from the teens themselves what they are interested in and what they want to move forward with. I think what’s really interesting about the Roblox platform as well is it’s become a space in which children and teenagers are able to express themselves in areas where they maybe normally aren’t able to do so in their, you know, real world experiences. So as an example we’ve heard from a number of vulnerable groups within the Roblox platform that say that they have the ability to communicate with others that are like them that they wouldn’t normally have in the real world space and so we are really, really supportive of trying to foster and enable that form of communication and play while at the same time maintaining privacy and protections in place to keep everyone safe.


Shivanee Thapa: Mr. Kleiner, let me turn to you. Which regulatory tools or frameworks do you believe are moving that needle most effectively in holding platforms accountable to children’s rights?


Thibaut Kleiner: So I would say that first of all regulation works. I mean we’ve had very concrete examples just thinking about a recent case we opened against TikTok where we found that in TikTok Lite there was some addictive behavior and I think that we could have a positive result because this was withdrawn from the features of the platform and generally speaking this is very much our experience that if you design legislation and you enforce it properly this works. You cannot just count on the goodwill of companies that are making profits to change their features unless they have really some pressure also coming from the regulators but we want to do that very effectively and that’s why we are going to publish these guidelines to improve child protection online. I can tell you that there are several elements we want to focus on like assurance methods to restrict access to age-inappropriate content. This is very much the case for adult content websites. We have four open procedures against them in the EU but also elements like setting accounts as private by default, reducing the risk of unsolicited contact by a stranger, making sure that the recommender systems are also reducing the risk for children especially about these rabbit holes where you get in contact with harmful content or elements linked with the possibility for children to take control, to block and mute user and ensure that they cannot be added to groups without their own agreement. So I think there are a series of measures we can take and through regulation enforce them. Another very important element is that we need to be serious about all this. It’s not enough to just pay lip service to the safety of children and that’s why we are about to introduce also mechanisms that are robust to identify the age of users online. You need to know when somebody is a minor because otherwise you are exposing them to potentially dangerous content and that’s why I think it’s great what Norway is doing with this white paper. I think a lot of very very positive elements but if I may I would invite Norway to actually adopt the Digital Services Act and actually invite more countries around the world to adopt the similar provisions in the Digital Services Act because this is today the only way to change the reality and to force somehow also platforms and content providers to take this issue seriously and if I may I think that as a last point we are here at the IGF and somehow what is a bit sad is that we don’t see enough the emergence of services online or digital products that really are addressing children as an audience and maybe that’s the challenge for this community. We need to really develop the right contents and make it also a business model for some companies. This is not at all the case today and I think this is also why we are failing because we are just trying to fix something which is not made for children.


Shivanee Thapa: Thank you so much. So what emerges here certainly is the growing interplay between regulation and responsibility. The policy can certainly set the floor but true impact certainly comes from when companies integrate children’s rights into their design ethos from the very outset. So building on this, ensuring that digital platform serves the public interest especially when it comes to children, it cannot rest on only one actor. It requires deliberate coordination as I even read from the reflections here across sectors, borders and of course mandates. What concrete steps can tech companies, governments and international bodies take together to make this shared responsibility a reality? This is a question and with which I would first turn to Madam Minister Tung.


Karianne Tung: Thank youand I think that we need to act more coordinated because no one can solve this problem alone. I think we all agree on that one. So first I believe that international organizations need to be better coordinated. I think we’ve got great views here both from UNICEF, the European Union, the Five Rights Foundation. I think we’ve been given a lot of important knowledge today so be able to be better internationally coordinated. I think that would be step number one. Step number two of course, the governments need to have good regulations, good laws. We are now, as we are speaking, implementing DSA in Norway now. We are sending out a law proposal because I really believe in the DSA. That is a good regulation for the European continent, keeping children safe. And number three, the tech companies need also to take more responsibility to make their platforms safer for children. So if the government do their part, international organizations do their part, tech companies do their part, I think we will take a huge step forward and I also want to give compliments to the tech companies for taking important steps since we started to discuss this topic of children. But we need to do more.


Shivanee Thapa: Thank you. I’ll come back to Minister Bah shortly after. Let me turn to Mr Thomas Davin. How can international agencies like UNICEF act as the connectors across policy platform and the civil society to institutionalize this responsibility?


Thomas Davin: So it’s not an easy act because I think we face a reality, and several panel members talked to that, of really companies looking for the bottom line and the bottom line is more money and it means more people on that platform and it means that part of when we talk about addiction behavior, it’s part of what brings and keeps people on the platform. And so we are in the middle, the international organizations do not really have a power to talk about this, so part of what we are really trying to do is really talk about what’s an incentivized regulatory platform that enables these companies to feel if I do it well, it’s good for the bottom line as well. And if I don’t do it well, there are consequences and the EU mentioned that. And so really looking into this but looking at it again from a win-win perspective in as much as we can, I think part of what we are trying to build a little bit deeper into where we don’t have enough data is understanding also what is the cost of inaction. So we are very clear about what’s the cost of inaction on smoking, on alcohol, on drugs, on sugar, not so much on technology. What is going to be the economic cost societally and the GDP cost? So we’re trying to be better at also telling that story. I think we’re also trying to really bring best practices in ways that enables other countries to benefit from Norway’s experience, to benefit from Scotland’s experience about what works and trying to look as well from an education system having children as actors of their own lives. So we did talk about digital literacy and the sense of digital natives. We fully agree that they need to be better equipped to understand how those technologies actually function, to also be able to make choices and make informed choices. Even from quite a younger age, once they start understanding all of those metrics, they are better at managing their own risks in many, many ways. And again, empowering parents to have that knowledge of how do I help my child in the same way that you say if you’re in the playground, you do not go with an adult that you do not know. even if he or she says he’s coming from your parent, you have a safe password or you have something, having parents being able to have those conversations with the children, understanding, helping them understand those are the risks of those platforms, this is the limits that we set for ourselves as a family that you should set for yourself as a user. All of this is important, but again it’s going to take I think an acknowledgement that this is not just a new thing, it is a public health issue, we need to treat it as such.


Shivanee Thapa: Thank you. Let me turn to Ms. Yu, as I ask you to add your perspective to this central question, I’d certainly be happy if you would add your thoughts on the partnership or co-regulatory approaches that have shown real potential in advancing child protection at the platform level at Shoebox. I think,


Emily Yu: I know that recent years one of the regulations that I really found to be quite effective was the age-appropriate design code from the UK. I appreciated the fact that with respects to especially transparency to children and like basically upholding children’s rights, it really did a I think a very good job of that. With regards to like trying to solve for this problem, I think there are a number of things that a company could potentially do. One is of course multi-stakeholder engagement and getting involved in a number of industry working groups to solve problems. For example, we’re involved with the tech coalition and their lantern program in which we’re sharing signals amongst companies to take action against child sexual exploitation and abuse. We do a number of other, we are involved in a number of other working groups as well with child safety in mind. Another potential solution is with of course again safety by design, having safety as a fundamental requirement to all product features that would eventually get published on the platform. We feel that it’s very critical to ensure that given that our audience is predominantly under the age of 18 that we take care of them and have a responsibility of taking care of them on the platform. Also engaging in participating and getting involved or getting involvement from youth and getting their opinions and feedback and understanding what they’re experiencing on the platform and how we can resolve any of the pain points or issues they may be dealing with. I think that’s also very critical as well. And then finally developing education to basically express not only what Roblox is to everyone but to also provide education to both parents and children alike to ensure the safety protection and privacy of themselves on the platform.


Shivanee Thapa: Ms. Leanda, what does meaningful collaboration look like from a child rights advocacy standpoint and where are the current gaps?


Leanda Barrington-Leach: Thank you. Well, I think there are lots of gaps. So firstly I’ll echo I think a few points already made but one indeed governments have an absolutely critical role. Self-regulation has not worked and good regulation does work and I appreciate the endorsement also given to the age-appropriate design code and I really applaud the European Commission for the work that the EU is doing here and also the African Union has done some good work. So governments need to regulate, they need to implement and most importantly they need to enforce and this really really does take resources and political will. They also need to invest in broader awareness raising and capacity building in particular of civil society. At the moment the traditional groups who support communities and ensure that oversight are not aware do not have the knowledge and capacity to work in this area. So capacity building of civil society is something that urgently requires resources and supports. International bodies need to continue to promote coherence. So this is a global problem, we need global solutions, global standards and no one must be left behind. Industry and technical bodies have a really important role to play not only in developing technical standards and certification mechanisms but also in investing in educating industry professionals, in particular engineers. Like civil society they really need support in this area to understand children’s rights and needs. And finally tech companies need to play ball, I mean really meaningfully play ball. They need to get their act together and they need to do really what they do best. So innovate to respond to demand by understanding that in this case the demand is for products


Shivanee Thapa: and services that genuinely will be beneficial to children. Let me get back to Mr. Kleiner. How Mr. Kleiner, can regulatory frameworks be designed to encourage and not to enforce joint responsibility among these stakeholders?


Thibaut Kleiner: So I think that regulation as I was saying we believe works if you implement it properly and this is where also we want to have a constant dialogue between the European Commission and the platforms that are subject to the Digital Service Act but also to the Audio Visual Media Service Directive that are also provisions to protect children. We also want indeed to have this way to measure the age of users online, we think this is really essential to be real about what is happening. But generally speaking this requires really a constant discussion and debate, that’s why we have opened this consultation on the guidelines that we intend to publish by the end of this summer, Article 28 of the Digital Services Act, but also we are trying to have now some inquiry precisely on mental health and we are preparing some initiatives around cyber bullying to be specific about this issue. But more generally what I would say is that this is not a one-off, it is a constant effort because the technology is evolving all the time and you have new issues, new problems. One year is a long time when you are 12 years old which also means that you need to be very fast and very agile in reacting to new developments and this is why we have also in the EU the so-called safer internet centres. They are really locally established and we have had more than 35 million single visitors to these websites which shows the magnitude of the issue we are talking about. So families, children, they want to know more, they want to be not only educated but they want to have also support to address very concrete issues. Conversations in families are difficult around this, try to indeed block access to games or to social media with your children. I can tell you it’s not easy and that’s where also I think that collectively we need to provide these resources. So again this safer internet centres is one of the success stories of our efforts to make a better internet for kids as we say it, but I think that for me the end message is that it cannot be something that we just leave to parents or we leave to platforms or we leave to governments. We have to work together.


Shivanee Thapa: Turning to Miss Christine from your TikToks experience, what shared standards or governance models have helped bridge the public-private divide in protecting young users?


Christine Grahn: So as one of the subjects to, amongst other regulations, the Digital Services Act, we actually appreciate it because it provides a level playing field. It also provides a forum for this ongoing conversation, not just with the regulation but also with civil society and other actors and I truly, truly believe that we have a better chance of being successful if we do work together and it’s not about shying away from the responsibility or passing it on to someone else, it’s about efficiency. So let me just maybe illustrate that with an example. I’m from Sweden and sadly we’ve seen a development over the last few years where teenagers, young teenagers even, are pulled into criminality by gangs and as an isolated player, as a platform in this instance, we can make sure to have policies in place, we can make sure to enforce those policies and we can have channels with law enforcement authorities but this is also not going to address the root cause, this is at best going to be a mitigation once something has already gone wrong. So when BRIS, which is a minor safety NGO in Sweden, was tasked by the Swedish government to increase the support available for teenagers generally but in particular those at risk for being pulled into this environment, we decided to partner with them and we did so with full force. So we found creators that they were not previously in touch with that could be that kind of voice that would speak to those that they wanted to reach, we found creative agencies that could help them with the expression that would really, really speak to these teenagers, we helped them with the strategic campaigns and I have unfortunately, given the topic that we’re talking about, it was it was a success, there was a need for this, so we saw, or they saw rather, sorry, a 60% increase of calls to their helpline. The campaign in itself had 26 million video views, 2.7 unique viewers and this is in a country where we have around 10 million inhabitants and about 3.2 million users on TikTok, so there was clearly a need that we could help really, you know, them reach out to this group. We also did that in a very close dialogue led by the Minister of Justice in Sweden that gathered indeed public authorities, civil society organizations and some of the platforms where we can have that forum to kind of continue and build and iterate and I also think that it’s now grown to a Nordic initiative, so it’s also something that we’re trying to address collectively at a Nordic level. I think also there is a need to have a mutual kind of understanding and transparency when we talk about these issues. I think as a government, if you will allow me, I think it’s actually quite important to use platforms like TikTok to truly understand what it is that you’re trying to regulate and how you can facilitate collaboration. On our end, we’re trying to be or we’re highly committed to rather transparency. I welcome or invite everyone to spend some time in our transparency center where we talk very, very openly about the number and I believe still as the only platform the number of accounts that we’ve removed because we suspect they’re underage. We talk about our success and enforcement of our policies, we talk about how the algorithm works and we talk about products more in detail to really get a bit further down into creating that understanding and I think that’s really, really important when it comes to building that kind of trusting relationship, not just with our community but also with society around us that we’re a very kind of integrated part of at this point.


Shivanee Thapa: As we engage in this conversation even at the moment, I think it’s important we pay heed to the fact that there is a huge divide between the worlds, a digital divide and a divide in the apparatuses surrounding this which are so central towards empowering the digital world, right? So, I think it’s so, so important and I’m sure that’s a big, I mean one of the biggest challenge for governments, for companies and the players to deal with. So, as I make this statement, let me turn to Minister Mbah. How can countries from the global south be meaningfully included in global coordination efforts, ensuring that solutions are both inclusive and equitable?


Salima Bah: Thank you, thank you. I think I’d really like to re-echo a lot of my colleagues have really talked about which is really critical, the need to collaborate along international organizations, along civil society, the tech companies and as well really it’s a whole of stakeholder approach. It can’t be just a job of either government or the job of companies themselves to do. It really takes all the stakeholders to come on board and really work together and I think maybe one of the biggest also points to make is we do believe that these digital platforms inherently are public interest and goods, play a significant role in the advancement of society in terms of just exposing when we talk about bridging that digital divide and making sure everybody has access to the same opportunities and because of these platforms we see young people now aspiring to beyond what is in their immediate societies and realities and I think we have to acknowledge it as government we use these platforms to make sure that we now have the whole public sphere which used to be controlled by traditional media now as government, all of our government sectors have, we all have social media accounts and that’s how we communicate directly to people and to children so I think it’s an effective that way. I think even if you look at, I was saying this to somebody the other day, you know surprisingly is a surprising stat, 15 percent of the traffic in Sierra Leone goes through TikTok so we think that’s a huge huge aspect in terms of looking at the impact and it has and we’re already working with some of these specifically for example TikTok, we’re working with the African African-American office in terms of looking at how we ensure that these platforms are safe because with us we know it serves a good huge public good but we also have to ensure us we’ve been talking about that it’s safe and we’re rolling out trainings, we’re rolling out capacity building and initiatives, we’re rolling out platforms in terms of ensuring how can we more efficiently flag some of these harmful contents that we see and I think maybe for us within the global one of the problems we’ve faced is it felt as if it was a long drawn out process that we didn’t really wasn’t was clouded a bit in terms of how do we report some of these contents that we see because within our region we understand the context more sometimes rather than maybe somebody from outside so I think we’re working with them mm to get that but maybe just so the final point even within governments though I think there’s a huge need for collaboration even amongst ourselves in terms of understanding what we’re doing for example you talk about this space you’re talking about dealing with a ministry such as myself that is responsible for the digital economy but then obviously there’s a ministry of gender and children affairs specifically so we have to make sure things are aligned that way you’re talking about also and potentially the ministry of education how we’re ensuring that within the education system the digital platforms that are available and can improve learning outcomes are doing that for example with the advent of the child gpts and the ais of the world we have the ministry of information we’re working with them to see issues such as deep fake deep fake now is a significant thing and how we’re ensuring that people recognize and these are deep fakes or you know all of a sudden especially when you’re in a society who might not be as digitally literate and even some people don’t understand there’s such things as deep fake sometimes you see something is out in the social media platforms people believe that it is true and our government has to be flexible enough to quickly respond and to alleviate fares you have the parliament also which you have to get on board so I think really the underlining message is there’s a lot of cross collaboration that needs to happen and right across thank you


Shivanee Thapa: thank you madam minister and now as we approach to the very close of this very important conversation I’m sure 90 minutes wouldn’t do justice to this very huge topic that we’ve come here but I’m sure this adds one stone or a pebble to the very silent pond that’s how I would like to put it now as we come to the concluding moments of this session I would want to invite each of our panelists I think given the time I can allow you less than 60 seconds to each to make your I mean share your final takeaway or your call to action so can we begin with Mr. Thomas Davin.


Thomas Davin: I was afraid you were starting with me I would have preferred to go to go last I’m not quite ready I think I think it’s the potential of technology is immense when we look at again the outcome for children and again through these platforms that we’ve talked all the gaming children learn so much they get exposed to so much that they may not be exposed the potential is fantastic the risk is very genuine and I think what we are seeing is we need to take that with an urgency that we have not yet completely seized across the globe I would say maybe the EU is maybe slightly ahead but I think that is quite important and in doing that we need to build on the leaders those who are leaning in whether it is TikTok and the examples we’ve heard what Roblox is doing for best practices so that again we push forward in saying in building the world we want with children at the heart and with their voices in meaningful collaboration impacting decisions on designs and shifts in designs as they keep voicing what works for them and what may not.


Shivanee Thapa: Thank you, Ms Emily Yu.


Emily Yu: I think I would definitely say that we want to maintain and keep children at the heart of what we do and also take into consideration their viewpoints and opinions in addition to that I think that there’s been I mean from a historical perspective perhaps some tension between safety and innovation and we find that actually we can in fact innovate with keeping safety fundamental to that so yeah we’re basically trying our best to not only entertain and allow children around the world to connect with one another but also to keep them safe and secure and their information private.


Shivanee Thapa: Thank you, Mr Kleiner.


Thibaut Kleiner: So I really would like to thank also the Government of Norway for hosting us here at the IGF. I mean it has been already a fantastic conference and I think this panel was very very rich and I think full of very important conclusions so what I would really want to do as a close is to have a call. because I think it’s not enough to understand the issues. Now we need really collectively to take this in our hands, which means look at the regulation that the EU has introduced, try to expand it around the globe, have real measures to have age verification in place that work, and make sure also that we really make the Internet a better place for children by having age-appropriate content, but also services that are really targeting children because technology is a wonderful opportunity and our kids are great.


Shivanee Thapa: Thank you. Miss Christine Grahn.


Christine Grahn: Yeah, thank you. So we want to continue to be a place that people come to have fun and explore and learn, right? And as I’ve underlined throughout this conversation, safety is a prerequisite for that. Now we don’t always get everything right. We put in a huge amount of work to keep our platform safe, and we are industry-leading in many ways in this space, but we want to continue to listen and learn and iterate and rise to the challenge. And I’m sure that all of you that have taken time to listen to us today have a lot of very good input as well, so I would like to invite all of you to come to our booth as well and continue this conversation so that we can engage more.


Shivanee Thapa: Thank you. Miss Leanda.


Leanda Barrington-Leach: So I’d reiterate what some have said, that there is huge potential, huge potential, and children are very optimistic, thankfully. We know what needs to be done, and as Emily also said, we can do it. But I would just, a word of caution, do not be naive. Big change needs to happen. We cannot allow children’s futures to be at the mercy of commercial imperatives, which at the moment they are. This is not going to be something that’s going to be easy to wrestle back. So please keep up the good work. Please implement, please enforce, please regulate. We can help. And please listen to children, and please stand up for them.


Shivanee Thapa: Thank you so much. Minister Bah.


Salima Bah: Thank you. Thank you so much. I think definitely we echo a lot of what has been said in terms of the huge potential that technologies bring and these platforms bring. And I think as we mentioned, specifically for regions such as ours, we see how young children who, there’s actually research done about the potential of a young person who grows up with access to these platforms versus another one who grows up without and the outcomes from them. So we understand those. And I think maybe another topic which really we haven’t talked about today, but I think it’s also a huge topic area, is just also the huge potential economic benefits that are accessible to young children if they can fully participate on these platforms as well. But obviously we also have to ensure that they’re safe whilst they’re using this platform. And I think maybe another point as a call to action is really responsible use of these platforms as well might be another conversation to have at another time. So really the call to action is again to all stakeholders and specifically the companies is to really come on board with us and really look at this issue as critical. Maybe, I know sometimes there’s a tendency when we say Africa and then there’s just maybe one representative or two, but it’s really a diverse region and we hope that we see that being reflected in terms of the engagement across board.


Shivanee Thapa: And finally, Minister Tung.


Karianne Tung: Thank you. And first I want to say thank you to the panellists because I think this session shows the importance of IGF and the multi-stakeholder model. It brings us together at the same floor, being able to have tough conversation to be able to move forward really. We need to protect the children better than we are doing today. And as a minister of digitalisation, I’m a tech optimist. I really believe technology can help us solve huge societal challenges, help us close the digital gap and also make new opportunities. But we have to put the ethical perspective first. We have to put the children first. We have to put the citizens first because also it’s about the parents and adult people as well. So in Norway, we are doing a lot of things. We have the white paper on going up digitally safe. We are now trying to implement an age limit of 15 years old to the social media platform. We work together with Google and Microsoft to put personal privacy first when children are using this platform in school. And we have banned mobile phones from the classroom. So we do a lot of things. We do it together with the company. I want to continue to work together because we can’t do this alone. In the end, for me, it is a question looking back, I don’t know, 10, 20 years from now. I ask myself, what kind of side of history do I want to stand on? And I want to stand on the children’s side. And I invite really all of you to be on the same side.


Shivanee Thapa: Wow. We couldn’t have ended in a better note than this. I reiterate, we were engaged in discussion on a topic which is no longer an emerging risk. It’s too urgent, too complex, and too personal, I believe, to each one of us, right? And what we’ve heard today from this very distinguished panel is very, very clear that protecting children in this digital age and in the age of algorithms is not just a technical challenge for us. It is certainly a moral imperative, right? And the future of digital governance must, as I could reap from the essence of this distinguished panel, must be built with and not just for our young citizens. Well, thank you so much to all our distinguished panelists for the great leadership, the great undertakings that you’re doing in your own respective niches, and for the great value you have accorded to this panel with your very, very gracious presence. A special thanks to the Government of Norway and to Madam Minister for this great initiative and for the great takeaways we all are having to do from the Lillstrom Conference. So with this, I thank the members in the audience for your presence and your engagements as well. With this, I rest my microphone and call off the session as I invite my distinguished panelists to kindly step forward for a group photo. Thank you.


L

Leanda Barrington-Leach

Speech speed

157 words per minute

Speech length

1716 words

Speech time

654 seconds

Children face addiction, loss of control, sleep deprivation, and inability to make connections due to algorithmic design

Explanation

Barrington-Leach argues that children are losing fundamental aspects of healthy development including their ability to control their usage, get adequate sleep, form meaningful connections, pay attention, and think critically due to how algorithms are designed.


Evidence

Consistently around half of children surveyed say they feel addicted to the internet


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Sociocultural


Agreed with

– Karianne Tung
– Thomas Davin
– Shivanee Thapa

Agreed on

Current algorithmic systems pose significant risks to children’s mental health and development


Half of surveyed children feel addicted to internet, two-thirds feel unsafe online, three-quarters encounter disturbing content

Explanation

Barrington-Leach presents statistical evidence showing the widespread nature of harm children experience online, including feelings of addiction, safety concerns, and exposure to inappropriate content.


Evidence

Nearly two-thirds say they often or sometimes feel unsafe online. More than three-quarters say they encounter content they find disturbing, sexual content, violence, hate. A quarter to a third are bullied online. Half experience sexual harms, a quarter sextortion.


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Cybersecurity


Current platforms prioritize revenue through maximizing time spent, reach, and activity rather than child welfare

Explanation

Barrington-Leach argues that the fundamental business model of most platforms where children spend time is designed around revenue generation rather than child well-being, creating inherent conflicts with child safety.


Evidence

Today, most of the services where children spend most of their time are designed with three primary purposes, all geared towards revenue generation. Maximize time spent, maximize reach, and maximize activity.


Major discussion point

Platform Design and Business Model Problems


Topics

Economic | Human rights


Disagreed with

– Christine Grahn
– Thomas Davin

Disagreed on

Platform business model compatibility with child safety


Algorithms weight negative or extreme content five times higher than neutral or positive content

Explanation

Barrington-Leach reveals that algorithmic systems are specifically designed to promote negative and extreme content over neutral or positive content, which directly harms children’s online experience.


Evidence

algorithms that weight negative or extreme content five times higher than neutral or positive content


Major discussion point

Platform Design and Business Model Problems


Topics

Sociocultural | Human rights


Children can go from innocent searches to harmful content in just a few clicks due to algorithmic recommendations

Explanation

Barrington-Leach demonstrates how algorithmic recommendation systems create dangerous pathways that can quickly lead children from benign content to harmful material with minimal user interaction.


Evidence

Children can go from a simple search for slime to porn in just a single click, or from trampolining to pro-anorexia in just three clicks, and nudge to self-harm in 15 clicks.


Major discussion point

Platform Design and Business Model Problems


Topics

Human rights | Cybersecurity


Tech companies are aware of harm they cause children but choose profits over protection

Explanation

Barrington-Leach argues that internal documents and whistleblower reports show tech companies have knowledge of the harm their platforms cause to children but deliberately continue harmful practices for financial gain.


Evidence

whistleblower reports and leaked internal documents show how time and again tech companies are aware of the harm they are causing children and choosing to do it anyway. One platform sees children as the golden demographic and looks to hook them young.


Major discussion point

Platform Design and Business Model Problems


Topics

Economic | Human rights


Disagreed with

– Christine Grahn
– Thomas Davin

Disagreed on

Platform business model compatibility with child safety


Safety must be designed by default with privacy protections, turning harmful features off for children

Explanation

Barrington-Leach advocates for a fundamental shift in platform design where safety and privacy protections are the default state for children, requiring active choices to enable potentially harmful features rather than requiring children to opt out.


Evidence

privacy and safety by design and default. What that means a lot of the time is turning things off, making sure that children’s experience stays private and they have real agency and choice.


Major discussion point

Child-Centric Design Solutions and Best Practices


Topics

Human rights | Legal and regulatory


Agreed with

– Karianne Tung
– Christine Grahn
– Emily Yu

Agreed on

Safety by design and default is essential for child protection


Age-appropriate design code principles from UK to Indonesia provide enforceable regulatory requirements

Explanation

Barrington-Leach points to the global adoption of age-appropriate design codes as evidence that enforceable regulatory frameworks for child protection online are both feasible and spreading internationally.


Evidence

The age-appropriate design code principles embedded in law from the UK to Indonesia, the EU to California, set out enforceable regulatory requirements.


Major discussion point

Regulatory Frameworks and Policy Approaches


Topics

Legal and regulatory | Human rights


Agreed with

– Thibaut Kleiner
– Karianne Tung

Agreed on

Regulation works when properly designed and enforced


Children must be included at policy tables rather than being completely left out of decisions affecting them

Explanation

Barrington-Leach emphasizes that meaningful child participation requires their physical presence and voice in policy-making processes, not just consultation or representation by adults.


Evidence

There are no children in this room today and generally they are completely left out. I’d also say that I am talking about children and we often replace children with young people and often that can be like 25 to 35 year olds.


Major discussion point

Children’s Rights and Meaningful Participation


Topics

Human rights | Legal and regulatory


Agreed with

– Christine Grahn
– Emily Yu
– Thomas Davin
– Shivanee Thapa

Agreed on

Children need meaningful participation in digital governance decisions


Digital world is 100% human-engineered and can be optimized for good just as easily as for bad

Explanation

Barrington-Leach argues that since digital systems are entirely created by humans, there are no technical barriers preventing the creation of child-friendly digital environments – only choices about priorities and values.


Evidence

The digital world is 100 per cent human-engineered. It can be optimised for good just as easily as it can for bad.


Major discussion point

Technical Standards and Innovation Approaches


Topics

Infrastructure | Human rights


Meaningful collaboration requires governments to regulate and enforce, while companies must genuinely integrate children’s rights

Explanation

Barrington-Leach outlines that effective collaboration requires each stakeholder to fulfill their specific role – governments must create and enforce regulations while companies must genuinely innovate to meet demand for child-beneficial products.


Evidence

Self-regulation has not worked and good regulation does work. Industry and technical bodies have a really important role to play not only in developing technical standards and certification mechanisms but also in investing in educating industry professionals.


Major discussion point

Multi-Stakeholder Collaboration and Shared Responsibility


Topics

Legal and regulatory | Human rights


K

Karianne Tung

Speech speed

138 words per minute

Speech length

1021 words

Speech time

442 seconds

Algorithms expose children to harmful content, bias, manipulation and cause serious mental health and body image issues

Explanation

Minister Tung argues that while algorithms are powerful tools for personalization, they create significant risks for children including exposure to harmful content and manipulation that damages their mental health and self-image.


Evidence

algorithms, they are powerful tools for personalization and engagement, but they do also expose children to harmful content, to bias and to manipulation. They can shape behavior, they can influence choices and they do serious damages when it comes to mental issues, body issues


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Cybersecurity


Agreed with

– Leanda Barrington-Leach
– Thomas Davin
– Shivanee Thapa

Agreed on

Current algorithmic systems pose significant risks to children’s mental health and development


Age verification, appropriate design, and banning behavioral advertising for children are essential principles

Explanation

Minister Tung outlines specific policy measures that Norway considers fundamental for child protection online, including robust age verification systems, child-appropriate platform design, and prohibiting targeted advertising to children.


Evidence

we need to have age-appropriate design and that’s why and also to ban behavioral advertisement on the platform as well and we need to stop profiling children on the platform because they are not mature enough to take these good decisions on behalf of themselves


Major discussion point

Child-Centric Design Solutions and Best Practices


Topics

Legal and regulatory | Human rights


Agreed with

– Leanda Barrington-Leach
– Christine Grahn
– Emily Yu

Agreed on

Safety by design and default is essential for child protection


Norway is implementing 15-year age limit for social media platforms and working on better age verification systems

Explanation

Minister Tung describes Norway’s specific policy approach of setting a higher age limit for social media access and developing more robust systems to verify users’ ages to better protect children.


Evidence

We are now trying to implement an age limit of 15 years old to the social media platform. We work together with Google and Microsoft to put personal privacy first when children are using this platform in school. And we have banned mobile phones from the classroom.


Major discussion point

Regulatory Frameworks and Policy Approaches


Topics

Legal and regulatory | Human rights


Agreed with

– Leanda Barrington-Leach
– Thibaut Kleiner

Agreed on

Regulation works when properly designed and enforced


International organizations, governments, and tech companies must coordinate better as no single actor can solve the problem alone

Explanation

Minister Tung emphasizes that protecting children online requires coordinated action across different types of stakeholders, with each playing their specific role in a comprehensive approach.


Evidence

I think that we need to act more coordinated because no one can solve this problem alone. So first I believe that international organizations need to be better coordinated. Step number two of course, the governments need to have good regulations, good laws. And number three, the tech companies need also to take more responsibility


Major discussion point

Multi-Stakeholder Collaboration and Shared Responsibility


Topics

Legal and regulatory | Human rights


Agreed with

– Salima Bah
– Thomas Davin
– Thibaut Kleiner
– Shivanee Thapa

Agreed on

Multi-stakeholder collaboration is necessary as no single actor can solve child protection alone


S

Salima Bah

Speech speed

160 words per minute

Speech length

2174 words

Speech time

814 seconds

Cultural erasure through algorithms trained on datasets that don’t reflect African diversity and values

Explanation

Minister Bah argues that algorithmic systems trained on non-representative datasets pose a unique risk to African children by exposing them primarily to content that doesn’t reflect their own cultures, languages, and values, potentially leading to cultural erasure.


Evidence

these algorithms are trained on data sets that potentially don’t reflect our diversity or the diversity of our societies or our realities. And that means children in Africa from our regions are growing up in an environment or through these where they’re exposed to languages, to expression and just identity that don’t reflect them.


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Sociocultural | Human rights


Sierra Leone is developing online safety legislation specifically for children, working proactively with startups on safety by design

Explanation

Minister Bah describes Sierra Leone’s approach of creating dedicated child protection legislation while also working proactively with emerging tech companies to embed safety considerations from the design phase rather than retrofitting solutions.


Evidence

in Sierra Leone as well one of the things we’re looking at is an online safety legislation specifically looking at children. there’s a bunch of young people in Sierra Leone who are developing edtech solutions with AI component and to my surprise the other day I was having a conversation with them and they’re telling me how the solution they built first of all the AI is a bit homegrown so they don’t tap into the global AI data sets because they wanted to make sure that the children using it getting information from our society and our cultures


Major discussion point

Regulatory Frameworks and Policy Approaches


Topics

Legal and regulatory | Development


Cross-government collaboration needed between ministries handling digital economy, education, gender/children affairs, and information

Explanation

Minister Bah emphasizes that effective child protection online requires coordination across multiple government departments, as the issue touches on digital policy, education, child welfare, and information management simultaneously.


Evidence

even within governments though I think there’s a huge need for collaboration even amongst ourselves in terms of understanding what we’re doing for example you talk about this space you’re talking about dealing with a ministry such as myself that is responsible for the digital economy but then obviously there’s a ministry of gender and children affairs specifically so we have to make sure things are aligned that way you’re talking about also and potentially the ministry of education


Major discussion point

Multi-Stakeholder Collaboration and Shared Responsibility


Topics

Legal and regulatory | Development


Agreed with

– Karianne Tung
– Thomas Davin
– Thibaut Kleiner
– Shivanee Thapa

Agreed on

Multi-stakeholder collaboration is necessary as no single actor can solve child protection alone


African children need to see their own cultures and societies reflected on platforms, not just exposure to other cultures

Explanation

Minister Bah argues that while global exposure is valuable, it’s crucial that African children also see content that reflects their own cultural contexts and identities to prevent cultural displacement and maintain cultural diversity.


Evidence

We’re not saying you don’t want them to see others but it’s the fact that they might not be seeing something that reflects our culture and our societies and our identity and our values as well. So with that, there’s a potential for cultural erasure and then just the adoption of cultures from elsewhere


Major discussion point

Global South Perspectives and Digital Divide


Topics

Sociocultural | Development


Digital platforms serve as public goods bridging digital divides and exposing young people to opportunities beyond immediate realities

Explanation

Minister Bah acknowledges the positive role of digital platforms in providing access to opportunities and information that wouldn’t otherwise be available to young people in her region, treating these platforms as essential public infrastructure.


Evidence

we do believe that these digital platforms inherently are public interest and goods, play a significant role in the advancement of society in terms of just exposing when we talk about bridging that digital divide and making sure everybody has access to the same opportunities and because of these platforms we see young people now aspiring to beyond what is in their immediate societies and realities


Major discussion point

Global South Perspectives and Digital Divide


Topics

Development | Economic


15% of Sierra Leone’s internet traffic goes through TikTok, showing massive impact requiring safety collaboration

Explanation

Minister Bah provides concrete data showing the significant role that major platforms play in her country’s digital ecosystem, emphasizing why safety collaboration with these platforms is essential rather than optional.


Evidence

15 percent of the traffic in Sierra Leone goes through TikTok so we think that’s a huge huge aspect in terms of looking at the impact and it has and we’re already working with some of these specifically for example TikTok, we’re working with the African African-American office in terms of looking at how we ensure that these platforms are safe


Major discussion point

Global South Perspectives and Digital Divide


Topics

Infrastructure | Development


Global South countries need meaningful inclusion in coordination efforts to ensure solutions are inclusive and equitable

Explanation

Minister Bah calls for genuine representation of Global South perspectives in international coordination efforts, emphasizing that the diversity within regions like Africa must be recognized and reflected in global solutions.


Evidence

Maybe, I know sometimes there’s a tendency when we say Africa and then there’s just maybe one representative or two, but it’s really a diverse region and we hope that we see that being reflected in terms of the engagement across board.


Major discussion point

Global South Perspectives and Digital Divide


Topics

Development | Legal and regulatory


C

Christine Grahn

Speech speed

152 words per minute

Speech length

1882 words

Speech time

742 seconds

TikTok implements safety by design with private accounts by default for under-16s, no direct messages until 16, and screen time limits

Explanation

Grahn describes TikTok’s specific safety measures that are built into the platform’s design for younger users, including privacy settings, communication restrictions, and time management tools that are automatically applied based on age.


Evidence

when you create a TikTok account as a 13-year-old, you will have a kind of a face introduction to the platform. So we don’t give access to direct messages, for example, until you reach the age of 16. When you’re under 16, your account is set to private by default, and we have screen time limits in place for everyone who’s under 18.


Major discussion point

Child-Centric Design Solutions and Best Practices


Topics

Human rights | Cybersecurity


Agreed with

– Leanda Barrington-Leach
– Karianne Tung
– Emily Yu

Agreed on

Safety by design and default is essential for child protection


TikTok’s global youth council with representatives from 15 countries provides direct input to senior leadership

Explanation

Grahn describes TikTok’s structured approach to including young people’s voices in platform governance through a formal council that gives youth direct access to senior decision-makers and influences platform policies.


Evidence

last year I think as a first platform we introduced a global youth council. We have representatives from 15 countries around the world, Brazil, Nigeria, Poland, Indonesia just to mention a few and it’s a forum where they can in a setting kind of created for them share their views with us directly but also and maybe you know even more importantly here in this conversation it’s a way for our most senior leaders to also hear from the youth community directly.


Major discussion point

Children’s Rights and Meaningful Participation


Topics

Human rights | Sociocultural


Agreed with

– Leanda Barrington-Leach
– Emily Yu
– Thomas Davin
– Shivanee Thapa

Agreed on

Children need meaningful participation in digital governance decisions


Public-private partnerships can effectively address root causes, as shown by TikTok’s collaboration with Swedish safety NGO BRIS

Explanation

Grahn provides a concrete example of how platforms can work with civil society organizations and government to address underlying social problems rather than just mitigating symptoms, showing measurable positive outcomes.


Evidence

when BRIS, which is a minor safety NGO in Sweden, was tasked by the Swedish government to increase the support available for teenagers generally but in particular those at risk for being pulled into this environment, we decided to partner with them and we did so with full force. So we found creators that they were not previously in touch with that could be that kind of voice that would speak to those that they wanted to reach, we found creative agencies that could help them with the expression that would really, really speak to these teenagers, we helped them with the strategic campaigns and I have unfortunately, given the topic that we’re talking about, it was it was a success, there was a need for this, so we saw, or they saw rather, sorry, a 60% increase of calls to their helpline.


Major discussion point

Multi-Stakeholder Collaboration and Shared Responsibility


Topics

Human rights | Sociocultural


T

Thibaut Kleiner

Speech speed

170 words per minute

Speech length

1530 words

Speech time

539 seconds

Children lack deep understanding of underlying technology despite being able to use apps

Explanation

Kleiner argues against the ‘digital natives’ assumption, explaining that while children can operate digital applications, they don’t understand the underlying mechanisms and can be easily manipulated or tricked by these systems.


Evidence

maybe we should stop using the term digital natives when we speak about children, because sometimes, you know, you get this idea that you can leave the children with the technology and they are very savvy and they can, you know, get their way out, and we don’t understand the technology so well, whereas they do. Actually, the studies we conduct in the EU show that there is a very superficial understanding of the technology among children.


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Sociocultural


EU’s Digital Services Act provides hard regulation giving platforms responsibility for child protection with enforcement mechanisms

Explanation

Kleiner describes the EU’s regulatory approach as creating binding legal obligations for platforms to protect children, backed by enforcement actions and penalties for non-compliance.


Evidence

we have introduced in the EU the Digital Services Act, so it’s hard regulation that precisely aims to give this responsibility to platforms in partnership, because we want them to precisely develop also better ways to know what age the children have online. Secondly, we are really taking this very seriously, and we don’t hesitate to actually open proceedings when we find that the platforms are not delivering on their responsibilities under the Digital Services Act. We have already eight open cases


Major discussion point

Regulatory Frameworks and Policy Approaches


Topics

Legal and regulatory | Human rights


Regulation works when properly designed and enforced, as shown by concrete cases like TikTok Lite withdrawal

Explanation

Kleiner provides evidence that regulatory enforcement can achieve concrete results in protecting children, citing a specific case where regulatory action led to the removal of harmful features from a platform.


Evidence

I would say that first of all regulation works. I mean we’ve had very concrete examples just thinking about a recent case we opened against TikTok where we found that in TikTok Lite there was some addictive behavior and I think that we could have a positive result because this was withdrawn from the features of the platform


Major discussion point

Regulatory Frameworks and Policy Approaches


Topics

Legal and regulatory | Human rights


Agreed with

– Leanda Barrington-Leach
– Karianne Tung

Agreed on

Regulation works when properly designed and enforced


Need for robust age verification mechanisms and technical standards providing practical guidelines for innovators

Explanation

Kleiner emphasizes that effective child protection requires reliable methods to determine user age and clear technical standards that guide developers in creating child-safe systems.


Evidence

we are about to introduce also mechanisms that are robust to identify the age of users online. You need to know when somebody is a minor because otherwise you are exposing them to potentially dangerous content


Major discussion point

Technical Standards and Innovation Approaches


Topics

Infrastructure | Legal and regulatory


Agreed with

– Karianne Tung
– Salima Bah
– Thomas Davin
– Shivanee Thapa

Agreed on

Multi-stakeholder collaboration is necessary as no single actor can solve child protection alone


E

Emily Yu

Speech speed

158 words per minute

Speech length

824 words

Speech time

311 seconds

Roblox focuses on discoverability rather than personalized content limitation, with robust parental controls and content labeling

Explanation

Yu explains that Roblox’s algorithmic approach prioritizes helping users discover quality content rather than creating personalized filter bubbles, while providing parents with tools to control and understand their children’s experiences.


Evidence

with regards to algorithms used for recommendation systems on the platform, our focus there is more on discoverability rather than limiting the content that is seen by the child based on personalization. There are millions of experiences on the platform, and what we prefer to surface are higher quality experiences and newer emerging experiences that our audience may be interested in. We’ve launched last November a set of pretty robust parental controls that include screen time limitations that parents can set. We’ve also introduced content ratings or content labeling within our systems


Major discussion point

Child-Centric Design Solutions and Best Practices


Topics

Human rights | Sociocultural


Disagreed with

– Leanda Barrington-Leach
– Christine Grahn

Disagreed on

Effectiveness of parental controls and digital literacy as solutions


Trust by Design program integrates fundamental children’s rights into product requirements from the start

Explanation

Yu describes Roblox’s systematic approach to embedding children’s rights considerations into the earliest stages of product development, ensuring that child protection is built into features rather than added as an afterthought.


Evidence

we have a program called Trust by Design in which we basically take at the requirement level what are fundamental rights of children and how do we end up then incorporating them into the product features that we will later publish to the platform.


Major discussion point

Child-Centric Design Solutions and Best Practices


Topics

Human rights | Legal and regulatory


Agreed with

– Leanda Barrington-Leach
– Karianne Tung
– Christine Grahn

Agreed on

Safety by design and default is essential for child protection


Roblox’s teen council gives teenagers worldwide opportunity to provide feedback on policies and platform direction

Explanation

Yu describes Roblox’s formal mechanism for including teenage voices in platform governance, allowing young users to directly influence policy decisions and platform development.


Evidence

We also have recently launched a teen council as of earlier this year where we get a lot of feedback from teenagers throughout the world in terms of what, you know, if we have any additional or updated policies, what their feedback is on that and find out from the teens themselves what they are interested in and what they want to move forward with.


Major discussion point

Children’s Rights and Meaningful Participation


Topics

Human rights | Sociocultural


Agreed with

– Leanda Barrington-Leach
– Christine Grahn
– Thomas Davin
– Shivanee Thapa

Agreed on

Children need meaningful participation in digital governance decisions


T

Thomas Davin

Speech speed

167 words per minute

Speech length

1908 words

Speech time

683 seconds

Mental health issues, addiction patterns, social isolation, and loss of concept of truth are major societal concerns

Explanation

Davin outlines multiple interconnected risks that algorithmic systems pose to children, including mental health problems, addictive behaviors, social isolation, and the erosion of shared understanding of truth and facts.


Evidence

one is mental health, clearly, with so many areas of mental health around nutrition, around self-harm, around so many of these issues. Addiction as a part of that significant growth, explosive growth of addiction. I would say in a somewhat genderized manner, you have boys that largely go into gaming addiction, and girls tend to do more social media addiction. The other one connected to that is social isolation, with again significant potential societal costs and consequences to that. The third one I would mention is maybe a little bit philosophical, but is we are at risk of losing the notion of the concept of truth.


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Sociocultural


Agreed with

– Leanda Barrington-Leach
– Karianne Tung
– Shivanee Thapa

Agreed on

Current algorithmic systems pose significant risks to children’s mental health and development


Neuroplasticity impacts from screen time affecting children’s brain development need more research

Explanation

Davin highlights the need for more research into how digital technology use affects children’s developing brains, noting that the long-term neurological impacts are not yet fully understood but may be significant.


Evidence

Maybe one issue that Leandra maybe didn’t touch on but will be interesting and important for us as society to really dig deep into is the impact of neuroplasticity. And so what is going to be the impact of those screen time and the fast space connection on really children’s brains and their ability in different ways. And we don’t really fully understand that yet, and we believe it should be a priority in terms of research and longitudinal studies


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Sociocultural


UNICEF’s digital child rights impact assessment provides participatory tools for companies to understand impacts

Explanation

Davin describes UNICEF’s practical tool that helps companies assess how their products affect children’s rights through a participatory process that includes children’s voices and perspectives.


Evidence

UNICEF developed a digital child right impact assessment which is a tool, one of the tools to understand what will happen, what can happen and that is a participatory process so we bring in different elements of societies, children themselves, adolescent, young people to also speak to what they feel are right design approaches, wrong design approaches


Major discussion point

Technical Standards and Innovation Approaches


Topics

Human rights | Legal and regulatory


Children’s voices must lead to visible action and change, not just consultation exercises

Explanation

Davin emphasizes that meaningful participation requires that children’s input results in concrete, observable changes to platforms and policies, rather than being merely consultative processes with no follow-through.


Evidence

many of the children we speak to say you ask us for views and then nothing happens and so that’s quite often where we feel we need to really get to that stage where once you engage children it has to be meaningful, if you want it to be meaningful it means action needs to be taken and needs to be visibly taken so visibly monitored, transparency is quite important.


Major discussion point

Children’s Rights and Meaningful Participation


Topics

Human rights | Legal and regulatory


Agreed with

– Leanda Barrington-Leach
– Christine Grahn
– Emily Yu
– Shivanee Thapa

Agreed on

Children need meaningful participation in digital governance decisions


Scotland’s Children’s Parliament working with Alan Turing Institute on AI recommendations shows innovative participation models

Explanation

Davin highlights an innovative model where children have their own parliamentary body that works with technical experts to provide policy recommendations on AI, demonstrating how children can be meaningfully included in complex technical policy discussions.


Evidence

there is an interesting initiative in Scotland where they brought the Children’s Parliament from Scotland to act as a mirror to the adult Parliament on AI specifically with the Alan Turing Institutes and the Scottish AI Coalition, they are essentially going into various use cases of AI and bringing back to Parliament issues and recommendations on legislative pathways to tackle that.


Major discussion point

Children’s Rights and Meaningful Participation


Topics

Human rights | Legal and regulatory


Transparency in algorithmic operations and allowing society to ‘look under the hood’ enables better monitoring

Explanation

Davin argues that meaningful oversight of algorithmic systems requires transparency about how they function, allowing society to monitor whether promised changes are actually implemented and effective.


Evidence

Another element that we see really powerful is when companies agree to kind of allow anybody to look under the hood, in other words to understand this is how the algorithm functions and once we are going to do this, this is what will change and you can then monitor that together as a society again to try to understand whether there is a sense of progress on issues we together identified.


Major discussion point

Technical Standards and Innovation Approaches


Topics

Infrastructure | Human rights


International agencies like UNICEF should focus on incentivized regulatory platforms and understanding economic costs of inaction

Explanation

Davin describes UNICEF’s role in creating regulatory frameworks that provide positive incentives for companies to protect children while also building evidence about the economic costs of failing to address these issues.


Evidence

part of what we are really trying to do is really talk about what’s an incentivized regulatory platform that enables these companies to feel if I do it well, it’s good for the bottom line as well. And if I don’t do it well, there are consequences and the EU mentioned that. I think part of what we are trying to build a little bit deeper into where we don’t have enough data is understanding also what is the cost of inaction. So we are very clear about what’s the cost of inaction on smoking, on alcohol, on drugs, on sugar, not so much on technology.


Major discussion point

Multi-Stakeholder Collaboration and Shared Responsibility


Topics

Economic | Human rights


Agreed with

– Karianne Tung
– Salima Bah
– Thibaut Kleiner
– Shivanee Thapa

Agreed on

Multi-stakeholder collaboration is necessary as no single actor can solve child protection alone


Disagreed with

– Leanda Barrington-Leach
– Christine Grahn

Disagreed on

Platform business model compatibility with child safety


S

Shivanee Thapa

Speech speed

123 words per minute

Speech length

2265 words

Speech time

1099 seconds

Algorithms are active architects of children’s digital experiences, shaping what they see, how long they stay on screens, and how they feel

Explanation

Thapa argues that algorithms are not passive tools but actively construct children’s digital reality by determining content exposure, engagement duration, and emotional responses. She emphasizes that this active role makes algorithms responsible for shaping childhood experiences in fundamental ways.


Evidence

We know that quite well, right? We are in fact very, very active architects of these children’s digital experiences who are shaping what, not only children, but what we see, how long we stay on the screens and increasingly how we feel


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Sociocultural


Agreed with

– Leanda Barrington-Leach
– Karianne Tung
– Thomas Davin

Agreed on

Current algorithmic systems pose significant risks to children’s mental health and development


What’s at stake is not just screen time but childhood itself

Explanation

Thapa frames the discussion as fundamentally about protecting the nature and quality of childhood rather than simply managing technology use. She argues that algorithmic systems are threatening core aspects of child development and the childhood experience.


Evidence

As a practicing journalist, I’m very much committed to, you know, covering issues of public interest, and as a mother of a teen, my 14-year-old Vivaan is seated somewhere amongst you, and both professionally and personally, I see so clearly that what’s at stake at the moment is not just screen time. It is childhood itself.


Major discussion point

Current Risks and Harms to Children from Algorithmic Systems


Topics

Human rights | Sociocultural


There is a consensus that opaque algorithmic systems are influencing children’s mental health, social well-being, and development in ways that demand coordinated attention

Explanation

Thapa synthesizes the panel discussion to highlight the shared concern across different sectors about the lack of transparency in algorithmic systems and their broad impact on child development. She emphasizes that this consensus calls for coordinated action rather than isolated efforts.


Evidence

So what we are hearing across sectors represented in this panel is of course a lot of sense of optimism and commitment in your actions and in your thoughts, and also a very prominent shared concern that opaque algorithmic systems are influencing certainly children’s mental health, social well-being, development, and you know so many aspects in ways that certainly demand more coordinated attention and action.


Major discussion point

Multi-Stakeholder Collaboration and Shared Responsibility


Topics

Human rights | Legal and regulatory


Agreed with

– Karianne Tung
– Salima Bah
– Thomas Davin
– Thibaut Kleiner

Agreed on

Multi-stakeholder collaboration is necessary as no single actor can solve child protection alone


Child-centric design must be a foundational principle, not an afterthought, with meaningful involvement of young people in shaping systems

Explanation

Thapa advocates for a fundamental shift in how digital systems are designed, placing children’s well-being at the center from the beginning rather than trying to retrofit protections later. She emphasizes that this requires genuine participation from young people themselves in the design process.


Evidence

So if we were to re-engineer or if we were to redesign the social media environment with children’s well-being at its core, centrally placed and not as an afterthought but as a foundational principle, what would that look like? More importantly, how can we ensure that young people are not just consulted but meaningfully involved in shaping systems that govern their digital lives?


Major discussion point

Child-Centric Design Solutions and Best Practices


Topics

Human rights | Sociocultural


Agreed with

– Leanda Barrington-Leach
– Christine Grahn
– Emily Yu
– Thomas Davin

Agreed on

Children need meaningful participation in digital governance decisions


Protecting children in the digital age is a moral imperative, not just a technical challenge

Explanation

Thapa frames child protection online as fundamentally an ethical issue that goes beyond technical solutions. She argues that society has a moral obligation to protect children in digital spaces, making this a question of values and responsibility rather than just technological capability.


Evidence

And what we’ve heard today from this very distinguished panel is very, very clear that protecting children in this digital age and in the age of algorithms is not just a technical challenge for us. It is certainly a moral imperative, right?


Major discussion point

Multi-Stakeholder Collaboration and Shared Responsibility


Topics

Human rights | Legal and regulatory


The future of digital governance must be built with and not just for young citizens

Explanation

Thapa emphasizes that effective digital governance requires children and young people to be active participants in creating the systems that affect them, rather than passive recipients of adult-designed protections. This represents a shift from paternalistic to participatory approaches to child protection.


Evidence

And the future of digital governance must, as I could reap from the essence of this distinguished panel, must be built with and not just for our young citizens.


Major discussion point

Children’s Rights and Meaningful Participation


Topics

Human rights | Legal and regulatory


Agreements

Agreement points

Children need meaningful participation in digital governance decisions

Speakers

– Leanda Barrington-Leach
– Christine Grahn
– Emily Yu
– Thomas Davin
– Shivanee Thapa

Arguments

Children must be included at policy tables rather than being completely left out of decisions affecting them


TikTok’s global youth council with representatives from 15 countries provides direct input to senior leadership


Roblox’s teen council gives teenagers worldwide opportunity to provide feedback on policies and platform direction


Children’s voices must lead to visible action and change, not just consultation exercises


Child-centric design must be a foundational principle, not an afterthought, with meaningful involvement of young people in shaping systems


Summary

All speakers agree that children must be actively involved in decisions about digital systems that affect them, not just consulted. This requires formal mechanisms like youth councils and must result in visible changes to policies and platforms.


Topics

Human rights | Legal and regulatory


Safety by design and default is essential for child protection

Speakers

– Leanda Barrington-Leach
– Karianne Tung
– Christine Grahn
– Emily Yu

Arguments

Safety must be designed by default with privacy protections, turning harmful features off for children


Age verification, appropriate design, and banning behavioral advertising for children are essential principles


TikTok implements safety by design with private accounts by default for under-16s, no direct messages until 16, and screen time limits


Trust by Design program integrates fundamental children’s rights into product requirements from the start


Summary

There is strong consensus that child safety must be built into digital systems from the ground up, with protective features enabled by default rather than requiring children or parents to actively enable them.


Topics

Human rights | Legal and regulatory


Multi-stakeholder collaboration is necessary as no single actor can solve child protection alone

Speakers

– Karianne Tung
– Salima Bah
– Thomas Davin
– Thibaut Kleiner
– Shivanee Thapa

Arguments

International organizations, governments, and tech companies must coordinate better as no single actor can solve the problem alone


Cross-government collaboration needed between ministries handling digital economy, education, gender/children affairs, and information


International agencies like UNICEF should focus on incentivized regulatory platforms and understanding economic costs of inaction


Need for robust age verification mechanisms and technical standards providing practical guidelines for innovators


There is a consensus that opaque algorithmic systems are influencing children’s mental health, social well-being, and development in ways that demand coordinated attention


Summary

All speakers acknowledge that protecting children online requires coordinated efforts across governments, international organizations, tech companies, and civil society, with each stakeholder playing their specific role.


Topics

Legal and regulatory | Human rights


Current algorithmic systems pose significant risks to children’s mental health and development

Speakers

– Leanda Barrington-Leach
– Karianne Tung
– Thomas Davin
– Shivanee Thapa

Arguments

Children face addiction, loss of control, sleep deprivation, and inability to make connections due to algorithmic design


Algorithms expose children to harmful content, bias, manipulation and cause serious mental health and body image issues


Mental health issues, addiction patterns, social isolation, and loss of concept of truth are major societal concerns


Algorithms are active architects of children’s digital experiences, shaping what they see, how long they stay on screens, and how they feel


Summary

There is unanimous agreement that current algorithmic systems are causing serious harm to children’s mental health, social development, and overall well-being through addictive design and exposure to harmful content.


Topics

Human rights | Sociocultural


Regulation works when properly designed and enforced

Speakers

– Leanda Barrington-Leach
– Thibaut Kleiner
– Karianne Tung

Arguments

Age-appropriate design code principles from UK to Indonesia provide enforceable regulatory requirements


Regulation works when properly designed and enforced, as shown by concrete cases like TikTok Lite withdrawal


Norway is implementing 15-year age limit for social media platforms and working on better age verification systems


Summary

Speakers agree that well-designed regulation with proper enforcement mechanisms can effectively protect children online, as demonstrated by successful regulatory interventions.


Topics

Legal and regulatory | Human rights


Similar viewpoints

Both speakers identify the fundamental conflict between profit-driven platform design and child welfare, advocating for regulatory measures that prioritize children’s rights over commercial interests.

Speakers

– Leanda Barrington-Leach
– Karianne Tung

Arguments

Current platforms prioritize revenue through maximizing time spent, reach, and activity rather than child welfare


Age verification, appropriate design, and banning behavioral advertising for children are essential principles


Topics

Economic | Human rights


Both platform representatives describe similar formal mechanisms for including young people’s voices in platform governance through structured youth councils that provide direct input to leadership.

Speakers

– Christine Grahn
– Emily Yu

Arguments

TikTok’s global youth council with representatives from 15 countries provides direct input to senior leadership


Roblox’s teen council gives teenagers worldwide opportunity to provide feedback on policies and platform direction


Topics

Human rights | Sociocultural


Both speakers recognize the positive potential of digital platforms while emphasizing the need for frameworks that balance benefits with protection, particularly for developing regions.

Speakers

– Salima Bah
– Thomas Davin

Arguments

Digital platforms serve as public goods bridging digital divides and exposing young people to opportunities beyond immediate realities


International agencies like UNICEF should focus on incentivized regulatory platforms and understanding economic costs of inaction


Topics

Development | Economic


Both speakers challenge common assumptions – that companies act in good faith and that children are ‘digital natives’ – arguing instead that children are vulnerable and companies prioritize profits over protection.

Speakers

– Leanda Barrington-Leach
– Thibaut Kleiner

Arguments

Tech companies are aware of harm they cause children but choose profits over protection


Children lack deep understanding of underlying technology despite being able to use apps


Topics

Human rights | Sociocultural


Unexpected consensus

Platform representatives acknowledging need for stronger regulation

Speakers

– Christine Grahn
– Emily Yu

Arguments

Public-private partnerships can effectively address root causes, as shown by TikTok’s collaboration with Swedish safety NGO BRIS


Trust by Design program integrates fundamental children’s rights into product requirements from the start


Explanation

It’s unexpected that platform representatives would not only accept but actively support stronger regulatory frameworks and acknowledge the need for fundamental changes to their business models to protect children.


Topics

Legal and regulatory | Human rights


Agreement on limiting parental controls as primary solution

Speakers

– Leanda Barrington-Leach
– Karianne Tung
– Thomas Davin

Arguments

Safety must be designed by default with privacy protections, turning harmful features off for children


Age verification, appropriate design, and banning behavioral advertising for children are essential principles


Children’s voices must lead to visible action and change, not just consultation exercises


Explanation

There’s unexpected consensus that parental controls and digital literacy are insufficient solutions, with speakers agreeing that the burden should not be placed on parents or children but on platforms and regulators.


Topics

Human rights | Legal and regulatory


Global South perspectives being central to child protection discussions

Speakers

– Salima Bah
– Thomas Davin
– Leanda Barrington-Leach

Arguments

Cultural erasure through algorithms trained on datasets that don’t reflect African diversity and values


International agencies like UNICEF should focus on incentivized regulatory platforms and understanding economic costs of inaction


Digital world is 100% human-engineered and can be optimized for good just as easily as for bad


Explanation

Unexpected consensus that Global South perspectives are not just relevant but essential to global child protection efforts, with recognition that cultural diversity and representation are fundamental rights issues.


Topics

Sociocultural | Development


Overall assessment

Summary

The panel demonstrated remarkably high consensus across diverse stakeholders on core principles: children’s meaningful participation in digital governance, safety by design as default, necessity of multi-stakeholder collaboration, serious risks from current algorithmic systems, and effectiveness of proper regulation. Even platform representatives acknowledged need for fundamental changes.


Consensus level

Very high consensus with significant implications – this level of agreement across government, civil society, international organizations, and industry suggests strong momentum for coordinated global action on child protection online. The consensus indicates readiness to move from identifying problems to implementing solutions, with shared understanding of both the urgency and the approaches needed.


Differences

Different viewpoints

Effectiveness of parental controls and digital literacy as solutions

Speakers

– Leanda Barrington-Leach
– Christine Grahn
– Emily Yu

Arguments

Don’t shut children out and don’t please put the burden back on parents via parental controls. For example they’re not working, we know they’re not working or on to children we heard they are not digital natives, digital literacy is not a silver bullet. So age verification, parental controls, controlling content and digital literacy these are not the solution.


We’ve launched last November a set of pretty robust parental controls that include screen time limitations that parents can set. We’ve also introduced content ratings or content labeling within our systems so that parents have awareness as to what an experience holds, and they can obviously permit or not permit their child from entering that experience.


Roblox focuses on discoverability rather than personalized content limitation, with robust parental controls and content labeling


Summary

Barrington-Leach argues that parental controls are ineffective and place unfair burden on parents, while platform representatives (Grahn and Yu) present parental controls as important safety tools they have implemented. This represents a fundamental disagreement about whether parental controls are part of the solution or a problematic shifting of responsibility.


Topics

Human rights | Legal and regulatory


Platform business model compatibility with child safety

Speakers

– Leanda Barrington-Leach
– Christine Grahn
– Thomas Davin

Arguments

Current platforms prioritize revenue through maximizing time spent, reach, and activity rather than child welfare


Tech companies are aware of harm they cause children but choose profits over protection


TikTok is a place that people come to because it’s fun, because you can figure out what book to read over summer, where you can plan your next trip or where you can discover new artists. And also a place where our community come to express themselves and can be creative. And this really wouldn’t happen if it weren’t also a safe and inclusive place.


International agencies like UNICEF should focus on incentivized regulatory platforms and understanding economic costs of inaction


Summary

Barrington-Leach argues that current platform business models are fundamentally incompatible with child safety, while Grahn suggests that safety is actually essential for TikTok’s business model. Davin takes a middle position, suggesting that regulatory incentives can align business interests with child protection.


Topics

Economic | Human rights


Unexpected differences

Cultural representation in algorithmic systems

Speakers

– Salima Bah
– Other speakers

Arguments

Cultural erasure through algorithms trained on datasets that don’t reflect African diversity and values


African children need to see their own cultures and societies reflected on platforms, not just exposure to other cultures


Explanation

Minister Bah raised a unique concern about cultural erasure that other speakers did not address, despite the panel’s focus on child protection. This represents an unexpected dimension of algorithmic harm that goes beyond the typical safety concerns discussed by other panelists, highlighting how Global South perspectives can reveal overlooked aspects of digital harm.


Topics

Sociocultural | Development


Neuroplasticity and brain development impacts

Speakers

– Thomas Davin
– Other speakers

Arguments

Neuroplasticity impacts from screen time affecting children’s brain development need more research


Explanation

Davin was the only speaker to raise concerns about the neurological impacts of digital technology on developing brains, which was unexpected given that this represents a fundamental biological dimension of child safety that other speakers did not address, despite their focus on child protection.


Topics

Human rights | Sociocultural


Overall assessment

Summary

The panel showed remarkable consensus on the need for child protection online and the importance of regulation, but disagreed on specific mechanisms and approaches. Key disagreements centered on the effectiveness of parental controls, the compatibility of current business models with child safety, and what constitutes meaningful child participation.


Disagreement level

Low to moderate disagreement level. While speakers agreed on fundamental goals, they differed on implementation strategies and the effectiveness of current approaches. The disagreements were constructive rather than adversarial, with speakers building on each other’s points while advocating for their preferred solutions. This suggests that while there is broad consensus on the problem, there remains significant work to be done in developing unified approaches to solutions.


Partial agreements

Partial agreements

Similar viewpoints

Both speakers identify the fundamental conflict between profit-driven platform design and child welfare, advocating for regulatory measures that prioritize children’s rights over commercial interests.

Speakers

– Leanda Barrington-Leach
– Karianne Tung

Arguments

Current platforms prioritize revenue through maximizing time spent, reach, and activity rather than child welfare


Age verification, appropriate design, and banning behavioral advertising for children are essential principles


Topics

Economic | Human rights


Both platform representatives describe similar formal mechanisms for including young people’s voices in platform governance through structured youth councils that provide direct input to leadership.

Speakers

– Christine Grahn
– Emily Yu

Arguments

TikTok’s global youth council with representatives from 15 countries provides direct input to senior leadership


Roblox’s teen council gives teenagers worldwide opportunity to provide feedback on policies and platform direction


Topics

Human rights | Sociocultural


Both speakers recognize the positive potential of digital platforms while emphasizing the need for frameworks that balance benefits with protection, particularly for developing regions.

Speakers

– Salima Bah
– Thomas Davin

Arguments

Digital platforms serve as public goods bridging digital divides and exposing young people to opportunities beyond immediate realities


International agencies like UNICEF should focus on incentivized regulatory platforms and understanding economic costs of inaction


Topics

Development | Economic


Both speakers challenge common assumptions – that companies act in good faith and that children are ‘digital natives’ – arguing instead that children are vulnerable and companies prioritize profits over protection.

Speakers

– Leanda Barrington-Leach
– Thibaut Kleiner

Arguments

Tech companies are aware of harm they cause children but choose profits over protection


Children lack deep understanding of underlying technology despite being able to use apps


Topics

Human rights | Sociocultural


Takeaways

Key takeaways

Protecting children online requires urgent, coordinated action across governments, tech companies, and international organizations as current algorithmic systems cause documented harm including addiction, mental health issues, and exposure to harmful content


Children must be meaningfully included in policy discussions and platform design decisions, not just consulted – their voices should lead to visible action and change


Safety by design and privacy by default are essential principles – platforms should turn off harmful features for children rather than relying on parental controls or digital literacy as solutions


Regulation works when properly designed and enforced, as demonstrated by the EU’s Digital Services Act and similar frameworks, but requires political will and resources


The current business model prioritizing engagement and revenue over child welfare must change – children cannot be treated as commodities for profit


Global South perspectives must be included in solutions to prevent cultural erasure and ensure platforms reflect diverse societies and values


This is fundamentally a public health issue requiring the same systematic approach used for tobacco, alcohol, and other harmful substances


Technology has immense positive potential for children’s learning and development, but current implementations prioritize commercial interests over child welfare


Resolutions and action items

Norway to continue implementing Digital Services Act and 15-year age limit for social media platforms


EU to publish child protection guidelines by summer 2024 following recent consultation closure


EU to introduce robust age verification app in coming months


Sierra Leone to finalize online safety legislation specifically focused on children


Tech companies to continue developing youth councils and meaningful consultation mechanisms


International organizations to better coordinate efforts and share best practices globally


Platforms to implement transparency measures allowing society to examine algorithmic operations


Investment needed in civil society capacity building to support oversight and advocacy


Continued research required on neuroplasticity impacts and long-term effects of screen time on children’s brain development


Unresolved issues

How to effectively measure and address the economic costs of inaction on children’s digital safety


Developing truly effective age verification systems that balance privacy and protection


Creating sustainable business models for platforms that prioritize child welfare over engagement metrics


Ensuring meaningful participation of children, especially younger children, in policy processes


Addressing the technical challenge of content moderation at scale while respecting cultural diversity


Balancing innovation and safety without stifling technological advancement


Establishing global standards while respecting national sovereignty and cultural differences


Determining appropriate enforcement mechanisms and penalties for non-compliance


Addressing the digital divide to ensure all children benefit from protective measures


Suggested compromises

Multi-stakeholder approach recognizing that no single actor can solve the problem alone – requiring collaboration between governments, companies, and civil society


Co-regulatory frameworks that combine hard regulation with industry self-regulation and technical standards


Graduated approach to platform features based on age, with more protections for younger users and gradual introduction of features as children mature


Public-private partnerships for addressing specific harms, as demonstrated by TikTok’s collaboration with Swedish safety organizations


Transparency requirements that allow oversight without revealing proprietary algorithms


Global coordination on principles while allowing regional adaptation for cultural contexts


Incentive-based regulatory approaches that make child protection profitable for companies


Balancing platform responsibility with parental involvement and child agency


Thought provoking comments

A child not long ago asked me, why won’t adults stand up for children? You watch everything we do online, you nag us to get off our devices, even though you stay firmly glued to yours, and now you just want to outright ban us. When are you going to stop making out that we are the problem instead of the system? Why don’t you stand up for us?

Speaker

Leanda Barrington-Leach


Reason

This comment powerfully reframes the entire discussion by presenting the child’s perspective directly, challenging the adult-centric approach to digital safety. It exposes the hypocrisy in adult behavior and shifts blame from children to systemic issues, making the conversation more authentic and urgent.


Impact

This opening comment set the moral foundation for the entire discussion, with multiple panelists later referencing the need to ‘stand up for children’ and put children at the center of solutions rather than treating them as the problem. It established children’s agency as a key theme throughout the session.


Children can go from a simple search for slime to porn in just a single click, or from trampolining to pro-anorexia in just three clicks, and nudge to self-harm in 15 clicks. It is clear that the problem is a feature, not a bug, of the system.

Speaker

Leanda Barrington-Leach


Reason

This stark, concrete example transforms abstract concerns about algorithmic harm into visceral, understandable terms. The phrase ‘feature, not a bug’ is particularly powerful as it suggests intentional design rather than accidental harm, challenging the tech industry’s narrative of unintended consequences.


Impact

This comment shifted the discussion from theoretical risks to concrete evidence of harm, forcing other panelists to respond with specific safety measures and design changes. It elevated the urgency of the conversation and made denial of the problem impossible.


One of the most concerning things for us as well is the potential for cultural erasure with algorithm recommendations because we understand that these algorithms are trained on data sets that potentially don’t reflect our diversity or the diversity of our societies or our realities.

Speaker

Salima Bah


Reason

This comment introduced a crucial dimension often overlooked in child safety discussions – the cultural and identity implications of algorithmic curation. It broadened the conversation beyond individual harm to collective cultural impact, highlighting how AI systems can perpetuate global inequalities.


Impact

This perspective added a global south viewpoint that enriched the discussion, prompting other speakers to consider diversity and representation in their responses. It demonstrated that child safety isn’t just about preventing harm but also about preserving cultural identity and ensuring equitable representation.


I think we should stop using the term digital natives when we speak about children, because sometimes, you know, you get this idea that you can leave the children with the technology and they are very savvy and they can, you know, get their way out, and we don’t understand the technology so well, whereas they do. Actually, the studies we conduct in the EU show that there is a very superficial understanding of the technology among children.

Speaker

Thibaut Kleiner


Reason

This comment challenged a fundamental assumption that has shaped policy and parental attitudes for years. By debunking the ‘digital native’ myth, it reframed children as vulnerable rather than naturally tech-savvy, requiring protection rather than independence.


Impact

This insight shifted the conversation away from solutions that place responsibility on children (like digital literacy as a silver bullet) toward systemic protections. It influenced subsequent discussions about the need for design-level changes rather than education-focused solutions.


We are at risk of losing the notion of the concept of truth. As those algorithm brings those children into more and more of things they believe to be true, they are more and more certain of that truth and of holding that truth, and they are more and more reluctant to actually connect or open up to others who may say, well that is not my truth.

Speaker

Thomas Davin


Reason

This comment elevated the discussion to a philosophical level, identifying algorithmic curation as a threat to democratic discourse and social cohesion. It connected individual child safety to broader societal implications, showing how personal harms scale to civilizational challenges.


Impact

This observation deepened the conversation by connecting immediate safety concerns to long-term democratic and social implications. It influenced later discussions about the need for diverse content exposure and the societal costs of algorithmic polarization.


Children need to be at the policy table. There are no children in this room today and generally they are completely left out… don’t please put the burden back on parents via parental controls. For example they’re not working, we know they’re not working or on to children we heard they are not digital natives, digital literacy is not a silver bullet.

Speaker

Leanda Barrington-Leach


Reason

This comment directly challenged the composition and approach of the very panel discussing children’s issues, pointing out the fundamental contradiction of making decisions about children without including them. It also systematically dismantled common ‘solutions’ that shift responsibility away from platforms.


Impact

This critique forced panelists to acknowledge the limitations of current approaches and commit to more meaningful youth engagement. Several speakers subsequently mentioned their youth councils and engagement efforts, showing how this challenge influenced their responses.


We need to treat this as a really as a public health issue. This is an area that we don’t fully understand, this is an area that will have societal consequences and very likely significant economical costs if we don’t manage it appropriately.

Speaker

Thomas Davin


Reason

This reframing of digital harm as a public health issue rather than a technology problem was transformative. It suggested the need for population-level interventions, regulatory approaches similar to tobacco or alcohol, and long-term research on societal impacts.


Impact

This framing influenced multiple subsequent speakers, including Minister Tung who referenced the public health approach in Norway’s white paper. It shifted the discussion from individual solutions to systemic, society-wide interventions and elevated the urgency of coordinated action.


Overall assessment

These key comments fundamentally shaped the discussion by challenging conventional wisdom and reframing the problem. Barrington-Leach’s opening child’s voice set a moral imperative that echoed throughout, while her concrete examples of algorithmic harm made the abstract tangible. Bah’s cultural erasure perspective globalized the conversation beyond Western-centric views. Kleiner’s debunking of ‘digital natives’ and Davin’s public health framing shifted focus from individual responsibility to systemic solutions. Together, these insights elevated the discussion from technical problem-solving to a moral and societal reckoning, forcing all participants to grapple with deeper questions about power, responsibility, and the kind of digital future we want to create for children. The comments created a progression from problem identification to systemic analysis to calls for fundamental change in approach and governance.


Follow-up questions

What is the long-term impact of neuroplasticity changes from screen time on children’s brain development?

Speaker

Thomas Davin


Explanation

UNICEF recognizes this as a critical gap in understanding how fast-paced digital interactions affect children’s cognitive development, attention spans, and learning abilities over time


What are the economic costs of inaction on technology-related harms to children?

Speaker

Thomas Davin


Explanation

While costs of inaction are well-established for smoking, alcohol, and drugs, the societal and GDP costs of technology-related harms to children need quantification to inform policy decisions


How can we develop digital products and services specifically designed for children as a viable business model?

Speaker

Thibaut Kleiner


Explanation

Current platforms are designed for general audiences and then modified for children; there’s a need for services built from the ground up with children as the primary audience


How can algorithm recommendations be designed to reflect cultural diversity and prevent cultural erasure in different regions?

Speaker

Salima Bah


Explanation

Algorithms trained on datasets that don’t reflect regional diversity may lead to cultural erasure, particularly concerning for African children who may not see their own cultures represented


What are the most effective mechanisms for meaningful child participation in technology design and policy-making?

Speaker

Leanda Barrington-Leach


Explanation

While there’s consensus on including children’s voices, specific frameworks and processes for meaningful participation beyond tokenistic consultation need development


How can longitudinal studies be conducted to better understand the relationship between algorithmic exposure and mental health outcomes in children?

Speaker

Thomas Davin


Explanation

Current research shows correlations but more comprehensive longitudinal studies are needed to establish causation and understand long-term impacts


What are the most effective approaches for cross-ministry collaboration within governments on child digital safety?

Speaker

Salima Bah


Explanation

Child digital safety spans multiple government departments (digital economy, education, gender/children affairs, information) requiring coordinated approaches


How can robust age verification mechanisms be implemented globally while protecting privacy?

Speaker

Thibaut Kleiner


Explanation

Effective child protection requires knowing when users are minors, but current age verification methods are inadequate and privacy concerns need addressing


Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.