WS #25 Multistakeholder cooperation for online child protection

17 Dec 2024 08:15h - 09:45h

WS #25 Multistakeholder cooperation for online child protection

Session at a Glance

Summary

This discussion focused on protecting children in the digital world, addressing the evolving threat landscape and strategies to mitigate risks. Experts highlighted the rapid pace of technological change and the increasing sophistication of online threats targeting children, including cyberbullying, grooming, and exposure to inappropriate content. They emphasized the need for a multi-stakeholder approach involving governments, tech companies, educators, parents, and children themselves.


Key challenges identified included the online disinhibition effect, the ease of creating deepfakes, and the collection of children’s biometric data through gaming. Participants stressed the importance of age-appropriate digital literacy education and the development of practical cybersecurity skills for children. They also discussed the role of parental controls and the need for open communication between parents and children about online safety.


The discussion touched on regulatory approaches, with some advocating for stricter content moderation and others cautioning against blanket bans on technology use. Experts emphasized the importance of international cooperation in addressing cross-border cyber threats. They also highlighted the need for ongoing research into emerging risks and the development of evidence-based interventions.


Participants agreed that while the threat landscape is likely to worsen, increased awareness and improved digital skills could help children navigate online spaces more securely. The discussion concluded with a call for continued dialogue and collaboration among stakeholders to ensure children’s rights and safety in the digital environment.


Keypoints

Major discussion points:


– The increasing threats and risks to children in the digital world, including cyberbullying, grooming, and exposure to harmful content


– The need for a multi-stakeholder approach involving government, industry, civil society, and academia to protect children online


– The importance of education and awareness for children, parents, and teachers about online safety


– The challenges of regulating and enforcing child protection measures in the fast-evolving digital landscape


– The role of technology companies in developing tools and solutions to enhance child safety online


The overall purpose of the discussion was to examine the current and future threats to children’s safety in the digital world and explore potential solutions and best practices for protecting children online through collaboration between different stakeholders.


The tone of the discussion was serious and concerned, reflecting the gravity of the issues being discussed. However, it was also constructive and solution-oriented, with speakers offering practical suggestions and examples of initiatives to address the challenges. The tone became more urgent and action-oriented towards the end, with participants emphasizing the need for immediate and coordinated efforts to protect children online.


Speakers

– Gladys O. Yiadom: Moderator


– Melodena Stephens: Professor of innovation and technology governance at Mohammed bin Rashid School of Government in Dubai, UAE


– Elizaveta Belyakova: Chairperson for the Alliance for the Protection of Children in the Digital Development


– Elmirti Arousafi: Cyber security expert and board member of the Moroccan Centre for Polytechnic Research and Innovation


– Heng Lee: Senior government affairs and public policy manager at Kaspersky


– Andre Gorobets: Representative from Ministry of Education (country not specified)


– Margarita Yurova: Translator for Andre Gorobets


– Anne Mickler: Online moderator


Additional speakers:


– Ethan: Youth ambassador of the One Power Foundation from Hong Kong


– Grace: Representative from Pan-African Youth Ambassadors for Internet Governance, from Uganda


– Jutta Croll: Child rights advocate from the German Digital Opportunities Foundation


Full session report

Protecting Children in the Digital World: A Multi-Stakeholder Approach


This discussion focused on the critical issue of protecting children in the rapidly evolving digital landscape. Experts from various fields, including academia, government, and the technology sector, convened to address the growing threats to children’s safety online and explore potential solutions.


Survey Insights


The session began and ended with a survey of participants’ views on the threat landscape for children in the digital world. Notably, there was a shift in perception by the end of the discussion, with more participants recognizing the severity of online threats to children.


Key Threats and Challenges


The participants highlighted the increasing sophistication of online threats targeting children. Melodena Stephens, a professor of innovation and technology governance, emphasized the alarming rise of deepfakes and the lack of alignment on standards for age-appropriate content. Cyberbullying and emotional harm from online interactions were also identified as significant concerns.


Elmirti Arousafi, a cybersecurity expert, pointed out that the rapid pace of technological change is outpacing regulatory responses, creating a challenging environment for protecting children. Andre Gorobets, representing a Ministry of Education, stressed the transborder nature of online threats, further complicating efforts to safeguard children.


Multi-Stakeholder Collaboration


A recurring theme throughout the discussion was the need for a multi-stakeholder approach to address these complex issues. Melodena Stephens highlighted the crucial role of political will from governments in prioritizing child safety. She also stressed the importance of industry alignment on values and ethics, as well as the role of researchers in studying both the benefits and harms of technology.


Education and Awareness


The experts agreed on the critical importance of education and awareness in protecting children online. Elmirti Arousafi advocated for gamified, interactive curricula to teach children about online safety, making the learning process more engaging and effective. He specifically mentioned the Espace Maroc Cyber Confiance program as an example of such initiatives. Heng Lee, from Kaspersky, shared examples of educational resources such as cybersecurity alphabet books for children.


The discussion also touched on the need for practical guidance for parents on protecting their children online. Elmirti Arousafi emphasized the importance of equipping parents with the knowledge and tools to navigate the digital landscape alongside their children, stressing the importance of building trust between parents and children for effective online safety.


Regulatory Approaches and Challenges


The participants discussed various regulatory approaches to enhancing child safety online, while acknowledging the challenges in implementing and enforcing such measures. Heng Lee highlighted the difficulty of enforcing age restrictions for social media use and suggested the need for dedicated regulatory bodies focused on child online protection.


There was some disagreement on the approach to regulation. While Melodena Stephens emphasized the need for strong political will from governments, Heng Lee suggested a more balanced approach that considers both consumer protection and innovation in the tech industry.


Jutta Croll, a child rights advocate, introduced the UN Convention on the Rights of the Child as a basis for government action on child online protection. She specifically mentioned General Comment No. 25, adopted in 2021, which obliges states that have ratified the convention to implement children’s rights in the digital environment.


Role of Technology Companies


The discussion explored the role of technology companies in developing tools and solutions to enhance child safety online. Heng Lee presented Kaspersky’s Safe Kids as an example of parental control software designed to protect children in the digital space. He detailed features such as GPS tracking, screen time management, and content filtering, emphasizing the app’s ability to help parents guide their children’s online activities.


Youth Involvement and Empowerment


An unexpected point of consensus emerged around the importance of involving young people in developing solutions for online child protection. The potential for youth-led awareness campaigns on online risks was discussed, recognizing children as active participants rather than just passive recipients of protection measures. The Pan-African Youth Ambassadors for Internet Governance expressed interest in raising awareness about online risks among their peers.


A youth ambassador from Hong Kong raised the controversial topic of banning children from using mobile phones or the internet, sparking a discussion about the balance between protection and digital literacy.


Alliance for the Protection of Children in the Digital Environment


Elizaveta Belyakova, Chairperson for the Alliance for the Protection of Children in the Digital Development, presented the organization’s activities. The Alliance focuses on creating a safer digital environment for children through various initiatives and collaborations.


Thought-Provoking Insights


Several thought-provoking comments deepened the discussion. Heng Lee illustrated the long-term risks of seemingly harmless online interactions by describing how a chatbot could casually collect sensitive information from a child. Melodena Stephens highlighted the misconception that home internet use is always safe, framing it as a literacy issue. She also shared an example of a seven-year-old boy’s approach to identifying strangers in online games, demonstrating children’s potential for developing safety strategies.


An audience member raised the issue of content creation challenges and the need for skilled creators to produce engaging, age-appropriate content for children.


In conclusion, the discussion highlighted the complex and evolving nature of online threats to children, emphasizing the need for a comprehensive, collaborative approach involving governments, industry, civil society, parents, and children themselves. While challenges remain, the experts agreed that increased awareness, improved digital skills, and coordinated efforts could help create a safer online environment for children.


Session Transcript

Gladys O. Yiadom: . Can the online moderator share her screen? Full screen, please. Thank you. So the first question is this. . . . . . . . . How will the threat landscape for children in the digital world develop over the 3-4 next years? Just as a reminder, an on-site participant can scan the QR code to participate to the survey. First point, it will increase significantly and lead to increased abuse and cybercrime. The threat situation is getting worse. It will increase significantly, but at the same time, children’s awareness and knowledge of cybersecurity issues and protection against threats in the digital world will also increase. The threat situation will remain more or less the same. It will increase significantly, but better knowledge and well-developed defense skills, as well as better developed digital skills, will ensure that children can operate more securely in the digital world. In this respect, the threat situation will improve, or I cannot give an estimation. So just let you a couple of minutes to take the survey, so that we can view the results all together. And please, do we have the results? Okay. The results are coming. Just a few seconds, please. Just a couple of seconds. I just received them, give me one second to share the graph. Thank you, and okay, let me switch the screen, then I’ll be able to share them. There we go, thank you. So we now have the results, so 17% of you have responded that it will increase significantly and lead to increased abuse and cybercrime, 33% of you has indicated that it will increase significantly but at the same time, children’s awareness and knowledge, my mic is not clear apparently, can you please fix it, thank you, that it will increase significantly but at the same time, children’s awareness and knowledge of cyber security will increase. See, 32% of you has answered that it will increase significantly but brilliant knowledge and well-developed defense skills will ensure that children can operate more securely in the digital world, and at least 17% of… few cannot give an estimation. So we will run again this survey at the end of the session and see if the result has changed. Thank you very much Anne. So let’s now start with our conversation with our speaker. I am pleased to have with me today Melodina with us. Brief introduction, Melodina is a professor of innovation and technology governance at Mohammed bin Rashid School of Government in Dubai, UAE. She has three decades of senior leadership international experience and consult in strategy working on policy issue with organizations such as Agile Nations, Council of Europe and the Dubai Future Foundation. So Melodina please tell us in your opinion what are the main threats, risks and dangers for children in the digital world and can you illustrate the negative effects with a few examples and I will kindly ask you to share Melodina’s slides please. Thank you.


Melodena Stephens: Thank you so much. Let me start with just a little question. If I show you this picture can you tell me where the threat comes? So this is what children play with. Post pandemic we went a lot online and right now we’re using gaming as an educational medium. For example I’ll just give you figures Minecraft has 173 million users active users and approximately if you look at the figures less than 15 year olds are 20.6 percent but 15 to 21 year olds so this is a little bit funny we don’t know what that number is 15 or 16 years old is 43 percent. Let’s take another online game Roblox 71.5 million users two-thirds are children right. And this raises these interesting questions because when I look at this figure, I don’t know who’s a stranger, who’s a bot, what kind of information is getting. And if children online are using VR sets, and they are because parents sometimes let them unsupervised, basically when they play a game which is approximately 20 minutes, there’s two million biometric points of data being collected. This may not seem like much, but tomorrow this could be information that could be used for security. So I think the challenge that we have right now, the threats we don’t see because we aren’t having enough discussions. We’re taking a one-sided view, oh, it’s online education and it’s developing digital skills. But we’re not looking at what might be the security concerns. We’re not asking the questions, what happens if children are too long online? And here’s another example that I wanted to show you. This is not working. Next slide, please. Could you put the next slide? Okay, so everything has gone a little bit off. So if I take something like this, there are a lot of age-appropriate codes being developed right now. But there’s a little bit of a problem. The global online safety currently showed that 18 to 19-year-olds scored very high on addiction. So that’s one fourth of them. And 18% of 13 to 17-year-olds, they didn’t do anything lower, but there are children much smaller. I see babies with iPads who are three years and four years old because parents want to keep them occupied, right? And then we also see that many of these children have profiles put up, either directly or indirectly. So if a parent puts up a child’s profile on Instagram or TikTok, there is data being collected. So we see that there is a problem in this gap. Another interesting thing is when there are crimes that are happening, most likely it’s associated with people you know. so it’s friends and family. Imagine a child playing online with a friend. They don’t know this friend. Physically, the parents have not met them. So you can see how much of that threat increases. And then you add the concept of these non-player characters, which are AI bots. And we recently saw a 14-year-old commit suicide because he fell in love with his AI bot in the US. The bot did not ask him to commit suicide. It said, please join me. And the child assumed that was take a gun and shoot himself. So we see this little bit of, I wouldn’t say little bit, horrific consequences of unsupervised online time. Another big challenge that I think is really important is when you look at the standards on how they decide what is allowed for children to play, there is no alignment. So I took this picture across different, it’s the same game, and you can see six-year-old, seven-year-old, 10-year-old, 12-year-old. So we don’t have alignments on standards, and there isn’t enough education for parents on this. So if I look at all the online harms, and I’m just gonna leave that as an impulse, you see there’s quite a lot, but I want to talk about things like self-harm, right? We right now find that WHO says the fourth largest cause of death among, and I’m taking 15 to nine-year-olds, is bullying and cyberbullying. Cyberbullying often happens online, and we may not recognize it because it does not result in physical harm, but it results in deep emotional harm. So I think there are challenges when we look at how we want to manage this process. I’m going to leave it there, but we’ll leave more for questions afterwards. Thank you.


Gladys O. Yiadom: Thank you so much, Milodena, and what you highlighted is very key, and it shows that there is a need to undertake actions to protect children, and this leads us to our next speaker, which is Elisabeta Beliakova. So very pleased to have you with us, Elisabeta. Elisaveta is the chairperson for the Alliance for the Protection of Children in the Digital Development. Elisaveta, I wanted to ask you, you are part of these alliances, what were the motives for founding the Alliance for the Protection of Children in the Digital World? What are the key stakeholders you work with and what are the goals of your organisation? Over to you.


Elizaveta Belyakova: Dear colleagues, good afternoon. I speak Russian because my Russian is much better than English. Sorry. Dear colleagues, good afternoon. I am glad to see you all. Thank you very much for the question.


Gladys O. Yiadom: Elisaveta, sorry, I need to interrupt you. I would kindly ask you to speak in English because we do not have interpreter services here. So, I will kindly ask you to speak in English, thank you.


Elizaveta Belyakova: My English is not good, but okay. I am glad to present to you the activity of the Alliance for the Protection of Children in the Digital Environment. In the Includive, there has been bridging together Russian technologies companies for the, yes, to create digital for our children. Founded in 2021, the Alliance has been working in addressing the key challenges of the digital age. The Alliance has become the unique platform that unites the largest companies in Russia. It’s Laboratory Kaspersky, VTK and many others. One of the most important activities of the creation of the digital liberty. Education, it is a big education portal. This portal provides the children, parents and teachers with access to cyber risk materials that help develop skills of protection. agency digital fees, such as psychological cases, data breaches and others. We have also developed a kind of book Risk of the Digital Environment. This is also a big interesting project, so the future risk of others. We also pay special attention to corporations in international areas. The activity shows our plans with Luvon, or BRICS and many other organizations. Let me also… Sorry, this is really not… Okay. Let me also answer a few questions for the lecture today. First, what is the best way to address children, parents and teachers? It is important for children to use games and education forms to involve them in the education process. Second, how to adapt to the learning programs? We believe the programs of the Internet interactively take into account the psychological characteristics of children, so the difference between games and the inclusive parental care. Third, how should the dialogue be simulated? We see great potential in the organization, regulation, meeting, workshop, and around the table, bringing together the business, government and civil society. Thank you for your attention and I look forward to the continuing and prospective dialogue. Thank you so much.


Gladys O. Yiadom: Thank you very much, Elisaveta, for your words. The contribution that you’re doing in this space is very important. Thank you for having shared your experience with us. Let me now turn over Elmirti, who is with us here. Thank you again also for being with us, Elmirti. A quick bio, Elmirti Arousafi is a cyber security expert and board member of the Moroccan Centre for Polytechnic Research and Innovation, where he plays a key role in advancing cyber security initiatives and research. He also is a core contributor to Espace Maroc Cyber Confiance, a national program dedicated to protecting children and vulnerable groups overall in the digital space. My question to you, Elmirti, is this one. In your opinion and experience, what are the main challenges when it comes to protecting children in the digital space?


Elmirti Arousafi: Thank you so much, Gladys. First of all, I would like to thank Kaspersky for the invitation and for creating this opportunity to talk about a very important subject. I was really surprised to see the numbers you showed, Elmirti, that this issue is really getting worse year after year. And as you kindly introduced, our experience in Morocco, actually with the EMC, Espace Maroc Cyber Confiance, helped us first-hand see how difficult it is to implement a national program in order to raise awareness around those issues. And I would like to just share some of them. I really think the issue is much bigger than that, but we can summarize them into… maybe three or four challenges. The main challenges according to what we see is the speed at which technology evolves. Unfortunately, as well, it’s misuse. So technology advances at a speed, but we can see that harmful people will use it as well, would use those advances such as AI and deep fakes and those kinds of technologies to create more harms, to create more complex harm to combat. So this sophistication really poses a difficulty, both from a regulatory perspective, but as well as technical perspective. And acting quickly is something we are trying to implement as well through the implementation of the helpline to help children, for example, remove content online. We also try to go beyond children. We talk about really weak populations. So in order to act quickly, those target people we are trying to touch needs, and I come to my second challenge, to be aware, right? To be a minimum trained, to be able to detect actually a fraud scenario, I would say, or maybe a cyber threat in the internet realm. So while children are usually digitally savvy, we noticed that they are not well-trained into the main threats, into internet and digital worlds. And equally, parents and educators are also, may lack the knowledge, especially the technical knowledge and tools to guide them effectively in this endeavor. So this is the gap we see, and the cyber criminals actually exploit that gap. The third, maybe, and final challenge would be the regulatory inconsistency across borders. And I emphasize on the across border. Our experience at EMC is to act locally, but we very quickly found out that. we cannot. We are talking about internet, we are talking about internet giants, we are talking about platforms, international platforms, so we have to reach, we have to expand our reach, hence the collaboration with some of our partners like Kaspersky, to actually touch and be in contact with the regulatory boards across borders. So we are trying, and in Morocco we’ve been addressing these challenges through multi-faceted approach. I will leave some of those remarks to the next questions.


Gladys O. Yiadom: Thank you. Thank you very much El-Mahdi, and I think what you share with us also highlight the need to have this multi-stakeholder approach at the end of the day, so having representatives from the civil society of course, including academia and government, which is key, and this leads me to our next speaker, Ang, who will represent the industry. So Ang, again also, thank you very much for being with us today. Just a quick intro and then I’ll share my question to you. So Ang Lee is Singaporean and a lawyer by training. He’s a senior government affair and public policy manager at Kaspersky. Prior to that he worked at the Singapore’s Ministry of Home Affairs as assistant director of technology and data policy, and he studied issues at the intersection of law enforcement and technology, including crimes targeting children and cyber bullying. So Ang, my question to you, why Kaspersky as a cyber security company is committed to protecting children in the digital space, and what projects has the company initiated in that regard?


Heng Lee: Thank you, thank you Gladys and my fellow esteemed speakers. Maybe let me answer this in two parts, the why and the how. Firstly, the why. I think Melodina and Almedi had shared a lot about threats and challenges of protecting children online. I just want to add one more perspective from the vantage point of a cybersecurity company, and why has the online quality of this problem made it particularly difficult and thus relevant to the entire tech industry. So in the physical world, we have special standards for children like child safety seats in a car, and warning labels about small parts that could be easily swallowed by a toddler. These are tangible things that you can touch and feel, and if there is a bully at the playground, you know who he or she is and you can see what he or she has done. But the online world gives rise to a different set of behaviour, what is called the online disinhibition effect, where individuals behave differently than they would in face-to-face interactions. It is partly because of the anonymity and lack of accountability that the cyberspace offers, and partly because consequences aren’t felt as immediately. And children who often do not have full appreciation of these consequences become particularly susceptible to this effect. Since the mischief here had arisen because of tech, the tech industry is naturally well-placed to offer many of the solutions to counter these, because tech practitioners have a good acumen of trends and designing these into a workable solution. Which brings me to the how. So what are some of the projects that Kaspersky as a company has initiated in this space? The first and foremost that I want to share is in parental control. Many cybersecurity companies have come up with solutions in parental control, and so has Kaspersky. Ours is called Safe Kids, and it’s been around for 10 years, having been launched in 2014. This app, when installed on a device, protects children from harmful content. It will interact with search engines and browsers. block search requests and once a week parents will receive reports on what their child had searched for on the Internet. This helps them to better understand the child’s interests and to remind them what is suitable for them to search on the Internet and what is not. The app also allows web filtering by enabling parents to block adult content, violent sites and video games. Moving on to usage control, Safe Kids also allows the blocking of inappropriate apps based on the child’s age. Parents can set time limits for the usage of the device by shelling time slots and days off and the device will also be blocked when a time limit is up. It can be switched off at a given time when the child needs to do their homework or be engaged in other screen-free activities. For how all-arounded this solution is, Safe Kids has received awards from the Independent Assessor AV test from Germany and also at the Mobile World Congress in Barcelona. Of course we know that such apps are not completely without controversy so we’ve also issued guidelines on whether the installation of the app should be discussed with children and how. For instance we are suggesting that from the ages of 3 to 6 there is no discussion needed, from 7 to 10 children need to be informed, from 11 to 13 there has to be a discussion and from 14 to 17 there should also be mutual agreement. This is where tech really intersects with policy so Kaspersky also extends into thought leadership and education programs engaging even parents and teachers who need to be equipped with knowledge about cyber threats to make fully informed decisions about what is best for their child. On that front Kaspersky has been conducting events to promote cyber hygiene habits and in 2023 alone we did 107 events reaching out to some 700,000 people around the world. What we really sought to do is to share some of our findings on threats and make them actionable for parents and teachers like anti-hacking, protecting children’s privacy identifying indicators of cyberbullying, and helping children become more resilient when digital devices become a norm. Kaspersky has also published children’s books on good cyber hygiene habits. One of them launched this year is called Cyber Security Alphabet, where we are teaching A to Z. But here, A is for authentication, B is for backup, C is for capture, and D is for digital footprint. It is a profound reflection of the world that we live in today and how fast changes are happening, because I don’t think even adults may know all of these words. So I encourage everyone to download a copy. There are limited physical copies, which I think Gladys has also been distributing in our booth at IGF. Of course, there are still many other initiatives like a joint study with the UAE government on children’s online habits, and a white paper written with the Singapore Institute of Technology on motivations for safe online behavior. These initiatives allow us, as a cybersecurity company, to contribute to the ecosystem as practitioners with real stories to share that is backed by our data. But given the limited time, I won’t be able to go to the details of many of these initiatives. I’ll be happy to share more about these later on in the questions, as well as during the interaction session with the audience. Thank you very much.


Gladys O. Yiadom: Thank you very much, Eng, for this comprehensive overview of the action led by Kaspersky. As you were saying, there is a need to have this conversation from the get-go with children. So now moving on to sort of conversation that I want to have with all of you panelists. So, El-Mahdi, you referred later to risks that are coming up with AI. So I will address this question to Melody and Eng, but AI is often exploited by cybercriminal to harm children via deepfakes, as you mentioned, El-Mahdi, or text that look convincingly real. What can the different stakeholder groups do to counter this danger and mitigate risk? So I’ll kindly ask Melodina to share her insight, then Engin, and yes, thank you.


Melodena Stephens: So I think the first thing is to know how many pictures it takes to make a deep fake. And you could do that with one. How much a voice recording, 15 minutes. Then think about where all children’s voices are being recorded or pictures are being put up. And this is schools put up children’s pictures online because they are like, this is my new class. We use platforms like learning platforms where we record things. But we never ask the questions, what happens to these recordings? Are these recordings with the platforms? What are the safety things? Because there’s constantly training. I mean, Zoom is currently now training on the recordings that are there. The problem is the large companies may choose, and I don’t know yet because I don’t have clarity, not to share the information, but there are many, many educational apps. Who’s vetting them? And we know apps fail very quickly. You need a minimum user, kind of 50 million users, otherwise you’re not going to be successful. When they die, what happens to the data that the teachers have used to keep the kids engaged? And we don’t have answers because no one’s vetting them. No one’s asking these questions. So I think the problem with deep fakes, it’s very easy to make, and we need a whole of society responsibility, but we need regulators to get onto this too. We need ministries of education to perhaps vet apps and say, these are approved, these are not approved, and also monitor them because if they fail, make sure their data is not leaked. If you look at the deep web, one of the biggest challenges is child pornography. The children that are being trafficked there are synthesized. They take real pictures and superimpose them on compromising pictures. Imagine that child growing up and being confronted with a picture that’s a deep. fake? What will be the psychological damage for that child years later when they’re trying to get a job or anything else? We do not have an idea how this will evolve in the future, but it’s a little bit scary and I am worried for the children.


Gladys O. Yiadom: Likewise, Melodina. Hank, could you share your thoughts on that as well, please?


Heng Lee: Certainly. I think let me divide it into a few parts once again. So it’s really the people, the process, the technology. The people here would really need to gain the kind of awareness and education as to what kind of threats children are facing. In terms of policy, we have examples very recent, in fact, from Australia. I think many of you might have read in the news that Australia is going to become the first country in the world which intends to ban children under the age of 16 from using social media. It is in fact the world’s most restrictive regime so far. But there are also questions about how this is going to be enforced. How do you ensure that children under 16 don’t have access to the social media? Because there are age limits for alcohol, but that doesn’t stop children under the age of 18 or 21 or whichever age limit there is from getting alcohol in different countries. So enforcement can be a problem. And then, of course, there is technology, which I think, once again, coming from a tech company, I feel that the value of what a tech company can contribute is practitioner experience, the understanding of what the latest threats are and how to guard against them. Since we’re on the topic of AI, Melodina has shared a lot about how quickly it is to, how easy it is to create a deep fake, just 15 seconds of voice and just one picture. And AI is actually making it easier to groom children as well. Just imagine if adults can fall for deep fake, what more would it be for children? The kind of grooming also comes in the form of conversations where the child could think that he or she is talking to a friend who is playing an online game, but it could well be a bot which is being programmed to gather some of the personal data. And these have long-lasting effects. I’ll just cite one example. If we have a bot which is collecting conversation very casually with a child asking, what is your blood type? The child takes it as a very innocuous question, answers it. And this is something that stays on the internet forever because it is pretty unlikely that the child’s blood type is going to change. So on the internet or in a dark web database, whatever blood type this person has remains on the internet forever. So this is actually quite a sobering thought, which if you think about the kind of damage that online gaming modules and together with AI can create. So tech companies need to guard against these, to flag these out as early as possible whenever they come across these new threats. And coming from wearing my hat as a regulator previously from the Singapore Ministry of Home Affairs, I also think that there needs to be an enlightened approach when it comes to regulations to ensure the balance between consumers, protection, and innovation. Try not to jump into any new threat that emerges. Start with broad principles, guidelines, rather than a very blunt question like, how do we regulate chat GPT? So I think the kind of balance from regulators, contributions from tech companies are essential to create an ecosystem that’s safe for children. Thank you.


Gladys O. Yiadom: Thank you, Veng. Absolutely. And when you’re mentioning guidelines, we can see here that it’s about an ecosystem, really. So we’re talking about children, but there is a need to also address parents. and teacher, which leads me to my next question that I will address to Elisabetta and Elmeri. What is the best way to address children, parents and teachers? And how can curricula be adapted to create sufficient appropriate cybersecurity offerings? So Elisabetta, please, I know that you answered some of the question earlier, but if you want to perhaps add some more comments to that? Elisabetta, are you with us? Perhaps let’s start with you, Elmeri, if you can answer this question, please.


Elmirti Arousafi: Sure. So if I got the question right, we are talking about how to actually address children, parents, educators, and all the stakeholders. And of course, different targets would require different approaches. And first of all, let’s talk about children. So what we noticed is that effective curricula are actually gamified. So we’d like or we would want to create interactive experiences while we teach children. We don’t want to have a strict technical curricula. We want to get engagement. So remember, we said that one of the challenges is to get children trained, able to spot threats and alarms. So the easiest way and the most effective way would be to create gamified curriculas. We tried that through our subsections of EMC, EMC Youth, where we created actually games on internet and such kind of interactive experiences to get as many children engaged as possible. Now for parents, the focus should be on practicality, right? So the guidance with parents needs to be ongoing and it needs to be helping parents be equipped really to protect their children. And this is again, due to the pace of technology. So when we talk with parents and us included, talking about our generation, we start feeling that technology is actually, you know, far ahead of us. So we actually need practical guidance. We need to understand. I mean, I understood Roblox from my daughter and it took me some time to understand the threats behind that. So it’s really very interesting to see how this space and how parents sometimes feel lost. So again, at EMC, we created guides, practical guides for parents. Now, we need also to think about teachers, right? Because we think that peer-based and structured learning is also important when it comes to cybersecurity. So teacher would play a crucial role and they are in constant contact with children. Us as an NGO, we are not as often with children as the teachers and parents would be. So teachers, the curricula would need to be very specific cybersecurity curricula that aligns with their teaching objectives. So depending on which level we’re talking, which kind of school, et cetera. So again, equip them with foundational cybersecurity skills and it is essential, again, to make it practical. We don’t wanna have only theory on that. And I think part of your question is how to adapt the curricula. So the module should be actually part of standard education. Ideally, we wanna have from an early age, something evolving into more complexity as the child will grow in different levels. It can be like… digital safety weeks, for example. This is some of the initiatives we have been doing, or online hygiene session. So we had that as well done with some of the classes. And again, it involves developing age-appropriate resources. So the key is to adapt curricula to every audience. Thank you.


Gladys O. Yiadom: Thank you very much, El-Mahdi. I saw that Elizabeth was with us a couple of seconds ago. Elizabeth, are you back? No, so let us move then to the next question that I will address to Henk and Elizabeth as well, if she is back. Henk, how can a multistakeholder dialogue and cooperation on online child protection on national and regional level be stimulated?


Heng Lee: Thank you. Thank you, Gladys, for the question. I think it really needs to start from the recognition that the problem cannot be faced or solved by any single stakeholder, because no one will have all the answers to this problem, even the complexity of it involving regulators, tech companies, parents, teachers. So that humility and understanding that it needs to be some all hands on deck, it needs to start with that understanding. And also, there needs to be a recognition that this problem is not something that’s confined to a certain country. So whenever we see a case of something very alarming happening in another country, that could soon be on our shores very quickly. So that dialogue gives the urgency of this problem, really gives the impetus to put together a dialogue that is very specific to this problem. And instances like the workshop that… having today is an example of how we’re coming together to influence policy at a national level because we have regulators sitting with us today, we have government representatives who can take these ideas back. It’s a cross-pollination of ideas not just from the industry but also from people who have done it before, practitioners, NGOs and academics like Melodina giving very good ideas on how we can shape a balanced approach to regulating content on the internet that can protect children. And some of the examples of what have been thematic discussions in the past would be such as the World Anti-Bullying Forum where I think the example I raised just now about online disinhibition in fact had been widely discussed and also there’s a Safer Internet Forum that is done by the EU. I don’t know where there’s going to be another repetition of it but instances like this really allows people to sit together and learn what is it that has succeeded or failed especially instances which have failed so that we know how to draft laws in a way that avoids these pitfalls. And the example finally coming to the example I talked about just now about how health data are becoming very crucial about how blood types could be data about blood types could be taken up from game modules. This is something that I think even health authorities from around the world could be interested in. So especially for children because adults may understand the importance of keeping health data close to themselves but children may not and they could see as oh I’m just sharing it to see if I’m making a good friend or they look at it in terms of whether I’m just sharing my horoscope there’s nothing wrong with it. So that awareness and education, once again, has to be present across different verticals, healthcare being one of them. So I think the involvement of different verticals, and I don’t have a comprehensive list of what these verticals may be yet, but as and when there are new challenges that are targeted at them, they should be involved in this conversation. I think as a start, there will be a good approach to understanding who is it that we need to gather for these conversations. Thank you.


Gladys O. Yiadom: Thank you. Thank you very much, Ang, for these thoughts. And this gave us the opportunity also to open the floor for some comments and questions. And I’ve seen that we have a request from the Ministry of Education. Could we please give open mic to Mr. Andre Gorobets? It’s online, right? It’s online.


Andre Gorobets: Hello.


Gladys O. Yiadom: Hi, we can hear you, sir.


Andre Gorobets: Can you turn on the mic of our translator? I can speak in Russian. And he can say it.


Gladys O. Yiadom: But I can only ask you to speak in English because we do not have interpretation.


Andre Gorobets: Okay. I will translate it online. You can turn for her a mic, a microphone. And Margarita Yurova.


Gladys O. Yiadom: Okay, we have someone in the adjunct that can translate in Russian.


Margarita Yurova: I will help. I will translate from the audience here.


Gladys O. Yiadom: Please go ahead, sir.


Andre Gorobets: Thank you very much. I support my colleague. The issue of digital education is quite important. It is considered as one of the main trends of development in all states.


Margarita Yurova: So, I’m supporting colleagues. The issue of how to deal with digital challenges is one of the key ones, not only for Russian Federation, Russian government, but for other governments across the world. It’s a very important issue.


Andre Gorobets: We believe that our key goal is to focus on new skills development, new competences for our kids to adapt to the new digital challenges. And we believe that new technologies, new technological instruments need to be instrumental and helpful and not to be a showstopper. First of all, Russian government and Russian state pay attention to the fact that security issues, first-stage security issues, need to be combined and considered as pedagogical issues and psychological development of our kids. And we believe that we need to address the issue holistically, and cyber security issues, challenges, has to go along with psychological and educational goals and tactics. So it all has to be in one, to address holistically. And the most important point that all our colleagues should pay attention to, is the interaction between the state and the protection of our children. Because trans-border threats, trans-border interactions, affect our work primarily now. And cooperation needs to be improved to address kids’ safety goals. To address kids’ safety, we are working on three levels. The first level is technological level, to ensure that devices kids are using are protected from harms, unwanted content, threats, etc. The second level is software level. to ensure that the device is equipped with software for checking the content. The highest level is, of course, the content. This is what we should pay close attention to, because the content contains the important moments of the development of our children, which we should control first and, first of all, protect them from external harm. Colleagues have already said that the chat GPT and any other artificial intelligence that is implemented in education poses high risks. Yes, we agree with this. But we must understand that this new technology can be used, and we are already using it, in the work, including for the development of skills, knowledge and skills, which allows the teacher to build a lesson But we need to admit that it’s also generated by the chat GPT, also instrumental in terms of teaching kids, and they can be used for good in educational process. And the role of the government is to equip teachers with the technologies, which can be embedded in a safe way for the educational process. And we need to all together work on the common goal of equipping schools, universities, with the good modern technologies for the modern, modern education. Thank you.


Audience: And what I wanted to address, and I’m sorry, I don’t know all the names of the panelists sitting around here, but I’d like to ask, and the lady, and the sir from Morocco. We found that the modus operandi of perpetrators reaching out to children for either for radicalization, as well as for grooming, is the same. So they use the same pattern for a different crime, meaning that the most vulnerable children, one of you mentioned that, is even more at stake in the online environment. And I must say I’m kind of disappointed because I was hoping this room was packed today, because this is one of the most threatening subjects we have online at the moment, and I don’t think we’re doing enough. We keep talking about it, but at the same time the big tech is just over our shoulders and not protecting our children enough. So, my question would be, you mentioned it, I believe, but how do you emphasize a regulator or regulatory body to be able to put more regulation online, protect our children, if you know that it’s for the perpetrators, just another easy crime.


Gladys O. Yiadom: Thank you. Melodina, I’ll ask you to answer the question first.


Melodena Stephens: So your question was just what should we do for more regulations, if I’m correct? Okay, so there is, I think, a literacy gap, and this is at not only at societal level, but at also regulatory levels. We’re not understanding what, and even with engineers, so I’m also working with big tech, I work with IEEE, even engineers don’t understand the consequences of designing code. Startup founders, they mean to change the world for the better, but the moment is when you’re embedded in 50,000 devices, and you don’t have the safe calls or the protocols, or there is somebody who says, where’s my money, because I’ve invested 50 million, I need, shareholders do the same thing on the stock market. So the issue is really a very large literacy issue. And the scale, and I think you mentioned that too, so I’ll just give you this example. It took 68 years for airplanes to reach 50 million customers, but it took Pokemon Go 19 days. There is no regulator in the world that reacts in 19 days. So we are far behind. What we need is a public sector that is thinking 20 years ahead of the private sector, which is not the case right now. Right. So what can we do right now? I think we need to make hard no-go areas. Does a child who’s five years old need to be exposed to the internet and have a mobile? Should that child have the right to childhood right now and just exploring simple things, learning how to read, looking at books? I don’t think books are bad. When you talk to psychologists, they say reading in books is a slow process, but it helps their brain. develop differently than online stuff, gamified stuff does. So we need a lot more research funding going into these areas. We need stricter penalties on what is a crime online. I don’t think that is very transparently clear. And it’s very difficult to apprehend criminals, make them accountable if they’re in another jurisdiction. So we do need a lot more coordination for apprehension of criminals, a lot more transparency on what is criminals. And I want to come back to cyberbullying for this. A lot of children bully children, and it would be considered a crime. They don’t know better. They think it’s OK to put a face on something else, and it’s fine. So I agree with you. Grooming is the same. With AI, it’s easy. You’re basically mirroring a child. So if a child smiles, you smile back at it. The child develops trust. And therefore, if you say, it’s bath time, the child will do whatever needs to be done, and the camera’s on. So it is very difficult to catch, unless parents have a rule that no kid should be on the computer unsupervised. We assume if it’s at home, it’s fine. So it’s a literacy issue, I think.


Gladys O. Yiadom: Thank you, Mr. El-Medhi. A couple of comments on that, please.


Elmehdi Erroussafi: I would comment from the NGO perspective. And some of the work we do in Morocco is actually with partners and regulators, whether it is telecom regulators, data privacy regulators, are actually partners. So one very important point is to have the common goal, is to align everyone on a shared objective. Because this issue has been taken from different perspectives. Us as technical people, as researchers, we would look into the technicality of it and try to come up with standards and big compliance items to be checked. Whereas from the regulator, those are more into a work in progress. I would say policymaking more into protecting end customer. So we need this shared goal and we need to understand the pain point of each other. So collaboration is key. We don’t need to have multiple initiatives here and there. We need to have focus and we need to have collaboration. I would comment from that point alone because I think we won some months, maybe not years ahead, but months ahead when we worked since day one in a collaborative manner.


Gladys O. Yiadom: Thank you, Almedia. And perhaps since we also have Ang with us and I know that Kaspersky is only working with regulator, could you please share the perspective from the industry?


Heng Lee: Certainly, maybe I’ll just share from my region, which is in Asia Pacific. I think when it comes to regulations and how we respond, like what Melodina rightly pointed out, there is no regulator in the world who could be responding to a challenge like this of this magnitude in 19 days. So in my home country of Singapore, there have been different outfits that’s being set up to respond to new challenges that are emerging. So for example, an issue like child safety will straddle across a few ministries. In public security, the police might come into it, internet regulators might come into it, the social affairs people might want a hand in it. So the latest problem to enjoin such an outfit is something like the protection from falsehood. So there is a dedicated office called the Protection from Falsehood Office in Singapore. Yes, I can hear you. Can you hear me now?


Gladys O. Yiadom: No, I don’t.


Heng Lee: I am not hearing you very well. It’s very choppy on your end. And I’m here to check if it’s well. Can you say something again? That is, are you saying anything?


Gladys O. Yiadom: Yes. Sorry, there is some issue. Online you can hear it, so it’s here actually, it’s onsite. The issue comes from onsite. What we will do is while we’re fixing this issue, we’ll give the floor to another question and then I’ll come back to you once we are fixed. Because online they can’t hear you, but I believe that we’re having an issue with that.


Heng Lee: Do you want me to continue? Because I hear you.


Gladys O. Yiadom: Now it works. Now it works. Please go ahead and then we will take your question afterward. Question from the audience. Please go ahead.


Heng Lee: Sure. I was talking about how there are different outfits that are being set up for specific purposes in Singapore, like the protection from falsehood office, the anti-scam commands. So have we come to a time when we need a child protection authority for online content? And how does that overlap with existing outfits, which are in the physical space for child protection? And what is the qualitative difference other than online disinhibition that warrants such an outfit? These are all problems that we need to think about from a regulatory perspective. And it’s tempting to think about how we answer questions in terms of let’s have a new team of people doing it. But are the people who are staffing such a new outfit sufficiently equipped with the response skillsets to understand this is an upcoming trend that needs to be addressed? This is something that will happen no matter whether you regulate it or not. So I think. That acumen has to be gained over time, so there is really no easy answer to this, but the shape and form that outfit could take is something that we could probably start thinking about. Thank you.


Gladys O. Yiadom: Thank you, Hang. So we will take one question from the on-site audience, and then we will have a question from the online audience. Could you please give the mic to the young man over there? Thank you. We can hear you. I will kindly ask you to share your name, organization, and who you addressed the question to.


Audience: I’m Ethan from Hong Kong, and I’m the youth ambassador of the One Power Foundation. Basically, I’m asking this question to every one of you. In Hong Kong, specifically, I guess, many parents try to protect their children from the internet by physical ways, banning their children from using phones, or maybe banning them from contacting the internet. Before, not five years ago that you mentioned, but actually 12 years ago or below, they are not able to use phones because of some kind of parents’ problem. And I’m just wondering, because I’ve only heard perspective from teachers, parents, and my peers that this way, because they comment on this kind of ways of protecting children. And I want to hear from different perspective that comments on this way. So will banning children from using mobile phone, or maybe banning children from using the internet, an effective way or maybe a suitable way for them to not being cyber-bullied or insecurity in internet. Yeah, that’s a question.


Gladys O. Yiadom: Thank you. Milodina, please go ahead.


Melodena Stephens: So, I think the technology is here to stay. I don’t think banning alone is good enough because we have to teach people how to use the technology safely. And so it’s a dual problem. educate 8 billion people of the world. 30 percent are still not online. We need to educate everyone online. We need to educate parents as their children grow and as new technologies come, how to use that. And we have to educate children. And I want to give you this one example. I was working for a company and we were trying to look at preventing bullying and harassment on an online game. And everything we did, we parents came, and a lot of us were adults, so we came together and we looked at cyber tools, we looked at content moderators, we looked at algorithms, we looked at AI bots. But the thing is, the children were still accessing it because somebody’s parent’s friend allowed them to use it. So I don’t think banning actually works, correct? The children, whatever’s banned, they want to find a way to use it. So the kids were still finding ways to do that. And then I decided to interview a little boy who was seven years old. And I asked him, how do you know it’s a stranger on Minecraft? And we had all these answers as parents, but he said something very simple. If that is my friend, they go to my school, they will know what I ate for lunch. So the first question I asked them is, what did I have for lunch yesterday? What happened in school in this class? And if they don’t know that answer, I don’t play with them. And I thought this was not something I would never have come up with myself. But we also need to involve, and I’m very happy you’re here, we need to involve the children in this dialogue, because they may have answers, because I’m not playing on the ground, I’m not comfortable with that thing. So thank you for asking that.


Gladys O. Yiadom: Thank you Meletana. Elmedli, a couple of comments on that?


Elmehdi Erroussafi: Yeah, I think everything has been said. I would just say that this method of forbidding to children might seem to be effective, but what we noticed talking to children and parents is that there is a more effective way, which is to actually build trust with your child as a parent or as an educator. The child needs to trust you enough, to be able to come forward if he’s being harassed or if he’s being bullied on the Internet without fear of reprisal or without fear of punishment. So, open communication is key. We advise technology will also, you know, is very fast and children will get access to technology. Banishing access to the device itself might not be effective in our current world. So, open communication is key and building trust with your child so that everything can be said, you know, in an open manner. That’s our advice.


Gladys O. Yiadom: Thank you. And Maddy, Ang, do you have brief comments from an industry perspective?


Heng Lee: Yes, yes. Thank you for the question. I’m very happy to answer. Having lived in Hong Kong myself, I’m going to start with a Cantonese answer, which is So, in short, this really means there are trends in the world that we can’t really disobey. And like what Melodina says, I agree with her that the tech is here to stay, which is why going back to what I mentioned just now about the guidelines for safe kids, when we interact with parents on whether or not to tell them or to discuss with the children before they install safe kids on their children’s phone apps. We’re saying that for the younger children, maybe there is no discussion needed. At a certain age, there needs to be information about where am I doing this for? So that it doesn’t have to be a very prolonged kind of information, just that there is an app that I’m installing to protect you. But at some point in time, you realize that by especially by 14 to 17, our guidance has changed to you must get the children’s permission before you install the app, because we also recognize that you can probably install the app, but the children is also, they’re also wise enough to know how to uninstall the app from their phone. They could even be reinstalling it every day before they come home, or they could have a different phone altogether for their day-to-day interaction, and a dummy one just to show the parents. So there is really no hard and fast or blunt way of solving the problem itself, but I think that constant communication to maintain trust between parents and children, rather than a blanket ban, will be something that is more effective and sustainable. Yeah, thank you.


Gladys O. Yiadom: Thank you. Thank you, Ang, for your answer. Now, perhaps turning to our online audience to check whether or not there’s a question. Anne, please.


Anne Mickler: Yeah, there is one question, and it comes from Jochen Michels, who is the Head of Public Affairs, Europe, at Kaspersky. And this question goes to Melodina and Elmedi, please. So the question is, how can the different competencies of the various stakeholders be brought together to ensure better protection, good and sufficient training, and appropriate solutions for children, parents, teachers, and other stakeholders? And should this start at a local level and then be expanded regionally and globally?


Gladys O. Yiadom: Thank you. And I’ll kindly ask Elmedi to answer first.


Elmehdi Erroussafi: Yeah, thank you for the question. So I think that it’s a problem for everyone. So again, we are emphasizing on cooperation. So every stakeholder will bring something to the table. Regulators will state, as I said, common objectives, I would say. and those big ethical guidelines, NGOs will help to touch the ground. We believe in the NGO work, we think it’s very effective. Vendors will provide as well technical solutions, technical responses. Educators, academia will provide the research, oversight, early warnings. Those years ahead we need, it’s actually needed from research and we need to be aiming for the best. We really need to be looking forward and having those years ahead. So everyone can collaborate to actually build a common, I would say, strategy. So acting locally is very important. This is where we touch, this is where we also take in consideration local points such as the one we just heard. So locally it’s very important, but we need to open those channels of communications. One example would be, for example, AI regulation. This is a big subject and cannot be local. This is a global issue and it needs to have a global regulation. It needs to have global guidelines. So acting, again, in the spirit of collaborations. Let me just share that we work with those big tech companies like NETA and TikTok to open this channel of communications and to be able to, as a partner of trust, they call us to be able to report on them so that they remove it, maybe quicker than on a regular basis. Collaboration. And hopefully we get there with the help of everyone.


Melodena Stephens: I agree with everything you said. I just want to talk about competencies. I want from government political will. That means this ruthlessness that I will ensure the child is safe. And what does it mean safe? So I want political will from governments. I want alignment on values. What does it mean when I say for good? So this alignment on values across all industry. From society, we want a reflection of culture. I have to say that we have different sorts of culture around the world, and I would like that to be there. So for me, for example, a child is not just below 14 years of age. I would think a child is 18 or 19 or 20. So maybe the way I look at the child is slightly different than maybe somebody in Europe does. I want that also, those nuances of culture, to be there. And from researchers, researchers tend to publish research either for something or against something. I think we have now evolved enough, I would like them to publish both for some of what is harmful, what could be some of the goods.


Gladys O. Yiadom: Thank you. Thank you, Melodina, for this answer. Perhaps checking from the audience on site, do you have any questions? We do have questions from the audience. Can someone share a mic, please? So one question, one question, one question. Okay. Okay. Please name, organization, and who you address this question to, please.


Audience: Hello. I’m Grace from the Pan-African Youth Ambassadors for Internet Governance. Yes, so I come from Uganda, where awareness about online risks is very low. So my question is addressed to anyone who can answer. So I’d like to maybe have campaigns. How can I collaborate with any of you to raise these campaigns and let people know about online child protection?


Gladys O. Yiadom: Thank you. Perhaps this question, I will address it to El-Mahdi, who can share his concrete experience and share some of the best practices that Espace Maroc Cyberconfluent has been doing.


Elmehdi Erroussafi: Yeah, sure. So again, I’ll profit from this question to also share some best practices. So I talked about common goals. I think clearly here we share common goals, but one very effective way to work and to start such campaigns would be to work by small focused work streams. We call them task force. So let’s get all our partners. So we have a team, but we’ll get there and contact every stakeholder partner, whether from the public sector or the private sector. We go on the team, we talk, we produce a project, we work in a project mode, and we build those task force. And if, for example, you would like to build this kind of campaign, I would suggest that first you list the stakeholder. The local area is actually able to support that project. For example, regulators, telecom operators, IT companies, and maybe also ministries of education. So those kinds of stakeholders, we need to contact them to this kind of partnership. Coming from the NGO, it also gives you a kind of a trusting, I would say, hat, because you are willing to basically profit from your time. And as young people, I believe this is very important. This is part of your education. So a portion of your time should go toward that work. And then, as I said, working in small teams. We want to build a campaign to our schools. We need someone from the ministry. We need someone from the education to build the content. We need some experts. And From my experience, I know we are in the same continent and I know how generous African people are. So, whenever we talk about an opportunity for a project, we get more participants than we initially anticipated. So, we end up with a very good problem to have this out-of-the-tunnel relevant people in the work stream. So, again, we are looking for progress and I trust that people will appear. At CMRP, we will share our interests as well. We will be very happy to open this collaboration and discussions with other organizations.


Anne Mickler: Sorry if I chip in for the online audience. The sound quality got really bad at the moment. The online audience can’t hear. Sorry about that.


Heng Lee: Yes, is there anyone speaking on site? I can’t hear you.


Elmehdi Erroussafi: Sometimes, from time to time, we can help. I can’t hear you. I can’t hear you. looking at the mix of vaccines and COVID-19. So it’s becoming such issues, and it’s a policing thing. So the only way to come up with the issue is now to build some solutions, some experiences from the countries whereby open solutions have been found. But now, all what we notice is also from the content creators. Sometimes, because we are calling upon local content, now, local content creators are not really skilled. There is no sound in Zoom.


Gladys O. Yiadom: Now, they’re solving the issue. There is no sound in the Zoom room, apparently?


Heng Lee: No, there wasn’t. Until you promised.


Audience: Yes, I was saying, like, another issue, but part of the issue is lying at the content creator-wise, because they are not really skilled. They are not really aware of educational curricula. What it requires for them is just to have the solution, to have something, like material, something on the hand that they can share and they can maybe share out. So now, what I can suggest, like from this workshop, is that to allow us, as we are coming from such countries, to allow us to have a proper way of bringing about awareness to the public, to those… children at schools, because some of them, maybe the school they are going to is not very equipped, but they are finding, they are coming across the devices out there, so maybe at a friend’s place, and now the only way to save those children is to bring about some programs, such as what we were talking about, from some of those kind of experiences, so that they be aware, and they be willing to say no to those who come to bully them. Yes, this is kind of collaboration I can ask for, to whoever can bring its expertise.


Gladys O. Yiadom: Thank you very much for sharing that, and I believe El-Mahdi is a concrete example of the partnership that we have with the civil society, so please do not hesitate to talk to him, we also have a booth, so very happy to have you there and can discuss further. So we’ll take a last question, please, before the next service, so please.


Jutta Croll: Thank you for giving me the floor. My name is Jutta Kroll, I’m a child rights advocate from the German Digital Opportunities Foundation. I just wanted to mention that what Melodina said about the age of children pretty much resonates with me, because we have these cultural nuances, but we also have the UN Convention on the Rights of the Child, which defines everyone under the age of 18 as a child, and with regard to all the questions that we’ve been discussing, I would like to refer to the principle of the evolving capacities of the child, so it’s not a matter of exact age of whether a child is 12, 13 or 16, it’s about their evolving capacities, and I also would like to go back to Grace’s question, because In 2021, the United Nations Committee on the Rights of the Child adopted General Command No. 25 in regard to… This is the document that you can base all these questions, whether there will be education. The states that have ratified the UN Convention are obliged also to implement the rights of the child in the digital environment. So when you’re asking for training for digital literacy and for cooperation, you can go to your state government and say, hey, you have ratified the UN Convention and here is the basis, you are obliged to do something for the young generation to make the rights come true in the digital environment. So I think it’s kind of groundbreaking that we have a document from the United Nations where you can base child online safety, but all their rights to provision and participation. Thank you so much for listening.


Gladys O. Yiadom: Thank you very much for your comments. So we’re entering now the end of our session. So before closing, we will run out again the survey. So Anne, I’ll kindly ask you to share the slide. So it’s the same question, see that it has evolved. I’ll kindly ask the on-site participant to scan the QR code and participate in the survey. And then we’ll review the result afterwards. So just a couple of minutes so that everyone can take the survey. And then also Anne, our online moderator, can share the results. And it will also be the opportunity to see if the answers have changed after the session. So, we have received the results. Just give me one second to change the screen and show you the results. Can you see the results now? Oh, yes. So, we have seen that the results have changed quite a bit. So, now actually more of you think that the threat will increase significantly and lead to increased abuse and cybercrime, that the situation will get worse, both on the A and the B aspect. But nonetheless, also more of you believe that the awareness and the knowledge of cybersecurity issues and protection against threats in the digital world will increase. And, oh, also 40%. So, now we’ve moved a little bit from 32% to 40% of those of you believe that better knowledge and well-developed defence skills, as well as better developed digital skills will ensure that children can cooperate more securely. And actually, less of you are uncertain. So, we’re seeing that after the workshop, the results have evolved a little bit. So, I would like now to kindly thank our speakers for their contribution. Thank you, Aang, thank you, Melodina, thank you, El-Mehdi, and also El-Evizabeta, which was with us. Thanks to the audience for the great contribution that you shared regarding this topic. And I’d also like to thank Anna, who was our online moderator. And please, let us continue the conversation. Can we… We will discuss this after… Okay, we will share about that, but let me just conclude the session and then we can take this conversation together. So, I would like to thank you all for participating in the session and please let us continue the conversation together. Thank you.


M

Melodena Stephens

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Increasing sophistication of online threats like deepfakes

Explanation

Online threats to children are becoming more sophisticated, with technologies like deepfakes posing new risks. These threats can be created with minimal data, making them particularly dangerous.


Evidence

A deepfake can be created with just one picture, and a voice recording of only 15 minutes is needed to replicate someone’s voice.


Major Discussion Point

Threats to children in the digital world


Agreed with

Elmirti Arousafi


Agreed on

Increasing sophistication of online threats to children


Lack of alignment on standards for age-appropriate content

Explanation

There is a lack of consistency in standards for determining what content is appropriate for children of different ages. This inconsistency makes it difficult to protect children from inappropriate content across different platforms and regions.


Evidence

The speaker showed an example of the same game being rated for different ages (6, 7, 10, 12 years old) across different platforms.


Major Discussion Point

Threats to children in the digital world


Cyberbullying and emotional harm from online interactions

Explanation

Cyberbullying is a significant threat to children’s well-being, often resulting in deep emotional harm. It can be difficult to recognize as it may not result in physical harm, but its impact can be severe.


Evidence

The World Health Organization states that bullying and cyberbullying are the fourth largest cause of death among 15 to 19-year-olds.


Major Discussion Point

Threats to children in the digital world


Need for political will from governments to prioritize child safety

Explanation

Governments need to demonstrate strong political will to ensure children’s safety online. This involves a commitment to taking decisive action to protect children in the digital space.


Major Discussion Point

Role of different stakeholders in protecting children online


Agreed with

Elmirti Arousafi


Elizaveta Belyakova


Agreed on

Need for multi-stakeholder collaboration


Differed with

Heng Lee


Differed on

Approach to regulating children’s online safety


Importance of industry alignment on values and ethics

Explanation

There needs to be alignment on values and ethics across all industries involved in the digital space. This alignment is crucial for ensuring consistent protection of children online.


Major Discussion Point

Role of different stakeholders in protecting children online


Agreed with

Elmirti Arousafi


Elizaveta Belyakova


Agreed on

Need for multi-stakeholder collaboration


Role of researchers in studying both benefits and harms of technology

Explanation

Researchers should focus on studying both the potential benefits and harms of technology for children. This balanced approach is necessary for a comprehensive understanding of the impact of technology on children.


Major Discussion Point

Role of different stakeholders in protecting children online


E

Elmirti Arousafi

Speech speed

143 words per minute

Speech length

953 words

Speech time

397 seconds

Rapid pace of technological change outpacing regulatory responses

Explanation

The speed at which technology evolves makes it difficult for regulators to keep up. This gap between technological advancement and regulatory response creates challenges in protecting children online.


Evidence

The speaker mentioned that it took Pokemon Go only 19 days to reach 50 million users, while regulators cannot react that quickly.


Major Discussion Point

Threats to children in the digital world


Agreed with

Melodena Stephens


Agreed on

Increasing sophistication of online threats to children


Need for gamified, interactive curricula to teach children about online safety

Explanation

Effective curricula for teaching children about online safety should be gamified and interactive. This approach helps engage children and makes the learning process more effective.


Evidence

The speaker mentioned their organization’s creation of games on internet safety to engage children.


Major Discussion Point

Approaches to protecting children online


Agreed with

Heng Lee


Agreed on

Importance of education and awareness


Importance of practical guidance for parents on protecting children online

Explanation

Parents need ongoing, practical guidance on how to protect their children online. This guidance should help parents understand and navigate the rapidly changing digital landscape.


Evidence

The speaker mentioned creating practical guides for parents as part of their organization’s efforts.


Major Discussion Point

Approaches to protecting children online


H

Heng Lee

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Development of parental control software like Kaspersky Safe Kids

Explanation

Kaspersky has developed parental control software called Safe Kids to help protect children online. This software allows parents to monitor and control their children’s online activities.


Evidence

The speaker described features of Safe Kids, including web filtering, app blocking, and time limits for device usage.


Major Discussion Point

Approaches to protecting children online


Creation of educational resources like cybersecurity alphabet books for children

Explanation

Kaspersky has created educational resources to teach children about cybersecurity. These resources aim to make complex cybersecurity concepts accessible to children.


Evidence

The speaker mentioned a ‘Cyber Security Alphabet’ book that teaches cybersecurity concepts from A to Z.


Major Discussion Point

Approaches to protecting children online


Agreed with

Elmirti Arousafi


Agreed on

Importance of education and awareness


Difficulty of enforcing age restrictions for social media use

Explanation

Enforcing age restrictions for social media use is challenging. Even with legal restrictions, it’s difficult to prevent underage children from accessing these platforms.


Evidence

The speaker referenced Australia’s proposed ban on social media for children under 16, questioning how it would be enforced.


Major Discussion Point

Challenges in regulating children’s online safety


Need for dedicated regulatory bodies focused on child online protection

Explanation

There may be a need for dedicated regulatory bodies specifically focused on child online protection. These bodies could address the unique challenges of protecting children in the digital space.


Evidence

The speaker mentioned examples of specialized offices in Singapore, such as the Protection from Falsehood Office and anti-scam commands.


Major Discussion Point

Challenges in regulating children’s online safety


Balancing consumer protection and innovation in regulations

Explanation

Regulations need to strike a balance between protecting consumers (including children) and allowing for innovation in the tech industry. This balance is crucial for effective and sustainable online safety measures.


Major Discussion Point

Challenges in regulating children’s online safety


Differed with

Melodena Stephens


Differed on

Approach to regulating children’s online safety


E

Elizaveta Belyakova

Speech speed

106 words per minute

Speech length

352 words

Speech time

198 seconds

Collaboration between government, industry and civil society stakeholders

Explanation

Effective protection of children online requires collaboration between various stakeholders including government, industry, and civil society. This multi-stakeholder approach allows for comprehensive solutions to complex online safety issues.


Evidence

The speaker mentioned the Alliance for the Protection of Children in the Digital Environment, which brings together Russian technology companies to address digital challenges for children.


Major Discussion Point

Approaches to protecting children online


Agreed with

Elmirti Arousafi


Melodena Stephens


Agreed on

Need for multi-stakeholder collaboration


A

Andre Gorobets

Speech speed

93 words per minute

Speech length

471 words

Speech time

301 seconds

Transborder nature of online threats to children

Explanation

Online threats to children are not confined by national borders. This transnational nature of threats makes it challenging to address them effectively through national regulations alone.


Major Discussion Point

Threats to children in the digital world


U

Unknown speaker

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Potential for youth-led awareness campaigns on online risks

Explanation

There is potential for youth-led campaigns to raise awareness about online risks. These campaigns could be particularly effective in regions where awareness about online risks is low.


Evidence

A participant from Uganda asked about how to collaborate on awareness campaigns about online child protection.


Major Discussion Point

Role of different stakeholders in protecting children online


Importance of equipping content creators with child safety knowledge

Explanation

Content creators need to be equipped with knowledge about child safety online. This is particularly important for local content creators who may lack awareness about educational curricula and online safety standards.


Major Discussion Point

Role of different stakeholders in protecting children online


J

Jutta Croll

Speech speed

154 words per minute

Speech length

280 words

Speech time

108 seconds

Using UN Convention on Rights of the Child as basis for government action

Explanation

The UN Convention on the Rights of the Child, particularly General Comment No. 25, provides a basis for government action on children’s rights in the digital environment. This international framework obliges states to implement children’s rights in the digital space.


Evidence

The speaker referenced the UN Convention on the Rights of the Child and General Comment No. 25 adopted in 2021.


Major Discussion Point

Challenges in regulating children’s online safety


Agreements

Agreement Points

Increasing sophistication of online threats to children

speakers

Melodena Stephens


Elmirti Arousafi


arguments

Increasing sophistication of online threats like deepfakes


Rapid pace of technological change outpacing regulatory responses


summary

Both speakers highlighted the growing complexity of online threats to children, emphasizing how quickly technology evolves and creates new risks faster than regulators can respond.


Need for multi-stakeholder collaboration

speakers

Elmirti Arousafi


Elizaveta Belyakova


Melodena Stephens


arguments

Collaboration between government, industry and civil society stakeholders


Need for political will from governments to prioritize child safety


Importance of industry alignment on values and ethics


summary

The speakers agreed on the importance of collaboration between various stakeholders, including government, industry, and civil society, to effectively address online child protection issues.


Importance of education and awareness

speakers

Elmirti Arousafi


Heng Lee


arguments

Need for gamified, interactive curricula to teach children about online safety


Creation of educational resources like cybersecurity alphabet books for children


summary

Both speakers emphasized the need for engaging, age-appropriate educational resources to teach children about online safety and cybersecurity.


Similar Viewpoints

These speakers shared concerns about the challenges in regulating and enforcing age-appropriate content and access for children online, suggesting the need for more focused regulatory efforts.

speakers

Melodena Stephens


Elmirti Arousafi


Heng Lee


arguments

Lack of alignment on standards for age-appropriate content


Difficulty of enforcing age restrictions for social media use


Need for dedicated regulatory bodies focused on child online protection


Unexpected Consensus

Involvement of children in developing solutions

speakers

Melodena Stephens


Unknown speaker


arguments

Potential for youth-led awareness campaigns on online risks


explanation

There was an unexpected consensus on the importance of involving young people in developing solutions for online child protection. This approach recognizes children as active participants rather than just passive recipients of protection measures.


Overall Assessment

Summary

The main areas of agreement included the increasing sophistication of online threats to children, the need for multi-stakeholder collaboration, the importance of education and awareness, and the challenges in regulating age-appropriate content online.


Consensus level

There was a moderate to high level of consensus among the speakers on the key issues. This consensus suggests a shared understanding of the challenges in protecting children online and the need for comprehensive, collaborative approaches. However, there were some differences in emphasis and proposed solutions, indicating that while there is agreement on the problems, there may still be diverse views on the most effective ways to address them.


Differences

Different Viewpoints

Approach to regulating children’s online safety

speakers

Melodena Stephens


Heng Lee


arguments

Need for political will from governments to prioritize child safety


Balancing consumer protection and innovation in regulations


summary

While Melodena Stephens emphasizes the need for strong political will from governments to ensure children’s safety online, Heng Lee suggests a more balanced approach that considers both consumer protection and innovation in the tech industry.


Unexpected Differences

Overall Assessment

summary

The main areas of disagreement revolve around the approach to regulating children’s online safety and the balance between protection and innovation.


difference_level

The level of disagreement among the speakers is relatively low. Most speakers agree on the fundamental issues but offer slightly different perspectives on how to address them. This suggests a general consensus on the importance of protecting children online, with variations in proposed strategies and emphases.


Partial Agreements

Partial Agreements

All speakers agree on the increasing sophistication of online threats and the challenges in regulating them. However, they propose different solutions: Melodena emphasizes political will, Elmirti highlights the need for rapid regulatory responses, and Heng suggests creating dedicated regulatory bodies.

speakers

Melodena Stephens


Elmirti Arousafi


Heng Lee


arguments

Increasing sophistication of online threats like deepfakes


Rapid pace of technological change outpacing regulatory responses


Need for dedicated regulatory bodies focused on child online protection


Similar Viewpoints

These speakers shared concerns about the challenges in regulating and enforcing age-appropriate content and access for children online, suggesting the need for more focused regulatory efforts.

speakers

Melodena Stephens


Elmirti Arousafi


Heng Lee


arguments

Lack of alignment on standards for age-appropriate content


Difficulty of enforcing age restrictions for social media use


Need for dedicated regulatory bodies focused on child online protection


Takeaways

Key Takeaways

The threats to children in the digital world are increasing in sophistication and scale, outpacing regulatory responses.


Protecting children online requires a multi-stakeholder approach involving government, industry, civil society, parents, and children themselves.


There is a need for age-appropriate, interactive education on online safety for children, as well as practical guidance for parents and teachers.


Regulatory approaches need to balance consumer protection with innovation, and consider transborder nature of online threats.


The UN Convention on the Rights of the Child provides a basis for government action on child online safety.


Resolutions and Action Items

Continue dialogue and collaboration between different stakeholders on child online protection


Develop more interactive, gamified curricula to teach children about online safety


Create practical guidance materials for parents on protecting children online


Consider establishing dedicated regulatory bodies focused on child online protection


Use UN Convention on Rights of the Child as basis for government policies on child online safety


Unresolved Issues

How to effectively enforce age restrictions for social media and other online platforms


How to address the rapid pace of technological change in regulatory approaches


How to balance children’s rights to privacy and protection in parental control measures


How to align different cultural perspectives on appropriate content/age limits


How to equip local content creators in developing countries with child safety knowledge


Suggested Compromises

Balancing technology bans with education to help children use technology safely


Adapting parental control measures based on child’s age, with more discussion/permission as child gets older


Focusing on building trust and open communication between parents and children rather than strict bans


Combining local action with global coordination on issues like AI regulation


Thought Provoking Comments

If we have a bot which is collecting conversation very casually with a child asking, what is your blood type? The child takes it as a very innocuous question, answers it. And this is something that stays on the internet forever because it is pretty unlikely that the child’s blood type is going to change.

speaker

Heng Lee


reason

This comment vividly illustrates the long-term risks of seemingly harmless online interactions for children, highlighting how easily sensitive personal information can be collected and potentially misused.


impact

It deepened the discussion on the sophistication of online threats and the need for more comprehensive education on digital safety for children and parents.


We assume if it’s at home, it’s fine. So it’s a literacy issue, I think.

speaker

Melodena Stephens


reason

This succinctly captures a key misconception about online safety and frames it as an educational challenge rather than just a technological one.


impact

It shifted the conversation towards the importance of digital literacy for both children and parents, emphasizing education as a crucial component of online safety.


The child needs to trust you enough, to be able to come forward if he’s being harassed or if he’s being bullied on the Internet without fear of reprisal or without fear of punishment.

speaker

Elmehdi Erroussafi


reason

This comment highlights the importance of trust and open communication in protecting children online, moving beyond just technological solutions.


impact

It broadened the discussion to include the role of parent-child relationships in online safety, emphasizing a more holistic approach to protection.


In 2021, the United Nations Committee on the Rights of the Child adopted General Command No. 25 in regard to… This is the document that you can base all these questions, whether there will be education. The states that have ratified the UN Convention are obliged also to implement the rights of the child in the digital environment.

speaker

Jutta Croll


reason

This comment introduced a crucial legal and policy framework for child protection online, grounding the discussion in international law and state obligations.


impact

It provided a concrete basis for advocacy and action, shifting the conversation towards practical steps that can be taken to protect children online based on established international agreements.


Overall Assessment

These key comments shaped the discussion by broadening its scope from purely technological solutions to a more comprehensive approach encompassing education, trust-building, and legal frameworks. They highlighted the complexity of online child protection, emphasizing the need for collaboration between various stakeholders including tech companies, parents, educators, and policymakers. The discussion evolved from identifying threats to exploring multifaceted strategies for creating a safer online environment for children.


Follow-up Questions

How can we address the gap in literacy about cyber threats and their consequences among regulators, engineers, and startup founders?

speaker

Melodena Stephens


explanation

This gap in understanding leads to inadequate regulations and potentially harmful product designs, making it crucial for protecting children online.


What should be the shape and form of a dedicated child protection authority for online content?

speaker

Heng Lee


explanation

As existing regulatory bodies may not be equipped to handle the unique challenges of online child protection, a specialized authority could be necessary.


How can we involve children more in the dialogue about online safety?

speaker

Melodena Stephens


explanation

Children may have unique insights and solutions that adults might not consider, making their involvement crucial in developing effective protection strategies.


How can we create effective, gamified curricula to teach children about online safety?

speaker

Elmirti Arousafi


explanation

Gamified approaches may be more engaging and effective in teaching children about cybersecurity, making this an important area for development.


How can we develop practical guides and ongoing support for parents to help them protect their children online?

speaker

Elmirti Arousafi


explanation

Parents often feel lost or overwhelmed by rapidly changing technology, so practical guidance is essential for effective child protection.


How can we better align standards for age-appropriate content across different countries and platforms?

speaker

Melodena Stephens


explanation

The lack of alignment in standards for what is appropriate for children of different ages creates confusion and potential risks.


How can we improve international cooperation for apprehending online criminals who target children?

speaker

Melodena Stephens


explanation

The cross-border nature of online crimes makes international cooperation crucial for effectively combating threats to children.


How can we develop global guidelines for AI regulation, particularly in relation to child safety?

speaker

Elmirti Arousafi


explanation

As AI presents both opportunities and risks for children online, global guidelines are necessary to ensure consistent protection across borders.


Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.