WS #14 Children in the Metaverse

16 Dec 2024 10:00h - 11:30h

Session at a Glance

Summary

This discussion focused on children’s rights and safety in the metaverse and other virtual environments. Experts from various fields explored the challenges and opportunities presented by these emerging technologies. The conversation highlighted that children are already active users of virtual worlds, with over 50% of metaverse users being under 13 years old. Participants discussed the need for age verification and data protection measures, balancing these with data minimization principles.

The discussion touched on existing regulations such the GDPR and their applicability to virtual environments, while noting the need for more specific governance frameworks for the metaverse. Experts emphasized the importance of involving children in the development of policies and technologies that affect them, as well as the need for child-friendly reporting mechanisms and effective remedies in virtual spaces.

The potential benefits of the metaverse for children’s education, creativity, and advocacy were highlighted, alongside concerns about privacy, safety, and potential exploitation. Participants stressed the importance of digital literacy for both children and adults. The discussion also covered the role of parents and educators in supporting children’s safe engagement with virtual environments.

The Global Digital Compact was presented as a framework for shaping the digital environment, with a focus on protecting children’s rights online. Overall, the discussion emphasized the need for a balanced approach that protects children while also empowering them to benefit from the opportunities presented by virtual worlds and emerging technologies.

Keypoints

Major discussion points:

– The metaverse and virtual worlds are already inhabited by many children, raising concerns about safety and rights

– There are challenges around age verification, data collection, and protecting children while allowing participation

– Existing regulations such as the GDPR provide some guidance, but there are gaps in regulating virtual reality environments

– Children want to be empowered and involved in shaping policies for the metaverse, not just protected

– The metaverse offers opportunities for education and child advocacy if developed responsibly

The overall purpose of the discussion was to explore how children’s rights and safety can be ensured in virtual reality and metaverse environments, while also leveraging the opportunities these technologies offer for children’s development and participation.

The tone of the discussion was primarily serious and concerned about child protection, but became more optimistic towards the end when discussing the potential benefits and children’s desire to be involved. There was a mix of caution about risks and enthusiasm about possibilities throughout.

Speakers

– Jutta Croll, Chairwoman of the Digital Opportunities Foundation

– Michael Barngrover, Managing Director of XR4Europe

– Deepak Tewari: Founder of Privately, a technology company focused on online child safety

– Sophie Pohle: Advisor at the German Children’s Fund (Deutsches Kinderhilfswerk e.V.)

– Lhajoui Maryem: Projects and Network Lead at the Digital Child Rights Foundation

– Emma Day, Nonresident fellow at the Atlantic Council’s Digital Forensic Research Lab and a human rights lawyer

– Deepali Liberhan, Global Director of Safety Policy, Meta

– Torsten Krause, Project Consultant at the Children’s Rights in the Digital World

– Hazel Bitana: Deputy Regional Executive Director at the Child Rights Coalition Asia

Full session report

The Nature and Scope of Virtual Worlds and the Metaverse

The discussion began with Michael Barngrover, Managing Director of XR4Europe, exploring the broad nature of virtual worlds. He highlighted at least four relevant concepts: virtual reality (VR), mixed reality, 3D gaming, and social media platforms. Barngrover also noted that social media platforms are already functioning as virtual worlds based on user behavior, and discussed the cognitive load of multi-presence in these environments.

Deepak Tewari, founder of Privately, provided concrete statistics, noting that the metaverse currently has 600 million monthly active users, with 51% under the age of 16 and 31% under 13. This revelation underscored the urgency of addressing children’s rights and safety in virtual environments.

Children’s Rights and Safety in Virtual Environments

Sophie Pohle, representing the German Children’s Fund, referenced General Comment 25 as a framework for children’s rights in digital environments, establishing a legal and ethical foundation for the conversation.

Lhajoui Maryem from the Digital Child Rights Foundation emphasized the need for age-appropriate content and safer digital experiences for children. Hazel Bitana from the Child Rights Coalition Asia presented children’s perspectives from the Asia-Pacific region, stressing the importance of child-friendly reporting mechanisms and effective remedies in virtual spaces. She also highlighted children’s desire to be involved in policymaking processes.

Deepali Liberhan outlined Meta’s approach, explaining that the company implements default privacy settings, parental supervision tools, and third-party age verification through YOTI for teens. However, this platform-specific approach was contrasted with calls for broader, more comprehensive safeguards across all virtual environments.

Data Collection, Privacy Concerns, and Age Verification

Emma Day, a human rights lawyer and artificial intelligence ethics specialist, highlighted the unprecedented amount of sensitive data collected in the metaverse as a significant concern. The discussion revealed tensions between effective age verification and data privacy.

Deepak Tewari presented on privacy-preserving age detection technology, arguing that such solutions already exist but face pushback. He suggested implementing age verification at the device or operating system level to minimize data collection. This contrasted with Liberhan’s emphasis on balancing age verification with data minimization principles, highlighting the complexity of implementing safety measures without compromising user privacy.

Regulation and Governance of Virtual Environments

The applicability of existing regulations like GDPR to virtual environments was discussed, with Emma Day noting that while these regulations apply, new challenges emerge in the metaverse context. She advocated for a multi-stakeholder approach to developing governance frameworks.

Torsten Krause mentioned the Global Digital Compact as a framework for protecting children’s rights in digital spaces. He elaborated on its principles and potential impact on international cooperation for digital governance. Krause also emphasized the responsibility of states in implementing child safety policies and standards.

The discussion touched on the Australian law preventing children under 16 from using social media, sparking debate about the effectiveness and implications of such stringent measures.

Opportunities and Risks of the Metaverse for Children

The potential benefits and risks of the metaverse for children were explored. While some speakers highlighted educational opportunities through immersive experiences, audience members raised concerns about addiction and disconnection from reality. Hazel Bitana provided a balanced perspective, acknowledging the potential for creative expression while also warning about the risk of deepening discrimination in virtual spaces.

The importance of avatars in virtual worlds was discussed, highlighting their role in self-expression and identity formation for children and teens.

Unresolved Issues and Future Directions

Several key issues remained unresolved, including effective age verification without excessive data collection, appropriate consequences for virtual harms, and closing digital divides in access to metaverse technologies. The ongoing challenge of balancing innovation with protection in metaverse regulation was emphasized.

Jutta Croll mentioned an upcoming session on age-aware internet of things, while Emma Day noted a panel on governance of edtech, neurotech, and fintech, indicating the breadth of related topics to be explored.

Conclusion

The discussion underscored the complex nature of children’s rights and safety in virtual environments. While there was broad agreement on the importance of protecting children, significant differences emerged in approaches to implementation. The conversation highlighted the need for continued dialogue, research, and multi-stakeholder collaboration to develop effective governance frameworks that balance safety, privacy, and children’s right to participation in the evolving digital landscape.

Session Transcript

Jutta Croll: Welcome to our workshop, Children in the Metaverse. My name is Jutta Kroll, I am the chairwoman of the Digital Opportunities Foundation. Welcome to those people who are on site in Riyadh, and also to those who are taking part in our session online. As I said before, my name is Jutta Kroll, I’m chairwoman of the German Digital Opportunities Foundation, and I prepared the workshop proposal together with my colleague, Torsten Traude, and with Peter Josius from the Netherlands. Thank you for being here. Yes, just to set the scene, with the availability of artificial intelligence and the creation of virtual worlds, the immersion of digital media in everyday life has reached a new level of development, although the concept of virtual reality dates 35 years back, like the World Wide Web and like the Children’s Convention. So we have kind of a coincidence of certain developments, but nonetheless, virtual reality has only come into our lives during the last several years. And what we already know from 20 years of children using digital devices, being online in the internet, they are the early adopters, and they are those who will also be the first inhabitants of the virtual environment. So that is why we have come together to consider how children’s rights can be ensured in virtual reality in the metaverse, and I’m really happy to have those esteemed speakers around me and also online. I will introduce the speakers to you once it’s their turn to speak, and we will begin with is Michael Barngrover, he’s already to be seen in the Zoom, you can see him. Michael is the Managing Director of XR4Europe, which is an industrial association with the mission to strengthen collaboration and integration between XR industry stakeholders across the European continent. So he’s coming from the technical part, but I know that he has children’s rights on his mind, and I will hand over to you, Michael, your screens are already to be, your slides are already to be seen and just start your presentation. Thank you.

Michael Barngrover: Excellent. Thank you very much. Just to confirm that, that everyone hears me all right. Good. Yep. Okay. Then yes, I’m the Managing Director of XR4Europe, and we are an association that supports those who work with XR all over the continent of Europe, both researchers as well as companies and policymakers, helping and hoping to make the future of virtual worlds in Europe, one that we want to have happen, that we want future generations to be healthy and productive. And so we can go to the next slide, I’d like to focus on the scope. So scoping virtual worlds. So the next slide, is that something that I don’t have control over doing? No, I do not. Can you move next slide? Yeah, thank you. Okay. So with virtual worlds, it is a challenging term, because it’s very broad and very encompassing. So there are at least four broad concepts of virtual worlds that I think are very relevant. The first one being VR, a virtual reality. So you’re completely immersed, taking place and sharing place, often with others in a digital environment, a fully digital environment. But of course, it’s not fully digital, because you are there, and you have both a digital and a non digital existence. And we have this mixed reality, which is what is becoming more prominent, starting last year with new devices, new headsets that can do this. And almost every device going forward, is going to be increasingly capable of mixed reality, which brings the digital and the non digital together in an integrated experience, which means that in the same way that you are present in this digital environment, this digital medium, you are still present in your physical interactively. So you’re having to interact with both. But both of these are built off of another kind of virtual world, which is much more mature and much more common, which is basically traditional 3D gaming, and even 2D gaming. So these worlds like this image here of Fortnite, these are places where hundreds of millions of users, billions of users are actively engaged, including a lot of children. So this is where the virtual world starts to extend beyond what we think about the future and really what we’re actually concerned about today right now. So the lessons that we can draw from what makes these environments safe and healthy, we should be able to draw lessons from what we’re trying to do today to future virtual worlds that would be mixed reality and virtual reality. But even these 3D worlds follow a lot of the same experiences as 2D and non-visually represented or non-immersive, non-3D. environments, like even social media platforms. Social media platforms are virtual worlds, as in that they are populated, they are active, they are interactive, but they are virtual. They don’t have a direct correlate with the physical world, except through us, the users. Can I go to the next slide, please? Because in the next slide, we’re going to talk about, sorry, it’s back a couple of slides. It’s the cognitive loads of multi-presence. So a couple of slides before this one. So there is a cognitive load to be thinking about and managing your activities and your presence, and thus the activities and presence of others in multiple worlds, the non-digital world, but also the digital virtual worlds. And when we have many virtual worlds that we are active in, then that is an additional load. So already this is, again, something we’re doing right now with social media platforms, but it’s also true of 3D platforms that are persistent. The traditional gaming platforms are always there. Our friends may be there. When we’re not there, these worlds are still active, and we are still maybe thinking about them, particularly children who generate a lot of their social equity through their activities in these platforms. As these become 3D immersive spaces, it’s actually even more challenging. There’s more that we have to think about. We have to think about, as we move around in this space, the gist of the actual physical space as it maps and correlates to a virtual space. For example, a mixed reality environment. The table or the chair, those things are in the virtual environment, whether they are represented there or not. They’re in my physical space, so they’re also in my virtual space. even if they’re not virtually represented there. So even from a safety perspective, you have to manage these things. But when we start putting more people in there with us, that becomes more complicated as well. So cognitive load for everyone, but especially children, is something really to be concerned about. Can we go to the next slide, please? The avatar slide. So when we’re talking about being in these worlds with each other, yes, we have this question about how we’re represented, how we’re viewed there. So avatars are a really important topic. I’ll go a little quicker just because we’re getting, I think, a little over time for my part. But the avatar here is just to say that avatars come in many different forms, and it’s not possible for any one company to provide you all the ranges of representation here, whether cultural or even age and ideology. So there’s a big question between whether people should be able to make their own avatars and bring them into these virtual worlds, or whether they should limit themselves to the options that are provided to them. And these, of course, have tremendous consequences, particularly for younger people, as they start to spend more time building social equity while using avatars to represent themselves with others. And it’s not just their choice. When they choose an avatar, it does not mean that that’s what they are. They have to then still negotiate with others, with peers to establish what their identity is in that social environment. So avatars are just a tool towards establishing identity in a virtual world. And then the next slide, please. Just very quickly, once we’re in there and we’re represented, sorry, a couple slides back. There’s a slide about policing, but basically, once we’re in this environment, it is still something of a free-for-all environment in virtual worlds, because we do not have something keen to policing or criminal justice. So when crimes happen in these virtual environments, right now, it’s very hard to arrive at a consequence, an acceptable consequence for them, but it’s also very difficult to know what is a suitable consequence. A virtual crime is a crime by definition, but we don’t know how much it’s equivalent to the similar or same crime that may take place outside of virtual realm. And this is, again, not new to virtual worlds. If we consider social media. platforms to be a form of virtual world, then this is something that already we’re trying to legislate and understand. So I’ll close it there, so we can move on to the next speaker, but thank you very much.

Jutta Croll: Thank you, Michael, for giving us a first insight into what virtual worlds will be, can be, or are already. We will now go to Deepak Tewari from Privately, who will tell us a little bit about technology to guide us through the metaverse. Deepali, I can see, sorry, but I can see everyone and I can secure it. Okay, we can hear you, and now we can also see you. Would you please introduce yourself, and then we will have a look whether your slides will work, because there are some technical issues. It’s the first day of the Internet Governance Forum, and we are in a really busy, busy room, and the technicians are doing a great job, but sometimes things go wrong. We will try. It’s all right. Even if there are no slides, I will be able to conduct myself,

Deepak Tewari: I’m very happy to be here addressing all of you. I run a company based out of Lausanne in Switzerland, and the company is called Privately. Privately is a technology company, and for the last decade, we have been developing and deploying various technologies to keep miners safe online, and this includes technology that identifies threats and risks online, but also technology the technology of age assurance and age assurance is essentially the, the technology behind being able to identify whether someone is a child online. And hopefully I could get my slides up and I would, I would have really wanted to show you how it’s working and, and what it is like but going back to the subject at hand the metaverse. So my previous speaker he was mentioning about avatars and the state of the metaverse or virtual worlds in general. And that’s essentially where I’m calling in from I’m sitting very close to that lake that you see here and it’s very beautiful. So, very happy to be here. And if you could kindly go to the next slide. So, this summarizes what privately does so we have various kinds of technology, safeguarding privacy enhancing technology but something which is very pertinent to today’s discussion is the, the technology behind being able to identify who is a participant online is a child, and especially in the metaverse is a child. And can we do this. So that’s, that’s something I’m going to talk about but as I said let’s back up about the metaverse a couple of notes that I took. So, as of the latest statistics today. What roughly categorizes as the metaverse has about 600 monthly active users 600 million. So that’s roughly 600 million monthly active users. And according to some reports that I have seen, particularly from wired and another agency called data reg 51% of the active users in the metaverse are children. of the users of the metaverse, so-called metaverse, are under the age of 13. And 84% are under the age of 18. And you might ask me why that is the case, because most of what we know as the metaverse or virtual world is made of gaming environments, Roblox, Minecraft, Fortnite, which explains why such a large number of, you know, the participants in there are actually miners. One more interesting piece of data, 31% of US adults have never heard the term metaverse. So that just tells you how much is the divide, and essentially this virtual world that we are talking about is actually full of children. And this has been our experience as well, developing technology, which takes me to the next slide, please. If, yes, I don’t know if you can get this, yes, if you can get this working, then I’ll let you see it, and then I’ll comment after. There is some, there is a volume behind it, if you can see it. Well, if the volume is not playing, I’ll tell you, first of all, can we get the audio playing on that one? It’s playing, but it’s very faint. Okay, so let me tell you what’s happening. So first an adult, so this is actually, this is our technology at work inside a, it’s a VR headset, and first an adult is speaking. And when the VR headset detects that it’s the adult which is speaking, it gives them access to adult content. As you saw that there was, the advertisements were meant for adults. But after that, if the audio was playing, you would now hear a child speaking. And as soon as the technology detects that it’s a child which is speaking, the environment has changed. And the only content, and in this case, it is advertising. The advertising that is, that is. shown to the participant is only related to children. So what I wanted to illustrate out of this, you will see this time it is the child. So it goes inside, you’ll see all child-related content. The point we are trying to make here, well, there is a QR code here, so you could actually watch this video on YouTube directly as well. The point we are trying to make here is that technology exists today from companies like Privately, which have developed fully privacy-preserving, GDPR-compliant solutions for detecting age of users. And as you can imagine, the detection of the age of users is the cornerstone to keeping users safe online. And this is the demo that you just saw. We actually had it implemented for a very, very large social media network. It was in trial. Of course, they decided not to pursue it, but for reasons best known to them. But as you can imagine that the technology today exists, and we are seeing more and more of these technologies which are privacy-preserving, run in the background, but are able to differentiate in real time, just like as we, as adults do. As human beings, we do that this is a child, this is an adult, and they can do so in the background. And in doing so, then they can ensure that the platform or the service or the virtual environment is able to deliver an age-appropriate experience. So this is essentially my message. If you need to know more about the tech, that’s the QR codes that I have left behind. You could very happily either contact me or look at the codes here at the showroom. You can even test the technologies. So thank you for your time. And that’s me. That’s over now.

Jutta Croll: Thank you, Deepak. Thank you so much for giving us an insight into also how children or maybe all users might be protected in virtual environments. We have now about seven to eight minutes for questions, either from the floor here inside or from our online participants. Do we have any people who want to come in with their questions? Michael and Deepak will be there to answer your technology-related questions. or whether you have any further questions so far. Yes, we have a hand raised. You will need to take a microphone.

Audience: Hi, my name is Martina from Slovakia. I’d like to ask you for this solution for the privacy of the children. Is it not true that the child’s voice will change to adult’s voice?

Jutta Croll: Yes, please Deepak, go ahead.

Deepak Tewari: So, look, this is a real time. Maybe to clarify, this is not happening on a one day basis. This will become a feature of the microphone. Each time you’re speaking, the microphone can detect whether it’s a child or an adult. So it’s not like there is a one time check done and you are categorized as an adult or a child. This is real time, this is continuous, and this is a feature of the device itself. Very much in a similar way, we’ve also created age aware cameras. If you go into shops, the camera looks at you, it detects if it’s an underage person. They will not serve you alcohol or any restricted item. So this is a feature, you have to think of it as being continuous and not done once forever or for a long time.

Jutta Croll: So I see she’s nodding, so the answer was accepted. It gives me the opportunity, as you’ve been speaking about age awareness, that we will have another session on Wednesday at 10.30. Yes, at 10.30 on age aware internet of things. So probably we’ll put that in your schedule. We have another question from the floor. The microphone.

Audience: What if the adult is using an AI to convert his voice from an adult to a child, to deceive the program and get into the child’s room?

Deepak Tewari: Yes, that is correct. That is, you’re right, there is a threat because these days there are artificial programs and generative AI, which is attacked. You have to go into a little bit of a depth into this. There are two kinds of attacks, one which is called a presentation attack, which is, you know, you use an external program, you show, you play a child’s voice, for example. That’s one way. The other one is you actually inject a child’s voice into the program, which is a little bit more difficult. It’s called an injection attack. So there are, obviously there are technologies to detect both, and you could always argue that it’s a battle between how good the technology to detect is vis-a-vis the technology which is trying to fool or spoof. But an interesting thing here is that the fact that this is continuous, at some point, you know, in some use cases, we’ve seen that the technology is being used to detect anomalies. So the same person is talking like a child, a child, and he’s talking like an adult to someone else. That produces an anomaly in the system. And that could be used to track that there is a person who’s probably malified in that group. So there are ways and means to detect these things. But like you rightly said, there’s a contest between technologies. But solutions exist, by the way, to detect spoof voices and, you know, injection attacks.

Jutta Croll: Thank you, Deepak. We have time for one more question, and then we go to our colleague, Sophie. Thank you, Deepak.

Audience: This is a very important work, especially in a world where governments are now contemplating where children under 16 even should have access to platforms or not. So you talked about working with some big platforms, but of course, is there ownership of such kind of work? At a more strategic level, you know, is there any ownership of this kind of work? Because this means less consumers of their content. And also, because of the way you register for these platforms, these big tech platforms, you can just say under 16, they don’t give you a voice or anything that you can detect. So is there any more work by privately that you can talk about how to stop the exploitation of children by big tech platforms?

Deepak Tewari: Luke, thank you for your question. It is true that I have to say that we are seeing a big pushback from big platforms, big tech. And I hate to say this, but it is because age assurance comes as a threat to their business model. If you imagine you’re advertising to everyone and saying, I’m only showing these ads to adults, but a part of your users are kids. So if you just follow the money, then this is a misappropriation of advertising revenues. Minus all the morality and minus all the child safety here. So it is not in the interest of big tech to support age verification, which is why you see stalling and fear, uncertainty and doubt being sown by pretty much all the big tech platforms. We are actually even fighting a case in the Supreme Court in the U.S. right now. So we are active everywhere. The technology exists, but big tech does not want to do it directly. And it’s always about liability. Who has the liability for this? As long as that is not settled and as long as there are no strict fines, unfortunately, big tech will be pushing this out a little bit. So we as a small company, we are trying our our bits, challenging in court, doing thought leadership, also going into games, building it and showing people that it’s privacy preserving. It works. It’s functional. We have certified all of the technology publicly. But I have to say that there is a business model contradiction with the with big tech. So, yeah, then there is a problem.

Jutta Croll: We already have big tech also in the room. So we’ll go to that question afterwards. But first, I would like to refer to the lady from the floor who referenced children’s rights and their right to access to to digital media, to digital communication. And that’s the point when Sophie Pohler from the German Children’s Fund come into play and she will give. us a short introduction on the general command number 25. For those of you who have not heard about that, we have the document printed out here and when you leave the room you can take it and also you will find it on the website www.childrens-rights.digital. Yes, thank you. Sophie, over to you please.

Sophie Pohle: Thank you, Jutta. I hope everyone can hear me. So, hello and welcome from cold and rainy Germany, from Berlin. I’m Sophie Pohler. I’m from the German Children’s Fund, which is a children’s rights organization in Germany here and we have been collaborating with the Digital Opportunities Foundation for years, including the joint coordination of Germany’s consultation process for the GC25, the general command number 25. And now I’ll set aside my role as an online moderator today of our session to give you a brief overview of the general command number 25 as a framework for our discussion. So, let’s start. The UN Committee on the Rights of the Child published the general command number 25 in 2021, so three and a half years ago, and it’s focusing on children’s rights in the digital environment. This document guides state parties on implementing the Convention on the Rights of the Child in digital contexts. It was developed with input from governments, from experts, and also from children and offers a practical framework to ensure comprehensive and effective measures are in place. Our session today explores the metaverse, a topic which is not directly named in general command number 25. However, the GC highlights that digital innovations significantly influence children’s lives and rights, even when they do not directly engage with the Internet and by ensuring meaningful, safe and equitable access to digital technologies, GC25 aims to empower children to fully realize their civil, political, cultural, economic and social rights in an evolving digital landscape. Let me briefly introduce the four general principles of the GC25, starting with non-discrimination, which emphasizes ensuring equal access and protection for all children. Second, we have the best interests of the child, prioritizing children’s well-being in the design, regulation and governance of digital environments, including of course virtual worlds or the Metaverse. Thirdly, we have the right to life and development, that means to ensure digital spaces support children’s holistic growth and development. And last but not least, we have the principle of respect for the views of the child, which means to consider children’s perspectives in digital policymaking and platform design. What are the central statements or key emphasis of the GC25? The general command recognizes children’s evolving capacities and how user behavior varies with age. It highlights opportunities and different levels of protection needs based on age and also stresses the responsibility of platforms to offer age-appropriate services for children. The GC25 calls on the states’ parties to support children’s rights. to access information and education using digital technologies. It also urges to ensure that digital technologies enable children’s participation at local, national and international levels and highlights the importance of child-centric design, which means to integrate children’s rights into the development of digital environments, for example with age-appropriate features, content moderation, easy accessible reporting or blocking functions and so on. On the one hand we have those opportunities and participation calls and on the other hand we also have the GC25 calling the states parties to identify risks and threats for children incorporating their perspectives and it also calls for solid regulatory frameworks to establish clear standards and international norms on the one hand and to implement legal and administrative measures to protect children from digital violence, exploitation and abuse. Also GC25 encourages cooperation among stakeholders like governments, industry and civil society to tackle dynamic challenges and last but not least it underlines to promote digital literacy and safety awareness for children, parents and educators to ensure informed participation. In my last minute I’d like to give a very brief insight into children’s perspectives that were collected during the consultation process on GC25. The principles I laid out in GC25 are directly informed by the needs and expectations children have voiced globally during the consultation. consultation process. I think it was more than 700 children globally that were consulted. And yeah, to conclude, I brought some key insights on a very general level from young people on how we can better support them in the digital world before Mariam after me will take us through the children’s perspectives in way more detail. So what do children want? They want equitable, reliable access to digital technology and connectivity. They also wish for age-appropriate content and safer digital experiences where platforms protect them against harassment, against discrimination, against aggression, and rather enable them to participate and express themselves freely. Children themselves demand greater privacy and transparency about data collection practices. And they want also more digital literacy education, also for their parents, by the way. And they also want the recognition of their right to play and leisure, which is also crucial when we talk about the metaverse. So much for now. I think my time is up. That was on a very general level. Yeah, thanks a lot for your attention. And I’m happy to answer questions.

Jutta Croll: Thank you, Sophie. Thank you so much. Welcome. Can you hear me? Okay, it’s working. Thank you so much for already touching upon digital literacy because we will also come to that point to discuss how the metaverse will open up opportunities, huge opportunities for training of digital literacy for children as well as for their parents and other adults. But first, we go to Mariam. Mariam, also, she is a very young person. She holds a bachelor’s degree and a master’s degree. So I’m really impressed about you. But you’re representing youth in our panel. And please go ahead. Your slide is already on the screen.

Lhajoui Maryem: Thank you. Can everyone hear me? Yes. So before I start, thank you, Sophie. And I’m very happy to see so many people in the room, actually. Thank you for being here. So my name is Mariam, and I’m here on behalf of Digital Child Rights Foundation. And so we are a youth platform, expertise center based in the Netherlands, but we focus internationally. So what we do at Digital Child Rights, our main goals are to really have the importance of digital child rights and to really have them included children’s opinions in, you know, what is safe for them and to include their opinions. We do this through different ways. We create playful tools for children. We also gather most of their information on how they view things through our child rights panel. And then for youth, we have many youth ambassadors in the Netherlands, but also in different other countries. And so we really want to give youth a platform to connect through connection challenges. So Sophie, thank you again. A very clear overview of the general comment 25. You might see some overlap here. So at Digital Child Rights, these are our 10 themes, and they are based on, yeah, they’re based off the general comment 25. So while the metaverse that we’re talking about offers great opportunities for children, for youth to actively participate, there’s many chances in ways we didn’t know before, right? At the same time, we also have to remain critical and acknowledge that there’s challenges, right? We have to prioritize safety and privacy and fairness in the best interest of the children, and that’s what we really focus on. So when it comes to privacy, it was also already mentioned before, who’s collecting my data, where is my data collection going, how old am I? Safety, also addressed very clearly by my colleague Deepa. So when it comes to age verification, can I pretend to be way older than I am, or can I pretend to be way younger than I actually am? And so we know the dangers that come along with it, but what is actual action that should be taken there, and how do children even, how do they view this? Do they even know that it’s possible to, you know, to meet other people online that might not have the best interest, unfortunately? So this is where the rules are very important, right? So there’s rules, also as outlined in the general comments, like Sophie explained, so how do they view it, and what are we going to do about this? And then also really important is this, at Digital Child Rights, and in many others, I also heard it, is this digital inclusion, right? So can everyone participate? So we’re based in the Netherlands, we speak with a lot of youth in the Netherlands, and but it’s different when you look at other countries, so can everyone participate? And at Digital Child Rights, we actually conducted some interviews at Gaza Camp, Jordan, this year, where we spoke to Palestinian refugees in Jordan, and about access to internet, access to the metaverse, what are opportunities for them in the metaverse, what’s important to them? So this digital inclusion is very important, especially when it’s so closely interlinked with education, so when education plays a very role. And then also very important, we write it as an opinion, is can everyone give their opinion online? Are children free to say what they want? and also be able to do that within the frameworks of do not bully, non-discriminatory, as outlined in the general comment. So very important that it’s equal and that they can give their opinion, which we also really, we gather a lot of their opinion. And so, yeah, at Digital Child Rights, we’re always looking to connect with other youth platforms. So I do invite all of you that are sitting here to talk to me afterwards about what are opportunities to enhance this, to really strengthen the voice of the youth and children, not only in this room and not only in the Netherlands or in Europe. And that’s why also we are here. And I would love to meet so many of you. So the question for us is, what can we do in the best interest of children? And how do we really include them in it? So I hope to talk to many of you after this.

Jutta Croll: Yes, but we also have now about 10 minutes for you to take questions as well as for Sophie. So do we have any questions here in the room or otherwise in the online chat? Anyone who wants to raise their hand? Yes, okay, Emma, please. It should work.

Audience: Thank you for the presentation. So I think the question was for me, right?

Lhajoui Maryem: Yes, thank you, Emma, for your question. So that’s a really good question. We should always ask ourselves how much knowledge is already present in the room when speaking about these important issues. So when it comes to children, we create playful tools to connect with them. So there is like, what do you call it, Kofta? Like a card game and my colleague is holding it. Perfect. So there is a card game. It’s a playful way, but to really also talk about what are the, so what’s connected to, can everybody participate? What does that even mean? And also, so for example, with safety, okay, what does safety even mean? And then it’s also, yeah, so it’s like, we also wrote it here. You can take a look at it so that you’re aware that you can ask for help when you’re in danger and that there is many opportunities for you, chances. So there is a card game. We also offer different kinds of workshops. So there’s also a workshop with masks. So where children make their own mask. And this also really portrays, right, this age verification and also really, as outlined in the previous presentation by Michael, I think, yeah, about the avatar. So who am I even in the online world? Is it also a different person than I am in the real world? So we really encourage them to think about this before they then put their opinions. And then still, there’s so much to learn. And through these playful tools and also through these connection challenges that they do with other platforms, we encourage them to learn more. And then we can learn more from them. And then they can also give better opinions. So I hope that answered your question, Emma.

Jutta Croll: We have another question here. And then I would also turn to Sophie to tell a little bit, if you’re able to do so, about the children’s participation for general command number 25, which pretty much refers to what you have said. So we take those two questions there. Yes.

Audience: I’m the youth ambassador from Hong Kong, from the One Path Foundation. And I’m here, I want to ask a question that there are lots of attractions in the metaverse. And how can you prevent children from addicted to those attractions and cut off from the reality?

Jutta Croll: Okay, it’s a question with regard to addiction to the metaverse. You’re talking about addiction to the metaverse, right? Yes. Okay, I’m not sure whether this question should go to the panel or whether we can go to it afterwards. Who would you want to pose the question to, Mariam? Okay, go ahead.

Audience: Because just now she said that there are lots of playful tools.

Lhajoui Maryem: question, actually. So there is many, there’s many things in the online world. And I must admit that I also get maybe a bit addicted sometimes. And maybe I can get a bit lost, you know, when you’re scrolling. So what can we do? This is actually also a question for you and for me at the same time. So what can we do to help each other and help our friends that are the same age to really not get lost in that? So what we at Digital Child Rights do is we, so like I said in the beginning, the metaphors, yeah, we can get a bit lost in it. But it’s also that, you know, we’re also going with the time. So it is also a place where there’s many great opportunities as long as we know how to handle it, right? So we really try to make young people aware of the dangers, yeah, the challenges and also the nice things. So we really try to tell you, if anything happens, then there’s always some kind of help offered. And that you are really aware that you have the right to be safe, right? Because Winston, if I tell you, yeah, you should not spend any time on your phone. That’s a bit crazy, right? Maybe. I don’t know. If you want. But we need to regulate it. We can’t spend too much time. But it’s also interlinked. You can learn a lot. There’s also, it can help you with your education. So to answer your question, I’m sorry.

Audience: Hi. I’m from India. I just wanted to ask Mariam and Sophie both. She talked about digital literacy. You’re talking about the programs with children. How do you involve parents and educators when it comes to children in metaverse? Because that’s crucial to have them on board when we talk about children’s engagement in the online spaces.

Jutta Croll: I’ll give that question to Sophie. But only a short answer, Sophie, please. Because we are running out of time.

Sophie Pohle: Yeah, that’s a question we also discussed a lot in Germany for a few years and I think the key is to involve the schools more to reach every child because every child goes to school and there we have the parents too. So that’s key I think and we also need to think about how can we reach those parents who are not already sensitive to these topics because often we see that parents inform themselves when they see there’s a problem but we have a lot of parents also that do not have access to this information and who do not have the resources and we do need to think about more how to involve them and to get to them directly. It’s a very complex question to be honest, very difficult for a short answer. Okay, we will follow up with that as well.

Audience: Hello, it’s not a question, it’s a proposal to the Digital Child Rights Foundation. So I am Sadat Rahman from Bangladesh. We are also working for teenagers in Bangladesh and in Bangladesh we have a helpline 13 to 19 is a cyber teens helpline if in any Bangladeshi teenager facing cyber bullying, cyber harassment, they can call us and we are working with Netherlands also in Child Helpline International and I received the International Children Peace Prize 2020 from Kissright. So I would like to work with Digital Child Rights Foundation. We need mentors. Thank you.

Jutta Croll: Thank you so much. So we leave it at that time because we have already our next speaker on my left side and Hazel is also in the room, in the Zoom room, but we will start with Emma Day. She is a human rights lawyer and an artificial intelligence ethics specialist. Also she’s the founder of the consulting company specializing in human rights and technology for UN agencies. Emma, over to you because now we want to talk about regulations and when we set up that virtual proposal, the Australian law keeping children under the age of 16 was not enforced, even not on the debate. So now we have a different situation but I’m pretty sure you will be able to address it.

Emma Day: We specialize in human rights and technology and I’m going to talk to you a bit about the existing legal regulatory place in the metaverse and maybe where some of the gaps might be. So the metaverse is an evolving space and some of it we’re talking about is a little bit futuristic and may not actually exist yet, but we’re talking about a kind of virtual space where unprecedented amounts of data is collected from child users and adult users as well. So already today we have apps and we have websites which collect lots of data about users, about where they’re going online, how they’re navigating their apps, but in the metaverse companies can collect much larger volumes of data and much more sensitive data. So things like users, physiological responses, their movements and potentially their brainwave patterns even, which may give companies a much deeper insight into users’ thought patterns and behaviors and then these can be used to really target people with marketing, to track people for commercial surveillance or even shared with governments for government surveillance. So it’s something that takes us to another level when we’re thinking about data governance. Then we have people’s behavior in the metaverse, both children and adults. If someone says something in the metaverse space, is it the same as posting content online? What laws should apply there? Who is liable for content they post online? If I use an avatar online, am I responsible for their speech? And if my avatar abuses somebody who’s wearing some kind of haptic device that they can feel the touch to them, then how do we deal with that? So there are some questions that we don’t really have answers to from regulators, but there are some regulations that we know apply. So for example, the GDPR wouldn’t still apply in the metaverse. So this is the European regulation around data protection and there are many laws around the world now to fight against the GDPR, which is used for data protection for business. And that means also that the children’s codes, which have been developed like the UK age appropriate design codes and similar codes in other countries, which is guidance on how the GDPR applies to children, would also apply. But it may be difficult within data protection law to determine who is the data controller and who is the data processor. So a data controller is the entity responsible for deciding how the data is going to be used for what purposes. And they then ultimately are the most liable and accountable for that. But if you have lots of different actors in the metaverse space who are sharing data between them, it may become quite confusing. And then the data controller, before they process data, particularly from children, they should be telling them, giving them a privacy notice. So how many privacy notices can you have in different parts of the metaverse? And say you’re in an experience, maybe a child is walking along a street in a virtual town and they stop in front of a bakery and they’re looking in the bakery window. Then maybe one of the companies involved can see that maybe they are hungry and they can target them with some food advertising because they stopped in front of that bakery. So it’s kind of a different scenario to the way we use websites and apps currently. And then, of course, there’s this question of how to determine which users are children. And sometimes not even which users are children and which are adults, but precisely how old is that child for data processing purposes. Then you also have a body of regulations which are about online safety. So we have online safety acts. We have this law in Australia which is preventing children under 16 from using social media, which maybe they would apply also to the metaverse. But then you have in the US, there is a section 230 of the Communications Decency Act, which you may have heard of, which really provides online platforms with immunity from third-party content, from liability for that. So that may change in the US as the metaverse develops. This is a very political topic in the US. We don’t really know what direction that will go in. So we have a very diverse global regulatory framework and, in fact, not a lot of very specific regulations and not a lot of enforcement currently. But I think that the main takeaway for me would be that the common thread globally is human rights and children’s rights. And what we do have is the UN Guiding Principles on Business and Human Rights, which were endorsed by the Human Rights Council in 2011. And these are the global authoritative standard for preventing and addressing human rights harms connected to business activity. And they’re aimed at both states and business enterprises. So they call on states to put in measures to protect children’s rights, including in the digital environment, and also for businesses to respect children’s rights. And this includes tech companies. And when we think about tech companies, we need to think about also the safety tech and the age assurance tech. These are also tech companies. And so both the platform who is providing the metaverse and also any technologies used within that platform need to carry out risk assessments, where they look at all of the different rights that could be impacted for both children and adults. And in a mixed audience, you need to look at both children and adults and make sure that all of the stakeholders are engaged with and consulted with. And so that something that is introduced to protect children doesn’t then have a adverse impact on other rights, so we need to make sure that all of the rights online are protected, and the UN Guiding Principles provide a methodology for stakeholder engagement, for risk assessment, and then for identifying how to mitigate those risks in accordance with children’s rights and human rights. So I think that if tech companies carry out their human rights due diligence, do their child rights and human rights impact assessments, then they should be in good shape, in fact, to comply with regulations that may be coming down the line. I will leave it there.

Jutta Croll: Thank you so much, Emma, for giving us that insight on the situation of regulation so far, and we know that it does not address the metaverse at this time, but it will definitely be going that way. And now I’m handing over to Deepali, who was already somehow addressed because she’s coming from META, and I don’t think you will be able to react to all the things that have already been said about service providers like META is one, but I would like to refer to something that Michael said at the beginning, that social media platforms are already virtual worlds, and they are coming on us via our own behavior. So that is something that your company is working on, and could you explain a little bit your position?

Deepali Liberhan: I can. I think it’d be useful to talk a little bit about what the objective is here. The objective that we have is to make sure we have safe and age-appropriate experiences for young people on our platform, and irrespective of regulation, we’ve been working to do that, and it’s across our apps, whether that’s Facebook, Instagram, or the VR and AR products that we offer. We’ve actually adopted a best interests of the child framework that our teams use to develop products and features for young people, and while I won’t go into all of those considerations, I think two important considerations that I do want to talk about. The first is, and it was lovely to hear from you, Miriam, is exactly engagement with young people and families who are using our products, and it’s really important to engage not just teens, but also parents, and we’ve done that where we have, in the last couple of years, we’ve rolled out parental supervision tools across our products, including MetaQuest, which is available in the Metaverse. Why parental supervision tools are really important is exactly to the point that somebody made that, you know, parents don’t necessarily know how to talk to their young kids about the metaverse of virtual reality. We had these consultations and, you know, engagement with parents and teens who are sitting in the same room to help to be able to design our parental supervision tools, and we’ve designed our parental supervision tools in a way that respects the privacy, as well as promotes autonomy for young people, but also, you know, gives parents some amount of oversight and insight on their teen’s activities, especially in the VR space. So, for example, you can see how much time your teen is spending in VR. You can set daily limits. You can schedule breaks. You can also approve and disallow apps. So, these are some of the things that are inbuilt into our parental supervision tools, and I think that it’s really important that, along with these tools, I think that there was a mention of digital literacy as well. We’ve worked with experts to make resources available in our digital hub so that parents can get guidance on how to talk to their kids about virtual reality and about AR and about how to stay safe online. This is, you know, this is one consideration that we have when we’re building for young people. The second is building safe and age-appropriate experiences. So, irrespective of whether you choose to have parental supervision or not, and I think it’s really important to have parental supervision, the other thing that’s really important is what are we doing to make sure that, you know, young people who are using, you know, who are using the Metaverse products that we offer are safe is essentially a set of inbuilt protections that we have. For example, 13 to 17-year-olds have default private profiles, and, you know, if you’ve used Instagram or if you’ve used, you know, if you’ve used MetaQuest, you know a private profile is very different from a public profile. You exactly, you can allow people to follow you. Not everybody can see what you’re doing. The second is that we’ve also, by default, we have a personal boundary, and I don’t know if any of you who’ve used MetaQuest have used that personal boundary. That personal boundary is an invisible bubble that is around you that is on by default, so that is to prevent against unwanted interactions when you’re in, you know, avatar space and you’re engaging with other avatars. The third thing that we’ve done is we’ve also limited, you know, interactions with the other adults that teens are not connected with on your platform, so you want to use the, you know, the Metaverse to connect with your aunt who’s in a different country, but you don’t necessarily want your teen to be able to engage with strangers, and we have built-in protections in place to limit those particular interactions. The other thing that’s really important is, you know, we live in the world of having really clear policies, so all the apps that are listed on the MetaHorizon store, for example, have an age and content rating like the film, you know, like you have a film rating, and teens are not able to, are not even going to download apps where they, which are inappropriate for their particular age, so, and there’s a lot more that we are doing in terms of making sure that we are building those safe and age-appropriate experiences on our platform. The other thing that, you know, that I do want to point out is that, and I know somebody said earlier, that a majority of teens online are right now using it for gaming and entertainment, but at Meta, we also feel like the potential for the metaverse in immersive learning is really, is really immense, and I remember I was in school, and we used to read about, you know, interesting places like the Pyramids of Giza or the Eiffel Tower. In virtual reality, you can have young people who are actually visiting those worlds, and you can actually have young people from different countries, at different economic backgrounds, actually be able to study together, and I think that kind of innovation, any kind of regulatory or legislative regime, it’s also important to, it’s also important to protect and promote that kind of innovation, and Meta is actually, you know, we’ve set aside a 150 million fund just for immersive learning, and there are, you know, many projects that we have that I can talk about. I don’t know how much

Jutta Croll: time is up, but if you want to take some more questions, I’m looking around in the room, I’m probably, there, there we have a question, and we have another question over there. Question is for you. Can you hear me now? Yes, we can hear you.

Audience: So, we heard about privately, privately, I believe the company was, and we know there’s other technology out there to verify children’s age on platform. What is Meta doing to verify age verification on platform?

Deepali Liberhan: So, we have a, we have a number of, you know, we have a number of ways we, we use to assure age on our platforms, and I think one of the important things to, to understand is that it’s, it’s a fairly, it’s a, it’s a fairly complicated area, because we want to make sure that we’re, we’re balancing different principles, and a couple of principles that I’ll talk about. The first is data minimization. We’ve all talked about, you know, many years ago, where we said that it would really be easy for everybody to collect digital IDs at the point of verification, right, but we, you don’t want to do that for multiple reasons. I don’t think any, anybody, least of all regulators or legislators wants companies like us to collect data. So, what is an effective way to assure age, which balances, you know, the, you know, data minimization with effectiveness and with proportionality? We have a number of ways that we, we do age assurance. For example, we have people trained on our platforms, for example, Facebook and Instagram, to identify underage platform, underage users, and they’re able to remove these underage users. We also have invested in proactive technology, and that proactive technology looks at certain kinds of signals. So, for example, the kind of accounts you’re following or the kind of content you’re posting, and those are some of the, some of the ways that we’ve developed, and, you know, proactive technology is something that keeps, keeps on developing, that we’re using to, to identify and remove under, underage accounts. We’re also working with, I don’t know if you’ve heard about YOTI, but YOTI is a third-party organization, like, like Privately, that essentially has come up with a way, in a privacy protective way, to, to be able to identify age range just based on your selfie. So, what we’ve done is, we’ve, we, and you’ll see it if you, if you use Instagram, that if we find that there’s an age liar, if I’m a young, you know, young person who’s 12 years of age, that person is not allowed on the platform. But, for example, if a 15-year-old wants to change their age to 18-year-old, one of the options to verify that person’s age is, you can give your ID, but also you can use YOTI, which is, you take a video selfie, and that selfie is kept for a little amount of time, and then deleted, to be able to identify age. So, in, there are a variety of ways that we’re, that we’re working to assure age on a platform.

Jutta Croll: We will be around to explain later, maybe, on YOTI. We have another question there. Please be brief.

Audience: Hi, just briefly, I’ll also sort of pick up on the age verification. I mean, you talked about data minimization. There are plenty of already available options that do age estimation, age verification, without gathering any data. None of the social media platforms employ those properly, and I think META’s reported several billion dollars of revenues from under-13s in its latest results. So, it’s clear that, whether it’s META or any of your competitors, none are really doing that seriously. So, some of the limits you’ve talked about, about restricting access by age to different content, are lovely, but if you don’t have effective age verification, then they’re also meaningless, sadly. So, it’s a comment rather than a question. I think all the social media platforms could do far better. At the moment, they do the bare minimum, in my opinion, and there’s a lot more that you could do. Thank you.

Deepali Liberhan: I’ll quickly respond to your question. I’m available after this if you want to have a broader discussion. We are working on exploring options to do age assurance in a more effective way. YOTI is one such organization that we work with. It’s been recognized by the German regulators. We’re looking at ways on how to use it at more points of friction on our platforms. The other thing that I would say is that it’s also a broader ecosystem approach. One of the legislative solutions that we’ve talked about that we think is a fine balance between data minimization and also making it really simple for parents is to have age verification or age assurance at the app level or the OS level, which allows parents to oversee approval of their apps in one particular place and also minimizes the collection of data at one place. A lot of third parties have also talked about how this is one of the ways where we can approach this in a broader ecosystem perspective. While those discussions are happening, at the same time, we are working to ensure that we are building more effective ways to do age assurance on our platforms. We’ve also recently launched teen accounts in certain countries, and we’re going to roll out in the rest of the world. We’re also investing in proactive technology to give us the right signals to make sure that the teens are not lying about their age, because, as you know, they will lie about their age.

Jutta Croll: They are doing already. Yes, thank you so much. So, we are a bit under time pressure. That’s why I’m going now to the last block of our session, and that is coming to what can internet governance do to support a common approach to keep children safe in the digital and virtual environment. And I’m handing over to my colleague Torsten, who will give us a short reflection on the Global Digital Compact and what responses the compact does give to children’s rights and the metaverse. Over to you, Torsten.

Torsten Krause: Thank you very much, Jutta. And regarding or with listening to all the thoughts and information shared, we want to have a closer look to the Global Digital Compact, which was adopted in this year’s September. It’s a global digital compact, and it’s a global digital summit of the future. You may recognize it. It is a part of our common agenda of the United Nations and determines the basic principles for shaping the digital environment. It’s not a legally binding document like the Convention on the Rights of the Child, but the states express their commitments to aligning the further development of the digital environment. What does it mean? The GDC describes several objectives, and just to put some of them, it said to close all the digital divides and accelerate the progress across the sustainable development goals, to expand inclusion and to reap the benefits for all, to foster an inclusive, open, safe and secure digital space that respects, protects and promotes human rights and children’s rights are a part of human rights, as you all know. And they declare that to create safe, secure and trustworthy emerging technologies, including AI with a transparent and human-centric approach and effective human oversight. So all what is done should be controlled in the end by humans. When we have a closer look to what is mentioned about children’s rights in the GDC, then I’m not aware of how you’re following the progress in the past. Then in the first draft, there was no children’s rights directly expressed, and several organizations do their stances and give their perspectives and comments to put children’s rights in this compact. And in the end, there are several points where we can touch on, and the biggest area or field of child rights is around protection rights. So the states are asked to strengthen their legal systems and policy frameworks to protect the rights of the child in the digital space. So every one of you can ask their governments how they do it, how they put in place their policy frameworks. The states also should develop and implement national online child safety policies and standards, and they call on digital technologies companies and developers to respect international human rights and principles. We heard some of that in the session. All the companies, developers, and social media platforms should respect human rights online, and to implement measures to mitigate and prevent abuses, inclusive, also with effective remedy procedures, as also Sophie mentioned, in line with the general comment. And the broader part comes to counter and address all forms of violence, including sexual and gender-based violence. Hate speech also is mentioned, like discrimination, misinformation, and disinformation, but also cyber bullying and child sexual exploitation and abuse. Therefore, it’s necessary to monitor and review digital platform policies and practices on countering child sexual exploitation and abuse, and to implement accessible reporting mechanisms for the users. When we have a closer look to what’s about provisions with regard to child rights, then it’s mentioned in the GDC that the states are responsible to connect all schools to the internet, and it’s referred to the GIGA initiative of the ITU and the United Nations Children’s Fund in this way. And with regard to digital literacy, it is said that it’s necessary that children and all users, of course, should meaningfully and securely use the internet and safely navigate the digital space. Therefore, digital space, digital skills, and lifelong access to digital learning opportunities are very important. So, the states are responsible to establish and support national digital skills strategies, adapt teacher training programs, and also adult training programs. That is with regard to your question, Vincent, so that they have in mind that it’s not just necessary to teach the children, the users, but also the responsible person around them, so they can provide support and protect them. When we have a look to participation, then it would be very short. It’s mentioned that meaningful participation of all stakeholders is required, but it’s not said that children be part of that. But in regard of that, it’s so important that children and young people also take part at the internet governance forum at the multi-stakeholder level to bring in their voices and perspective, so that they come in in this meaningful participation process of all stakeholders. That’s what a short overview of the GDC, and I hope it was meaningful.

Jutta Croll: Yes, thank you so much, Torsten. We also would like to remind everybody, if you haven’t had a look at the Global Digital Compact, please do so, and it’s open for giving your consent or for endorsement of individuals, as well as organizations, companies, and so on. So, the more people endorse the Global Digital Compact, the more gravity will get all these recommendations. So, eventually, Hazel, thank you for your patience, waiting in the Zoom room. We are happy to have you here. to give us the perspective of children with a special focus on the Asian Pacific area. Over to you.

Hazel Bitana: Thank you, Jutta. I’m Hazel from Child Rights Coalition Asia or Earth Asia, a network of organizations working for and with children. And I would say that to keep children safe and help them reap the benefits of virtual environments, internet governance would be anchored on child rights principles, which is founded on in general comment number five, as Sophie presented, the principles of the best interest of the child, child participation that takes into consideration children’s evolving capacities, non-discrimination and inclusion and children’s overall development and full enjoyment of all their rights. These are the underlying messages from children, such as when my organization, Child Rights Coalition Asia held our 2024 Regional Children’s Meeting in August in Thailand, where we had 35 child delegates representing their own national or local child-led groups based in 16 countries in Asia. Although we focus our discussions mainly on emerging practices such as sharenting or kid fluencers and generative AI in line with civil and political rights. The recommendations we gathered from this platform could be applied in emerging technologies like the metaverse. Summarizing their inputs, one of the recommendations is an approach that recognizes children as rights holders and not just as passive recipients of care and protection. Children want to be involved and be empowered to be part of the discussions and policymaking processes. They want child-friendly spaces and information, having child-friendly versions of the terms of end conditions or privacy policies that they agree. Another relevant information is one of their key recommendations. Involvement of children in the decision-making process. making processes allows us to have a holistic perspective. We get to learn how children are leveraging these emerging technologies to create positive change, which are not usually highlighted in discussions. When we talk to children about generative AI, they said that it is beneficial for their advocacy work as child rights advocates or child human rights defenders for their recreation, the enjoyment of their right to play and leisure and take part in creative activities and for their education. And I think these are echoed in the Metaverse as well. And by getting children’s perspectives, you also get to see the impact of these emerging technologies on a number of children’s rights. There are already evidence on sexual exploitation and abuse in the Metaverse. And from our regional children’s meeting, the child delegates raised concern on the impact of generative AI to their right to a healthy environment in the context of climate change due to energy consumption of generative AI. And this could be a concern as well when we talk about the Metaverse. Another concern is in relation to the right to privacy and informed consent, especially considering the unique privacy and data protection issues both by generative AI and Metaverse as briefed by a number of our speakers today. And additionally, echoing one of the points from Miriam earlier, a key approach to internet governance is ensuring non-discrimination and inclusion in a number of aspects. For one, due to the price of virtual reality hardware or devices or the internet speed bandwidth required, Metaverse is widening the digital divide. And in terms of freedom of expression, including gender expression, Metaverse has the potential to provide children the platform to enjoy this freedom with avatar serving as a creative tool for expression. Michael expounded earlier, but at the same time without safeguarding these positive potential could instead deepen the discrimination. Cultural diversity should also be taken into consideration to keep children safe in virtual environments, in the metaverse harmful body language, gestures and non-verbal communication are the additional aspects that should be included in the list of violations that children can submit in the reporting mechanism in the metaverse. And this brings me to my next point regarding the importance of having child-friendly reporting mechanism and effective remedies in the metaverse with a diversity of languages, context, social norms, especially in the Asian region, which is always feeling left behind because our language are not always the popular ones in the digital environment. What more now in body language and gestures are included in the context of the metaverse. With this diversity, the importance of having specialized regional or local safety teams should be a part of the internet governance system. This facilitates timely and effective response and prevention mechanisms. And lastly,

Jutta Croll: I need to take out the time because we have now two interventions from the online participants and I want to give them also a bit of time. Maybe they have questions for you or for anyone else. Thank you so much. Sophie, will you hand over to the online participants or will you read out their questions from the chat?

Audience: I had one raised hand when we were talking about children’s rights and the children’s rights blog. It was from Andrew De Alvis, but I’m not sure if they’re still with us because I cannot see them in the participants. participants list anymore, so maybe a participant with a question is already gone. But if not, feel free to raise your hand again. But I think they’re gone. And I have another question from Marie Yves Nadeau. I can read it out. She has a question for a META speaker. The OECD report highlights the impact of virtual reality for children’s development. What prompted META to lower the age requirements? And how does the company address the potential risks to children? Over to you, please. Thanks for the question.

Deepali Liberhan: So, as I said before, we’ve worked with parents and young people as part of our best interests of the child framework. And a lot of parents themselves want to be able to have their kids experience or start experiencing META products in a very protective way. So, we’ve done two things. The first is, you know, 13 to 17-year-olds, which are allowed on our platform, which have certain default protections, as well as, you know, as well as the ability to have parental supervision. For less than that age, when, you know, for less than that age, those accounts are actually managed by the parents. And what we’ve heard from parents is that, and we have a, you know, we did this similarly for Facebook as well. We have Messenger for kids, which is managed by parents, which gives an opportunity for parents to really effectively manage their child’s presence online and be able to deliver all the benefits of that in a very, very protective way. So, we’ve done this with consultation with parents, as well as with experts.

Jutta Croll: Thank you, Deepali. We have now Ansu in the room, not online. So, please go ahead. I can give you only one minute. Yes. And one minute for the response.

Audience: Yeah. So, the question is, have you considered, anyone can answer this, have you considered the design principles for a governance framework? Because I’m a researcher in this area. Thank you.

Jutta Croll: So, you’re asking for the design principles of regulation. Yeah.

Audience: Design principles when developing a governance framework, has anyone considered that? And if so, what?

Jutta Croll: Okay. Will you be able to answer that? So, I think she’s talking about privacy by design, safety by design. Safety by design, child rights by design,

Emma Day: do you mean that kind of design principle? Or just the way that, in general? Yeah. I mean, maybe one of the, there are safety by design and privacy by design regulations, but I think also the theme of the Internet Governance Forum is multi-stakeholder governance, right? And one of the things, Tech legality has been working with UNICEF actually on data governance for edtech, and some of that is related to this with the impressive learning type of work. And what we’ve been looking at is, as part of this is the use of regulatory sandboxes, which there is an organization called Data Sphere Initiative, and they have been looking at trying to make a multi-stakeholder regulatory sandbox. So, bringing together, even across borders, regulators, private sector, and also civil society. And civil society are often the missing piece, but trying to bring all those stakeholders together to look at these frontier technologies and think about how to regulate. So, I don’t know if that answers your question. If you want to hear more about that, we have a panel at 4.30 p.m. today local time, where we’re looking at governance of edtech, neurotech, and fintech, and we’ll look at that multi-stakeholder. Yeah. Thank you.

Jutta Croll: Yes. Thank you so much. Thanks to all of you who have been in the room or taken part online. I have now only three minutes to wrap up, but I will try to do my very best. I do think that we’ve learned that, as we said at the beginning, the virtual environment, virtual worlds, are already inhabited by children. We heard from deep Park that 50% of the 600 million users are under the age of 13. Also, we know that there are age restrictions. And that is mostly due to the fact that we find virtual worlds in the gaming area. But nonetheless, we also heard that social media platforms are already virtual worlds. And that is because we are inhabitants of these social media environments, and we are giving there our data, our profiles, so several information about our identity. That led us to the question of data minimization, which is kind of a controversy to knowing the age of the users, either by age estimation or by age verification, which is always going hand in hand with gathering, collecting data about the users. And that was also something that Meta representative Deepali told us, that they are trying to balance that and trying to minimize the data. We have some regulations, especially the GDPR was mentioned, which is applicable only in Europe. But still, it has been copied to several areas of the world, which give us kind of an orientation what would be the principle of data minimization and how it could also be applied to the metaverse. But then we also learned that there are larger amounts of data and more sensitive data collected. So that would be the reason why we need another level of data governance for the metaverse, considering that we already have a gap of regulation when it comes to virtual reality. Eventually, I would like to go back to the children’s rights. We have learned that we have the area of protection rights, provision rights, and participation rights. And when we are talking about virtual environments, we always have a bit more focus on children’s safety. But still, we have seen that it’s really a huge opportunity to build on the evolving capacities of children to provide them with the education, with peer education in virtual environments. And that will also ensure that they have their right to participation. Finally, we heard that children want to be empowered and involved, and that they understand generative AI as an instrument for children’s advocacy. So I think that is a very important and future-oriented message that we got at the end of our session. And that is the message that I will conclude with. Let’s focus on the opportunities that the metaverse will provide children without neglecting their right to be protected. Thank you so much.

M

Michael Barngrover

Speech speed

163 words per minute

Speech length

1215 words

Speech time

444 seconds

Virtual worlds encompass VR, mixed reality, 3D gaming, and social media platforms

Explanation

Michael Barngrover explains that virtual worlds are a broad concept including various digital environments. He highlights that these range from fully immersive VR experiences to traditional 3D gaming and even social media platforms.

Evidence

He mentions specific examples like Fortnite as a 3D gaming world and notes that social media platforms can also be considered virtual worlds.

Major Discussion Point

The nature and scope of virtual worlds/metaverse

Agreed with

Deepak Tewari

Jutta Croll

Agreed on

Virtual environments are already heavily populated by children

D

Deepak Tewari

Speech speed

159 words per minute

Speech length

1634 words

Speech time

616 seconds

Metaverse has 600 million monthly active users, with 51% under age 13

Explanation

Deepak Tewari provides statistics on metaverse usage, highlighting the significant presence of children. He emphasizes that a majority of users in these virtual environments are minors.

Evidence

He cites specific statistics: 600 million monthly active users, 51% under age 13, and 84% under age 18.

Major Discussion Point

The nature and scope of virtual worlds/metaverse

Agreed with

Michael Barngrover

Jutta Croll

Agreed on

Virtual environments are already heavily populated by children

Privacy-preserving age detection technology exists but faces pushback

Explanation

Deepak Tewari argues that technology for privacy-preserving age detection is available but not widely adopted. He suggests that big tech companies are resistant to implementing such technologies due to potential impacts on their business models.

Evidence

He mentions his company’s development of age-aware cameras and microphones that can detect age in real-time without storing personal data.

Major Discussion Point

Data collection and privacy concerns

Differed with

Deepali Liberhan

Differed on

Effectiveness of age verification methods

J

Jutta Croll

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Social media platforms are already virtual worlds based on user behavior

Explanation

Jutta Croll suggests that social media platforms can be considered virtual worlds due to user behavior. She argues that users inhabit these platforms by providing personal data and creating digital identities.

Major Discussion Point

The nature and scope of virtual worlds/metaverse

Agreed with

Michael Barngrover

Deepak Tewari

Agreed on

Virtual environments are already heavily populated by children

Generative AI seen as tool for children’s advocacy

Explanation

Jutta Croll highlights that children view generative AI as a potential tool for advocacy. This perspective emphasizes the positive potential of emerging technologies for empowering children.

Major Discussion Point

Opportunities and risks of metaverse for children

D

Deepali Liberhan

Speech speed

167 words per minute

Speech length

1881 words

Speech time

675 seconds

Metaverse offers immersive learning opportunities beyond gaming

Explanation

Deepali Liberhan emphasizes the educational potential of the metaverse beyond entertainment. She argues that immersive virtual environments can provide unique learning experiences for children.

Evidence

She mentions the possibility of virtually visiting historical sites like the Pyramids of Giza or the Eiffel Tower, and collaborative learning across geographical and economic boundaries.

Major Discussion Point

Opportunities and risks of metaverse for children

Meta implements default privacy settings and parental controls for teens

Explanation

Deepali Liberhan describes Meta’s approach to child safety in virtual environments. She highlights the implementation of default privacy settings for teens and parental supervision tools across Meta’s products.

Evidence

She mentions specific features like default private profiles for 13-17 year olds, personal boundary settings in VR, and parental controls for time limits and app approvals.

Major Discussion Point

Children’s rights and safety in virtual environments

Agreed with

Sophie Pohle

Lhajoui Maryem

Hazel Bitana

Agreed on

Need for age-appropriate content and safety measures in virtual environments

Differed with

Hazel Bitana

Differed on

Approach to child safety in virtual environments

Need to balance age verification with data minimization principles

Explanation

Deepali Liberhan discusses the challenge of verifying users’ ages while adhering to data minimization principles. She emphasizes Meta’s efforts to find effective age assurance methods that don’t compromise user privacy.

Evidence

She mentions the use of trained personnel to identify underage users, proactive technology using behavioral signals, and third-party solutions like YOTI for age estimation.

Major Discussion Point

Data collection and privacy concerns

Differed with

Deepak Tewari

Differed on

Effectiveness of age verification methods

S

Sophie Pohle

Speech speed

114 words per minute

Speech length

961 words

Speech time

501 seconds

General Comment 25 provides framework for children’s rights in digital environments

Explanation

Sophie Pohle introduces General Comment 25 as a guiding document for children’s rights in digital contexts. She explains that it offers a practical framework for implementing the Convention on the Rights of the Child in digital environments.

Evidence

She outlines the four general principles of GC25: non-discrimination, best interests of the child, right to life and development, and respect for the views of the child.

Major Discussion Point

Children’s rights and safety in virtual environments

Agreed with

Deepali Liberhan

Lhajoui Maryem

Hazel Bitana

Agreed on

Need for age-appropriate content and safety measures in virtual environments

L

Lhajoui Maryem

Speech speed

154 words per minute

Speech length

1313 words

Speech time

510 seconds

Need for age-appropriate content and safer digital experiences for children

Explanation

Lhajoui Maryem emphasizes the importance of creating safe and age-appropriate digital experiences for children. She argues for the need to involve children in the design and policy-making processes of digital platforms.

Evidence

She mentions the use of playful tools and workshops to engage children in discussions about online safety and digital identity.

Major Discussion Point

Children’s rights and safety in virtual environments

Agreed with

Deepali Liberhan

Sophie Pohle

Hazel Bitana

Agreed on

Need for age-appropriate content and safety measures in virtual environments

H

Hazel Bitana

Speech speed

140 words per minute

Speech length

724 words

Speech time

309 seconds

Importance of child-friendly reporting mechanisms and effective remedies

Explanation

Hazel Bitana stresses the need for child-friendly reporting mechanisms and effective remedies in virtual environments. She argues that these systems should consider cultural diversity and language differences, especially in the Asian region.

Evidence

She mentions the need for specialized regional or local safety teams to facilitate timely and effective response and prevention mechanisms.

Major Discussion Point

Children’s rights and safety in virtual environments

Agreed with

Deepali Liberhan

Sophie Pohle

Lhajoui Maryem

Agreed on

Need for age-appropriate content and safety measures in virtual environments

Differed with

Deepali Liberhan

Differed on

Approach to child safety in virtual environments

Children want to be involved in policymaking processes

Explanation

Hazel Bitana emphasizes children’s desire to be actively involved in discussions and policymaking related to digital environments. She argues for an approach that recognizes children as rights holders rather than passive recipients of protection.

Evidence

She cites inputs from the 2024 Regional Children’s Meeting in Thailand, where child delegates expressed their desire for involvement and child-friendly information.

Major Discussion Point

Regulation and governance of virtual environments

Potential for creative expression but also deepening discrimination

Explanation

Hazel Bitana discusses the dual nature of the metaverse for children’s expression. While it offers opportunities for creative expression through avatars, she warns that without proper safeguards, it could deepen existing discrimination.

Major Discussion Point

Opportunities and risks of metaverse for children

E

Emma Day

Speech speed

151 words per minute

Speech length

1267 words

Speech time

500 seconds

Unprecedented amounts of sensitive data collected in metaverse

Explanation

Emma Day highlights the increased data collection in metaverse environments. She explains that this data can be more sensitive and voluminous than traditional online platforms, potentially including physiological responses and brainwave patterns.

Evidence

She mentions potential uses of this data for targeted marketing, commercial surveillance, or government surveillance.

Major Discussion Point

Data collection and privacy concerns

Existing regulations like GDPR apply but new challenges emerge in metaverse

Explanation

Emma Day discusses the applicability of existing regulations like GDPR to the metaverse. She points out that while these regulations still apply, the metaverse presents new challenges in areas like determining data controllers and processors.

Evidence

She gives an example of the complexity of providing privacy notices in different parts of a virtual environment.

Major Discussion Point

Regulation and governance of virtual environments

Multi-stakeholder approach needed for governance frameworks

Explanation

Emma Day advocates for a multi-stakeholder approach to developing governance frameworks for new technologies. She suggests using regulatory sandboxes to bring together regulators, private sector, and civil society to address frontier technologies.

Evidence

She mentions work with UNICEF on data governance for edtech and the Data Sphere Initiative’s efforts to create multi-stakeholder regulatory sandboxes.

Major Discussion Point

Regulation and governance of virtual environments

T

Torsten Krause

Speech speed

123 words per minute

Speech length

743 words

Speech time

361 seconds

Global Digital Compact calls for protecting children’s rights in digital spaces

Explanation

Torsten Krause introduces the Global Digital Compact as a framework for shaping the digital environment. He highlights its emphasis on protecting children’s rights in digital spaces.

Evidence

He cites specific objectives from the GDC, including closing digital divides, fostering inclusive digital spaces, and creating safe and trustworthy emerging technologies.

Major Discussion Point

Data collection and privacy concerns

States responsible for implementing child safety policies and standards

Explanation

Torsten Krause emphasizes the responsibility of states in implementing child safety measures in digital environments. He argues that states should develop and enforce national online child safety policies and standards.

Evidence

He references the GDC’s call for states to strengthen legal systems and policy frameworks to protect children’s rights in digital spaces.

Major Discussion Point

Regulation and governance of virtual environments

A

Audience

Speech speed

140 words per minute

Speech length

781 words

Speech time

333 seconds

Risk of addiction and disconnection from reality

Explanation

An audience member raises concerns about the potential for addiction to virtual environments. They question how to prevent children from becoming overly immersed in the metaverse and disconnecting from reality.

Major Discussion Point

Opportunities and risks of metaverse for children

Agreements

Agreement Points

Virtual environments are already heavily populated by children

Michael Barngrover

Deepak Tewari

Jutta Croll

Virtual worlds encompass VR, mixed reality, 3D gaming, and social media platforms

Metaverse has 600 million monthly active users, with 51% under age 13

Social media platforms are already virtual worlds based on user behavior

Speakers agree that virtual environments, including social media platforms, are already widely used by children and constitute a significant part of their digital experience.

Need for age-appropriate content and safety measures in virtual environments

Deepali Liberhan

Sophie Pohle

Lhajoui Maryem

Hazel Bitana

Meta implements default privacy settings and parental controls for teens

General Comment 25 provides framework for children’s rights in digital environments

Need for age-appropriate content and safer digital experiences for children

Importance of child-friendly reporting mechanisms and effective remedies

Multiple speakers emphasize the importance of creating safe, age-appropriate digital experiences for children with proper safeguards and reporting mechanisms.

Similar Viewpoints

These speakers highlight the tension between effective age verification and data privacy concerns in virtual environments, acknowledging the need for balance and the challenges posed by extensive data collection.

Deepak Tewari

Deepali Liberhan

Emma Day

Privacy-preserving age detection technology exists but faces pushback

Need to balance age verification with data minimization principles

Unprecedented amounts of sensitive data collected in metaverse

Both speakers advocate for involving children in the design and policy-making processes of digital platforms, emphasizing the importance of children’s perspectives in creating safe and appropriate digital experiences.

Lhajoui Maryem

Hazel Bitana

Need for age-appropriate content and safer digital experiences for children

Children want to be involved in policymaking processes

Unexpected Consensus

Potential of virtual environments for education and advocacy

Deepali Liberhan

Jutta Croll

Hazel Bitana

Metaverse offers immersive learning opportunities beyond gaming

Generative AI seen as tool for children’s advocacy

Potential for creative expression but also deepening discrimination

Despite discussions largely focusing on risks and safety concerns, there was unexpected consensus on the positive potential of virtual environments for education and children’s advocacy. This highlights a balanced view of the metaverse’s impact on children.

Overall Assessment

Summary

The main areas of agreement include the widespread use of virtual environments by children, the need for age-appropriate content and safety measures, and the challenges of balancing age verification with data privacy. There was also recognition of the potential benefits of virtual environments for education and advocacy.

Consensus level

Moderate consensus was observed among speakers on key issues. While there were differing perspectives on implementation details, there was general agreement on the importance of protecting children’s rights in virtual environments. This consensus suggests a shared foundation for developing policies and regulations for children in the metaverse, but also highlights the need for continued dialogue to address complex challenges like data privacy and age verification.

Differences

Different Viewpoints

Effectiveness of age verification methods

Deepak Tewari

Deepali Liberhan

Privacy-preserving age detection technology exists but faces pushback

Need to balance age verification with data minimization principles

While Deepak Tewari argues that effective privacy-preserving age detection technology exists and should be implemented, Deepali Liberhan emphasizes the need to balance age verification with data minimization, suggesting current solutions may not fully address privacy concerns.

Approach to child safety in virtual environments

Deepali Liberhan

Hazel Bitana

Meta implements default privacy settings and parental controls for teens

Importance of child-friendly reporting mechanisms and effective remedies

Deepali Liberhan focuses on platform-specific safety measures implemented by Meta, while Hazel Bitana emphasizes the need for broader, culturally sensitive reporting mechanisms and remedies across virtual environments.

Unexpected Differences

Perception of generative AI by children

Jutta Croll

Hazel Bitana

Generative AI seen as tool for children’s advocacy

Potential for creative expression but also deepening discrimination

While Jutta Croll presents a positive view of children using generative AI for advocacy, Hazel Bitana highlights both the creative potential and the risk of deepening discrimination. This unexpected difference shows the complexity of emerging technologies’ impact on children’s rights.

Overall Assessment

summary

The main areas of disagreement revolve around the implementation of age verification technologies, the approach to child safety in virtual environments, and the involvement of children in policymaking processes.

difference_level

The level of disagreement among speakers is moderate. While there is general consensus on the importance of protecting children’s rights in virtual environments, speakers differ significantly in their proposed approaches and solutions. These differences highlight the complexity of balancing privacy, safety, and children’s participation in the rapidly evolving digital landscape, particularly in the context of the metaverse. The implications of these disagreements suggest that a multi-stakeholder approach, as proposed by Emma Day, may be necessary to develop comprehensive and effective governance frameworks for children’s safety and rights in virtual environments.

Partial Agreements

Partial Agreements

All speakers agree on the importance of child safety in virtual environments, but differ in their approaches. Deepali Liberhan focuses on platform-specific measures, while Hazel Bitana and Lhajoui Maryem advocate for more direct involvement of children in policymaking and design processes.

Deepali Liberhan

Hazel Bitana

Lhajoui Maryem

Meta implements default privacy settings and parental controls for teens

Children want to be involved in policymaking processes

Need for age-appropriate content and safer digital experiences for children

Similar Viewpoints

These speakers highlight the tension between effective age verification and data privacy concerns in virtual environments, acknowledging the need for balance and the challenges posed by extensive data collection.

Deepak Tewari

Deepali Liberhan

Emma Day

Privacy-preserving age detection technology exists but faces pushback

Need to balance age verification with data minimization principles

Unprecedented amounts of sensitive data collected in metaverse

Both speakers advocate for involving children in the design and policy-making processes of digital platforms, emphasizing the importance of children’s perspectives in creating safe and appropriate digital experiences.

Lhajoui Maryem

Hazel Bitana

Need for age-appropriate content and safer digital experiences for children

Children want to be involved in policymaking processes

Takeaways

Key Takeaways

Virtual worlds/metaverse are already widely used by children, with 51% of users under age 13

Existing regulations like GDPR apply to metaverse but new challenges emerge around data collection and privacy

Children’s rights frameworks like General Comment 25 should guide metaverse governance

Age verification and parental controls are important but must be balanced with data minimization

Metaverse offers opportunities for education and creative expression but also risks like addiction and discrimination

Multi-stakeholder approach involving children is needed for effective governance

Resolutions and Action Items

States should implement national online child safety policies and standards as per Global Digital Compact

Companies should conduct child rights impact assessments for metaverse products

More research needed on impacts of virtual reality on child development

Develop child-friendly reporting mechanisms for metaverse environments

Unresolved Issues

How to effectively verify age in metaverse without excessive data collection

Appropriate consequences for virtual crimes or harms

How to close digital divides in access to metaverse technologies

Balancing innovation with protection in metaverse regulation

Suggested Compromises

Age verification at device/OS level rather than by individual platforms to minimize data collection

Use of privacy-preserving age estimation technologies instead of strict verification

Allowing children access to metaverse under parental supervision with strong default protections

Thought Provoking Comments

With virtual worlds, it is a challenging term, because it’s very broad and very encompassing. So there are at least four broad concepts of virtual worlds that I think are very relevant.

speaker

Michael Barngrover

reason

This comment provided a framework for understanding the complexity and breadth of virtual worlds, setting the stage for a more nuanced discussion.

impact

It led to a deeper exploration of different types of virtual environments and their implications for children, moving the conversation beyond simplistic notions of the metaverse.

There is a cognitive load to be thinking about and managing your activities and your presence, and thus the activities and presence of others in multiple worlds, the non-digital world, but also the digital virtual worlds.

speaker

Michael Barngrover

reason

This insight highlighted an often overlooked aspect of virtual worlds – the cognitive demands they place on users, especially children.

impact

It shifted the discussion to consider the psychological and developmental impacts of virtual worlds on children, beyond just safety concerns.

Technology exists today from companies like Privately, which have developed fully privacy-preserving, GDPR-compliant solutions for detecting age of users.

speaker

Deepak Tewari

reason

This comment introduced concrete technological solutions to age verification, a key challenge in protecting children online.

impact

It moved the conversation from theoretical concerns to practical solutions, sparking discussion about implementation and effectiveness of such technologies.

The metaverse is an evolving space and some of it we’re talking about is a little bit futuristic and may not actually exist yet, but we’re talking about a kind of virtual space where unprecedented amounts of data is collected from child users and adult users as well.

speaker

Emma Day

reason

This comment grounded the discussion in reality while highlighting the potential future challenges, particularly around data collection.

impact

It refocused the discussion on the need for proactive regulation and governance to address future challenges in virtual environments.

Children want to be involved and be empowered to be part of the discussions and policymaking processes. They want child-friendly spaces and information, having child-friendly versions of the terms of end conditions or privacy policies that they agree.

speaker

Hazel Bitana

reason

This comment brought the crucial perspective of children themselves into the discussion, emphasizing their desire for agency and understanding.

impact

It shifted the conversation from a protective stance to one that also considered children’s rights to participation and information, leading to a more balanced discussion of safety and empowerment.

Overall Assessment

These key comments shaped the discussion by broadening its scope from a narrow focus on safety to a more comprehensive consideration of children’s experiences in virtual worlds. They introduced technical, legal, and child-centric perspectives, leading to a richer, more nuanced dialogue about the challenges and opportunities of the metaverse for children. The discussion evolved from defining virtual worlds to exploring their cognitive impacts, from theoretical concerns to practical solutions, and from adult-centric protectionism to child-empowering approaches.

Follow-up Questions

How can addiction to the metaverse be prevented in children?

speaker

Youth ambassador from Hong Kong

explanation

This is important to address potential negative impacts of immersive virtual environments on children’s wellbeing and development.

How can parents and educators be effectively involved when it comes to children in the metaverse?

speaker

Audience member from India

explanation

Parental and educator involvement is crucial for ensuring children’s safety and positive experiences in virtual environments.

What are the design principles for developing a governance framework for the metaverse?

speaker

Audience member Ansu

explanation

Establishing clear design principles is important for creating effective and ethical governance structures for virtual environments.

How can age verification be implemented more effectively on social media platforms without compromising data minimization principles?

speaker

Audience member (unnamed)

explanation

Balancing effective age verification with data protection is crucial for ensuring child safety while respecting privacy rights.

What prompted META to lower the age requirement for the Quest, and how does the company address the potential risks to children?

speaker

Marie Yves Nadeau (online participant)

explanation

Understanding the rationale behind age requirement changes and associated risk mitigation strategies is important for assessing the impact on child safety.

How can cultural diversity be effectively incorporated into safeguarding measures in the metaverse?

speaker

Hazel Bitana

explanation

Ensuring cultural sensitivity in safety mechanisms is crucial for creating inclusive and effective protection for children from diverse backgrounds.

How can child-friendly reporting mechanisms and effective remedies be implemented in the metaverse?

speaker

Hazel Bitana

explanation

Developing accessible and effective reporting and remedy systems is essential for protecting children in virtual environments.

Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.