Child participation online: policymaking with children | IGF 2023 Open Forum #86

11 Oct 2023 05:15h - 06:15h UTC

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Audience

The articles analysed cover a range of topics related to youth engagement and online safety. One article explores the effectiveness and necessity of age verification systems online. It discusses Marie Stella’s investigation into the opinions of youth and children regarding age verification. Stella found that, while adults haven’t found a perfect solution, the issue still needs attention.

Another article focuses on the opinions of young people regarding age verification. It raises the question of whether awareness-raising education alone is enough to prevent access to harmful online content. This article emphasises the importance of further examination and dialogue within the context of quality education and strong institutions.

The third article highlights the importance of engaging companies in child participation, particularly in areas with restricted democratic participation. It discusses how companies can contribute to the design and decision-making processes that affect children. While Microsoft is suggested as a potential partner, other companies are also encouraged to get involved. This article emphasises the role of SDG 16 (Peace, Justice and Strong Institutions) and SDG 8 (Decent Work and Economic Growth) in promoting child participation.

The fourth article stresses the need to develop convincing strategies to engage companies in child participation. It emphasises the importance of partnership for the goals and industry innovation and infrastructure, as outlined in SDG 17 and SDG 9. Microsoft is suggested as a possible partner, but other companies are also welcomed.

In conclusion, the articles highlight the importance of addressing these issues to ensure the online safety and well-being of young people. They emphasise the need to explore effective age verification systems, consider youth opinions, and promote awareness-raising education. Engaging companies in child participation and developing convincing strategies are also seen as vital. These discussions align with various Sustainable Development Goals, such as SDG 4 (Quality Education), SDG 16 (Peace, Justice and Strong Institutions), SDG 8 (Decent Work and Economic Growth), SDG 17 (Partnership for the Goals), and SDG 9 (Industry, Innovation and Infrastructure).

Courtney Gregoire

The digital environment, although not originally designed for children, has a significant impact on their rights and potential. The policies of technology providers play a crucial role in shaping this impact. It is important to transition from a mode of protection to empowering youth voices. For example, Microsoft has a long-standing commitment to children’s online safety. They recognize the need to understand how children use technology in order to better design it. One way they have addressed this is through their gaming platform, where they introduced ‘Home Sweet Hmm’ to promote online safety.

The argument put forth is that children learn through play, highlighting the role of educational gaming in their development. Microsoft’s ownership of a gaming platform further emphasizes their involvement in promoting learning through play and fostering a safe digital environment for children.

Regarding product development, it is crucial to engage children in the process. Microsoft has convened three councils for digital good, where children aged 13 to 17 have provided valuable feedback on their services and apps. This demonstrates Microsoft’s commitment to involving children and incorporating their perspectives in the development of their products.

The potential of Artificial Intelligence (AI) is also highlighted, particularly its positive impact when used responsibly. The argument suggests that AI has the ability to do good, though responsible use is key to ensuring positive outcomes.

The summary also emphasizes the importance of incorporating children’s online behavior into policy-making. This reflects the need to understand how children ask for help online and to consider their experiences when shaping policies related to child safety and well-being.

Microsoft’s approach to child participation is noted, as they leverage existing organizations to engage children in product safety and design. They have previously convened Councils for Digital Good, collaborating with NGOs and academia to gather information and stimulate conversations on these issues.

Finally, the argument is made that children’s voices should influence both company rules and regulatory/legal rules. Microsoft actively involves their child participants in direct interactions with regulators, demonstrating their belief in the importance of children’s influence at various levels of decision-making.

In conclusion, the expanded summary outlines the significance of the digital environment on children’s rights and potential, the importance of empowering youth voices, the role of play and education in their development, engaging children in product development, responsible use of AI, integrating children’s online behavior into policy-making, and Microsoft’s efforts to involve children in shaping rules and regulations.

Afrooz Kaviani Johnson

The analysis emphasises the importance of actively involving children in decision-making processes and policy development, particularly in the area of online safety. The Convention on the Rights of the Child recognises and upholds children’s right to freely express their views. The speakers in the analysis highlight that by involving children, policymakers can tap into their creativity, skills, and unique understanding, leading to more effective and tailored policies and programs.

It is crucial to consider that children interact with digital technology in ways that differ greatly from adults. Therefore, their perspectives and experiences must be taken into account when formulating policies and programmes related to online safety. Including children’s insights allows policymakers to gain a better understanding of their needs, enabling the creation of more relevant and effective guidelines.

Several supporting facts demonstrate the benefits of involving children in decision-making processes and policy development. The Committee on the Rights of the Child proposes nine basic requirements for effective child participation, including transparency, voluntariness, respect, child-friendly approaches, inclusivity, support through training, safety considerations, and accountability.

Examples from Tunisia and the Philippines illustrate how children’s voices have helped shape national plans and legislation. In Tunisia, children’s voices played a crucial role in formulating the National Plan of Action on Child Online Protection. By consulting with children, policymakers were able to gain valuable insights and develop a plan that truly addressed their concerns and needs. Similarly, in the Philippines, consultations with children informed the development of a new national plan of action on children’s issues and other legislative instruments.

The analysis also highlights that involving children in decision-making requires careful planning, allocation of resources, and adequate training. In the consultations held in the Philippines, young adults from the communities acted as facilitators, ensuring that children felt comfortable and supported. Additionally, programming for parents and caregivers was implemented, and an emergency response plan was in place to safeguard children in case of any disclosures.

To conclude, actively involving children in discussions and decision-making processes is essential for developing effective policies, particularly in areas like online safety. The Convention on the Rights of the Child recognises their right to express their views, and involving them leverages their unique perspectives and understanding. Transparency, respect, inclusivity, and accountability are all key elements for successful child participation. Examples from Tunisia and the Philippines highlight how children’s voices can shape national plans and legislation. However, it is important to note that involving children in decision-making requires careful planning, allocation of resources, and adequate training to ensure meaningful and impactful participation.

Hillary Bakrie

The Protection Through Online Participation (POP) initiative aims to provide a safe online space for children and youth to access protection support. It emphasizes the importance of peer-to-peer support and encourages children-led solutions and initiatives. Hillary Bakrie, a supporter of POP, believes that the internet can be a valuable tool for young people to seek support and highlights the value they place on peer-created solutions. Young people also desire to be included as partners in decision-making processes, particularly regarding online safety and cybersecurity. This inclusive approach ensures that policies and measures are effective and relevant to their needs. To enable effective youth participation, addressing the digital divide and investing in education and skills are essential. Transparency, accessibility, and the recognition of young people’s contributions in policy-making processes are also emphasized. Overall, POP and its supporters advocate for an empowering online environment that values the expertise and experiences of young people.

Moderator

The discussion focused on the importance of child participation in policymaking, particularly in the context of online safety. Participants highlighted the significance of involving children in discussions and considering their rights in the digital environment. It was stressed that children have a unique understanding of their experiences online, and their perspectives should be taken into account when designing policies and interventions. The Child Online Protection Initiative (COP) and ITU’s role in implementing guidelines for child safety online were mentioned as important efforts in this area. COP aims to facilitate the sharing of challenges and best practices among member states. The ITU has been co-leading the initiative, providing support to countries in implementing the guidelines. The discussion noted that the involvement of children in policymaking can help ensure that their views and experiences are considered, leading to more effective and relevant policies and programs that address the specific needs of young users. The role of Microsoft in promoting child online safety was also highlighted. Microsoft has a longstanding commitment to this issue and has developed a suite of products and services that intersect with children’s online lives. The company engages in conversations with young people to understand their needs and enhance the way they interact with technology. Examples from Tunisia and the Philippines showcased the value of children’s input in shaping national action plans and legislative instruments related to online safety. In Tunisia, consultations with children helped shape the first-ever National Plan of Action on Child Online Protection. In the Philippines, involving children in consultations contributed to the formation of national action plans. The ITU In-Country National Assessment was proposed as a valuable resource for governments to improve child safety online. By conducting a comprehensive assessment of the existing situation and developing a strategy and action plan based on global best practices, countries can enhance their policies, standards, and mechanisms. Overall, the discussion highlighted the importance of involving children in policymaking and designing online safety interventions. Children’s participation ensures that their perspectives are taken into account, leading to more effective and relevant policies and programs. The involvement of youth in decision-making processes was also stressed, emphasizing the need for an inclusive approach that reflects the realities and aspirations of young people. The discussion recognised the value of partnerships between stakeholders, such as the ITU, Microsoft, and governments, in promoting child online safety.

Amanda Third

Children’s meaningful participation in the design of services and online safety interventions is considered crucial. The drafting of the UNCRC General Comment 25, which focuses on children’s rights in relation to the digital environment, was informed by consultations with children in 27 countries globally. This approach ensured that the key issues reflected not only the perspectives of adults but also the lived experiences of children themselves.

The International Telecommunications Union has taken steps towards promoting online safety by developing an online safety app, game, and trainings for three different age groups of children. What sets these initiatives apart is the involvement of a children’s advisory group, ensuring that the voice of children contributes to the creation of these tools.

To further support children’s participation, Amanda led the establishment of national child task forces in five countries. These task forces serve as guides for the government’s approach to online safety policy, emphasizing the importance of involving young people in crafting policies that directly affect them.

Youth participation in policy-making is highly valued and encouraged. Amanda suggests that shadowing decision-makers could enhance children’s influence in shaping online safety policies. Additionally, Amanda proposes that organizations’ platforms should actively seek young people’s input in a daily, approachable manner. This ongoing, real-time conversation would allow organizations to better understand children’s needs and preferences.

A notable finding from the consultations conducted in 27 countries is that children expressed their desire for improved online protections and data security. This highlights the importance of addressing these concerns to ensure a safe digital environment for children.

It is worth mentioning that attempting to restrict children’s online activities without considering their input can often lead them to find ways to circumvent such systems. Therefore, involving children in the decision-making process can lead to more effective and sustainable solutions, as children become active participants rather than passive subjects.

In conclusion, the engagement and participation of children in the design of services and online safety interventions are crucial. Through consultations, the UNCRC General Comment 25 incorporates children’s perspectives, ensuring that their unique experiences are reflected. Initiatives such as the online safety app and the establishment of national child task forces further demonstrate the commitment to involving children in shaping online safety policies. Encouraging youth participation and seeking their input in an ongoing manner will create an environment that better meets children’s online safety needs. By addressing their desires for better protection and data security, we can foster a digital environment that is safe and supportive for children.

Boris Radanovic

The analysis highlights the positive and impactful role played by youth in addressing various pressing issues. One notable example is the development of the Bully Blocker app by a group of teenagers, which aims to combat cyberbullying. This app demonstrates how youth-led initiatives can effectively address societal challenges, particularly in the realm of online safety. Another inspiring initiative is the creation of an online fake shop by a Polish high school student, intended to assist domestic abuse victims during the virus lockdown period. These examples exemplify the creative and innovative solutions that young people bring to complex problems.

Furthermore, the analysis emphasizes the importance of involving youth in decision-making processes regarding their own issues. It argues that discussions on how to support children often lack the direct participation of children themselves. However, in order to create valuable actions and solutions, it is essential to include youth input. The presence of youth-led advisory boards is acknowledged, but it is stressed that following through on their advice is crucial to ensure meaningful outcomes.

In terms of online safety, the analysis recommends government representatives apply for the ITU In-Country National Child Safety Assessment. This assessment provides a comprehensive understanding of the existing situation of children’s online safety and aids in drafting national strategies and action plans that incorporate global best practices. It is argued that such assessments can enhance national policies, standards, and mechanisms to protect children in the digital realm. Additionally, the analysis highlights the importance of local adaptations of global strategies, as local cultural, social, and regulatory differences impact the effectiveness of online safety measures.

The analysis also addresses the issue of children encountering adult or abusive content unwillingly on the internet. It argues that children do not want content that is not intended for them in their online spaces, emphasizing the need for adults to implement protections to prevent children from accessing inappropriate material. It acknowledges that the internet and its content were not specifically created for children and therefore, proactive measures are necessary to safeguard their online experiences.

Furthermore, the analysis recognizes that age verification poses a significant challenge in ensuring online child safety. However, it suggests that with children’s input, a solution can be achieved. It is concluded that working collaboratively with children and implementing their perspectives and ideas can lead to more effective and comprehensive measures to protect them online.

Overall, the analysis highlights the important contributions of youth in tackling critical issues, the need to involve them in decision-making processes, the recommendation for government action in enhancing online safety, and the significance of age verification in protecting children online. By considering these insights and recommendations, society can better empower and protect the younger generation in an increasingly digital world.

Session transcript

Moderator:
What, in this stuff? Whatever the technology is, remind me where you’re based. We have, we, yes. You should, you should, 110%. Okay, can I just be honest, no, no, no. You do understand my value structure is we better be able to do it. Photo DNA, that’s the way, did Liz catch you? Yeah, so if we need to do a couple hours of engineering work to make it operable, like, that’s the way we do it. Easier to, yep. So we’re ready. And listen, do you see the dead line? I’m with you. Do you see the dead lines? See? We can cover it right this way, yeah. Let’s do it. Oh, great. Can you hear me? Yeah. All right, again, thank you very much for coming. Let me welcome you to the workshop number 86, open forum number 86, Child Online, Child Participation Online Policymaking with Children. So I’m your moderator. My name is Preetam Malur. I’m the head of the Emerging Technologies Division at the International Telecommunication Union. And the reason why many of you don’t know me, probably, is because I’m substituting for my colleagues, Carla and Fanny, who are the subject matter experts on the topic. So Carla couldn’t come. There was a last minute cancellation, so I offered to step in. So please indulge me if I don’t use the right terminology or make some mistakes. But to my credit, I’m an expert on this topic because I have my child here, who’s sitting next to me. So since the topic is on involving children in policymaking, if you have any hypothesis you would want to test during this session, here is a subject. No, I can guarantee you he’ll answer it. The quality of the answer is questionable. But anyway. All right. It’ll be great, it’ll be great. So with this, let me just start with a few introductory remarks, and then, which will be very quick, and then we can go to our panel. So some points on the Child Online Protection Initiative. We’ve been working, we as an ITU, has been working on this topic since 2009. Initially, the initiative was founded to facilitate the sharing of challenges, best practices among member states, addressing issues of violence against children in the online environment. Of course, we’ve broadened the focus now, actively involving children in discussions, and considering all child’s rights and the digital environment, including the right to participation, education, access to information, and many others. And the activities are now balanced between protection and participation online. So just to give you a background on the Child Online Protection Guidelines, because I know many of you were involved in drafting that. They were initially developed in 2008, revised again, comprehensively rewritten in 2020 by an expert group of more than 30 organizations from the UN, from NGOs, the private sector, academic sector, you know, so it was a truly multi-stakeholder effort. A global program was launched in 2021, you know, set to run till 24, aimed at assisting member states in implementing these guidelines. And it’s been a success story. You know, they are currently being implemented in 15-plus countries across all regions. You know, they include capacity building. They include policy assistance for member states in developing the national strategies, policies, legal and regulatory frameworks. And there’s a lot of activities going on. And among some of the new collaborative activities is the alignment with the new participation-based approach. And ITU has been co-leading the POP initiative, you know, Protection Through Online Participation, with the Office of the Special Representative of the Secretary General on Violence Against Children. And we have a colleague from that office here. A few words on POP, you will hear more from her. It collaborates with over 30 global partners, including the UN, universities, NGOs, youth, private sector companies such as Meta, Disney, Lego, Microsoft, and Roblox. Hillary will tell you more about the effort. And in light of, you know, ITU’s efforts, alignment with global trends with regards to the work of expert organization in this area, discussions around how children can be best involved in matters that are relevant to them, and in particular with regard to child online protection and online safety are clearly more relevant than ever. And this is the emphasis on this session. So we have one hour, so I really urge you to stick to the three minutes for every intervention that’s allocated to you. So without any delay, let me turn to Afroz Kaviani-Johnson. Afroz, so I’ll start to, you know, I’ll start the discussion with a fundamental question. Could you share with us why it’s so important to work closely with children on matters that concern them? And more specifically, when we talk about child online safety. Afroz, over to you.

Afrooz Kaviani Johnson:
Thanks a lot. And a special welcome to your child as well. We’re delighted to have you around the table. Working closely with children on issues that affect them, like online safety, is crucial. And there’s a number of reasons. I’m just going to focus on three main ones. Firstly, it’s a right. The Convention on the Rights of the Child provides that children have the right to freely express their views on all matters and decisions that affect them, and to have those views taken into account. So it’s the right of every child without any exception. And of course, children encompasses like a very broad range of ages. The definition of a child is anyone under the age of 18. So obviously, it’s vital to adjust approaches to suit different ages and capacities. The second main point is that working with children really enhances programs, involving children leverages their creativity, their skills, their unique understanding of their own lives to create and monitor more effective and relevant policies, services and practices. And thirdly, particularly in this space of online safety, it provides real world relevance. So obviously, online safety programs and policies, which are typically designed by adults, will reflect adult concerns and may miss, you know, it will miss, not may, it will miss the nuances and, you know, the things that are important to young users. And children interact with digital technology in ways that are very different to adults. I can say this, you know, in my personal capacity as a parent, but also looking at, you know, the masses of research that UNICEF has undertaken with children around the world, around their online experiences. So consulting children, working with children kind of opens these avenues that we can explore and comprehend actual risks kind of versus perceived threats in the online environment. And it’s not just about listening. It’s really about ensuring that our policies, our programs, you know, are rooted in their lived experiences and tailored to their needs. So involving children, it’s not just beneficial, but it’s essential if we want to make our programs and our policies applicable and relevant to them and to achieve the purposes that we want to. Thanks.

Moderator:
Thanks, Afroz. In fact, you know, I’m always pleasantly surprised when I chat with my son about child online safety because they use an iPad at school. And, you know, the way they set their own passwords, the kind of protections they take, you know, the perspectives they have, you know, we usually discount them. And, you know, that’s to our disadvantage. Anyway, thank you very much. So now we have three speakers that I’m going to pose the same question, bringing three different perspectives based on obviously the stakeholder groups they belong to. We have Amanda Third from the Western Sydney University, Boris Radanovich, who’s an online safety expert from SWGFL, and Courtney Gregory, Chief Digital Safety Officer from Microsoft. So let me start with Amanda. But it’s the same question that I’m going to pose to all three of you. So the question of how children can meaningfully contribute to creating solutions to the challenges they face online. And again, the three that we have, academia, civil society and the private sector. Could you provide us with concrete examples of situations where children have played an active role in creating or developing solutions for online issues from your own perspective, own respective fields of work? And how did this outcome of your work differ from work that is solely driven by adults? So let me first ask Amanda.

Amanda Third:
Thanks so much. And I think I would start by saying that thankfully, there are now lots of examples that we can use to illustrate the meaningful participation of children and young people in the design of services and online safety interventions in particular. There has been a recent trend towards meaningful engagement, which many people in this room are a part of. And I guess what I would do is I would just highlight a couple of examples that I’ve had recent involvement with. The first would be to cite the consultations with children in 27 countries globally to basically inform the drafting of UNCRC General Comment 25 on children’s rights in relation to the digital environment, which is a piece of evidence-based guidance for states about how to implement the Convention on the Rights of the Child in relation to the challenges and the opportunities of digital technology. And that process involved working with child-facing organisations in those 27 countries, designing a creative and participatory-based methodology where children attended workshops of five hours in length. And five hours is very important here because what we wanted to do was to create enough space for children to actively explore the issues because often what we’re doing when we consult children about things is we’re asking them to talk about things about which they have a lot of experience and expertise, but they haven’t necessarily had an opportunity to put those things into words. So, allowing them enough time and space to really work through what are the issues, what do we know about them, how would we put our experiences into language? This is a really important part of meaningful engagement, I would argue. Anyway, the upshot of that is that we now have a general comment. The children’s perspectives were used as a check and balance all the way through the two-year drafting process. And now I think we do have, as a result, a general comment that really encapsulates the key issues from adults’ perspectives, but filtered through children’s own lived experiences of these issues. So, that would be the first one. Another one that I would point to, given the sponsors of this panel, is that yesterday the International Telecommunications Union released an online safety app, game, and set of trainings for three different age groups of children internationally. And that’s a very exciting moment. Again, this is another piece of work where we engaged a children’s advisory group from six different countries around the world, from memory, and they were with us right the way through, from conceptualisation to the refining of the final products. And I think what we know from these and many other examples, as Afrooz just pointed out, is that this does result in online safety interventions that are much better able to address children’s real experiences, to speak about those experiences from the perspective of a child, as opposed to this sort of top-down methodology. And I think there’s another initiative that the ITU has underway, taking the lead from the East Safety Commissioner, but maybe I’ll talk about that next time, because I think my three minutes are probably over. Thank you.

Moderator:
Thank you, Amanda. So, Boris?

Boris Radanovic:
Thank you. Hello, everybody. At SWGFL, we’re a not-for-profit charity, and for the first time ever, we’re seeing the next generation coming into the workforce. And just a couple of weeks ago, we realised that the young are leading the unexperienced who are managing a system created by the elders, whatever the system is, and that creates a lot of issues and a lot of problems. Thank you for the question as well. There are a lot of examples that we can find, especially in the European Union, pan-European and worldwide. I think those examples of youth-led activities, children-led activities, or apps, or many of those examples need to shine even more through. But I did manage to find a couple that I think are worth mentioning and worth definitely shining a light. There’s something called the Bully Blocker app. A group of teenagers developed an app called Bully Blocker to address online bullying. Think before you type a campaign, which was started to raise awareness about the consequences of online hate speech and cyberbullying started by teenagers. I Can Help movement, funded by a group of students that later on became a literal movement, and digital literacy initiatives by youth all over the world. But at the same time, I found an example that just amazes me. A teen in Poland, disturbed by the reports of rising domestic violence under coronavirus lockdown, a Polish high school student decided to launch a fake online shop to offer a lifeline to victims trapped in their homes. Victims could look at lipsticks and other forms of makeup, but look for help in the descriptions of those lipsticks of different kinds of domestic abuse. I think that just showcases the different way of thinking and the richness that this little angel next to you gives to this conversation and to many others. And I’m saddened that often, and I am not young, and I’m often the youngest in the room, the youngest person in the room, when we are discussing how and what to do and how to help children. So I would love to see in the future conversations about this, to have children around the table and discuss the same things and principles, because in the last 10, 25 years of our charity, it has been evident that we, as adults, do not have enough experience to connect with what children are living through today, because we had the fortune or misfortune of not having 2K, 4K, HDR-ready connected cameras all around us and having to basically put yourself out there in front of the whole world to see. So it’s sometimes really difficult to understand the issues they are going through if you are not ready to listen. But then I will take it a step further and then action on that. It’s really nice to have a youth-led board. It’s really nice to have a youth advisory board. But the conclusions and advice that are coming out of that require, I would say they demand from us action so we can create a better world for them, because it’s going to be their world, and we are just managing it for the time. Rather badly, but I think with time, if we are listening to them a little bit more, that will be helpful. Thank you.

Moderator:
Boris, that’s a good point. Even within the UN, I’ve spent 15 years here, and I can see, and I’m largely involved in running processes that are member-state facing, where I sit at the ITU, and I’ve seen the delegations change in nature with more youth involvement. Of course, the definition of youth changes from country to country. So in some countries, you’ll see a 35-year-old, 40-year-old youth delegate, and in some they are much younger. But still, it’s getting younger, the age of the delegation. And also, we’ve had specific consultations with youth on so many of these topics that I’ve seen earlier being just decided in closed meetings among traditional set of delegates. So hopefully it’s all changed for the better. Thank you very much. And I can assure you there is no angel sitting next to me. They’re calling you an angel, man. Okay. So let me go to Courtney. Courtney, over to you.

Courtney Gregoire:
Well, thank you very much. And just at the outset, I think it’s valuable to think about the concept of our conversation today, policymaking with children. And I respect that many of our conversations today are about how we ensure that children’s voices are at the center of laws and regulation. But when we’re talking about the digital environment, let’s be perfectly honest, there are multi-layers of what policy is made and what it means to have policy made by tech providers that equally impacts the ability of children to unlock their potential, their rights in the digital environment. It’s also worth just stating the fundamental reality, that the digital environment was not designed for children. And we now have to recognize how significant a role it plays in their lives. Microsoft has a longstanding commitment to child online safety. And we also recognize the need to think and evolve from a protection mode to truly an empowering youth voices as to how they can unlock their potential through tech. Microsoft has a suite of products and services that intersect with children from their gaming lives, their social lives, to their economic and educational opportunities of the future. And we think it’s pretty critical to understand at the core how children are using our technology to better design it to fit their needs. First and foremost, I probably have to thank every single other panelist here because the work you’ve done has informed how we think about product design and build it in through our standard. Whether it’s recently a conversation with Amanda in Australia, reminding me that, you know, as a parent, how well does it go when you give a list of thou shall nots? The top ten things not to do online does not exactly inspire the young people to think about how to unlock their potential. And has had a huge impact on me reshaping how we think about that. Giving you two concrete examples about how we put this into practice, Microsoft has convened three councils for digital good over the past couple of years and I really respect exactly what you said. We structure that intentionally to be a conversation. We thought at the first outset it was important to create a baseline understanding with our young people, ages 13 to 17, how we think about privacy, safety and cyber security and how that’s built into our products. But with that baseline, bring it on. Tell us how you engage with our services and our apps and how those can better achieve what you want. They did have a final project and actually one of the most wonderful parts of my job has been reviewing some of those final projects. They were responsible for saying what they wanted their digital life to look like in five years. And then how do we together co-create that reality. And one of the most fascinating things was their sense of responsibility to their friends and their peers. They understood that maybe they had not reported something they’d seen online, but now they understood they were doing it for their community. The feedback we also got, we should learn because we obviously own a gaming platform, that kids learn through play. And so one of our releases just about seven months ago, Home Sweet Hmm, is a fun and educational way to introduce young people to online safety within Minecraft where they already spend a bunch of time. I had to be told why it’s called Home Sweet Hmm. That’s because I’m not an average Minecraft user. But you may know that the Minecraft iconic villagers don’t speak but rather grunt hmm. But in the cyber safe adventure, this sound is also intended to represent that pause. To think about what it means to make sure you are setting the tone for what you want for your online future. So we think through play when we get that active engagement, we can better help bring that to life.

Moderator:
Thank you, Courtney. Again, I’m using him as a subject. Yes, that is true. Minecraft, you do hmm. But, you know, about a decade and a half ago, I was also involved in the Child Online Protection Initiative for a few years. Microsoft has been a steady partner of this initiative over many years. So, you know, thank you very much for that. All right. Hillary. Hillary Bakery, Associate Program Officer on Youth Innovation and Technology from the Office of the SG’s Envoy on Youth. Hillary, thank you for being here. So let me pose the question I have for you. So together with the ITU and thank you for highlighting ITU’s work, Amanda. So I know my colleagues have been working hard and, you know, so also a shout out to Carla and Fanny for the, you know, the new release yesterday. So, okay, Hillary. So together with the ITU, your colleagues present here today and many more partners from the UN, NGO, academic and ICT sectors, you’re working on POP, the protection through online participation. So can you share some insights about this initiative? Give us some ideas on how children and youth can be part of the solution with regards to violence against children and violence online? Hello. Yes.

Hillary Bakrie:
No, yeah, thank you so much for the question. It’s been a very exciting journey actually to be part of POP with ITU, the Office of the SRSG’s on Violence Against Children, working together with Amanda and many of their colleagues are in the room. So as you mentioned, the initiative is called Protection Through Online Participation. We call it POP for short, very youth friendly, children friendly in terms of name. And I believe the name also really speaks for itself, right? It has a vision of a world where children and youth can leverage from the internet, can leverage from digital platforms to safely access protection support either from official services or from their peers. And we often hear a lot about how the internet brings harm or there’s a lot of risks that comes with digital platforms. I think with this initiative we’re taking a slightly different approach. We want to explore the other part of that narrative, right? We want to explore how internet and other platforms could be used to, you know, do good impact, to empower, to create solutions that can help children and young people to stay safe. And with this initiative we’re actually doing a series of mapping, one of the mapping exercise that we are looking into is we’re looking to one, seeing how young people and children are using the internet itself to access protection support, but second, and I think this is one of the key unique aspect of the initiative is that we’re looking into the role of peer-to-peer support. So really looking into children-led solutions and initiative, youth-led solutions and initiative. And I think it was mentioned a little bit by Boris as well, how crucial this is. So we asked young people from around the world a few months ago to share and participate in this mapping exercise and through the survey we learned that the majority of young people, as obviously we assume, use online system and online platforms to seek support when they’re feeling scared, when they’re feeling unsafe or experiencing harm. And then majority of them either find this by themselves or actually through their peers. So this power of young people and children understanding themselves, understanding other children and young people, I think that really speaks for itself, right? Like no one understands children and young people better than themselves. And then I am now a millennial and then going at the end of my late 20s and then I cannot speak on behalf of Gen Z’s who are a few years younger than me. So I think there is also, it’s important to acknowledge that the power of that peer-to-peer support system matters. And from that finding as well, we learned that not only that young people and children have the agency to navigate this challenge when they’re feeling scared or when they’re experiencing risk in terms of harm and violence, but they also believe in the solutions and initiative that their peers created. And funny enough, when we ask young people and children if they know who made the solutions, if they know who made this platform, not many of them are actually aware if children and young people are part of the solutions that created this platform. But when we ask if children and young people should be involved in the design and the creation and the development of the solutions, the majority of them, I think nearly 80% of them really believe that young people and children of their age should be involved, right? So I think there was a little bit of this perspective, also mentioned by Courtney earlier on the importance of involving youth directly. And then I think Amanda spoke a little bit also on meaningfully engaging young people and children into the process.

Courtney Gregoire:
In Singapore, but we talked to parents and caregivers about the different conversations you want to be having with children, yes, zero to five, and maturing from there about how you understand children are using technology and being age appropriate in those conversations. It’s just worth noting that as a parent, that is a hard thing to do if you do this job daily. It does mean getting in there and co-play and co-understand how they’re using technology. Just as we think about how we help our children understand the world, we need to be thinking about how they play through technology. Lastly, there is a big question on the table, and I think one of the most interesting things we’ve seen through surveys from kids is the challenge they feel in the misinformation and disinformation space online. They understand the overwhelming nature of what’s coming at them and want the tools to help better understand and make rational decisions about the information they’re coming in contact with. There are opportunities to do that as we really think about content provenance and other spaces in generative AI. But if we don’t do that, thinking about what the information is communicating effectively to young people, I don’t think will really help navigate that new world order in the generative AI context.

Moderator:
Thank you, Courtney. Actually, you gave me two interesting pieces of information among all the things you said. One is that children are more open to asking for help online, which is an eye-opening statistic. And second is you gave some very good examples of the potential of AI to do good, because obviously the conversation is all on governance, you know, which are important conversations to have, the guardrails that are needed for generative AI. But you’ve highlighted the good that it can do and use the right way in a responsible way. Thank you very much.

Courtney Gregoire:
Can I just add, the research, what you said was surprising was surprising to us. So I love that of course children know that they turn online and turn to their friends for peers. We have to acknowledge that for those working at tech companies and in government, the fact that this was eye-opening for us to learn, they’re willing to ask those vulnerable questions of a digital technology that we wouldn’t. We now need to build that into how we think about policy.

Moderator:
Absolutely, absolutely. And I acknowledge that, yeah. All right, so Hilary, let me come back to you. Can you tell us a little bit about how children and young people would like to be involved? You know, what are they requesting us, the international community, to consider when it comes to policymaking in relation to online safety?

Hillary Bakrie:
Yes, well, the short answer is nothing about us without us. I think earlier I shared that through the mapping that the POP initiative did, young people noted that they believe youth should be part of the solution, children should be part of the solution. But even just a few days ago in the IGF Youth Summit, many youth activists and leaders also highlighted that when it comes to policymaking processes and policy implementation on cybersecurity, on online safety, on safeguarding human rights in this digital age, young people are still not included fully, right, in the decision-making table. And sometimes youth is consulted, and I think Amanda mentioned a very exciting trend that young people are consulted. There’s an increasing trend in terms of meaningful engagement. But when it comes to actually making the decisions, delivering decisions, young people are not yet included as partners, as I briefly mentioned earlier. And if you look at the bigger picture, nearly half of the world’s population is actually young people, however, less than 3% of the parliamentarian members are actually under the age of 30. And I think that number speaks for itself, right, the lack of representation of youth. And even for younger youth or adults and then children specifically, right. So I think we need to change this number. We need to make these spaces available for young people and have young people meaningfully included in the policies that really affect their lives. Most policies like on online protection, on cybersecurity and many others. And beyond access, many of young people have not only noted that access is important, access to the policymaking process is important, but we need to make an enabling environment, right, so youth can effectively contribute as partners. Like we need to close the digital divide and make sure that everybody has access to meaningful connectivity. We have to invest in young people’s and children’s educations and skills, right, not just technical and digital skills that will help them become experts and contribute substantively to the subject, but also skills that will allow them to navigate how the policymaking process looks like. And to build on this, we have to make information on policies, process, policies, processes, not only transparent, but also accessible, right, to every layers of community, both children, young people, and not accessible in terms of just language, but also taking into account what is the cost to access information, disability inclusion, and many other factors that could help it to become a more inclusive process for both children and youth. And lastly, I think many young people have also voiced out that they are all also working in this sector as well. Many of them are young innovators. Many of them are young people in STEM, or even contributing to policy process in regional or national level. I think really just reiterating what I’ve been saying, it’s important to acknowledge them as partners and experts in this, again, so that they could have an equal footing in this conversation, yeah. Thank you.

Moderator:
Thank you very much. Afroz, the next question is for you. We’ve heard what children and young people are asking for, but in practice, how can we actively involve children in policymaking that concern them? And can you provide some examples, you know, good practices or lessons learned from successful initiatives, meaningfully involving children in policymaking?

Afrooz Kaviani Johnson:
Yeah, thank you. I think I’m gonna pick up a lot of what Hilary just mentioned. I just wanna point to another general comment of the Committee on the Rights of the Child, which is number 12, which actually talks about child participation. And I think this is really important when we’re thinking about what makes effective child participation. And they talk about all processes in which children are heard and participate. They’ve got nine, nine basic requirements. So one, that they’re transparent and informative. Two, that they’re voluntary. Three, that they’re respectful, child-friendly, inclusive, supported by training. I think it’s a big one. It’s not just something that happens. The people that are facilitating, the children that are participating need to be supported by training. Another really important one, safe and sensitive to risk, recognising that it’s not always a safe process to engage children on some of these sensitive issues. And sometimes even when it seems like it is fine, things can come up. So being ready for that. And then very importantly, accountability. So being accountable as well. So I’m just gonna share two quick examples in my, I don’t know how many minutes I’ve got left, very few, that strive to kind of embody those requirements. And so they’re examples from my colleagues around the world. So firstly in Tunisia, where Children’s Voices have helped shape the first ever National Plan of Action on Child Online Protection. And the impetus for this plan actually came from children. It came from a qualitative study with children about their online experiences. And there was a series of focus group discussions around the country with girls and boys aged 11 to 17, taking place in different parts of the country. So really trying to make it accessible and inclusive. And children were consulted not only for the input into the plan of action, but also to kind of validate and provide feedback on the draft plan of action. Children that were involved in the consultations, it wasn’t just your usual kids on school councils or a convenient sample of children, but children who were in school, but also out of school, and those also living in alternative care. So even in residential care facilities. And I think there were a lot of insights from that process that wouldn’t have been garnered if it had just been an adult-led process, just insights as to the topics that were most important for kids and privacy and data protection, coming out really strongly. The kids shared preferences on how they wanna receive information, be it peer-led initiatives or online or school programs. So it was a process in which children could not only kind of voice their concerns, but also help shape the measures that the country is now gonna take. The second example, which I’ll summarize very quickly, is from the Philippines, where there’s actually a longstanding practice of child participation that’s being refined and kind of improved over time. So our colleagues at UNICEF recently supported a series of consultations to inform the new national plan of action on children. And they’ve used similar methodologies for informing other pieces of legislation, including the recent legislative instrument on online exploitation and also one on child marriage. So just some kind of success factors around the methodology. The facilitators are young people. So they’re not older adults. They’re adults, but they’re younger adults in the scheme of things. And they’re young adults that have been trained over years and supported in this process. The facilitators are also from the communities in which children are from, right? So there’s already trust and there’s already a relationship there that, and yeah, it makes it more accessible as well. There’s minimal adult intervention, or I should say older adult intervention, so that children can, they feel safe to voice their perspectives. They’re not intimidated. They’re not influenced. Interestingly, there was also programming for the parents and caregivers to the side, right? Because it’s not always that easy to gather children and consult children. Well, what are the parents gonna do during that time? So there was programming to the side to engage them in parallel sessions. There was an emergency response plan in place for safeguarding, and there were social workers ready in case there were disclosures. And in fact, there were disclosures from participation. Just quickly, the methodology was child-friendly. So a lot of games in terms of inclusivity. A lot of diverse groups of children were involved, children with disabilities, children who were in alternative care, children in street situations. But you can’t just bring all these kids from all these different situations together without some careful planning, thought, preparation. And what I loved when my colleagues were telling me about this was that they wanted every child to go home feeling like they were seen and heard through that process. And that really stood out to me as kind of a principle. And I think just to close up, some of the lessons learned, especially from colleagues in the Philippines who have been working on evolving these practices of child participation is that it takes resources, it takes funding, it takes deliberate investment in kind of the capacity of facilitators over time. So some of the young people that are now facilitators were actually consulted, you know, at times during their childhood. So there was that nice kind of building of that capacity over time. So I’ll leave it there. Thank you.

Moderator:
Thanks, Afroz. Very interesting information. I think we have 15 minutes, so we’d like to have at least six, seven minutes of Q&A, if not more. So let me quickly move on to Boris. Boris, so you’ve concretely worked with the ITU to draft national strategies on child online protection in several countries. Can you tell us a bit more about this work and how it positively affects children’s well-being online?

Boris Radanovic:
Thank you. I think that’s the best part of my job, honestly, from the last 10 years I’ve been doing that. I’m going to call this a love letter to your government. Whoever government is looking at this and listening for the last three days of IGF and you’re wondering, where am I supposed to start? What am I supposed to do? ITU In-Country National Assessment is where you should start. So basically, the principle is, if you apply and discuss it with ITU, you’ll get the support. The worst part of that support, you’re going to meet me, but everything else is awesome. So National Child Only Safety Assessment. I honestly urge each and every one of them, especially the government, the government representatives listening to this, that you consider applying for this. This includes a comprehensive assessment of the existing situation, the development of a national strategy, and then the action plan, the much-needed action plan with recommendations based on global best practices. I had the pleasure of visiting many countries, and the difference of 50 to 100 miles does in culture, approach, consideration, and data behind it is just remarkable. Marking that we can have global solutions, but we need local adaptations, and they need to be carefully, carefully managed. With this, you not only enhance your national policies, standards, and mechanisms, that you can ensure the safety and well-being of children in the digital realm, but for all children in the entire country. I heard a lot of words in the IGF in the last… power words in the last couple of days of inspiration, speakers of many… paradigms, new shifts, and I love it, but I come from a non-profit sector, we are there for impact and action. So the time is now that you can apply for that, and I think this is the first step for any government official in considering where to start. This is a beautiful first step, and you need to take that step as a responsible government to understand where you are right now so you can understand where you want to go. Do not leave children of your countries just behind by not adapting simple actions that have wide-reaching consequences to protect them literally immediately while we are doing the assessment. I’m going to tell you a little bit what it is. And all of us, but especially anybody working in the government, have a duty to protect children in your country, and we want to help you with that, and pretty much that is it. Let us help you skip years, and in some countries decades, of stumbling in the dark and endangering children by your lack of knowledge and experience and just awareness of the global best practices or what to do and how to skip some of those issues. Please do reach out to ITU to start the process, and we can together create a better and safer environment for all children in your country, but for all children in the world than afterwards. By working together, we can build that environment, but we have to understand that while understanding the issues on a global scale, we need to understand that each of those issues is represented really differently in each of the countries. So how does it look? It’s rather simple after the application process starts with ITU. There is research that goes on first, and we love to do that research because we send it at the same time to children and parents, and some of the questions are the same, but the parents and children don’t know that they have the same question. So in some countries, parents would say that 80% of parents feel that their children will always talk to them if they found any issues online, and then 10% to 20% of children would really speak to them. So it’s really evident from the get-go that we as adults have a totally different picture of what is happening on the ground. Then happens the interviews, and I think that is the best advantage of this process because we do a multi-stakeholder interview, which is basically a marathon interview, 12 to 14 hours a day, with every part of the government, NGO, industry, stakeholders, every part of the internet society in each country, representing and asking them similar questions or sometimes even exactly the same to see those different perspectives, then combining that into a report, then providing the positive examples on a global scale, and literally at the moment of an interview, you can find the gap. In some countries, child sexual abuse material was illegal to distribute and to download and stop there. Then we asked a simple question, what about possession? It’s not in the law. So we just added one word in the law, and suddenly the police could have an action. In another country, they were really proud that they have a cyber-bullying law, but the cyber-bullying law only applied if you are cyber-bullying a child in your own school. So if you’re cyber-bullying a child in a different school, the law doesn’t apply. There are many, many, many other kind of simple solutions or simple fixes to maybe some good-willing actions that are already being made, literally seen in a couple of hours, if not in a couple of days. So this is like a bespoke assistance done by experts in the most sensitive way, especially listening to the voices on the ground, at the same time interviewing and listening to children going to schools and as well, putting their experiences through the report as well to shine through. And I think if you’re thinking about doing anything, especially after this wonderful IGF, I honestly believe this is a great first step, and we can only, like that, I think in that way, create a better and safer Internet for us all by thinking globally but looking at how to implement it locally. Thank you.

Moderator:
Thanks, Boris, and again, thanks for highlighting this important work of the ITU, but of course, we wouldn’t have been able to successfully do it without partners such as you, and all of you are on the table, so thanks again. So I think this is the last question, and it’s to Amanda. So you’re currently working with the ITU to involve children in the development process of national strategies related to child online protection. So can you share a little bit more about this effort and what you’re expecting to gain from it?

Amanda Third:
Yeah, sure. So I am indeed leading a piece of work with the ITU around the development of national child task forces in five countries to support the development of national strategies on the ground, which is a really exciting piece of work that I’m very happy to be part of. But it builds on a previous piece of work going back two years now when the eSafety Commissioner in Australia commissioned the Young and Resilient Research Centre that I direct to design a national task force to guide the government about online safety policy and programming across the country. And so what we did was we worked with a group of young people aged 10 to 18 to really dig deep into their experiences of online safety, but also of online safety interventions to get a really sort of like nice deep snapshot of the strengths and limitations of all of the good work that’s going on. And the eSafety Commissioner was very, very invested in doing this work so that they could really understand whether or not their messaging was hitting the mark and whether or not there were any sort of impacts emanating from some of their interventions. And certainly we did come across those. But also young people in that study gave us a really strong reality check on just where messaging is failing to land. They really reacted very strongly to top-down messaging, the list of don’ts. We all know that, don’t we, from our interactions with children in our everyday lives. But also they really spoke passionately about the ways that they felt that they had this expertise that could be channelled meaningfully into policy-making processes. So then we worked with them after we’d identified the strengths and limitations of the messaging and programming. We then worked with them to co-design a national youth advisory. And one of the beautiful things about this process this time around was that the old people like myself, I’m never the youngest person in the room, Boris, the old people like myself were not face-to-face with young people. We trained a team of youth researchers, sort of around the age of 18 to 20, to implement this work. And wow did it really, it was a complete game-changer because those young people opened up in ways we just hadn’t seen before. So that was really, really exciting and subsequently we’ve gone on to design a co-research, a youth co-research toolkit to support young researchers to be part of teams. And I’m happy to share a link to that if anybody would like it. But out of this process we’ve got a sort of, you know, young people designed a mechanism if you like, whereby there’s a group of young people who are diverse, they’re appointed over a two-year period and they guide and shape the government’s approaches to online safety policy and programming. It’s early days yet, we have only just started to implement that program and indeed it was implemented rather more quickly when a federal politician took a liking to it than we had anticipated. But so we’re sort of, you know, we’re just working out some of the bumps and roadblocks. But now what we hope is that we can now translate this process, working with people on the ground in different countries to culturally adapt this process and to create an ongoing mechanism for children to feed into that process of developing national strategies in their various countries. So I think what I would say is that I think it’s very, very encouraging to see that so many different entities from Microsoft to national governments, you know, to NGOs embracing advisory mechanisms to guide their work. I think though we, let’s not rest on our laurels. It’s really wonderful to have these mechanisms but we need to make sure that they stay fresh and that they remain open to young people’s insights and perspectives. Because they can get ritualistic, they can become kind of tick the box, as I said in an earlier panel today, tick the box kind of mechanisms and so we, and I think we need to remember too that not all young people’s experiences can be appropriately reflected through formal advisory mechanisms, right? So I think two things we should be thinking about in particular. The first is to level up on Boris’s challenge, not just to actually respond to children’s and young people’s insights, which is absolutely important, but really to reflect on our own processes, to think about the ways that we run things as adults in a quite like closed door sort of way often, to reflect on our processes of decision making and to really expand those and make new spaces for young people to become part of an ongoing real time conversation rather than the thing that we do when we need some input, right? How do we embed it? For example, could there be young people shadowing important decision makers, guiding them on what children and young people might like to do, etc. So things to think about there. So how do we transform our institutions so that young people can genuinely influence the agenda? But also how could we use our products and platforms to be seeking young people’s input on issues that relate to policy in a really everyday way that taps into their everyday interactions, right? How can we sort of seek their input, make spaces for people’s input to be channeled through to the decision making process inside organisations? So I think, like I say, I’m really excited and encouraged to see the ways that these youth participation models are being embraced around the world. It’s just so, so fabulous to see how far we’ve come in the last decade when we really were incredibly focused on protection and not really thinking about these questions of participation. But I think also let’s continue to think creatively about participation. Let’s not rest and let’s just think of this question of participation as one we won’t resolve and we have to keep paying attention to as things unfold. Thank you.

Moderator:
So thanks, Amanda. Your call for action was a good closure to the formal set of questions. So let me open it up for Q&A. I think we can take five minutes. So I have an online question, so let me start with that. In addition to the UN Youth Envoy and ITU’s plan to empower young minds against cyberbullying, would you take that? Thanks.

Hillary Bakrie:
Thank you. I think we had a chance to see the full question that was shared online and thank you so much also for sharing that. If I’m not mistaken, I think the question also highlighted that many young people have been working on solutions that address, solutions that help other young people and children to navigate cybersecurity and whatnot, so I think it’s actually exciting to see a real-life example of that. It would be really great, actually, if you are a young person or children or adolescent who is working on these solutions to be in touch with us. I think if the online moderators could help out, perhaps we can pop on the link of our actually initiative protection online participation because we actually want to not only hear from you but also learn from you specifically. How have you been building the solutions? Why did you decide to make the solutions? And really learn from your expertise as children and young people with lived experience and who are navigating through these challenges of cybersecurity and online protection, and we want to partner with you, as we indicated earlier in the panel, that it is important to recognize you as partners. So actually, if you would be open to connecting with us, that would be really great, and then perhaps the online moderators can help share the link, how you can get in touch with us. And then also, I think just a quick shout-out again to ITU and the other interagency partners like the SRSG’s Violence Against Children’s and also UNICEF, who has so many resources that has been building capacity of children and young people to contribute into this space as well.

Moderator:
Thank you, Elriana. In fact, the online moderator, if there are other questions that are being posed online, I just don’t have access to that. So please, you know, ask for the mic and you can read out the questions. Meanwhile, anyone here who, ah, yeah, please.

Audience:
Hi, everyone. I’m Marie Stella from the Philippines. So I’m just wondering, based on your interactions with the youth and the children, did they say anything about age verification? Because adults like us can’t seem to find the perfect solution to age verification. Do they think that we even need age verification online? Or is awareness-raising education enough to make sure that they will not access illegal or harmful content online? Thank you.

Boris Radanovic:
I’ll try to be quick and give space to others. It’s a wonderful question, and thank you so much. I think over 17 countries, I had the pleasure of speaking with children, they do not want content that is not intended for them in their spaces. And sometimes that is evidenced by data. A significant proportion of children encounter content of adult content or any other abuse content unwillingly. That’s the problem. That’s step number one. Step number two, we as adults do have to implement protections for children, whether they consider and we should consider their opinions about it, just because the internet and the content there is not created for children. So we need to find a way how to make it safer. Unfortunately, the solution to age verification is one million-dollar question, another one. But I think we will get there, and we will find a way to do so, but we will get there with children, because those spaces need protection so they can feel safe and they can feel protected. I hope that helps a bit.

Amanda Third:
I would just add quickly that when we spoke to children in 27 countries for the general comment 25, they were very, very clear with us that they wanted better protections online. They were also equally clear with us that they want their data to be protected, and they want to know how their data is being protected and why. So our solutions around age assurance, age verification, need to balance these tensions, obviously. But I think also they were also quite clear with us that systems that are sent up to prevent them doing things are often an invitation to subvert them. So I think we also need to think through the implications of protecting children through a range of mechanisms, and to think about, actually, does that keep a child safe all the time, or does it make them unsafe in ways that we then can’t deal with? So I think there’s a very complex set of questions there that we need to work through, and as Boris says, in partnership with children.

Moderator:
So let me first, before I give you, so I have the question now, and let me acknowledge the person who posed the question. It’s Omar Farooq, a 17-year-old boy from Bangladesh who’s working actively to ensure children’s rights and mental health. He’s the founder and president of Project OMNA, which is an AI-powered upcoming mobile project focused on children’s mental health and child rights.

Audience:
Thank you to the panelists. I’m Dora from UNICEF. I just wanted to ask maybe Microsoft, but open to the other companies as well, in context where child participation or democratic participation is more restricted, what could be winning arguments to engage the companies to do child participation in the design, or what could be winning strategies to convince them?

Courtney Gregoire:
A great question, and you opened a new window in my brain to acknowledge, and it’s interesting because we have, as I mentioned before, Microsoft had previously convened Councils for Digital Good and leveraged this as an important mechanism to have a conversation about how to think about safety design. As we’ve thought about how to scale, we really had taken a step back and said we want to work with existing organizations who think day in, day out about how to engage child participation and leverage their understanding and the information they can gather as we think about product and safety by design. Our expectation is that we should be doing that with the NGO community and the academic community who are building in, everyone here has said, the fundamental principles that we need to be thinking about to meaningfully engage children, that we are representative, that we are doing it in a safe environment, that they have the trust, and we’ve had that moment of, okay, that’s how we should be leveraging the ecosystem. I think we need child voices and participation at both layers. I’m sorry, they have to influence. And one thing that I failed to mention before, we had thought about our child participation and the councils to inform, yes, product design, but we opened up the door to have them talk directly to regulators so that it really was the true multi-stakeholder engagement that we wanted to hear out of those voices, whether that was with Ofcom or Archon, so that that was, they knew that they should be influencing all rules that impact them from a regulatory and a legal perspective to the rules of the road set by companies.

Moderator:
I’m getting a signal that we have run out of time. Let me quickly close by thanking our speakers. It was a fantastic panel. So Afroz, Amanda, Boris, Courtney, Hillary, thank you very much. I guess one takeaway which is very clear is the influence of children in not only the design of the online spaces is unquestioned. It’s also important that we as decision makers, as product makers, we make sure that the voices are properly heard and taken into account meaningfully. So again, thank you very much. Let’s probably end quickly with a round of applause for the speakers. And we hand over the room to the next set of people. A series of government meetings. And then fly tomorrow evening home to Seattle. Thank you. Yes. Thank you. Thank you. I know. We really do. I’m excited about it though. Including, I want to have a conversation with WeProtect about thinking about how they could potentially be a convening room. But then that is to me, I think so. So starting with Ian and saying what do you think?

Afrooz Kaviani Johnson

Speech speed

180 words per minute

Speech length

1476 words

Speech time

492 secs

Amanda Third

Speech speed

166 words per minute

Speech length

1901 words

Speech time

686 secs

Audience

Speech speed

144 words per minute

Speech length

148 words

Speech time

62 secs

Boris Radanovic

Speech speed

217 words per minute

Speech length

1907 words

Speech time

528 secs

Courtney Gregoire

Speech speed

200 words per minute

Speech length

1385 words

Speech time

416 secs

Hillary Bakrie

Speech speed

192 words per minute

Speech length

1606 words

Speech time

501 secs

Moderator

Speech speed

171 words per minute

Speech length

2437 words

Speech time

853 secs