Safe Digital Futures for Children: Aligning Global Agendas | IGF 2023 WS #403

10 Oct 2023 08:00h - 09:30h UTC

Event report

Speakers and Moderators

Speakers:
  • Manojlovic Marija, Intergovernmental Organization
  • Andrea Powell, Civil Society, Western European and Others Group (WEOG)
  • Albert Antwi-Boasiako, Government, African Group
  • Matthew Watson, Government, Western European and Others Group (WEOG)
  • Julie Inman Grant, eSafety Commissioner Australia
  • Ambassador Henri Verdier, French Ministry of Europe and Foreign Affairs Children’s Online Protection Lab
  • Cailin Crockett, Senior Advisor, Gender Policy Council, US White House Task Force to Address Online Harassment and Abuse
  • Salomé Eggler, Director of Digital Transformation Centre, German Agency for International Cooperation Kenya
  •  Mattito Watson, Senior Technical Advisor of Children in Adversity Team, USAID
  • Ananya Singh, USAID Digital Youth Council
Moderators:
  • Manojlovic Marija, Intergovernmental Organization

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Albert Antwi Boasiako

Ghana has made significant progress in integrating child protection into its cybersecurity efforts. The country has passed the Cybersecurity Act, which focuses on child online protection. Additionally, Ghana has established a dedicated division within the Cybersecurity Authority to protect children online. This demonstrates Ghana’s commitment to ending abuse and violence against children, as highlighted in SDG 16.2.

Furthermore, Ghana has seen a remarkable improvement in its cybersecurity readiness, with a rise from 32.6% to 86.6% between 2017 and 2020. This progress aligns with SDG 9.1, which aims to build resilient infrastructure and foster innovation.

Research and data have played a crucial role in shaping Ghana’s cybersecurity policies and laws. Through research, Ghana has identified the challenges faced by children accessing inappropriate content online, leading to more comprehensive child protection strategies. This highlights the importance of evidence-based decision-making, as emphasized in SDG 9.5.

However, Ghana has faced challenges in implementing awareness creation programs, particularly in reaching a larger percentage of the population. With a population of 32 million children, Ghana has only achieved around 20% of its awareness creation goals. Overcoming this challenge is crucial in combating cyber threats effectively.

Fragmentation within governmental and non-governmental spaces has been a significant obstacle in child online protection efforts in Ghana. To address this, Ghana needs to institutionalize systematic measures and promote collaboration among stakeholders. This will ensure a unified approach and enhance response effectiveness.

Albert Antwi Boasiako, a proponent of child protection, advocates for the integration of child protection into national cybersecurity frameworks. Albert emphasizes the importance of research conducted with UNICEF and the World Bank in shaping cybersecurity policies, aligning with SDG 16.2.

Public reporting of incidents is also essential for maintaining cybersecurity, as supported by Albert. The establishment of the national hotline 292 in Ghana has proven effective in receiving incident reports and providing guidance to the public. This aligns with SDG 16.6’s objective of developing transparent and accountable institutions.

Implementing cybersecurity laws can pose challenges, particularly in certain developmental contexts. Factors like power concentration and specific country conditions can hinder their practical application. Overcoming these challenges requires continuous effort to ensure equal access to justice, as outlined in SDG 16.3.

In the African context, achieving uniformity in cybersecurity strategies is crucial. Discussions on streamlining online protection and combating cyberbullying in Africa are vital for better cooperation and enhanced cyber resilience across the continent.

Ghana supports regional integration for successful cybersecurity implementation, sharing its expertise with other countries. However, fragmentation within the region remains a challenge that needs to be addressed for effective collaboration and coordination in countering cyber threats.

In conclusion, Ghana’s efforts to incorporate child protection, improve cybersecurity readiness, and promote evidence-based decision-making are commendable. Overcoming challenges related to awareness creation, fragmentation, law implementation, and regional integration will contribute to a more secure digital environment for children in Ghana and beyond.

Marija Manojlovic

Online child safety is often overlooked in discussions surrounding digital governance, which is concerning as protecting children from online harm should be a priority. This issue is further exacerbated by a false choice that is frequently posed between user privacy and online safety. This notion that one must choose between the two is flawed and hinders progress in safeguarding children in the digital realm.

The fragmentation within the digital ecosystem hampers progress in advancing child online safety. Marija, a leader in the field, has observed that collaboration and coordination among various stakeholders, including governments, the private sector, and academia, are crucial. However, there is an alarming level of fragmentation that impedes progress and the development of effective strategies to ensure children’s safety online.

One positive aspect that emerges from the discussions is the recognition that failures and learnings should be shared openly. Marija proposes that companies and organizations not only share what has worked but also what has failed. Transparency and the sharing of experiences can lead to better solutions and a more cooperative approach to addressing online safety challenges.

To truly drive change, it is essential to understand the root causes of digital challenges. Marija suggests moving upstream and examining the design and policy choices that contribute to online safety issues. This entails exploring how societal norms and technological design enable child exploitation, gender-based violence, and other online hazards.

Creating a unified digital agenda is crucial for maximizing the benefits of digital technologies and ensuring online safety for children. Misalignment in digital agendas can hinder progress, but engaging in meaningful discussions and sharing innovative solutions can help establish an internet environment that is beneficial for all, particularly children.

An evidence-focused and data-informed approach is necessary to effectively protect children online. Marija emphasizes the significance of testing, experimentation, and the sharing of results to inform decisions and shape policies. Building evidence through a cooperative spirit between different stakeholders is key.

Ghana serves as a unique example where child protection has been institutionalised in their cybersecurity work. This highlights the importance of countries actively integrating child protection into their cybersecurity strategies and policies.

However, it is disheartening to see that the innovation ecosystem is not always inclusive of individuals who require safety measures due to various reasons, including concerns for their well-being. This exclusion reinforces the need to address safety concerns to create a more inclusive and diverse innovation ecosystem.

The intersection of online child safety, inclusive digitisation, and gender balance should not be disregarded. Ensuring online safety is crucial for promoting inclusivity and achieving gender equality in the digital realm.

More work needs to be done in preventing gender-based violence and image-based abuse online. These serious issues require attention and effective strategies to protect individuals from harm.

Additionally, it is essential to challenge and address the prevailing narratives and perceptions of these digital challenges that are rooted in gender norms. Overcoming these deeply ingrained biases and stereotypes is crucial for creating a safer and more equitable online space.

While the internet presents numerous opportunities for young people, their participation and protection must be prioritised. Their experiences and perspectives need to be recognised and incorporated into decision-making processes to ensure their safety and well-being.

Moreover, it must be ensured that existing vulnerabilities, such as the gender divide, toxic masculinity, and extremism, are not exacerbated in the online world. Digital platforms should actively work towards a safer and more inclusive environment that nurtures positive interactions and discourages harm.

Lastly, increased investment in the field of online safety and protection is needed. Governments, industry leaders, and other stakeholders must allocate resources and finances towards robust initiatives that safeguard children from online threats.

In conclusion, addressing online child safety is essential and should not be overlooked within the digital governance discourse. It is imperative to dispel the false dichotomy between user privacy and online safety, overcome fragmentation, and foster collaboration among diverse stakeholders. Sharing successes and failures, understanding the root causes of digital challenges, building a unified digital agenda, adopting an evidence-focused and data-informed approach, institutionalising child protection, promoting inclusivity, challenging gender norms, ensuring youth participation and protection, and increasing investment in online safety are all integral to creating a safer and more inclusive digital environment for all, particularly children.

Mattito Watson

The analysis examines four speakers discussing various aspects of USAID’s strategies and initiatives related to youth and digital experiences. Firstly, it is noted that USAID’s digital strategy was released in 2020, indicating their adoption of digital technologies in development practices. As one of the largest development organizations globally, this digital adaptation is significant in terms of reach and impact.

Additionally, USAID has implemented a child protection strategy, demonstrating their commitment to safeguarding children’s well-being. Mattito Watson, who leads the child protection efforts within USAID’s child, children, and adversity team, plays a key role in this area. Moreover, USAID has a youth strategy that emphasizes collaboration and partnership with young people, rather than a paternalistic approach.

The analysis highlights the importance of involving youth in decision-making processes. To facilitate this involvement, USAID established a digital youth council, which serves as an advisory body and nurtures future leaders. The council consists of 12 members, including a gender-balanced representation of seven girls and five young men, underscoring USAID’s commitment to inclusivity.

Understanding the digital experiences of youth is vital. Mattito Watson’s efforts to comprehend the digital experiences of different youth demographics have led to the establishment of the Digital Youth Council, reinforcing the commitment to engage and empower young people.

In conclusion, the analysis reveals USAID’s strategies and initiatives to involve youth and incorporate digital experiences. The release of the digital strategy, implementation of child protection and youth strategies, and the establishment of the digital youth council showcase USAID’s efforts to stay relevant and foster inclusive development practices. By recognizing the importance of involving youth and understanding their digital experiences, USAID is taking a forward-thinking approach that can drive positive change and reduce inequalities in line with the Sustainable Development Goals (SDGs).

Andrea Powell

The internet has brought both great opportunities and risks for children. On one hand, children now have more access than ever to knowledge, entertainment, and communities, empowering them in various ways. However, there are also troubling aspects of cyberspace, with the dark web being used for criminal activities.

In terms of digital diplomacy and internet laws, there is a call for coherence. The belief is that everything that is forbidden in real life should also be forbidden online, and everything guaranteed offline should also be guaranteed online. Efforts have been made to implement this belief, such as discussions on how to apply the UN Charter or Geneva Convention within a conflict.

Solutions to digital challenges should come from a cooperative effort involving all stakeholders. Governments, companies, civil society organizations, and researchers all have different responsibilities and prerogatives that can contribute to problem-solving in the digital sphere.

One pressing issue is the lack of attention and resources given to child protection online compared to other areas. The field of child protection online is weaker, with less funding and organization, especially in comparison to efforts against terrorist content.

Creating an environment where there is effective testing and sharing of solutions to digital issues, such as age verification, is crucial. Different approaches to age verification exist, each with different levels of privacy, efficiency, and centrality. Finding the right balance is important.

Image-based sexual violence is a growing global issue that disproportionately affects vulnerable groups. There are over 3,000 websites designed to host non-consensually shared intimate videos, and young people are increasingly exposed to this form of violence. Survivors often experience psychological distress, trauma, anxiety, and even suicidal thoughts. Shockingly, over 40 cases of child suicide as a result of image-based sexual violence have been uncovered.

There is a need for better knowledge and public awareness of image-based sexual violence. Most law enforcement agencies lack knowledge of the issue, and public misunderstandings perpetuate victim-shaming attitudes. Global regulation and policies need to be harmonized to tackle this issue effectively. Barriers to addressing the issue include the need to prove the intent of the abuser, and it is argued that online sexual violence should be classified as a serious crime.

Tech companies are also called upon to take more accountability and engage proactively. Currently, there are over 3,000 exploitative websites that could be de-indexed, and survivors are left to remove their own images, effectively cleaning up their own crime scenes. Tech companies should play a more active role in preventing and dealing with image-based sexual violence.

In order to support victims of image-based sexual violence, global standardization of support hotlines is necessary. The InHope Network provides a model of global hotline support for child online sexual abuse, and this approach could be expanded to address the needs of victims of image-based sexual violence.

In conclusion, while the internet provides numerous opportunities for children, it also poses risks that need to be addressed. There is a call for coherence in digital diplomacy and internet laws, solutions to challenges should involve a cooperative effort from all stakeholders, child protection online requires more attention and resources, image-based sexual violence is a pressing global issue that demands better knowledge and regulation, tech companies should be more accountable, and global standardization of support hotlines is crucial.

Henri Verdier

The analysis examines topics such as online crime, the dark web, internet fragmentation, internet companies, innovation, security and safety, and violence and gender issues. It reveals that a significant portion of online crime occurs on the dark web rather than social networks, with real-time videos of crimes for sale. To combat this, the analysis suggests increasing police presence, investment, and international cooperation. It also highlights the issue of internet fragmentation at the technical layer, which needs to be addressed. Additionally, there is a disparity in trust and safety investment by internet companies, with greater investment in larger markets and less in smaller ones, especially in Africa. The analysis argues for equalizing trust and safety investment. Market concentration is also opposed, with a call for a more balanced approach to internet companies. Contrary to popular belief, the analysis argues that innovation and regulation can coexist, with regulations sometimes driving innovation. Furthermore, the analysis emphasizes that security, safety, and innovation are not mutually exclusive, and solutions can be found by considering all three. The analysis also explores the interconnectedness of violence and gender issues, noting that social networks play a role in radicalization and that violence often targets gender and minority groups. Ignoring gender issues can lead to overlooking other interconnected issues. In conclusion, the analysis provides a comprehensive examination of various topics and offers valuable insights for addressing these complex issues.

Cailin Crockett

The analysis highlights unanimous agreement among the speakers on the importance of addressing gender-based violence, particularly online violence. They argue that all forms of gender-based violence stem from common root causes and risk factors, often driven by harmful social and gender norms. Furthermore, they emphasize that these crimes are majorly underreported.

The Biden-Harris administration strongly supports efforts to end all forms of gender-based violence. They have taken a comprehensive approach to tackle the issue, including setting up a White House Task Force dedicated to addressing online harassment and abuse. This demonstrates their commitment to promoting accountability, transparency, and survivor-centered approaches with a gender lens. The administration acknowledges that gender-based violence has ripple effects on communities, economies, and countries.

In combating online violence, the speakers underline the importance of prevention, survivor support, accountability for both platforms and individual perpetrators, and research. These pillars form the basis of the strategy against online violence. The task force comprises various government departments, such as USAID, the Justice Department, Health and Human Services, Homeland Security, and more. The Biden-Harris Administration has already outlined 60 actions that federal agencies have committed to taking to address online harassment and abuse.

The speakers note that the United States’ federalist nature leads to multiple approaches being taken across different states and territories to address abuse issues. This diversity reflects the unique challenges and needs of each region. Additionally, they assert the need to balance the interests of children with the rights of parents, as parents may not always be inherently able or willing to represent the best interests of their children.

Investing in prevention and adopting an evidence-informed approach are crucial in addressing gender-based violence. The administration recognizes the importance of maximizing options and support for survivors of abuse to effectively prevent and combat violence.

The CDC’s analysis, titled ‘Connecting the Dots’, aims to identify shared causes of violence across the lifespan. This research contributes to a better understanding of the various forms of interpersonal violence and helps inform prevention strategies.

Finally, the speakers call on civil society to demand government investment in tackling these issues. They emphasize the importance of allocating resources to effectively combat gender-based violence and online violence. This partnership between civil society and the government is crucial for making progressive changes and achieving the goal of ending all forms of violence.

Overall, the analysis emphasizes the urgent need to address gender-based violence, with particular emphasis on online violence. It acknowledges the comprehensive measures taken by the Biden-Harris administration and stresses the significance of prevention, survivor support, accountability, and research. The speakers’ insights shed light on the diverse approaches taken across the United States and highlight the importance of balancing the rights of children with the rights of parents. Investing in prevention and evidence-informed policy is considered essential, and the CDC’s efforts to identify shared causes of violence are valued. Lastly, civil society plays a vital role in advocating for government resources to effectively combat these issues.

Salomé Eggler

The extended summary of the analysis highlights the significant role played by GIZ in integrating child online safety into its projects. GIZ is committed to incorporating child online safety from the outset of its projects, ensuring that the protection of children in the digital space is a top priority. This proactive approach underscores GIZ’s commitment to safeguarding children’s rights and well-being.

Furthermore, GIZ takes a comprehensive approach to ensure child online safety is embedded in every aspect of its projects. By integrating safety requirements at every stage, GIZ creates genuine child online safety projects specifically designed to address the unique challenges and risks faced by children online. This holistic approach is crucial in effectively protecting children from online threats and promoting their digital well-being.

To aid in the implementation of child online safety, GIZ utilises user-friendly tools that do not require extensive expertise in child protection. The Digital Rights Check tool is one such example, helping to assess projects in terms of human rights considerations, including child online safety. This tool allows GIZ to evaluate the extent to which its projects uphold fundamental rights and make necessary adjustments to ensure the protection of children’s rights.

However, the analysis highlights the challenges faced in implementing child online safety. Various cross-cutting issues, such as gender, climate change, and disability and inclusion requirements, need to be balanced with child safety considerations. This requires GIZ practitioners to find a delicate balance between these competing priorities to ensure that child online safety is not compromised. Moreover, limited budgets and time constraints further complicate the implementation process.

Nevertheless, the analysis indicates that increasing digitalization projects present an opportunity to mainstream child online safety. As GIZ’s digital projects continue to expand, there is a chance to incorporate child online safety into more frameworks and tools. By leveraging the digital rights check and other appropriate measures, GIZ can ensure that child protection considerations are integrated into larger projects, leading to a safer online environment for children.

Overall, the sentiment towards GIZ’s efforts in integrating child online safety is positive. GIZ’s commitment to embedding child online safety into its projects and using tools to assess projects in terms of human rights, including child online safety, demonstrates a proactive approach towards protecting children’s rights in the digital age. However, the challenges associated with implementing child online safety, along with limited resources, highlight the need for ongoing commitment and collaboration to overcome these obstacles.

In conclusion, GIZ’s role in integrating child online safety is crucial. By prioritising child protection from the outset of projects, adopting a comprehensive approach, utilising user-friendly tools, and capitalising on digitalisation opportunities, GIZ demonstrates its commitment to creating a safer online environment for children. Continued efforts, collaboration, and resource allocation are essential to overcome challenges and ensure the effective implementation of child online safety measures.

Moderator

Omar Farouk, in collaboration with UNICEF and the UN Tech Envoy, is actively involved in Project Omna, aiming to tackle pressing digital issues such as cybersecurity, bullying, and privacy on a global scale. The project is focused on addressing the challenges faced by children in the digital space and ensuring their safety.

The importance of balancing child safety and economic growth in the digital realm is a key aspect of the discussion. It is evident that as the world becomes increasingly interconnected, it is crucial to protect children from the potential harms that exist online while fostering an environment that promotes economic growth and innovation.

One of the primary arguments put forward is the need for strong partnerships between government, businesses, and civil society to effectively address child safety in the digital space. Collaborative efforts among these stakeholders are crucial in developing strategies and implementing measures that protect children from online threats. By working together, they can leverage their respective expertise and resources to create a safer digital environment for children.

The summary highlights the related topics of child safety online, government-business partnerships, civil society, and the digital space. It is evident that these topics are intertwined and interconnected. Effective child protection in the digital space requires cooperation and collaboration among all these stakeholders.

Furthermore, the discussion emphasizes the role of partnerships in achieving one of the Sustainable Development Goals (SDG 17: Partnerships for the Goals). This demonstrates the global recognition of the importance of collaboration in addressing complex challenges like child safety online.

The summary does not mention any specific supporting facts or evidence. However, the involvement of UNICEF and the UN Tech Envoy in Project Omna provides a strong indication of the credibility and importance of the initiative. Additionally, the fact that the summary mentions the need for partnerships suggests that there is evidence supporting the argument for such collaborations.

In conclusion, the expanded summary highlights Omar Farouk’s involvement in Project Omna, undertaken in partnership with UNICEF and the UN Tech Envoy, to address critical digital issues. The discussion emphasizes the necessity of balancing child safety and economic growth in the digital space and calls for strong partnerships between government, businesses, and civil society. By working together, these stakeholders can effectively tackle the challenges faced by children online and create a safer digital environment for all.

Julie Inman Grant

The issue of online safety for children is a significant concern that requires attention. Children make up one-third of global internet users, and they are considered more vulnerable online. The sentiment towards this issue is mainly negative, with arguments emphasising the need for safety measures and awareness to protect children.

One argument highlights that the internet was not designed for children, and thus, their safety should be considered. This emphasises the negative sentiment regarding the lack of adequate safeguards for children online. The related Sustainable Development Goal (SDG) is 3.2, which aims to end preventable deaths of newborns and children.

Another argument focuses on the long-term impacts of children becoming victims of online abuse. Victims of child abuse are more likely to experience sexual assault, domestic violence, mental health issues, and even become offenders themselves. This negative sentiment highlights the serious societal costs associated with online abuse of children. The related SDGs are 3.4, which promotes mental health and well-being, and 5.2, which aims to eliminate all forms of violence against women and girls.

Education and awareness are seen as crucial factors in addressing online safety for children. The positive sentiment is observed in the argument that prioritising education and awareness regarding internet safety is essential. Programmes and initiatives aimed at parents and young people demonstrate the commitment to promoting safety. The related SDG is 4.7, which focuses on education for sustainable development and global citizenship.

The inadequacy of age verification on online platforms is highlighted, with a negative sentiment towards platform responsibility. The argument is that platforms need to improve age verification, as even eight and nine-year-olds are reporting cyberbullying. It is emphasised that young children lack the cognitive ability to handle risks on such platforms. The related SDG is 16.2, which aims to end abuse, exploitation, trafficking, and violence against children.

The importance of developing technology with human safety, particularly children, as a core consideration is emphasised. The positive sentiment is expressed in the argument that the welfare of children should be considered from the beginning of technology development. Anticipating and mitigating risks is crucial to ensure their safety. The related SDGs are 9.5, which promotes enhancing scientific research and technological capabilities, and 16.2, which aims to end abuse, exploitation, trafficking, and violence against children.

The effectiveness of self-regulation in dealing with cyberbullying and image-based abuse is questioned, expressing a negative sentiment. It is argued that self-regulation is no longer effective, with evidence suggesting a 90% success rate in removing cyberbullying content and image-based abuse. The related SDG is 16, which focuses on peace, justice, and strong institutions.

Cooperation between regulatory bodies and industry is advocated as necessary for prevention, protection, and proactive and systemic change. The positive sentiment is observed in the argument that such cooperation is essential to effectively address the issue. Initiatives and networks have already been established to work together in removing abusive content. The related SDG is 17, which emphasises partnerships for achieving goals.

It is noted that there is no need to start from scratch when building regulatory models for online safety, expressing a positive sentiment. The argument is that localized materials have been developed in multiple languages to ensure wider accessibility, and sharing experiences, including mistakes, can help prevent future harm. The related SDG is 16, which focuses on peace, justice, and strong institutions.

Lastly, it is argued that online safety must be a collective responsibility, reflecting a positive sentiment. The argument emphasizes that no one will be safe until everyone is safe. This highlights the importance of individuals, communities, and organizations working together to ensure online safety for all. The related SDG is 16, which focuses on peace, justice, and strong institutions.

In conclusion, the importance of online safety for children is a pressing issue. The negative sentiment arises from concerns over their vulnerability and the long-term impacts of online abuse. Education and awareness, improved age verification, technology development with child safety in mind, and cooperation between regulatory bodies and industry are crucial for prevention and protection. The limitations of self-regulation are observed, and the need for collective responsibility is emphasized. Addressing these issues is vital to ensure a safer online environment for children.

Audience

During the discussion on the protection of children’s rights, several key points were raised by the speakers. One speaker emphasised the need to draw practical measures to prioritise child rights. This is particularly important in addressing issues such as abuse, exploitation, trafficking, and violence, which are central to SDG 16.2. The speaker highlighted their work at the Elena Institute, a child rights organisation, and their involvement in the Brazilian Coalition to End Violence.

Another speaker emphasised the importance of laws and design in avoiding fragmentation and effectively implementing new ideas. This is crucial in the context of child rights, as effective implementation requires a holistic approach. The speaker did not provide any specific supporting facts for their argument, but the need for coordination and coherence in policy and legislation is broadly recognised in this field.

The discussion also touched upon the need for better cybersecurity strategies and laws to protect online users, especially in African countries. The speaker highlighted the progress made by Ghana in this regard and stressed the importance of addressing cybersecurity in the context of digital inclusion and progress. They suggested gathering best practices and suggestions at both the national level and civil society level to combat issues such as cyberbullying.

There were also concerns expressed about balancing parental supervision tools with a child’s right to information and seeking help. The speakers pointed out the high rates of online abuse in Brazil, and the potential risks of violence coming from within the family, highlighting the need for caution with supervision tools.

The debate over prevention measures, such as sexual education, in conservative countries was mentioned as well. The discussion highlighted the challenges faced in advocating for such strategies, as they can be seen as taboo in conservative countries. The importance of finding practical approaches to deal with child abuse and exploitation, while considering cultural and social contexts, was emphasised.

In conclusion, the discussion emphasised the importance of practical approaches in safeguarding children’s rights. It called for the development of effective strategies and laws to address issues such as abuse, exploitation, and violence in both physical and online contexts. It highlighted the need for coordination, coherence, and best practices at multiple levels, including national and civil society. The debate also shed light on the challenges of balancing parental supervision tools with a child’s right to information and the difficulties in advocating for prevention strategies in conservative countries. Overall, the discussion underscored the need for comprehensive and contextually sensitive approaches to protect and promote children’s rights.

Ananya Singh

The USAID Digital Youth Council plays a crucial role in involving youth in digital development. The council has been created by USAID to ensure that the voices of young people are incorporated in the implementation of their digital strategy. They provide a platform for youth to have their voices heard and influence the development strategies. This initiative is aligned with SDG 4: Quality Education and SDG 8: Decent Work and Economic Growth.

The speaker, who is part of the USAID Digital Youth Council, actively works towards providing the platform for youth to have their voices heard and influence development strategies. This highlights the importance of giving young people a voice in shaping digital development. The sentiment is positive towards this argument, as it recognises the need for youth to have a platform to be heard.

Furthermore, the council has been instrumental in guiding the implementation of the USAID’s digital strategy and raising awareness about digital harms. They have co-created sessions on emerging technologies, which indicates their active involvement in shaping the digital landscape. This is in line with SDG 9: Industry, Innovation, and Infrastructure, and SDG 17: Partnerships for the Goals.

Moreover, the council members have designed apps to educate young people about digital harms, showcasing their creativity and commitment to addressing challenges in the digital world. This demonstrates the council’s dedication to empowering young people and equipping them with the necessary knowledge to navigate the digital space safely.

Involving youth in decision-making processes has been found beneficial, and the Digital Youth Council exemplifies this. Ananya Singh, a member of the council, was allowed to share the stage with USAID administrator and U.S. Congress representatives, indicating the recognition and importance given to the council’s involvement. Additionally, young council members were involved in planning and speaking at multiple sessions of the USAID at the Global Digital Development Forum, further highlighting their active participation in decision-making processes. This aligns with SDG 16: Peace, Justice, and Strong Institutions.

Overall, the Digital Youth Council’s work has been a success story in empowering youth and promoting digital engagement. By providing a platform for young people’s voices to be heard, guiding the implementation of digital strategies, raising awareness about digital harms, and actively participating in decision-making processes, the council is contributing to the advancement of SDGs and ensuring that youth are active and equal partners in digital development.

Session transcript

Marija Manojlovic:
Welcome, everybody. Welcome to people in the room around this huge roundtable at the end of the day. My name is Maria Manojlovic, and I’m director of SAFE Online. This event is called SAFE Digital Futures, Aligning the Global Agendas. I want to welcome participants as well, and if you’re joining us online, please, we have an online moderator, my colleague, Natalie Shroop, so please drop in the chat quickly where you’re joining us from and feel free to drop in the questions throughout the session. We will be monitoring the chat and making sure that we can respond to your questions. As I said, my name is Maria. I lead the work on SAFE Online as part of the End Violence Global Partnership. We are the only global fund focused on the safety of children issues online. We fund system strengthening across different sectors. We fund research and data, and we fund technology tools that are looking into tackling harms and risks to children in digital environments. So far, we have invested around $100 million in over 100 projects, making impact in over 85 countries. So this work, we interact with a wide variety of players and stakeholders from various sectors and fields of engagement. We interact with governments, private sector, and industry. We work with child protection organizations, with civil society organizations, as well as industry and academia. And through that engagement, we have realized one thing, and this is the reason why we have organized this session today on the alignment of various digital agendas, which is that there is alarming level of fragmentation in this ecosystem, which is truly hampering progress in many aspects. But in particular, when it comes to safety of children, children are too often left aside and not even considered in the discussion of digital governance and development. So some of the common reactions when you work in this field are the following. We have had literally people turn their backs on us when we mentioned children in our interactions in relation to online issues. They would say, well, we work on infrastructure or protocols, or we work on connectivity or access, but we really don’t deal with kids. Our engagement is focused only on women and girls. Sorry, we can’t really speak about kids more broadly. Or we just work on human rights more broadly, but kids are not really part of that. Or we really just care about education. Education is really critical in access, but safety is not that critical for what we do. But somehow placing children and safety in the overall global agenda on digital development and human rights has been particularly hard. Then there is the famous privacy and safety dichotomy, the tension between how do you assure privacy of users at the same time ensuring that users can. And not only you, I mean, I actually hate the term users. It’s humans and people. Like users, it’s not some other category of creatures roaming around. So when we think about privacy and safety, we need to be thinking more broadly about how do they interact at the level of humans. But when you speak about prevention and response to online child sexual exploitation and abuse, that’s even more harsh distinction. So you will find, if you find yourself at the end of the spectrum who cares about online safety, you will end up being accused of various things. Like the latest really fancy thing that we’ve been accused of is that we are trying to end privacy online, which sounds really cool. But it’s just, it’s unbelievable. So we believe that this dichotomy is really false. And we believe that more nuanced conversations are needed. We believe that we should not be forced into choosing which one matters more. And when we know that we can and should have both. So as we were discussing this yesterday with Maria Ressa and Justina Arden was saying, you can and should have both. And this should not be a matter of choice that we should be making. And sometimes having much more deep and upstream discussions is going to be needed for us to be making some nuanced and meaningful contributions in that regard. So now that I can vent and complain a lot, let me be more positive. What are the causes of this misalignment? And I believe there are a few things that we can think about. But in order to advance the state of the internet, which is beneficial for humans, and in order to maximize the benefits of digital technologies, we have to invest efforts to understand where these misalignments come from and how we can overcome them so that we are in fact more aligned and more impactful. That is why I believe that the most important, this is the most important discussion that we can have. And in order to do that, I want to ask you to do a couple of things. First one is let’s move more upstream. Instead of focusing on manifestations of the issues in these various fields, like technology facilitated at GBV or gender-based violence, lack of connectivity and access, cyber crime, and so on, to actually upstream design infrastructure and policy choices that enable these things. We are repeatedly seeing that the driving forces and engagement techniques behind radicalization, violent extremism, political extremism, misogyny, child sexual exploitation and abuse are very similar, both from societal and norms perspective, as well as from the technological point of view, in terms of how the design choices of digital platforms are enabling these phenomena and how not only they’re enabling them, making them worse and exacerbating them. Second thing that I want to ask you to do today is share learnings and failings openly, not only what has worked and succeeded, but what has not in your previous engagements, so we can do better. You will hear a lot from our speakers today about that, but also speak about solutions and approaches that work across the landscape. How to engage with governments, how do you create political will, what will it take to do that, how do you engage with industry and create incentive for more action, accountability and transparency will be critical. So today we have brought a lot of speakers, eight speakers and experts from various fields to help us frame this discussion. We will not make them speak all at once, so don’t be scared. I will introduce them throughout the session, but we want to make the session as interactive as possible, so we’re going to split the session in three segments. We’re going to have speakers introduce for five minutes their catalyzing, igniting remarks, and then we will open the floor for discussion. We have asked people, and again asking people to please come to the table if you want to join us at the roundtable, but also people online, please drop in your questions in the chat and we’re going to be making sure that you can participate. For all of those online, yes, please get engaged. And finally, there’s a huge diversity of perspectives and expertise in the room, so please be respectful when you speak, and this is a safe space for people to express their opinions. For people who are new to this field of online child sexual exploitation and abuse, there may be some sensitive things said here and some triggering facts, so we’re just giving some warning to you. Take care of yourself. If people need to step out, please step out at some point. And then again, be mindful that we all want to speak, so please keep your remarks concise and focused. And with that, let’s dive in. So, the first segment we’re going to talk about today, we have labeled cybersecurity and online safety, but again, these are such fluid agendas, and you will see how we are going to try to unpack all of that. I will kick off with my question to Ambassador for Digital Affairs, Andrie Verdier from France, and what we want to do is see how various agendas around cybersecurity and online safety interact around issues of child online safety. So, Andrie, in many ways, we spoke this morning as well, but you basically sit at the intersection of this issue that you want to discuss today. You are someone who has worked in private sector, public sector. You have worked in academia. You have worked on digital commons. You have worked on counterterrorism. You have been one of the instrumental people leading on the Christchurch call from French side, but also on the Paris call for cybersecurity. And most recently, and that’s how we started interacting, you’ve been leading the work on child online protection lab. So, as somebody who is wearing literally like 15 hats, can you tell us a little bit more about, from the global perspective, but also French perspective, how do you see all of these issues aligning, and what have you learned throughout these engagements, and what are the opportunities and challenges around these potential efforts to make these things more aligned? Over to you.

Andrea Powell:
Wow, in five minutes. Thank you very much for the invitation and for the opportunity to exchange with such a panel. Yes, as you said, we try, like our friends of the US, for example, to build a global and coherent digital diplomacy, because everything is interconnected and you can start from cybersecurity or education or else, at the end of the day, you have to to be coherent. And since I’m the first speaker, probably you, all of you will say the same, but let’s recognize that internet is something great, even for children, that they have access to more knowledge, more entertainment, more communities, more empowerment than ever, and that something is disbalanced now. So first, we have some troubles with the cyberspace itself, with the dark web, which is a very efficient tool for criminal activities. We did commoditize a lot of things, like payment or taking a room, which is very efficient for a lot of businesses, even for crime business. We have big companies that are very, very big, monopolistic, and why not, but sometimes they have unexpected negative externalities from their business model. And for example, we can observe filter bubble or echo chambers or radicalization. And regarding all of this, we have to find solutions that does respect the promises of internet. That’s the first point. And for this, yes, I was thinking this morning during another panel, 30 years ago, John Perry Barlow did write the Declaration of Independence of Cyberspace, because at this time, we could consider cyberspace as something external to the society. There were a place somewhere named cyberspace. Today, we could say cyberspace did eat the world. So it did contaminate and transform everything. And so to start to answer to your question, we have two principles for diplomacy. First, everything that is forbidden in our life should be forbidden online. And everything that is guaranteed in our life should be guaranteed online. So freedom of speech. So we have to forbid what is forbidden and to protect what should be protected. That seems very simple, but we all remember how difficult it was to implement. For example, when I go to New York to discuss in the UN about international law in the cyberspace, we are speaking about few very simple laws, like Geneva Convention. But we did discuss during 25 years to be sure that we do understand in the same way how we will apply the UN Charter or the Geneva Convention within a conflict, which is just one topic. So this idea that seems simple is not so simple to implement. But the second thing is that we, government, we didn’t build this system. We don’t understand how it does work. I’m an entrepreneur. I did create three internet companies, small, but I understand, but I don’t build it. I did never seen the algorithms themselves. I don’t access to the source code. So the companies and of course, civil society and researchers, but the companies has to be part of the solution. So we need not even a multi-stakeholder approach, but an efficient multi-stakeholder approach, which cannot just be a room where we discuss politely. We need to put the pressure, we need to ask for results. We need to, everyone in the room has responsibilities and prerogatives, and sometimes a business model or mandates, but we don’t have any other way. We need to be sure that we will find the solution all together and that the companies will contribute to find the solutions. Here I’m speaking generally about terrorism, harassment, gender balance, and child protection. If we go to child protection, what I did learn in this journey is that this is a very difficult topic, maybe one of the most difficult. First, as you say, this is very difficult to engage the conversation on those issues. People don’t want to recognize that, I don’t know, in France, for example, 25% of children less than 10 years did accede to pornographic content. That’s a big problem and we know that 20% of the adults did, were a victim of some kind of sexual offense in their life. So that’s one person out of five. So people don’t really want to recognize this because it would conduce them to change a lot of things and we could recognize that there is much less money in this field than, for example, in the fight against terrorist content. Regarding terrorist content, you have strong organization, you have a lot of technologies, you have a lot of monies. Or let’s mention this, if you want to, all of us, we could try, if you try to publish a small part of a Hollywood movie online, in 10 minutes it will be removed because Hollywood did finance solutions to detect this and to intervene very fast, very quickly. So this is a weaker field with less money and, I’m too long? Okay, I finished, with a wide range of issues and that’s the second thing, because everyone agreed to protect children, but here we can speak about strong and heavy criminality like human trafficking or whatever you can imagine, child abuse, but you can go to harassment, online harassment, so something lawful but harmful. But you can even speak about the consequences of some algorithms regarding the way you observe your body, for example, and the connection between some over-representation of some pictures and anorexia, and we should pay attention to this. So this is a wide range of topics, very impressive, not the same level of heaviness, if I may. So I will conclude and we’ll continue, but you did ask for a project, something more positive. As you know very well, we are trying to launch the Child Online Protection Lab. The idea here is to build evidences all together, in a cooperation spirit between companies, civil society organizations, research and governments, because I feel that one part of the issue, this is a very ideological conversation, everyone we should make this or this, and no one tests, no one experiments, no one shares the results. So for example, and I finish with this, if we just speak about age verification, so which should be normal, you should be able to test the age of someone pretending to go to a pornographic website, but you have dozens of approaches and some of them are better for privacy, others are more efficient, others are centralized, decentralized, etc. So we need to see the details and to test and to implement and to share the results. That’s one approach France will encourage deeply during the next years. Thank you. Thank you, Henri, and I like really

Marija Manojlovic:
the focus that you put on evidence and data that can really help us bridge these debates, but also bring back home the actual work on solutions, not only at the level of principles, and I think that’s something that we can also jointly think about as one of the ecosystem pushes to be more evidence-focused and more data-informed in our discussions and experiment more cross-sectorally as well. So thank you for that. So moving on from this kind of a very global and an interesting initial intervention, I would like to move to Dr. Albert from Ghana. Dr. Albert is the Secretary General of the Cybersecurity Authority of Ghana, and Ghana is really unique in many ways, but one of the ways that we really are particularly interested in is that it is a unique example in the world where issues of child protection have been streamlined fully into the work on cyber security at the national level. You are director of the Cybersecurity Authority and you have been at the center of those developments. Can you tell me a little bit more about what were the key factors leading to this outcome? The fact that you’ve managed to institutionalize child protection as part of the cybersecurity work, what was the political will, the ripeness of the issue, public attention, institutional setting, legislation, what were kind of the key driving forces behind that? In five minutes. Thank you.

Albert Antwi Boasiako:
Oh, certainly. Thank you. First of all, a pleasant afternoon to my colleagues here, participants, but also our colleagues who have joined virtually. Maria, I want to thank you on behalf of my government for the invitation, but not just that. The support your institution and violence against children has rendered to us over the last few years. You’re right. I think I’ve been around for a while. For the past six years, I’ve been leading Ghana’s cybersecurity development as a national advisor and then as director general of the agency responsible for cybersecurity development. You are right. A number of competing factors there. There’s a national security interest. Of course, the issue of terrorism, cyber terrorism comes up. There’s a private sector interest issue of protecting critical information infrastructure, the intelligence aspect of cyber, the civil society, academic part, but you can’t take away the critical concerns around children. I think at a national level, one always expects to have a 360 degree about some of this development. I think we’ve achieved some successes when we started this process. Ghana’s cybersecurity readiness according to the IT ranking was around 32.6%. That was the middle of 2017. At the end of 2020, Ghana’s rating jumped to 86.6, basically university cycle from F to A. A number of things have been done. Permit me to highlight some of this. I think one approach we also adopted, of course, the political commitment is key. I keep on telling my colleagues, I’m lucky to be running Ghana’s cybersecurity because I have the support of my government. My minister runs when she’s presented with a sound policy or personal matter. We don’t delay. Of course, my government is the president is committed to that. I think within the past six years, it’s been quite a sighted journey, notwithstanding the challenges, including financial challenges. We had a unique approach in terms of based on this approach on data. Research is key. We had to conduct research for this process to reference. We work with UNICEF in 2017 to look at opportunities, but also risks of children and interesting dynamics. On one hand, which you all know, there’s a trend in which children are using internet and devices. Consequently, we established that four out of 10 children had also had contact within a part with sexual content. On one hand, you have a positive development with respect to opportunities for children, even on seven underserved communities using the internet. On the other hand, you also have this disturbing trend in which they are always coming into contact with content that certainly had a potential to impact on their well-being. We also had to do a research with World Bank support, with Oxford University, the cyber security capacity maturity model, which also highlighted the gaps around the protection of children. This research led to a number of interventions. The first one was legislation, we’re going to pass a cyber security act that incorporates child online protection as a whole division within my authority, but also the law criminalizes certain sexual offenses. We’re lucky to tackle what has become the sextortion quickly within our law. It has had a lot of positive impacts afterwards. Of course, awareness creation also was put into legislation to make it mandatory for the states to lead that process. That is one aspect of the institutionalization of child online protection, but we had to also look at policy aspect of things. We developed a child online protection framework, which incorporated a number of best practices, including the WeProtect framework, but also ITU guidelines that we provided. They’re pretty important. As part of the institutionalization, I’ve mentioned my agency has the division for child online protection headed by a director, a very senior person. It’s not just that. Through the work we did with UNICEF and support from my agency, we established a child online protection forensic lab, which was the first one in the south region. I was to help investigative bodies in terms of forensic evidence to support the work because deterrence is key. Criminal justice response is also one of the areas that I need to look at it as a national response mechanism. Most importantly, and this is where I think I draw a lot of inspiration from, the institutional arrangement. Certainly, my experience, somebody needs to lead. You need a champion, but you need to carry people along. Different agencies, the gender ministry has a responsibility, education ministry, civil society, academia, the telecommunication service providers. We needed to bring all these actors together. I think anybody who has visited us has seen we’ve achieved a lot of success. There’s a consensus on the table on a way forward to be able to address child online. The last two areas that we’ve also achieved success is incorporating awareness creation around those risk areas and children into our national program. Ghana launched what we call a safer digital campaign. We came out with four pillars, government, business, public, but also children. Specifically, dedication. This has been institutionalized. You don’t treat awareness creation around the risks that children are facing as a sub-team. No. I think that’s one area we’ve achieved a lot of success is going through the schools across the whole country in terms of raising awareness with the collaboration with UNICEF. The last one is also reporting. We need to empower the public with the children to report. Ghana, when you call 292, it’s free. You can call that on a smart device or any other device and you can report incidents. We’ve become lucky. This is just to conclude. Initially, when we set this national hotline, we thought it was going to receive only incidents. In other words, the citizens, children are going to report only after they have been more affected. No, it has changed. It’s becoming more like a tool for them to seek guidance. When they make a call, somebody says, send me your note, or click on this link, they’re able to call. We advise and encourage them, please call 292 free. Don’t pay anything, 24 hours, and just at least conduct some minimum due diligence. I think that has been personally, as a public servant, the most important deliverable, a service good for the public. I really want to recommend that we look at those options as a best practice. Of course, there has been challenges. I wish we can speed up in terms of awareness creation. Ghana is big, 32 million. I don’t think I’ve been able to achieve my awareness creation mandate even 20%. I feel very uncomfortable. There’s a huge gap. I think we need to scale up our efforts, and the needs are there. Thank you.

Marija Manojlovic:
Thank you so much, Albert. It was really good. I do want to say this morning, when we talked about working Ghana, one thing that really struck me was how you eloquently described, well, actually, your strategic intent behind immediately legislating, because you wanted to remove this uncertainty around, there is political will now, but there may not be next political cycle. How do you immediately ensure that you institutionalize this thing, and also create incentives for ecosystem ownership? Not only that it’s you who has political will that leads on these things, but make everybody take the bit of responsibility and accountability over it, and create that ecosystem responsibility be shared. Thank you for that. Julie, I want to get back to you now. Your work is globally known. You are the first governmental regulator and independent agency focused on online safety. You’ve done tremendous work, both for Australians, but also global population and across. We use your resources all the time, and they’re really always the highest quality possible. eSafety is a regulator, but also it’s an agency that works on prevention of various forms of crimes. You have wide range of powers and functions that you try to apply really comprehensively. What is interesting about your agency is that many people don’t know that it started as only being focused on children. It’s kind of went from children to become everything else. It’s really been great to have you here to give us kind of a sense of how, because it seems it was centered around kids, how do you see kids’ issues now being embedded in this broader risks and harms ecosystem, and what are some challenges and opportunities for us to make that, as you did, a very big and part of a joint up effort? Over to you. That is a great and very hard question to encapsulate

Julie Inman Grant:
in five minutes, but really it was actually a political decision that it would be focused on children initially. There was a well-known media personality who is open about her mental health struggles. She had a nervous breakdown. She was very active on Twitter. I was interviewing for a role with Twitter to start their trust and safety and public policy roles across Southeast Asia, Australia, and New Zealand. She tragically ended up taking her life. It became known as the Twitter suicide, and a petition started to government that just said, government, you need to step in and do something. This was in 2014. Because of concerns about freedom of expression, the ICT minister at the time, Malcolm Turnbull, who became the prime minister, said, we’re going to start small with children’s e-safety, because nobody can argue that children aren’t more vulnerable than adults. We took a bunch of functions from across the government, put it into the Children’s e-safety commissioner, and that included, we’re the hotline for Australia, child sexual abuse material, taking reports on terrorist content, but also set up the world’s first youth cyberbullying scheme, where we serve as a safety net when the platforms fail or they miss cultural content, and the seriously harassing, intimidating, humiliating content targeting children doesn’t come down. When I took the role in January 2017, I was asked to set up the revenge porn portal. I said, no, I’m not going to call it revenge porn. Let’s call it what it is, image-based abuse for everyone. That’s how that started, but I think it’s really important to know that we take a vulnerability lens to everything we do, and nobody can argue, again, that children aren’t the most vulnerable cohort online, because the internet clearly was not made for children, although children make up one-third of the global internet users. And young people today don’t differentiate between their online and offline lives. It is their playground. It is their school room. It is their friendship circle. All that said, we had a very stunning national research, the Australian Child Maltreatment Study, that found that a stunning 28.5% of Australians have experienced sexual abuse by the time they’re 18. That’s more than one in four. Beyond thinking about it as an online issue or an individual issue, which is why we take down content, because it’s retraumatizing, but the comorbidities that exist that follow a child throughout their entire life, they’re more likely to experience sexual assault later in life, to be in family and domestic violence situations, to have drug and alcohol dependencies, to have serious mental health and suicidal ideation, and also to become sex offenders themselves. So we need to think about this in terms of the long-term societal costs as well. And did you know that our Canadian counterparts found that 20% of survivors are recognized on the street for the child sexual abuse series that they’ve been seen in? So you can imagine how traumatizing that is. So when we have that debate about adults’ privacy versus a child’s dignity or a child’s right to be free from online violence, I think, what about a child’s right to privacy when they’re being tortured and abused? We need to really rethink about how we rebalance this. So what have we done in just three broad areas to address this? We have these complaint schemes where we’re taking trends analysis all the time. So we know that kids are actually coming to us younger reporting youth-based cyber bullying because when kids on TikTok and Instagram and Snap at eight or nine, so now we’re getting reports of cyber bullying of kids. And this goes back to Henri’s comment about age verification. We need the platforms doing a better job. Like eight and nine-year-olds have no business being on these platforms. They don’t have the cognitive ability to be able to address this. So we do the fundamental research. We’ve got the programs. We know that 94% of Australian children have access to a digital device by the time they’re four years old. So parents need to be the front lines of defense. We’ve got a program for parents of under fives to be safe, to be kind, to make good choices, and to ask for help. And then when they get into the primary years, it’s about the four hours of the digital age, respect, responsibility, digital resilience, and critical reasoning skills. We have youth advisory committees so that we can hear from young people about what is going to work for them. So we have them running our scroll campaign. So it’s authentic and it’s resonating. We have them writing letters to big tech saying, this is what we want from you. We want you to take abuse seriously. We want you to take action. We are your future customers, users, humans. But then we also have systemic and process powers where we’re compelling more transparency from the major platforms on what they’re actually doing to address child sexual exploitation and sexual extortion and harmful algorithms. And next week we’ll have a major announcement and enforcement action. We’ll be holding five more companies to account in this area. So the more that we can shine light on what is and isn’t happening, the more we can push safety standards up. And that goes to the whole idea of safety by design as well. Again, we can’t have safety be an afterthought. The welfare of children to be an afterthought. We really need to revolutionize the way that technology is developed with humans and safety at the core. Again, not after the damage has been done. We need to get ahead of technology changes so that we’re anticipating the risks. We’re never going to get a hold of generative AI if we’re not focusing the scrutiny on how the data is chosen and it’s trained. And if we wait until it’s extricated out into the wild, we’re going to be playing a huge game of whack-a-mole or whack-a-troll, as I like to say. There we go.

Marija Manojlovic:
Thank you, Julianne. And thank you for always grounding us back into the research and data that you collect and how you always try to think in terms of long-term engagement. Because engaging with kids as young as zero to five, we are building future for healthier engagements later on. And from the prevention lens, that’s really, really critical for prevention. Because we are seeing that, again, perpetration is also starting to be done earlier and earlier. And we keep on engaging with just a certain group of kids, which is like adolescents, but no engagement with younger generations. So thank you for that. I know that you and Dr. Albert will need to leave at some point. So I’m just going to give that heads up to people. But with that, I’m going to give the floor for anybody who wants to ask any questions at this point in time after the first round of interventions. If there are any questions or comments, please now raise your hand and we can pass you on the mic. Or if there is anything online that is coming in, do you want to… So is there anybody in the room who has… Oh, there it is. There is one. I think you can use the mic over there. Yes, thank you.

Audience:
Hello, I’m Ana from Brazil. I work in the Elena Institute. It is a child rights organization. And we are part of the coordination team of the Brazilian Coalition to End Violence. And I would like to hear your thoughts based on INSPIRE. How can we draw some measures and practical measures to think about the priorities in this area? Is the law, is the design, or how can we think about standards to avoid the fragmentation and to implement all these new ideas that you were talking about?

Marija Manojlovic:
I’m looking at Julie, but anybody can pick up the mic, please.

Julie Inman Grant:
I think we all will probably agree that self-regulation is no longer enough. And this sounds strange coming from a regulator. I don’t think purely regulation is going to be enough either. And that’s why we have this 3P model with prevention, protection, and what I call proactive and systemic change. And that does mean working cooperatively with the industry to achieve outcomes. We have a 90% success rate in terms of getting cyberbullying content and image-based abuse down because we work informally and cooperatively with the networks. And that is the way we get that content taken down more quickly. To sort of solve this issue as more governments are thinking about how they set up either their own independent regulatory authorities or how they start small if they don’t have the political will, we’ve started the Global Online Safety Regulators Network. We now have six members of the network. I’m going to be calling you Dr. Albert soon. But we also have observers who don’t yet have independent regulators who can learn from these models. But please go to esafety.gov.au. We have a strategy. We’re trying to do as much capability and capacity building as we can. We were the only ones for a while doing it, trying to write the playbook as we go along. And we’ve made a lot of mistakes. We’re happy to share those as well. But I don’t feel like anybody needs to start from scratch. Even if it means we’ve localized a lot of our materials into multiple languages, take it, use it, localize it in a way that works for you. We’ve got to be in this together. None of us are safe until all of us are safe.

Marija Manojlovic:
Thank you so much, Julie. And Dr. Albert, do you want to? Just a quick one. I felt the sentiment of my Brazilian colleague, especially when she used the word fragmentation. That’s the reality. She’s speaking from a context. And I think I saw this when I was first appointed.

Albert Antwi Boasiako:
Again, it is a problem because you see, this is what I call ad hoc. Ad hoc happens even in the non-governmental space. Ad hoc activities are happening in government testing. I think that recognition is key for effective response. So institutionalization means essentially you are taking some systematic measures. What Ghana did, and I keep on my own experience from the development context, developmental context. Again, there’s a lot of things from our context. You are now starting to put the necessary structures in place. Champion is key. You need to have someone who drive to bring all this. In Ghana, I identify among the civil society institutions. I identify one of them which was quite active, very respected. And I use. So we did CSA and A institution. And we mobilized others around them. In government, we took that, that we had to carry gender, ministry, children, education along. That was deliberate, intentional. Other countries haven’t succeeded. I will mention, it’s a struggle. And the power concentration, and I’m saying those from the developmental context, they are real. And without being conscious and identify what I call champions in all sectors, it’s likely to be a little bit problematic. You may still have the law. And I think some of my Western colleagues keep on sometimes you are surprised. But you have this law. It’s been there, but nothing is working. In my context, the law is good, but frankly, getting people even to sit at the table can be a challenge. And that is why really I felt like in terms of the Brazilian situation, picking champions are wrong. Of course, the child online protection ecosystem is a collection of different players. And I think the first step is to look at those who are quite active. They will drive them. They are respected within the ecosystem. And able to mobilize ideas along there. Thank you.

Marija Manojlovic:
Thank you, Dr. Albert. And just one more note for Brazil. I know that you’re a member of the We Protect Global Alliance as well. Thinking about the model national response is a framework that one of the frameworks that you can use to start thinking and charting different areas of engagement and how that needs to be happening. But again, we’ll be happy to chat with you afterwards as well. I will excuse Dr. Albert and Julie who need to go to the next session. Yes, I know. Can I make a brief comment? Ambassador, sorry. Thank you so much. My name is… Sorry, I make a brief comment. I want to make this in front of you.

Henri Verdier:
So first, let’s recall that a large part of the issues we are speaking about is not on social networks. In the dark web, for example, if you want to buy a real-time video of a rape of a baby because it does occur, it’s not on business companies. It’s on the dark web. And here we need more policemen, more investment, more international cooperation. But this is not about company regulation. This is about fighting crime. Regarding company regulation, I understand the fact that it would be better to have a world with one common rules. But this is not what I call fragmentation. The fragmentation of internet is a fragmentation of the technical deep layer. We have to fight this. But we are democracies. We have the right to have proper rules. Or we are not democracies anymore. And we are not here to build a unique market for five companies. And I want to say there is another fragmentation, and that’s very important. That’s a fragmentation of the investment regarding trust and safety. Because most of those companies, and we can understand why they do this. They do invest regarding their sales. So they do invest a lot on big markets and much less on small markets. And of course, especially in Africa, for example, they don’t invest a lot. And we should ask them, we could do this in this framework of the UN, to equalize a bit the investment. And to take a small part of the investment in Europe or US to invest this in Africa, for example, or Brazil, why not?

Marija Manojlovic:
Thank you. Thank you.

Audience:
Maybe first in the room, and then we can do the online. But also, we will need to move to the next segment as well. But go ahead. Okay, thank you. My name is Peter King. I’m from Liberia. I would like to thank the NCH boss from Ghana. The question is an open question, but I would like for him to help in terms of suggestions that can be of best practice for other countries like Liberia, who are struggling to have cyber security strategy laws to protect online. What are some of the suggestions he can offer to African countries that are not at the level of how Ghana is moving with certain issues of tools that are used to create awareness on cyber security issues? The reason is because we look at inclusion, and then the level of progress, I’m thinking about uniformity. I just want him and other panel members who can share some of these best practices or advice on how to actually streamline online protection and cyber bullying in our African context. The European context may be different. Maybe four years in Europe, he has an idea of how to use a mobile phone. In our African context, he doesn’t even know that is it. Can we look at these dynamics and what are the best practices and suggestions for national level, civil society tools that he can use, and also at a level of maybe even the security sector to combat these kind of issues? Thank you so much. With permission of you, Dr. Albert, I will put you in touch with…

Albert Antwi Boasiako:
In fact, I brought a card and just a quick one because I’m being moved to another session. But a good thing is we are in touch with Liberia. I think a number of African countries have messages to tell us, and they keep on coming. We share the letter we’ve achieved. I think we’re sharing within the region. The only problem I’ve seen within the region is just fragmentation. So you have one ministry visiting you, others are left out. That’s why I was stressing the Brazilian situation. So there has been contact with the body in Liberia. Fortunately, we haven’t been able to really integrate the structures. But please, we will discuss. Thank you very much.

Marija Manojlovic:
Thank you so much. Nathalie, do you want to move ahead or do you want to ask a question? One brief question from online.

Moderator:
So from Omar Farouk, a 17-year-old from Bangladesh, passionate about child safety online. Started Project Omna and working with local and international organizations like UNICEF Bangladesh, UN Tech Envoy, to tackle digital issues like cybersecurity, bullying, and privacy, not only in their country but globally. Given the rise of cyberbullying and privacy concerns for children, how can we strike a balance between protecting kids online and fostering innovation and economic growth in digital space? What strategies can be developed to create strong partnerships between government, businesses, civil society, ensuring child safety is top priority? So just perhaps speaking to that balance between the economic growth and innovation piece along with child safety. So perhaps, Ambassador, if you don’t mind speaking a little bit to that.

Marija Manojlovic:
Ambassador Verdi, do you want to take that one? We’re kind of looking at you trying to avoid the look. That’s the eternal question, the big question.

Henri Verdier:
As a former entrepreneur myself, I just want to say two things. First, so sometimes if you forbid something, you forbid it. So it doesn’t, that’s not a problem of innovation. For example, I don’t know, a century ago when we did forbid child work, private sector said, ah, we cannot work like this, et cetera. But finally, we did all adapt. So that’s important. There is not always a contradiction between innovation and regulation. And some regulation can be tools for innovation, like, for example, a good standardization can be regulation and good for innovation. The second thing is that very often people oppose for security and privacy and innovation, for example. But very often, if we work a bit more, we can find solution. But you have to take in consideration that you are looking for three goals, for example, security and safety and innovation. So probably your first idea won’t be a good idea. You need to work a bit more, but you can still find solutions. And that’s why we need those multistakeholder, efficient multistakeholders to work all together and find solutions. I could, I won’t, but I could share dozens of examples on how we did a very fine tune, some good balances between all those goals. So it was not the first idea, but then we did find solutions. Thank you for that.

Marija Manojlovic:
And thank you for the question from Bangladesh. I think one of the things that I think neatly ties into the next segment that I want to open now is that sometimes innovation ecosystem is not inclusive of people who need to be part of it because various reasons, including safety. So women becoming part of the innovation ecosystem was for a long time not an option because they just didn’t feel welcome in certain environments. So making sure that innovation is not separate from ensuring safety in various environments. So now we want to move to a segment on gender based violence and image based abuse. One of the key things that we really want to unpack right here is how can online child safety be better positioned as crucial to inclusive gender balanced digitization. And another thing that we always struggle with is how can more be done in prevention work to address common narratives and perception of these issues grounded in gender norms and better center survivors. So with that, I will introduce Kaylin. Kaylin, you’re a senior advisor to the White House Gender Policy Council. You’re working on issues of technology facilitated abuse and harms. And you’ve been involved in the development of some of the landmark principles, guidance and coalitions in this space, including the Global Partnership of Action to Tackle Defacilitated GBV. So how do you see convergence of these various agendas from the White House perspective? And also from the perspective of the drivers of abuse, harassment and other harms? And how do they intersect with child safety and protection? Huge question, but over to you for five minutes. Thank you so much, Maria, for that question. And to Maria, Natalie, safe online for hosting this critical discussion that is really so important to be present at the Internet Governance Forum.

Cailin Crockett:
As Maria mentioned, I’m Kaylin Crockett. I am a senior advisor with the White House Gender Policy Council. I’m also director for military personnel and defense policy with the National Security Council. And for the past two plus years, I’ve coordinated the Biden-Harris administration’s efforts to address sexual violence in the military and also to counter online harassment and abuse as a feature of our domestic and foreign policy. These two portfolios might seem really quite distinct, but they actually share a lot in common and I think speak to the heart of our discussion today. And the first and foremost is that all forms of gender based violence and interpersonal violence across the life course share root causes and common risk and protective factors that perpetuate and are driven by harmful social and gender norms. And they are some of the most underreported crimes and abuses because survivors are too often shamed, silenced and made to feel invisible. This certainly has been true for survivors of sexual violence in the military, as well as for survivors of child sexual abuse. There are core values also that I think bind together the child online safety agenda with the ongoing work we must all do to promote a safe, secure and inclusive digital ecosystem for all people, but particularly for women and girls, children and LGBTQ plus people. This really means three things, I think, in particular, accountability, transparency, being survivor centered with a gender lens and, of course, prevention. I’m really fortunate to work for an administration led by a president and vice president that have been lifelong champions to address gender based violence and to stand with survivors. The administration really understands that the consequences and costs of gender based violence impact in addition to individual survivors, communities, and the ripple effects of gender based violence and all forms of abuse are felt across our communities, our economies and our countries. And it must be said in this conversation that women and girls from marginalized communities, including people of color, LGBTQ plus people and individuals with disabilities, among others, are disproportionately impacted. And it’s important to also be clear here, excuse me, that online violence is violence, and it can result in dire consequences for victims, ranging from psychological distress self censorship and decreased participation and political and civic life to economic losses, increased self harm, suicide, and forms of physical and sexual violence. In this campaign, President Biden made a commitment to convene a national task force to develop recommendations for federal and state governments, technology platforms, schools, and other public and private entities to prevent and address all forms of online harassment and abuse, with a particular focus on tech facilitators. gender-based violence. And in June of 2022, the president issued a memorandum that established a White House task force to address online harassment and abuse, which I’ve had the fortune to coordinate. This is an interagency effort that I think really speaks to that ecosystem approach that other colleagues have raised. It is co-led by the Gender Policy Council and the National Security Council, and involves many diverse government departments and agencies from USAID to the Justice Department to Health and Human Services, Homeland Security, and several others. And the senior representatives across the agencies that comprise the task force have met regularly with justice system practitioners, public health professionals, researchers, advocates, parents, youth, and importantly as well, partner governments to identify best and promising practices, gather recommendations, and learn from lived experiences to inform a blueprint for action. The initial actions of which were previewed in an executive summary that the White House released this past March, and will ultimately be fully captured in a public final report and blueprint of the task force that we’re working on to compile towards the end of this year. And again, most importantly, we’ve met with survivors, and especially youth, who shared how experiences of online violence have disrupted their lives, impacting their well-being, their health, relationships, careers, and career aspirations. And while each of their stories is unique, they share common threads and lessons that inform the work of the task force, and they have outlined concrete measurable actions, 60 and counting so far, that federal agencies have committed to address online harassment and abuse. And I know I’m already over time, so I’ll just briefly mention the four pillars of the blueprint that are inherently multisectoral. Those are prevention, survivor support, accountability for both platforms and individual perpetrators of harm, and research. And as an administration, we’re working truly across the whole of government. We’ve committed to updating and expanding resources to address gender-based violence online, and including child sexual exploitation. For example, the Justice Department is dedicating an unprecedented amount of resources to address cybercrimes that particularly impact women and girls, including image-based sexual abuse. And we’ve also really recognized the outsized impacts and harms of online harassment and abuse on children and youth, including in May, Surgeon General issuing an advisory on youth mental health and social media, which particularly emphasized the intersection of gender-based violence and child sexual exploitation online. So with that, I will look forward to sharing more in the Q&A. Thank you. Thank you so much, Kaylin, and thank you and the Biden administration for really

Marija Manojlovic:
taking such a strong lead and position on these issues, because we, you know, as everybody has been saying, a majority of the platforms and companies we speak about are based in the U.S., and what U.S. does is really going to matter for a lot of the other people across the world. So we are really looking into you for action on this. Particularly, thank you and the team and everybody else in the global partnership also for making sure that we are not siloing the work on child online protection, as well as the issues on gender-based abuse and violence. Unfortunately, Andrea Powell from the Panorama Image-Based Abuse Program has not made it in time from the airport, so we will, if she comes, we’ll just include her in the discussion, but if not, we will move ahead. I’ll just open the floor for one or two quick questions or comments on this, and then if there are none, we will move ahead with the next segment. I’ll wait for a little bit. Natalie, is there anything online coming in, or anybody in the room? Oh, there is. Please come on in.

Audience:
Hello, everyone. First, thanks for the great session. I’m really, really happy to be listening to you all. I’m Emanuela. I’m from Brazil as well. I work in Instituto Alona with child rights, and one question that I have when we talk about this theme of gender-based violence and also about child abuse and exploitation. In Brazil, we have high rates of abuse that happen online, so that happens at home, so I have two questions about this. The first is about supervision, parental supervision tools. How can we balance this complicated debate when we have the supervision, but we also know that the violence can come from the family, and this could be a risk for a child’s right to information, a child’s right to seek help, and how to do this in a practical way? This is my first question. The second is that we also have a very conservative country, and when we talk about prevention measures like sexual education, this can be a tough, tough debate that raises a lot of different issues. I would really like to hear you guys’ approaches on advocacy on this kind of prevention strategies that we can use because of the maybe taboo that this theme could evoke in more conservative countries. Thank you.

Marija Manojlovic:
Emanuela, over to you.

Cailin Crockett:
Thank you so much for that question, and as many of the experts in this room are aware, the United States, we are a federalist country, so we have 50 diverse states and territories as well as that, and so there are a multitude of approaches that have been coming up across the states on how to address these issues, and so for the administration’s perspective, we want to be really careful about balancing what you’ve said and recognizing those concerns given that parents may not always be inherently able to represent or willing to represent the best interests of their children, and we always want to maximize options and support for survivors of abuse at any age, so I think it’s a very timely question, and I think it’s really important that in line with your second question, we really take an evidence-informed approach and really focus on prevention as well. One of the areas that we’ve continued to invest in is through our Centers for Disease Control and Prevention really taking a public health approach to recognizing the shared causes of violence across the lifespan. We have an analysis that the CDC has done called Connecting the Dots that I quite like because what it really does is it connects the dots between multiple forms of interpersonal violence from sexual violence, intimate partner violence, child abuse and neglect, cyberbullying, and so youth violence, community-based violence, and so that’s one area where we’ve seen promise, but of course with everything, resources are so important too, and so the voice of civil society to really demand governments proportionally investing in these problems is so critical, so thank you for your work.

Henri Verdier:
A brief comment. Of course, there is a tendency everywhere, including in France, to say that those are questions for woke, decadent, and very liberal people, but actually, no, everything is connected, and I share your point. This is about violence, and maybe I can share two examples. First, within the crisis group, now we are speaking about algorithmic radicalization. Most of the terrorist attacks were done in Europe. I don’t mention Israel, which is a different situation, but in the EU, most of the terrorist attacks we had to face were done by very young people with the role of the social network in the radicalization process, and most of the terrorist attacks we had in Europe were coming from masculinist movements. It was not jihad or, I don’t know, it was masculinist people against LGBT or against, so everything, all those issues are connected, and if you pretend to avoid gender balance and gender protection, you will miss a large part of the other parts. Thank you so much. This is exactly why this session exists, to make these links and make

Marija Manojlovic:
them really clear in everybody’s mind, but also in our ability to create policy and otherwise responses to these phenomena. Andrea has made it. She literally just ran into the rooms, so she’s still in time for her intervention and the same segment, so perfect timing, Andrea. I hope you have had a time to take a breath. So, Andrea Powell is the director of the image-based abuse initiative at Panorama Global. In your work, you’re building partnerships and mobilizing efforts to ensure that no one experiences the enduring trauma that results from image-based abuse and other types of online harm. We are very proud to be part of this coalition and your work, and have been above all so impressed with how you’ve ensured that lived experiences, lived experience experts are an essential part of this coalition. What I want to ask you is, what opportunities do you see for better alignment between IPSA work, the image-based abuse, sexual abuse work, with online child protection, but also from your perspective, what have been some definitional and content-related issues, as well as in terms of practical tools and best practices we can build between these two fields. So, over to you. Thank you very much. My apologies for

Andrea Powell:
being late. Very happy to be here with all of you. Again, I’m Andrea Powell, and I am the director for the image-based sexual violence initiative housed at Panorama Global, and we most recently launched a new coalition, the Reclaim Coalition, that brings together over 50 stakeholders from 23 countries, most notably from civil society, academia, law policy, as well as lived experience experts, often called survivors. And what I mean in that context is not just individuals who’ve endured this ongoing trauma, but individuals who are active in the field of addressing image-based sexual violence. Image-based sexual violence as a threat are the act of creating and sharing intimate images without someone’s consent. It is a form of online sexual violence and a violation of privacy that disproportionately affects women and girls, LGBTQ+, and indigenous and BIPOC individuals. Anyone who deliberately views, shares, or recreates these non-consensual images is participating in a sex crime whose unique feature is that the abuse lives on long after it’s over, growing in magnitude for the whole of the world to see. When non-consensual imagery is shared over text messages, online forums, or posted in social media platforms, it can quickly reach a global audience via uploads to pornographic websites that do not or cannot reliably verify age or consent. Those who are victimized live in a state of constant trauma and fear when their victimization may happen again. Will their parents find out? Were their friends, co-workers, college admission counselors, or future employers? It is never post-traumatic stress disorder because the trauma lives on continuously. This type of technology-facilitated gender-based violence is growing in global prevalence. There are over 3,000 websites online that purely are designed to host this non-consensual intimate videos and images reflecting a vast enabling environment that facilitates this form of abuse. And what we know from the survivors in the Reclaim Coalition is that younger and younger children are ever more being exposed to this form of violence. Those who are impacted, whether they are adults or children, frequently experience elevated levels of psychological distress, trauma, extreme and prolonged anxiety, and suicidal ideation. In the early stages of the formation of the coalition, we uncovered over 40 cases of children who’ve ended their life as a result of image-based sexual violence, often within 24 hours of their abuse, leaving their parents little to no time to intervene. As a woman who was, as a child, a victim of sexual violence, I chose not to reach out for help. I chose to live in silence. And I never thought that silence was a privilege. Yet the survivors who bravely advocate in the Reclaim Coalition never got a chance to make that choice. Their sexual thoughts are there for all the world to see. And thus, this is a global problem, but there is hope and there are global solutions. Many child victims of image-based sexual violence are adults when they discover their victimization. And many survivors who are now adults and are part of the Reclaim Coalition experience reputed abuse online every time they dare to advocate publicly on this issue. As a matter of fact, as I boarded this plane, I was working with an individual who just came out publicly and had all of her images re-uploaded to a site called Pornhub. This trauma does not stop on their 18th birthday. The very real harms do not go away. And abusers continue to share and re-upload more content. Leading up to the launch of the Reclaim Coalition, we hosted a private summit with lived experience experts from eight countries. What I thought was going to be an informative, well-agended program became a witnessing session where survivors shared their stories and created the formation for 17 recommendations that we shared with our colleagues, most notably at the Gender Policy Council, as well as forming the basis for our first landscape report, I Didn’t Consent, focusing and centering this issue around privacy and consent in an innovative way that eliminates the question of why was the image put up there? What was the intent of the abuser? Because in all reality, we don’t ask domestic violence victims why their husband hit them. We don’t need to ask online survivors of image-based sexual violence why their abuser abused them. I came up with five core areas of intervention that I think we can take lessons from the area of child protection and build upon this to look at this issue, not as siloed intermittent interventions across children’s spaces and adult spaces, but things that we can do across those divides. First, we need to build knowledge. The public misunderstands and lacks awareness about online sexual violence. Without knowledge, survivors don’t know where to get help. Law enforcement don’t know how to intervene. And frankly, the public misunderstands and continues to shame victims instead of the abusers. We need to harmonize global regulation and policies. The policies to address good child and adult online sexual violence can and should be more harmonized. This includes removing the barrier of proof of intent of the abuser, as well as classifying this as a serious sex crime. In fact, we should ensure that across the globe, the online sexual violence of children and adults, most affecting women and girls, is taken seriously and given the serious type of criminal penalties that offline sexual violence endures. We also need to standardize global hotline support to ensure that hotlines that address the adult abuse image-based sexual violence receive the same standardized global standards as does in the child space. There is an allied network that may have been brought up today called the InHope Network. It is a phenomenal network of, I believe, over 80 hotlines across the globe looking at child online sexual abuse. We need to do this in the adult space as well. We also need more tech accountability. Those 3,000 websites that I mentioned earlier could simply be de-indexed and go away. So why haven’t they been? There needs to be an opportunity for tech to engage in a proactive intervention way with civil society, learning from lived experience experts, and this is quite possible. We also know that image removal is a critical piece of justice and healing for survivors. It’s very difficult to heal if your abuse is continuing to be placed online where anyone can Google your name, your address, your school, and learn everything about your exploitation. Image removal should not be different across different platforms and sites, but what we hear from survivors is they’re effectively left to create their own digital rape kit and clean up their own crime scene, and that is an unacceptable standard. In closing, I wanted to say that we have the will, we have the solutions, and our children depend on us. If we address image-based sexual violence for everyone today, there truly will be less child victims tomorrow. Thank you.

Marija Manojlovic:
Thank you so much, Andrea. There is so much I want to pick up on, but there is literally no time, so we’re going to just leave you to have conversations after this gets mic-dropped, literally. So, thank you for that. I will need to immediately move to the next speakers because there is not enough time. We have only 15 minutes left. What I’m going to move now on towards is the digital innovation ecosystem. That was a question we had from online, but also we just want to move to discuss a little bit more broadly this entire ecosystem. Salome, I want to go to you. You go from the German Development Agency, GIZ, and you are the director of the Digital Transformation Center in Kenya. GIZ is famous for investing in working in the field of digitization, innovation, cyber security, and skills. I’m very curious to hear from you a bit about challenges and opportunities of integrated child online safety into this work across all of these areas, whether that has been done successfully so far, or what are the plans for the future?

Salomé Eggler:
four minutes. So sorry. I’ll try my best, Maria. Thank you so much for that question. I’ll start with two disclaimers. I’m sitting here as a non-child online protection expert. I’m sitting here as a practitioner who has as a goal to mainstream, to bring in child online protection into our activities. And as you were saying, there are many fold, right? They range from digitizing government service to working with the tech ecosystem and tech entrepreneurs. SMEs going digital to working on a transition and everywhere there are angles around child online protection. And yet, and maybe that’s the entry point I’ll take. We have a twofold approach in GIZ, how we try to work around this question. So the first part of the approach is really that mainstreaming idea. And I like to use the image of a braid. When you braid the hair, we ideally want to braid in child online safety measures and considerations from the onset of a project and not, and I say, we’re also guilty of that as GIZ in the past, not adding it as a bow in the end, right? Of your braid, but you really have it as per design into all of our activities. And the second part of the approach is really genuine child online safety projects that focus not only on integrating and mainstreaming the topic into other activities, but generally trying to tackle a certain topic. For instance, one of our activities that we have jointly also with children, for children created is a set of online training nuggets where children starting from the age to 10 to 15 can explore how to navigate the online world safely in a very easygoing way. And this is one of the aspects as well, where we try to have that initiative and these trainings available in up to 10 languages by now. Also in Kiswahili, for instance, for Kenya, and we saw how important it is as well to translate all these phenomenal tools that we have by now, by ITU, by UNICEF, et cetera, to other languages as well, to make them accessible to all the children around the world and youth growing up. So that’s the twofold approach that we are pursuing with GIZ at the moment. And now maybe come to your questions around challenges, opportunities, I see. What struck me in your introduction was your point where we’re saying, oh, you talk to someone working on infrastructure, they’re saying, oh, that’s not about children. You talk about someone working on digital skills, and they say, oh, no, no, that’s not really what we’re doing. Reflecting on that, within GIZ, it might be a slightly different variation. I would more call it an attention economy, where I, as a practitioner on the ground, I’m interacting with my colleagues that are working in the child protection unit. And they tell me how it is important that we mainstream these activities and considerations and safeguards, et cetera. And I see the importance. And at the same time, I talk to my colleagues in the gender department, in the climate change department, in the disability and inclusion department, et cetera. So in the end, I end up with, I know that all these considerations are so important. And yet, my reality on the ground is, I have a highly political, highly dynamic political environment. I have technological debates that evolve so quickly. I have a limited budget and limited time. And often, and maybe that’s the lessons learned, also for myself, I end up maybe going with those that scream the loudest. Or, and maybe that’s the positive side, and maybe what could be helpful, what has been helpful for me, where I have resources that are easy to use off the shelf. And I don’t have to become half of a child protection expert in order to implement these activities, but it’s some tools that I can really use, take, implement. And that has been really helpful for some of the activities that we have been, for instance, on data protection, been able to do, take these, apply them without having to generally become an expert in itself, because we’re working at a very interdisciplinary level in the end. So that’s maybe one of the challenges slash opportunities that I see. And, okay, then I’ll come to the opportunity. That was the challenge. Let me talk about an opportunity that I see as well, is developing agencies, financial assistance organizations, et cetera, are setting up bigger and bigger digitalization projects. When I started in GIZ, we had a 3 million project, that was it. And now I don’t even know how much, only the Digital Transformation Center in Kenya is a 30 million project, right? So we are getting bigger and bigger. And I think it’s the opportunity as well for us to mainstream in our own frameworks, in our own tools, ways to include these considerations and something, maybe a best practice that I could mention here, we’ve developed what we call the digital rights check. It’s an online tool. It takes you 30 minutes as a practitioner to go through that tool, to assess your project, either at the onset of project design stage or implementation stage. And that check tells you, it’s a bit broader. It’s about human rights in general, but there’s a specific part on child online protection, tells you exactly, have you thought of XYZ? This is what you could do. These are further resources. These are people you can contact. And that has been highly valuable to me because I can cater towards all these needs that have, the importance is clearly there, but it kind of meets my daily environment in which I operate. So maybe that’s also an opportunity there to have these kinds of hands-on tools like the digital rights check to guide our activities on the ground. Thank you so much Salome,

Marija Manojlovic:
because I think you kind of brought back some reality into the context in terms of lack of resources or trying to align all of the resources across different agendas. And I think when you speak about how do you actually decide what you need to work on, I think I have a perfect answer for you because there is Matito and Ananya who are going to tell us a little bit more about USAID’s work. But let me just introduce that for a minute. So Matito, you are USAID’s lead on child protection within the child, children and adversity team. And most recently you have been leading on a cross-agency effort to define USAID’s approach and roadmap of digital harms. And I think you found yourself with a set of premises that you embarked on the process and then you switched everything around when you started involvement with young people. And I think that’s really what I thought was really the most thoughtful thing that USAID could have done is to engage at early stages with young people. So tell us more about that journey, what you’ve learned through that journey. You’ve established your digital council. Ananya is here who was part of the first cohort and became an advisor afterwards. So I’m going to give you both six minutes. Sorry to give us that. Please. Thank you so much. I’m so sorry that this is going to become like a

Mattito Watson:
running game. It’s first of all, thank you for everyone who is last one speaking. Good news is a lot of people have said things I’ve already wanted to say today so I can zip through my talking points. Bad news is that we’ve lost part of our crowd. So thank you for everybody who has stayed to the very end. You’re going to get the best part of the session right now. So I want to thank you for also saying that USAID was thoughtful. We’re not always called being thoughtful over here at USAID. We’re one of the largest development organizations. We’re the international branch of the U.S. government. Our job is to really save lives, reduce poverty, strengthen democracy, and get people out of assistance. And so to do that, we’ve got to be always looking towards the future. And so USAID came a little late to the party in terms of our digital strategy. It just came out in 2020. It’s very comprehensive. It’s very robust. I recommend people go online and read it. But when they were developing it, the question came up from our team, the child protection team, where are the children and youth? They are our future. They are going to be picking up whatever we lay down, and they’re going to be driving it as the next generation moving forward. So as the child protection person at USAID, or one of them, but leading in terms of our children and diversity team, they asked me to lead on our digital strategy. I am a field practitioner. I spent 25 years in Africa working with children and youth. I am not a digital person, and that ended up being a good thing for reasons I was going to talk about. We’ll skip over for the moment. But I said to myself, how do I get around my blind spot? How do I really understand what’s happening with you? How do I understand what’s going on with a 16-year-old girl online in Brazil? Or how do I understand what’s happening with a 12-year-old boy in Ukraine? And so my brilliant idea, which I somewhat stole from Microsoft, I’ll give them that nod of credit, was to develop a digital youth council. So at USAID, with our youth strategy, we want to work with youth, not for youth. And so that means bringing youth to the table, listening to what they have to say, and incorporating their viewpoint in our strategy and our implementation. So back two and a half years ago, I created a 12-member digital youth council consisting of seven girls and five young men, five young women, six, seven young women, five young men, to not only advise us in terms of, is our strategy on point? Where are we going? But also to build the next generation of changemakers, the next generation of leadership. We’ve had our second year happening now. Putting my money where my mouth is, also to make this last part go as quick as possible, I am going to turn the floor over to Ananya, the voice of the youth, to tell us what was it like working with USAID? What did you see us responding to your voice? And how did you feel in terms of the overall process?

Ananya Singh:
First of all, thank you very much for inviting me today. Not only is this topic very close to what I’m deeply passionate about, but I’ve relentlessly worked on this for the past three years. And this session provides us an opportunity to reflect on what some of the best practices in this area have been. And as the youth advisor to the USAID Digital Youth Council, I am very happy to have been invited to shed light on the success story that our council has been. And hence, as I speak, I hope that the story inspires more people to take action for the future, with the future. As a generation of young people born into the digital age, we understand how digital technologies impact or impair our aspirations and rights. All we need is a platform to be actually heard. Given that digital technology helps to enhance our capacity to engage with and empower the youth, there is no excuse anymore to not reach out and actually seek input from the youth in a more participatory way, treating them as the active and equal partners of digital development as they are. Recognizing this, the USAID, which has for long prioritized positive youth development, established the Digital Youth Council in 2021. I consider it to be my absolute privilege to have been a part of the Digital Youth Council since its very first day. Over the past 2.5 years, the council has not only served as an important voice in helping to guide the implementation of USAID’s digital strategy, but has also helped to raise awareness about digital harms in many countries and influence national leaders, the private sector, civil society, local communities, and other youth on how best to keep safe while learning, playing, exploring in the digital world. We have co-created and led sessions on innovation, emerging technologies, such as machine learning, artificial intelligence, large language models, chat GPT, and tried to establish a connection with digital harms which target young children, including our young council members. With the support, training, mentorship, resources, and encouragement that we provide through our extremely carefully designed program, our council members have been able to design apps that educate young people about digital harms through interactive games and other modern features. In fact, one of these apps is about to go live on the Google Play Store by the end of this year. We’re also very proud to have involved our young council members in planning and speaking at multiple sessions of the USAID at the Global Digital Development Forum in 2022 and 2023. Personally, I have had the opportunity to speak at the USAID Youth Policy Launch in 2022. The USAID enabled a young person like me to share the stage with and ask questions directly to the USAID administrator and the U.S. Congress representatives. I also had the opportunity to emcee the USAID’s International Youth Day event in 2022, where 1,200 people from across the globe joined us to celebrate young people and engage in a panel discussion on intergenerational solidarity, inclusion, protection, and mental well-being. But our magnum opus, the first virtual symposium on protecting children and youth from digital harms, attracted the attention of thousands of leaders in government, civil society, and private sector. And we organized this in collaboration with Save the Children and Tech Change. This even brought together influential policymakers and our young council members for panel discussions on themes including, but not limited to, online harms, hate speech, and cyberbullying. This symposium helped to further the U.S. government’s APCA strategy and USAID’s digital strategy. Thank you everyone for being with us this early afternoon here in beautiful Kyoto, Japan, and welcome all the remote participants that are following from all over the world, from my colleagues in Latin America, and must be extremely late at night, but I’m sure that some of us, some friends of us are there. And sure, there’s so much for me, and my name is Olga Cavalli, I am the National Cyber Security Director of Argentina, and also I chair the South School of Internet Governance here with me, my dear friend, Tracy Hawkshaw. Thank you.

Mattito Watson:
And as you can see, she makes my life a lot easier, because the voice of the youth are us being able to really provide that platform. We’re actually seeing that change start to happen, and we’re actually getting it right in terms of where we should be investing in U.S. government, in terms of dollars, in terms of protecting children and promoting employment for them. Thanks.

Marija Manojlovic:
Thank you so much, Matita, and Ananya, you’ve made my job also easier, because that was a really perfect closure to this discussion. I do want to thank all the participants. I’ll just take two minutes, or maybe like one minute, to try to sum up some of the main takeaways, but I think we all agree, and young people tell us, that the Internet is great. They love it. They like to be online. They like to engage online. It’s opening so many opportunities for them. But online and offline worlds are not, for them, separate. This is just the way that the world is. And we need to make sure that what the rules apply in the online world can be applicable, and offline can be applicable in the online world as well. And some things that are going to help us align across different agendas will be really much more rigorous and strong focus on participation of people who have lived experiences, people, young people who can tell us what the needs are, but also participation in terms of really using a vulnerability lens to understand the trends and threats online to make sure that we can, as we are building this great online world, that we can make sure that we are not exacerbating existing vulnerabilities, existing gender divide, existing issues around the gender norms and toxic masculinity, and issues around radicalization, extremism, and all other forms of expressions of violent behaviors and power dynamics that exist in the offline world, in the online world. And the last thing I want to say is that we have really seen and are calling for action in terms of increased investment in this particular field, because it is really sorely lacking investment, dedicated investment from both governments, but also industry and other players, whether it’s investment in foreign policy goals or investments domestically or investments in internal organizational infrastructure or in frontline services that we all need to have. So with that, I will thank you all for participating in this discussion. I have definitely been too ambitious in terms of the topics we want to cover and people we want to hear, but I’m really grateful that you are all here. I will run to my plane right now, but I will leave you all to chat a little bit more. Hopefully you go for drinks or something. Those who are online, please reach out. We will be happy to engage with you. Go on safeonline.global and follow us on social media, and we will be happy to engage with all of you, and thank you for the session. Thank you all for joining us and we look forward to seeing you again tonight. Again.

Cailin Crockett

Speech speed

149 words per minute

Speech length

1232 words

Speech time

497 secs

Albert Antwi Boasiako

Speech speed

156 words per minute

Speech length

1675 words

Speech time

644 secs

Ananya Singh

Speech speed

165 words per minute

Speech length

738 words

Speech time

268 secs

Andrea Powell

Speech speed

164 words per minute

Speech length

2426 words

Speech time

886 secs

Audience

Speech speed

162 words per minute

Speech length

608 words

Speech time

225 secs

Henri Verdier

Speech speed

166 words per minute

Speech length

731 words

Speech time

263 secs

Julie Inman Grant

Speech speed

166 words per minute

Speech length

1386 words

Speech time

500 secs

Marija Manojlovic

Speech speed

194 words per minute

Speech length

4378 words

Speech time

1352 secs

Mattito Watson

Speech speed

190 words per minute

Speech length

655 words

Speech time

207 secs

Moderator

Speech speed

179 words per minute

Speech length

131 words

Speech time

44 secs

Salomé Eggler

Speech speed

177 words per minute

Speech length

1057 words

Speech time

359 secs