Open Forum #40 Building a Child Rights Respecting Inclusive Digital Future

24 Jun 2025 09:15h - 10:45h

Open Forum #40 Building a Child Rights Respecting Inclusive Digital Future

Session at a glance

Summary

This UNICEF-hosted discussion focused on creating safe online environments for children and closing digital equity gaps, particularly for women and girls. The session was divided into two segments, with the first examining how different stakeholders can contribute to child online safety, and the second addressing gender divides in digital solutions.


In the first segment, speakers from China, South Africa, Norway, and the Netherlands shared diverse perspectives on protecting children online. Ms. Zhao from China outlined comprehensive legal frameworks including dedicated online protection laws and industry self-regulation initiatives. Advocate Lindhorst from South Africa described regulatory approaches combining platform accountability, digital literacy education, and enforcement mechanisms involving social workers and law enforcement. Caroline Eriksen from Norway’s sovereign wealth fund highlighted how investors can influence companies across sectors to respect children’s rights, noting that failure to do so creates material risks. Alex Galt from IKEA demonstrated how brands can take responsibility for child rights impacts throughout their digital marketing value chains, even when working with third-party platforms.


The second segment focused on addressing gender inequalities in digital access and design. Speakers noted that 31% of women globally are not in education, employment, or training, and that nearly half of documented AI bias targets women and girls. Lisa Sivertsen from NORAD and Silje Dahl from Sweden’s development agency discussed supporting digital public goods and innovative financing mechanisms to bridge these gaps. Tawhida Shiropa from Bangladesh shared how her mental health app was co-designed with girls to ensure emotional safety and privacy protection. Annina Wersun from OpenCRVS explained how open-source civil registration systems can challenge assumptions and promote inclusive design by default. The discussion concluded with calls for increased investment in women-led startups, sustainable funding models for digital public goods, and the recognition that digital infrastructure is as essential as roads or electricity for inclusive societies.


Keypoints

## Major Discussion Points:


– **Multi-stakeholder approaches to child online safety**: The discussion explored how different stakeholders (governments, regulators, investors, brands, and civil society) can collaborate to create safer digital environments for children, with examples from China, South Africa, Norway, and the Netherlands showing various regulatory frameworks, industry self-discipline measures, and public-private partnerships.


– **Gender equity in digital innovation and access**: A significant focus on addressing the digital gender divide, highlighting that women and girls face systemic barriers in accessing and benefiting from digital technologies, with statistics showing that only 2-3% of investment goes to women-founded companies and female-focused health solutions.


– **Digital Public Goods and open source solutions as enablers of inclusion**: The conversation emphasized how open source technologies and digital public goods can promote transparency, accountability, trust, and accessibility, particularly for marginalized communities, with concrete examples from mental health apps in Bangladesh and civil registration systems.


– **Innovative financing mechanisms for inclusive technology**: Discussion of how traditional development funding can be leveraged to de-risk private investment, including guarantee schemes, blended finance approaches, and the need for donors to support digital public goods infrastructure as essential public infrastructure.


– **Design principles for women and girls’ safety and empowerment**: Emphasis on co-designing solutions with end users, ensuring emotional safety alongside technical security, implementing human rights standards, and moving beyond passive protection to active empowerment of women and girls in digital spaces.


## Overall Purpose:


The discussion aimed to explore concrete strategies and partnerships for creating more inclusive, safe, and equitable digital environments, with particular focus on protecting children’s rights online and closing gender divides in digital access and innovation. The forum sought to move beyond identifying problems to sharing practical solutions and successful models from different sectors and regions.


## Overall Tone:


The discussion maintained a consistently collaborative and solution-oriented tone throughout. Speakers demonstrated mutual respect and built upon each other’s insights, creating a constructive dialogue. The tone was professional yet passionate, with participants clearly committed to the causes they discussed. There was an underlying sense of urgency about addressing digital inequities, balanced with optimism about the potential for positive change through coordinated action. The atmosphere remained engaging and forward-looking, with speakers offering concrete examples and actionable recommendations rather than dwelling on problems.


Speakers

**Speakers from the provided list:**


– **Sunita Grote** – Lead of UNICEF Ventures team, exploring how openly built emerging technologies can accelerate results for children


– **Josianne Galea Baron** – Co-moderator from UNICEF’s Child Rights and Business function


– **Zhao Hu** – Associate Professor and Secretary General of China Federation of Internet Societies, PhD in political science and theory from School of Government at Peking University


– **Makhosazana Lindhorst** – Advocate, Executive responsible for research, regulatory development, registration, and licensing compliance and enforcement at South Africa’s Film and Publication Board


– **Caroline Eriksen** – Head of social media and active ownership at Norges Bank Investment Management (NBIM)


– **Alexander Galt** – Digital ethics leader for Inter-IKEA Group with responsibility to develop digital ethics practices across the IKEA value chain


– **Lisa Sivertsen** – Director of Human Development at NORAD


– **Silje Dahl** – First Secretary for Development Corporation at the Embassy of Sweden in Pretoria


– **Tawhida Shiropa** – Founder and CEO of Monar Bandhu (Bangladesh), recipient of UNICEF Venture Fund


– **Annina Wersun** – Co-Founder and Chief Impact Officer of OpenCRVS


**Additional speakers:**


None identified – all speakers mentioned in the transcript were included in the provided speakers names list.


Full session report

# UNICEF Forum on Child Online Safety and Digital Gender Equity


## Executive Summary


This UNICEF-hosted discussion brought together stakeholders from government agencies, international organizations, private sector companies, and civil society to examine child online safety and digital gender equity. The forum was structured in two segments, with speakers participating both in-person and online, addressing how different actors can contribute to protecting children in digital spaces and reducing barriers to women’s and girls’ participation in digital innovation.


## Opening Framework and Context


Josianne Galea Baron, co-moderator from UNICEF’s Child Rights and Business function, framed the discussion with a key insight: “So our task is to protect and empower children as active participants and pioneers of the digital world as opposed to protecting them from the digital world.” She noted that research shows “children who experience online sexual abuse or exploitation and online bullying have significantly higher levels of anxiety, more suicidal thoughts and behaviors, and are more likely to self-harm.”


Sunita Grote, Lead of UNICEF Ventures, explained their role in exploring “openly built emerging technologies” and provided context for gender equity challenges, highlighting that 31% of women globally are not in education, employment, or training, and that nearly half of documented AI bias targets women and girls. She noted that only 2-3% of investment goes to women-founded companies.


## Segment One: Child Online Safety


### Government and Regulatory Approaches


Zhao Hu, Associate Professor and Secretary General of China Federation of Internet Societies, outlined China’s comprehensive legal framework for child online protection. He noted that China has “the world’s largest population of child internet users” and emphasized that “industry self-discipline and collaboration between public-private sectors strengthens child protection efforts.” He announced plans for an international conference on child online protection in September.


Advocate Makhosazana Lindhorst from South Africa’s Film and Publication Board described their balanced regulatory approach combining platform accountability with digital literacy education and accessible harm reporting mechanisms. She highlighted the role of social workers and law enforcement in investigating online child abuse cases.


### Private Sector Responsibility


Caroline Eriksen, head of the social media and active ownership at Norges Bank Investment Management (NBIM), demonstrated how investors can leverage influence across portfolio companies. She noted that “companies beyond tech sector impact child rights through digital advertising and marketing practices” and that failure to respect children’s rights creates material risks for businesses.


Alexander Galt, Digital ethics leader for Inter-IKEA Group, explained how brands can take responsibility throughout their digital marketing value chain: “brands like IKEA are the companies that fund this digital marketing ecosystem. We buy the media space to engage with people who want to buy our products. And we think that comes with responsibilities and ability to influence.”


Eriksen announced the official launch of disclosure recommendations and guidance for companies on child rights impacts through a webinar on June 27th, demonstrating a QR code for registration.


## Segment Two: Digital Gender Divides


### Barriers to Women’s Digital Participation


Lisa Sivertsen, Director of Human Development at NORAD, explained how “gender-blind digital tools enhance existing gaps and structural discrimination against women and girls.” She emphasized that women face multiple barriers including lack of access to devices, internet connectivity, and financial services. She highlighted DHIS2 as an example of a Norwegian-supported digital public good now used by over 100 countries.


Silje Dahl, First Secretary for Development Corporation at the Embassy of Sweden in Pretoria, discussed addressing technology-facilitated gender-based violence through coordinated policy responses. She mentioned how anonymous and safe information access is vital for youth in regions where topics like sexual health are controversial, noting that one partner’s app has reached “9 million young people.”


### Inclusive Design Examples


Tawhida Shiropa, Founder and CEO of Monar Bandhu in Bangladesh and UNICEF Venture Fund recipient, shared insights from developing mental health solutions for girls. She explained their approach to ensuring both technical security and emotional safety, emphasizing co-design with target users and implementation in local languages. Her organization has served thousands of students and adolescents through their mental health services.


Annina Wersun, Co-Founder and Chief Impact Officer of OpenCRVS, explained how their birth registration system challenges traditional assumptions. She described how “OpenCRVS shows an alternative, and that is a birth registration form that does not mandate the father’s information,” demonstrating how design choices can address systemic inequalities. She mentioned their use of a fictitious default country called “Farajaland” to demonstrate inclusive configurations.


### Open Source and Digital Public Goods


Several speakers emphasized the value of open source approaches for promoting inclusion. Sivertsen explained how open source systems allow for external audits and adaptability to different user needs. She provided an example of how a solution designed for low-resource contexts proved valuable even for wealthy nations: “During the pandemic, it actually proved so easy to use and such a safe tool that it spread also and it was taken into use also by a lot of middle-income and high-income countries, including Norway.”


### Financing Challenges


Speakers identified significant financing gaps, noting that less than 1% of venture capital funding goes to Africa, and only 2-3% goes to women-founded businesses globally. Dahl described alternative financing mechanisms like guarantees and capital mobilization to reduce donor dependency while de-risking private investment, with SIDA planning to expand these models to health and sexual and reproductive health sectors.


Wersun explained that “digital public goods need upfront investment to become self-sustainable while maintaining their public benefit mission,” with OpenCRVS committed to achieving self-sustainability within five years. Shiropa suggested that “the VC mind should be slightly changed” to better understand open source business models.


## Key Themes


### Multi-Stakeholder Collaboration


Speakers across both segments emphasized the need for coordination between governments, regulators, investors, brands, civil society, and affected communities to address both child online safety and gender digital divides.


### User-Centered Design


Multiple speakers highlighted the importance of involving end users, particularly marginalized communities, in the design process. This was demonstrated through examples from Bangladesh’s mental health services to birth registration systems.


### Digital Infrastructure as Essential Infrastructure


Speakers treated digital infrastructure as comparable to roads or electricity, influencing discussions about financing, sustainability, and public investment in equitable access.


## Concrete Commitments


– UNICEF and BSR committed to launching disclosure recommendations through a June 27th webinar


– China Federation of Internet Societies committed to hosting an international conference on child online protection in September


– OpenCRVS committed to achieving self-sustainability within five years


– SIDA committed to expanding guarantee and capital mobilization models to health and SRHR sectors


## Ongoing Challenges


The discussion identified several areas requiring continued attention: changing investment patterns to value impact alongside innovation, scaling local solutions while maintaining cultural relevance, addressing technology-facilitated gender-based violence across jurisdictions, and ensuring meaningful participation of marginalized communities in digital system design.


The forum established frameworks for ongoing collaboration through international conferences, webinars, and collaborative initiatives, providing a foundation for continued progress on child online safety and digital gender equity.


Session transcript

Sunita Grote: Good morning, everyone. Welcome. It is my honor to welcome you to this session on behalf of UNICEF. My name is Sunita Grota. I lead the UNICEF Ventures team where we explore how openly built emerging technologies can accelerate results for children. I’m joined for this session by my co-moderator online, Josia and Galia Brown from our Child Rights and Business function at UNICEF. We have prepared for you today a two-segment open forum. The first one will look at and bring together a diverse perspective on the unique role that different stakeholders can play play in creating a safe online environment for children. In the second segment, we’ll be looking specifically at how to close equity divides that we still find in the online world and in the design and implementation of digital solutions. We’re going to be focusing particularly on the gender divide and looking at how to design and implement solutions for women and girls. For the first segment, I’m going to hand over to Josie, who will take us through that discussion Excuse us while we try and sort out the technology. There we go. That looks promising. Josie, can you- Hi. Can you hear me? Excellent. Okay. Fantastic. So over to you for the first segment, please.


Josianne Galea Baron: Thank you so much. Forgive the technical issues. It’s great to be live now. It’s great to see everybody in the room. Thank you, Sunita, for setting the scene. Let’s get started with the first part. of this open forum. As has already been introduced, the first part of our forum today will focus on one key question, which is how can different stakeholders from across sectors play their part in delivering a digital world that works for children, one that keeps them safe, respects their rights, and positively supports their well-being. This task represents one of the defining challenges of our times, as we know, with countries around the world grappling with some critical questions, like how will digital technologies, including generative AI, impact children’s lives positively, negatively now and in the future? Or what is the right age for children to participate in online platforms? Or what emerging risks and dangers await children as they explore new digital environments? These worries about what happens online speak to real-world harms and challenges experienced by children today that demand urgent action. Research published by UNICEF Innocenti earlier this month found that children who experience online sexual abuse or exploitation and online bullying have significantly higher levels of anxiety, more suicidal thoughts and behaviors, and are more likely to self-harm. So these are real-world challenges. Having said all that, meaningful access, as we’ll hear more about later, are oftentimes a critical lifeline to opportunities. So our task is to protect and empower children as active participants and pioneers of the digital world as opposed to protecting them from the digital world. So with that, I’m delighted today to be joined by four distinguished speakers to help us understand and explore these issues. Our panelists will take us on a tour around the world, from China to South Africa to Norway and the Netherlands, to see also different perspectives representing distinct stakeholder groups, from regulators to investors to brands and beyond, and explore. how these issues are being addressed in different contexts. We have limited time, so we’ll do our best to engage. We will ask you to put questions or comments in the Zoom chat only if you’d like to engage with us, and we’ll do our best to either address them at the end of this Part 1 or follow up with you later. So with that, let me turn to our first distinguished speaker, Ms. Zhao, Associate


Sunita Grote: Professor and Secretary General of China Federation of Internet Societies, which is a body that convenes the industry under the leadership of the government. Ms. Zhao graduated with a PhD degree of political science and theory from School of Government at Peking University, and she has many years of public management experience in the field of education and a great deal of international experience as well. Ms. Zhao, thank you for joining us today. With the world’s largest population of child internet users and a rapidly growing digital ecosystem, we’d love to hear from you. What measures is China putting in place to guide its internet industry to respect children’s rights


Zhao Hu: in the digital age? Over to you. Hello everyone. Good morning. It’s a great honor to address this forum. The China Federation of Social Societies started in May 2018. We are honored to hold special consecutive status with the UN ECOSOC. Currently, we have 528 members, including major internet corporations like Tencent, Baidu, and Alibaba. Our goal is to promote the development of the digital economy and informatics carry out China has the world’s largest population of child internet users. We have made sustained efforts in protecting minors online. First, strengthening legal safeguards. In 2020, Law of the People’s Republic of China on Protection of Minors introduced a dedicated online protection chapter. In 2021, the Personal Information Protection Law further specifies special protections for children’s personal data. In 2024, the regulations on the protection of minors in cyberspace addresses critical issues such as cyber bullying, data breaches, and internet addiction. Complementary regulations like the provisions on the cyber protection of children’s personal information and the interim measures for the management of generative artificial intelligence services have been enacted under the guidance of the Cyberspace Administration of China. Second, enhancing government oversight. The CAC conducts annual campaigns to regulate minors’ online environments, particularly during summer breaks, urging platforms to standardize use focus. Communities. Leading internet companies have improved minor mode functions, established rapid reporting mechanisms, adopted restrictive measures to against cyber bullying. Collaborative efforts among public security, market regulation and the culture and the tourism, target game accounting trading and anti-addiction system circumvention. Third, mobilizing social forces. The CFIS has established a professional community called Committee on Minors on Life Protection and launched the ad-spread public welfare initiative, protecting the rights and the interests of minors. The online security and digital literacy courses we produced have received over 10 million views. Compiled and published the Minors on Life Protection Annual Report 2024, promoted the preparation of national standards such as guidelines for the safety of AI technology involving minors. Enhancing the protection of children’s rights through industry self-discipline. Fourth, deepening international cooperation. China has learned from international best practices and shared with the world the results of its experience. CFIS partners with the UNICEF China in 2024 to launch the safety by design corporate case collection activity. The selected cases will introduce China’s progress to global internet companies. Distinguished guests, building an inclusive digital future that respects children’s rights is our shared responsibility. In September this year, we will hold the international conference on the child online protection in China. We sincerely invite all of you to come to China at that time to share and discuss practical experiences and the future prospects of China’s online protection with us. Let us collaborate to create a safe, empowering online environment. Thank you, thank you.


Josianne Galea Baron: Excellent, thank you so much Ms. Zhao for so clearly and succinctly outlining also the importance of crafting strategies across a range of different levers and building blocks. The role of building digital literacy skills, the role of lawmakers, and the importance of knowledge exchange and also cooperation really appreciate that intervention. Now let’s move from China to South Africa, turning to our second speaker to bring the perspective of a national regulator for online services. I’m very delighted to introduce advocate Lindhorst joining us online, who is an executive responsible for research, regulatory development, registration, and licensing compliance and enforcement at South Africa’s Film and Publication Board. She has more than 10 years experience in regulatory policy, legal, compliance, and enforcement. Welcome to you, Advocate Lindhorst. The South African Film and Publication Board has taken a proactive stance on child online safety through regulatory frameworks, public education and community outreach. In your efforts to protect children in digital spaces, how do you navigate the balance between driving systemic accountability among digital platforms, equipping users, especially children and caregivers with digital literacy and ensuring accessible mechanisms for address when harm occurs? So, very easy question for you. Thank you so much for joining us. Over to you. Good morning, and thank you very much for


Makhosazana Lindhorst: having us. So, as the Film and Publication Board, we are an online safety regulator that is really responsible for ensuring that South Africans are protected from online harms. So as a regulator, we have regulation frameworks that we have. So streamers of films, games and certain publications are required to register with us as the Film and Publication Board. So that gives us an opportunity to ensure that we are able to have the checks and balances, but through the law, we are able to ensure that the measures that they have ensures that there’s online safety for children. But more importantly, when it comes to social media platforms, as a regulator, we ensure that where there is a prohibited content, harmful content that is brought to their attention through issuing our takedown notice to ensure that prohibited content that might cause a psychological harm to children, it’s being removed. So when it comes to the issue of education and awareness, that is also an area that we focus on as a regulator, because giving digital skills, just like China, that is our priority to ensure that South Africans are given digital skills, especially parents. That becomes the most important part because children have a right to responsible parenting and they also have the right to privacy. Whilst outside the mandate of the FPB, but as a country, we also have privacy laws that ensures that children’s rights are also protected and platforms are required to ensure that they comply with those laws. But on a day-to-day basis, we have a team of dedicated advocacy officers who goes to schools and ensures that children are protected and educated in terms of how they’re supposed to conduct themselves online. We also have toolkits that empowers them, including the teachers and parents. So that is the work that we do as the Filament Publication Board. We work closely with law enforcement. agencies, especially on issues of child sexual abuse material, where we have a team of social workers who are also our investigators that works on the case close with the police to make sure that they compile the reports that we take to court and serves as evidence. We have a dedicated committee, so if one feels their right is being affected by online content, they can lodge a complaint with us and that can be taken to the enforcement committee, which is a quasi-judicial body. And what makes it important is that we are not competing with any other mentors through the court process. This is a dedicated quasi-judicial body that is focusing on issues of online safety and other compliance issues within our mandate.


Josianne Galea Baron: Wonderful. Thank you so much, Advocate Lindhorst, for giving us that very different perspective of a regulator. You’re using different keywords that tease out such an important groups of people and professionals that play an important role in this. Social workers, law enforcement, investigators, these are very important elements for us to think about. But with that, we’re going to take another shift of gears. When we think about children’s online experiences, it’s natural to think of regulators or the responsibility of social media companies or other players in the tech sector. We also have our two next speakers who will help to shed a light on the important roles and responsibilities of other players and influencers in the broader ecosystem that also have a very critical part to play in delivering positive change for children. They will also share about work being done with UNICEF. I’ll turn to Caroline Erickson, who joins you in the Karum, who is the head of the social media and active ownership at Norges Bank Investment Management, or MBIM. Caroline heads the social team within the active ownership area of MBIM. In this role, she leads the work on policy development and engagement with portfolio companies on social sustainability topics, including children’s rights. Welcome to the panel, Caroline. As an investor, MBIM interacts with a large number of companies all around the world. What is the role that investors can play in advancing responsible business conduct for child rights and safety online, and how is MBIM addressing the topic of child rights and responsible business in the digital space?


Caroline Eriksen: Over to you, Caroline. Thank you so much, Josie. I’m delighted to be here. Thanks to everyone following here in the room and online. MBIM is a global financial investor in almost 9,000 companies in 70 markets. So we own a small slice of most of the companies in the world. As a global investor, we depend on well-functioning markets and a long-term sustainable development in economic, environmental, and social sense. We want companies to operate responsibly and with a long-term horizon. And to ensure long-term returns, as a minority shareholder, we can use our leverage to influence companies and improve market practices. Child rights in the digital environment is an important topic for us. We are a fund for future generations, and about 30% of today’s population are children, and they spend an increasing amount of time online. Failure to respect children’s rights can be a material risk for companies. It can entail legal, financial, reputational risk, and they can affect their license to operate. And it goes beyond digital companies, right? As Josie mentioned, for example, think about retail. They may not consider themselves a tech company, but they may impact children’s rights through digital advertising, through other ways of their operation. So it’s a topic that’s relevant to companies across sectors and across markets. It’s also an area where we see an increasing amount of regulation. I can mention the EU Digital Services Act or other types of regulation in other markets. This is regulation that asks companies to be open about how they’re addressing and managing impacts on children’s rights online. This is why we have entered a partnership with UNICEF over the last few years addressing exactly this topic. Together, BSR and UNICEF have done research on companies’ reports and looked at more than 200 reports to see how they address child rights impacts online. They found that only a few of them actually meaningfully address how the companies impact children’s rights. We also experienced this in our bilateral dialogue with companies. They see it as a material topic, but how to address this in reporting is still an area of development. UNICEF and BSR, together with industry, academia and civil society, have developed a set of disclosure recommendations and a guidance that we will soon publish and share with everyone. In fact, I can give a little teaser today. We will launch this officially on Thursday, the 27th of June. I’m really excited about this. It will be a webinar and I’m going to actually show… So yes, here on the screen, you can scan this QR code and sign up to attend the launch webinar. Really excited about this, watch this space, and thank you so much. Hope to see you there. Thank you so much for that intervention, Caroline.


Josianne Galea Baron: I think it’s so important for us to expand our minds and think about the different roles that actors who might be more of the quote-unquote unusual suspects and the very important influence and role that they can play in advancing the goals that we’re talking about. And I hope to see all of you joining us in two days on Thursday for the launch of the disclosure recommendations, which will also include the research that Caroline described. So with that, I will move us on to our final speaker. But before I do, a quick reminder, if you are joining in person or on the Zoom, you are very welcome to make a comment or a question in the chat, and hopefully we will have time to address some of those at the end. So again, and this is something also that Caroline gestured towards, is that when thinking about the variety of industry players that are relevant for child rights and safety online, the deployers of digital technologies are also key stakeholders, not only those that develop digital technologies. Alex Gelt joins us online. He’s the digital ethics leader for Inter-IKEA Group with responsibility to develop digital ethics practices across the IKEA value chain. He has experience of initiating and embedding digital ethics in several public and private organizations through his work with the UK Civil Service and Deloitte. Alex, what does Inter-IKEA mean to you? inter-IKEA group have to do with child rights and safety online? This is a question that I’m sure you get asked in different places. And what actions are you taking? How can other brands play an active role in understanding and addressing their impacts on child rights and safety online? Over to you, Alex. I don’t think we can hear you, Alex, if we can test the mic. It looks like it’s working now.


Alexander Galt: Thanks, I see. And good morning to those of you in Norway and hey to everyone joining online. And thanks for having me here today, IGF, and great speakers that we’ve had so far. And thank you to the fellow panelists. And echoing a lot of what Caroline has just said in terms of being a responsible business. So, first of all, what does IKEA have to do with child rights? We’ve made the commitment to integrate child rights into everything we do across all of the IKEA organizations. And we take that responsibility to ensure that children come to no harm as a result of the direct contact that we have with children and the indirect contact that we have wherever we do business in the IKEA value chain. And whilst we think that coming to the stores is a key part of the IKEA experience, we know that contact with people happens increasingly in an online context. So, actually, most of our shopping journeys start online through the more established e-commerce websites and applications that we have, but also through platforms. search or social media, and other types of emerging platform engagement, whether that’s peer-to-peer reselling marketplaces or social gaming. And where we have direct contact, we put measures in place to protect child rights. So whether that’s our policy and processes on child safeguarding, the data protection mechanisms that we put in place when we engage with children when we’re developing our products, or for the technologies that we put in the home. So this is our IKEA smart product range, where we’ve taken an inclusive design approach with diverse families to hear the voice of the child in relation to design choices that we should make when we’re developing those products. And as mentioned there by Caroline a bit, marketing and digital marketing, I think, is a big area where we have contact with children. So when it comes to marketing, we have a strong point of view already for many years on how we portray children in marketing material. So these are the images and the media content that goes out into the world. And we make sure that we show and express children in a respectful way, in a way that brings them no harm and shows them in the positive light that they need to be. So this is largely what’s in our control and can be directly governed within the IKEA value chain. However, we also want to take a child rights perspective in the more indirect contact places that we have across the value chain. And that needs a more nuanced approach based mostly on the relationships that we have with partners. So again, on this marketing topic, we’re pretty clear or very clear on what we require when we are addressing children, that there is no direct targeting of children in our commercial messages. However, we don’t just stop there. And we know that a lot of child rights harms can also come from other types of practices other than and direct targeting. So whether that’s new monetization techniques like gamification or new models of engagement from influencer marketing or content creator marketing that represent a new form of child labor. And we know this happens on the platforms that we’re engaged with, and we want to take responsibility of how that conduct happens. And brands like IKEA are the companies that fund this digital marketing ecosystem. We buy the media space to engage with people who want to buy our products. And we think that comes with responsibilities and ability to influence. And we want to influence because we know that there is a growing knowledge that these interactions have the potential for harm. So in the partnership that we’ve had with UNICEF, we’ve wanted to research those harms and understand how to best counteract them, especially through the complexity of the different actors in the digital marketing ecosystem. It’s not just a binary of brands and platforms, there’s many actors within the space and many leverage points that we need to engage with. And this allows us to make informed decisions when we’re partnering with media agencies or ad tech companies and the partners that we choose to engage with. And we pass this through our value chain and how we then govern our brand and marketing operations. So this research has enabled the development of a toolkit to break down the due diligence actions and responsibilities that each actor in the value chain should take. So us and those that we are then giving our money to. Helps us to move from this kind of binary thinking of yes, no, or good or bad, should we be online or not, to have a more holistic picture of what, when, who, and how we should engage. So this toolkit will be openly available to all organizations who want to use it, but especially encourage brands who are at the start of this value chain to adopt it and work with the organizations that they outsource their actions to, in order for them to adopt their requirements and recommendations, and that we can uplift the child rights perspective. Thank you, Alex, and for rounding off the panel of different perspectives, right?


Josianne Galea Baron: I think that you tease out such an important element that we return to in so many different areas of business and human rights, which is to take a value chain approach to not only look at one particular actor, but also the bigger picture of the connections in the industry, the different leverage that different actors might have on other players, and the importance of human rights due diligence, which is certainly a very important red thread that goes through the different resources that we’ve discussed. And again, I hope to engage with many of you in the room and online around the digital marketing work that we will be issuing later this year as well. So definitely watch that space. We do have some moments for engagement, so please do engage with us on the chat. Thank you to one participant who’s sharing some research that they have. This is wonderful. And if you are in the room and would like to engage, then please also do it in the chat. I will give a moment or two to anybody who’d like to participate. And if no questions, I will very happily hand over to Sunita to guide us through part two of the forum. Fabulous. It was all very clear. No questions at all. I will hand over back to you, Sunita.


Sunita Grote: Thank you so much. Very much looking forward to the part two of this forum. Thank you. Thank you very much, Josiane. I’m going to ask you both to exit the stage, please, on that side, and we’ll move over to the second segment. of this forum. We are again going to be joined by two panelists online as well as two in the room, so I’ll ask Silje and Lisa please to join me on stage. So in this segment we’re going to be keeping our focus on the importance and concrete actions that various stakeholders can take, particularly to ensure that the digital world, the value and the impact that it has to offer is more accessible and available to those who could benefit the most. So for this panel today I am joined here on stage by Lisa Sivertsen, Director of Human Development at NORAD. I’m joined also by Silje Dahl, First Secretary for Development Corporation at the Embassy of Sweden in Pretoria. And online we’re joined by two speakers, Tawhida Shiropa, Founder and CEO of Monar Bandhu, she joins us from Bangladesh, and Annina Wersun, Co-Founder and Chief Impact Officer of OpenCRVS. So keeping in mind the reflections and recommendations we heard in the first segment, we’re going to focus our discussion further to an often still overlooked and excluded half of the world’s population, women and girls. What we know is that at present 31% of women worldwide are not in education, employment or training. 740 million women in developing economies remain unbanked. One in five adolescent girls are married before the age of 18. These are some of the wicked challenges, some of the big problems that many of us here in the room and definitely those of us on stage are working to tackle on a day-to-day basis. Digital technologies do in fact hold significant promise to help us address some of these challenges, but we are currently falling far short in realizing that potential. In fact we see that not only are we missing a lot of opportunities for women to be successful in the digital world, but we’re also missing a lot of opportunities for women to be successful in the digital world. are missing solutions to address those challenges, often completely in the market. When they do exist, we’re not able to access them. We’re not able to scale them because they are behind often closed or prohibitory intellectual property regimes. When they are accessible, when solutions are being designed for everybody, we often see that actually they fail to meet the needs of women and girls. For instance, we know that almost half of publicly documented bias in AI systems is bias against women and girls. We know that only about 2% of medical research funding goes towards pregnancy, childbirth and reproductive health. If we look at the private sector, only 3% of investment going into digital health is actually focused on solutions that are addressing female health challenges. Similarly, in private capital, only 2% of investment in 2023 went to companies that are co-founded or founded by women. We at the Venture Fund, UNICEF runs a venture fund that provides seed funding to entrepreneurs in developing countries, really had to learn that the hard way. We realized that when we did investing as usual, we just mirrored what we saw in the private capital industry. But at the same time, the investment opportunities in women’s health in particular are massive. The industry is projected to grow to $1.2 trillion by 2030. And so clearly for all stakeholders around the table, there are massive gains to be made if we start focusing our efforts more deliberately on meeting the needs of women and girls. So what can we do to close this gap? How can we make inclusive technology for everyone? How can we leverage this opportunity? So I’d like to start with you, Lisa, please. Thank you for joining us today. I think you had the shortest commute of all of us to get here. We have partnered for a number of years together, particularly around digital public goods, and we co-founded the Digital Public Goods Alliance. Would you be able to share with us your observations of some of the gaps that you’ve seen in your work, where efforts around digitalization, around innovation at country level have really failed to meet the needs of women and girls? Thank you, thank you Sunita, and thank you for inviting us. Yeah, I did have a short commute, but I was delayed because the trains were delayed, so sorry to everyone about


Lisa Sivertsen: that on behalf of Oslo. Thank you so much for inviting us to be here, and I think your introduction really highlighted so many of the gaps and the structural barriers that women and girls are facing, and I think gender-blind digital tools and infrastructures will really enhance those gaps and disparities, and also the structural discrimination. So, from NORAD and Norwegian side, we are very committed to do what we can from our side to build down those structural barriers, and I think even though the gaps really do still exist, and even though we also see some very worrying negative trends when it comes to gender and marginalization in terms of access to digital tools, but also the failure of designing gender diverse tools, and I think also there is a lot of research pointing to the lack of trust in those systems, which I think is a big issue for all societies in the world. What gives me hope is that there are also so many examples. of partners and also governments working together with the private sector, research institutions, civil society, women’s social movements to close those gaps. If I could highlight some of those examples, there are so many. I think from the Norwegian side and NORAD, we try to do what we can to support initiatives and new tools, but we also really try to focus on the system approach. What can we do to enhance and support global digital public goods, but also the infrastructure that every country in the world really needs to invest in to make sure that you have inclusive digitalization. We are so grateful about the opportunity to work with UNICEF and also with UNESCO. We work on engaging more women and girls to get education and work within the STEM sectors. I think that’s a really crucial investment that we need to do for the years and the future to come. We are also really excited about the digital education strategy of UNICEF. Norway has been supporting for more than 20 years this digital public good infrastructure that is being designed by the University of Oslo. It has a really complicated name, DHIS2, but it’s an excellent tool. It was designed to be a low-cost open source. infrastructure for dealing with health data for lower-income countries. During the pandemic, it actually proved so easy to use and such a safe tool that it spread also and it was taken into use also by a lot of middle-income and high-income countries, including Norway. We actually started using it ourselves. So that’s an excellent and important tool that we will continue. More than 100 countries are now using it. And then OpenCR-VS is also another system that we have invested in. So that’s an open source system for birth registration and a lot of other official tools as well to use, which I think provides, being open source, it also builds trust and it makes it possible to adjust to the needs of women and girls.


Sunita Grote: Yeah, fantastic. And you’ve set the scene really nicely for some of the discussion points we’ll dig into a little bit later. We’ve actually got Annina joining us from OpenCR-VS. I think she can probably speak to that specific example even more. And we’ll try and talk a little bit more about this crossover between safety and trust and open source as well in the second half with two of the product owners that are going to be joining us online. So thank you for that. Silja, I’d be interested to hear from your experience leading Sida’s engagement regionally, what successful approaches you’ve seen in your region around digital innovations that’s being designed with and for women and girls. Maybe specifically how that’s influenced your viewpoint, your engagement, your choices and approaches in the region.


Silje Dahl: Thank you, Sunita, and thank you for inviting me. I think it’s great that we have both Sida and Norad in the same room since we are, of course, allies and working closely with a lot of different thematics. At the Swedish Embassy in South Africa, we work specifically on integrating the Swedish government’s strategy for sexual reproductive health and rights in Africa. So we work with, of course, many different stakeholders, UN agencies, civil society, social businesses, entrepreneurs. And I think globally CIDA has about 140 partners working specifically on digitalization as well. So it is a priority for us and also, of course, for the Swedish government. But I think for CIDA it’s very important that we support programs and partners that actually are bridging that digital gap and the gender divide as well. And ensuring that women and girls actually can access digital tools that are useful for them as well. And that’s one of the reasons why we have recently joined into a partnership with UNICEF for this specific FEMTIC initiative. Because we also know that entrepreneurs are the ones who will develop the new models. They will come up with the new technologies, with the new ideas that can potentially enable economic growth, but also lift people out of poverty. Which is, of course, our end goal. However, as you very well said, Sunita, women are very underrepresented in this area. And we see that there is a lot of digital products being developed that are not at all addressing the actual needs of women and girls. And especially within SRHR, where we work a lot. And we also know that in many rural areas, in many countries, women and girls, they don’t have access to digital devices. They don’t have internet. So how do we reach them with information? So for us it’s important also to support initiatives that not only support women and girls, but also women and girls. not only cover this gap, but also work on providing correct data. They might work on domestic financing, for example, or trying to have inclusive policies and legislations in their specific countries. But most importantly, that the solutions that we are supporting are locally adapted and actually are for the groups that we are intended to work with and also for the groups that normally are left behind, like the LGBTQI community, for example. A few examples from our region of partners that we support, and I think that in the previous sessions it was touched upon as well, the backside of digitalization and the harm of children online. So we have one partner working specifically on ending gender-based violence and they have developed different tools on how to work with that. And they work specifically on addressing technology-facilitated GBV, which is a new word I just learned in the Southern Africa region. But it’s very interesting because they also see that GBV has increased online and is also posing a risk of normalizing violence. And so they work with both civil society organizations, but also governments in ensuring that technology-facilitated GBV is incorporated into national legislation and policies. So that’s one example of how we can work with that in regards to women and girls. We also have another partner working on improving access to SRHR information in Western Central Africa, which is a region in Africa that is very difficult to work with SRHR and it’s sometimes very controversial as well. So you need to find ways to share information in a way that is not super controversial. And we also know that youth are more vulnerable. for example, to HIV, to other sexually transmitted diseases, and also early and unwanted pregnancies. And with this app, they can gain access and information that is anonymous and also safe, and they can also get information on how to find health facilities where they can get support. And so far, this app has reached about 9 million young people in that region. So, just two examples on how we work on that.


Sunita Grote: Yeah, fantastic. Thank you. And you called out specifically, I think, the ingenuity of entrepreneurs in kind of putting forward and trying new approaches. So, I’d like to turn now to our online speakers. We have Tawhida joining us from Bangladesh. And she’s a recipient of the UNICEF Venture Fund as part of our health cohort that we launched last year. And her team provides holistic well-being services in person and online in Bangladesh for children and young adults. And they’ve developed an app where, with support of the UNICEF Venture Fund, they’re looking at building an open machine learning model to help them enhance detection and also recommendations to parents. So, Tawhida, I’d love to hear from you if we can kind of dig into a little more deeper into actual, a specific solution, a specific product. Kind of hear from you how this idea of really putting the needs and rights, like safety, like data privacy and others, of women and girls at the center of how you’ve developed your product. Over to you, please.


Tawhida Shiropa: Thank you. Thank you very much, Sunita. And a very good morning, very good afternoon from my side. And thank you for having me today. And, yeah, first of all, I wanted to really show my gratitude to the UNICEF Innovation Team, especially UNICEF Venture Fund. And because I’m here because of the journey. And now I’d like to talk more about the technology and the product. We will be talking more about the passion that we are building right now in Bangladesh. So, at Moner Buntu, it’s a woman-led start-up company, so we’re building and transforming the mental health and well-being for all, especially the women, girls, because it’s a woman-led company, so we’re always considering the women in our first priority. And where we develop the technological team and the other things, we always mention and prioritize that. So, Moner Buntu is transforming the mental health and well-being using AI and the machine learning model and human-centric approaches, including the counseling, the psychometric assessments, and well-being tips and techniques, and all this assessment as well. And when we talk about the girls’ and women’s safety, this is something we always think about. The first thing we have to make sure of, because when we start the conversation from the very scratch and building our technology, especially the Manushi app, which is for the child well-being, and we’re using the open source and very much aligned with the digital public goods. So, we’re keeping those in mind, and with this spirit, we make sure that it’s co-designed with the girls. It’s not like something we build, I mean, like considering our idea, our thoughts. But I think thousands of students, not only from the urban area, then the rural and the semi-urban, even the remote area girls who are taking our services every day. So we wanted to build this app with them. We sat with them. We talked like the, I think, hours after hours, what are their struggles, what are the things they have been going through. And they told us about their being afraid to report their abuse, especially in their abuse by their, sometimes their community. or something happened is the gender-based violence they face, even about hiding their anxiety, especially the anxiety in the exam, anxiety, another anxiety that sometimes they thought about the cell phone issues. So, and they shared like at the time when they are going through this kind of issue, there was no private space so they can share those things because they think like maybe it would be like, it would be public or they will be, I mean like again, they will be bullied from their peers or in the social media or that online platform. So, we built and considering their thoughts and when we sat with them, so and we always, you know, make sure those things like we build the anonymized assessment and of course we designed this quick checking space in our app and make sure we could choose a female counselor so anytime they can reach out to us through our platform. And another thing we always wanted to make sure like the safety, you know, the for girls safety is not about the encryption, you know, we said like our model or our app is end-to-end encrypted, but for girls what we have seen, we have experienced for girls safety is the emotional safety. So, when a girl fills out the, I think the check-in, well-being check-ins or fill out anything assessment, so we wanted to make sure from our side we prioritize their data protected security. Sometimes they are under arrest might be so that they don’t have the access on the mobile phones or smartphone. So, we take the variable concept from their parents, but sometimes parents are not that much educated and those times we also call them and, you know, discuss with them these other things. So, this would be helpful for your child well-being and then we get the concept from them. Of course, we wanted to make sure the data and security with that note and of course and with that sense we said. like our back-end system. And we have the rest flags, like the high-risk flag to responses in the real time. So whenever we can see any time kinds of the gender-based or trauma-centric anonymized data, we said. So immediately, our counselors or our female counselor, they respond to the client. And of course, we maintain all the sensitive escalation protocol. We always maintain those. And number three, I wanted to mention the privacy. When we developed our whole process, even the UX design, we always got the feedback from the customer and always input those feedback into our site and the app as well. And with that means, like what we want to say, like this, of course, is encrypted, distorted, the very rooted access controlled by our tech team. And we also maintain the global standards things. And it is true, I think Sunita mentioned the first part. And I think I heard all the conversation from our amazing panelists. So this is true, like the AI models can be biased to some point. And that includes also the large language model and the ML and the machine learning models. So we always wanted to make sure like, of course, this data have to be protected, have to have the guided and accurate, as much as accurate data. Then we can provide those things. And of course, I think that later on, maybe that we will discuss more on the open source. But what we have seen so far with when we’re deploying this, the 10, almost 10,000 students and adolescents, including the majority of the adolescence girls in the rural school and some schools in the urban area. So and what we have seen, like we got that, I think, huge response from their side. And that will help us to accurate more data processing system. And of course, with that note, we maintain this like that. We wanted to make sure like the whole protocols, the whole sensitive things would be the gender. We maintained the gender-sensitive framework, and of course this had to be very human-centric. And we wanted to really get all those advantages from AI and technology so it can be more accessible for everyone. And one thing I really wanted to make sure from my side, and this is our vision, is that the Manushio app is not just an app, it’s an upcoming technology. We wanted to make sure it’s a promise to all the young girls so they can feel safe, so they will be heard. And the most important thing, whatever they have gone through, they don’t need to be gone through at their site. So this is something we built from our side. And yes, here we are. And thank you to the UNICEF Venture Fund. They gave us lots of ideas and insights about the digital public good, and of course the open source. So we wanted to make sure it’s very cost-effective and low maintenance costs, and very transparent with what we’re building right now.


Sunita Grote: Thank you very much. Thanks so much, Tawhida. Really good points, I think, around – thank you, you’re getting some applause in the room in case you can’t hear it. Really good points around really emphasizing that product development and product design is a necessary part but in no way sufficient if we think of inclusivity of an entire program and making sure we reach women and girls and provide them that emotional safety that you spoke to. I’d like to dig a little bit deeper into an aspect that several speakers have already touched on, which is this idea of creating digital solutions that are digital public goods. If you haven’t already and you’re not familiar with the concept, there’s a great booth just outside by the Digital Public Goods Alliance where you can learn a little bit more about the concept and meet some of the partner organizations. There’s also a high-level panel this afternoon that’s going to bring together major players in the digital public goods space. My little plug. But here we share this commitment and we have panelists that are very active. chosen to develop in the open and to develop open source solutions. So I’d like to turn, Annina I’m gonna start with you because we haven’t heard from you yet, really and hear your perspective about what role does this approach and focus on open source actually play in improving inclusivity and safety of solutions? How does that approach enable us to build that trust and to actually


Annina Wersun: operationalize this commitment that we have? Yeah, absolutely. Thank you so much Sunita and good morning, good afternoon to all of you. It’s an honor to join the panel and I feel truly inspired as I so often do when listening to the work that other people are doing in this area. So thank you as well for all the work that you’re doing. So as Sunita mentioned I’m from OpenCRBS and we are a digital public good for civil registration and vital statistics. So that’s the registration of births, deaths, marriages, divorces, adoptions and the use of that data for vital statistics purposes. And when we talk about inclusivity by design it’s really not just a principle, it’s something that we can demonstrate in practice. And as a digital public good, OpenCRBS gives countries a chance to see what inclusive digital services can look like right out of the box. So when a country starts to use OpenCRBS they will see a pre-configured version of the system based on our default country. It’s a fictitious country that we’ve created called Farageland and it supports a number of different vital events. And so the forms that they see initially are designed specifically to spark important conversations. So these countries can then subsequently configure the forms, they can make them work exactly as they need, that’s the whole premise of OpenCRBS, that it’s configurable for a range of different country contexts. But they can see out of the box what is possible. So what’s an example of trying to spark these conversations? Take birth registration, for example. In many countries, the father’s details are either assumed or required. But OpenCRBS shows an alternative, and that is a birth registration form that does not mandate the father’s information. And that simple design choice opens the door to discussions, even at the highest levels, about how to create more inclusive services for single mothers, for survivors of gender-based violence, or for any situation that doesn’t fit a narrow norm. And we’ve seen in many countries, when this comes up and we do a demonstration of the system, we have the reaction, oh, well, you know, why are the father’s details not there? Of course, you know, a woman would always be married. And these conversations at the highest level really allow us to explore these assumptions and also inform and educate those who may just not have access to this information. We’ve had some really, really interesting, engaging discussions where people have come away with a different perspective. It’s a powerful example of how design can lead policy and how a digital public good can educate and influence across government just by showing what’s possible. We also use data to reinforce this idea. So OpenCRBS includes dashboards specifically configured to highlight insights into the experiences of women and girls, insights that too often go unseen. And as civil registration is the only continuous source of population data, it really is so rich if you just ask it the right questions. Imagine if a government could pinpoint where young mothers were giving birth, so they could direct maternity services to those areas, or whether they could understand where and how women are dying to target life-saving interventions. Just being able to visualise this, and Sarah Walsh. So, we have to visualise these options. So, out of the box, OpenCRBS comes with this option and countries can take it away if they don’t want it. But more often than not, they see this data and they accept the data. And in fact, they might bring in another ministry, for instance, Ministry of Women’s Affairs in many countries, that they can take away from this data. So, we have to think about how we can make it more inclusive and how we can make it more specific interventions. And this really for us is the power of inclusive design, backed by meaningful data. It’s not just about technology, and as Tawhida said, there’s so much more to an implementation of OpenCRBS than just this design. But it certainly allows us to begin shifting systems and showing governments what good and how we want to start things, in order to ensure what we want to accomplish and what we want to change. So, we have bought these already. They have actually been built by GodforSoce. And as you imagine, they are incredibly different. If you want to be heeders of them, you have to build them by yourself first.


Sunita Grote: Thank you. Thank you very much, Anita.


Tawhida Shiropa: Of course, this the first thing from our experience is wonderful. What we do is we have the thousands of millions of people, they are getting our service. We have to do a lot of things to ensure the inclusivity and the safety, especially when we are working with the vulnerable communities, young girls, adolescent girls, and marginalized communities, and the women, of course, definitely. So one thing we wanted to make sure for, and to, I think, using the open source, because when we’re starting working with the open source code, like this is very, I think, very transparent, right? What our code like is open. So of course, like the anonymous data, the encrypted data. So we don’t want to show anyone’s personalized information, but the whole coding system that can be open, and the communities, the stakeholders, including the government can see exactly how the systems are going, right? What a decision already made. Even the, I think, the safeguard policy, what we have taken so far. And one of the things, like the vital things we wanted to make sure within these things, like that dealing with the sensitive data, like the, you know, the mental health, and the well-being data is one of the very personalized and very sensitive data, what we are dealing at this moment. And that’s the open source, one of the things, like, we believe that open source align with the digital, a public course, make us very accountable to our customer, to the people, like to the young girls. Maybe they don’t understand open source, they don’t understand the data and security, but they understand their privacy, right? So I think open source and the digital public, that’s really aligned us to accountable to the girls who share their data with us, or their parents, or the mothers, right? And another thing is like the accountability, you know, builds the trust. So I think that’s the fundamental things when you, I think, providing the mental health services to everyone, there’s a trust, the confidentiality. So that’s the things of, I think, we, I think, align with that. And then another two things, like the, of course, the, you know, the localizations and the tools, like the market, the meaningful things, what we We are building. So as communities, sometimes we went to many communities when even deploying and developing the Manoshi app. So there is many communities, they don’t have the proper applications and some of the, I think, parents, they’re very low income and they don’t understand the data and everything. So we contextualize all those things in their language, sometimes in our language in Bangla, and sometimes in their language, more local things. So we make sure that they understand what we are taking from them, right? So that’s the, I think, open source and the whole digital public goods. When I read all the digital public goods alliance, there’s missions, there’s, I think, statements. So we feel like this is something that’s very good for us and I think proper things. One of the things when we act with the open system, like the local developers and organization customize those things, and that’s also very cost effective because they don’t need to start these things from the very scratch from their side. So they can, you know, anywhere around the world, I think, across the world, so they can just remodeling those things they develop from, I think, deploy in their local context. And of course, there is a flexibility. I think this is one of the things that’s what turns the digital public goods into a human solution. So that’s one of the things that’s very fascinating, what we found. And I think that number three, which I think the last things I wanted to mention, like the removing all those license fees, like, you know, when we start Moner Bandhu and we suffered and struggled with the license fees because sometimes we have the credit card, but we can’t pay from the rules and regulations from the Central Bank of Bangladesh. So we struggled a lot. So I think the upcoming people who are, I think, very… They are very much passionate in the tech industry and especially the females, the entrepreneurs. I really wanted to cheer for them and advocate for them so they can take all these leverages for this. I think the open source code what we are building. So this is something, this makes all those things more inclusive and of course the safety because I mentioned all this accountability makes sure the safety and the trust. So that’s why we are here. And of course it helps with the low resource institutions so they don’t need to spend lots of money on that. And of course they can add up, they can share, they can scale the tool. So there’s all those things that are there. So they just need to hold those things and make sure the accountability and the trust. And of course it’s not about the code, it’s the values, right? So the digital public goods have the values. The UNICEF Venture Fund, they taught us how can we leverage all those, I think, good values and good values incorporate to our impactful business. So I think that’s the things we are doing right now and we are really making sure there’s no one left behind. So we have to be accountable to everyone and I think open source and digital public goods are the two fundamental things that can make sure that. Yeah, we are ready to scale those things in our program and in our app to across the world. Thank you very much.


Sunita Grote: Thanks so much, Toheedan. I think you touched really nicely on all the different benefits we see from open source. So from a product design development, but also a business model perspective around having to tackle questions around fees and licensing and how really that model enables others to pick up your work, adapt it and scale it more easily. So thank you. The other piece that I appreciated is that often we associate digital public goods just with its open sourceness, but you spoke also very nicely and brought us to our attention how the digital public goods standard isn’t just about licensing. It’s also around safety. It’s also around protection. It’s about human rights being standards. And so that digital public goods standards really is something that many of us are trying to hold ourselves to in order to ensure we are respecting human rights in our digital efforts, as well as access and inclusivity. So thanks so much for highlighting those different elements. Lisa, I’d like to come to you on the same question, probably from a different perspective, though I don’t know whether you have built and coded open source products yourself, maybe. It’s just to reflect a little bit more with us on what value you’ve really seen in your focus on digital public goods and on open source, how that’s been kind of a conduit and a facilitator of your focus on safety and inclusivity.


Lisa Sivertsen: So yeah, thank you so much. It’s really interesting to hear from OpenCR, yes, but also entrepreneurs that are building these systems, and I think you mentioned it already, how these systems make it possible to build accountability and trust, because they are transparent, they are open for external audits, and they are also adaptable. And from our side, we’ve actually developed our own open source data policy that we are trying to use across the different areas we work, and there is a lot of resistance, you know, there is a lot of resistance from different companies that are, you know, selling systems that are not open sourced. So this is something that we are really trying to push across areas such as education and health, but also the work that we are doing on environment and climate. So it’s a bit challenging, but I think also it’s crucial. It’s the only way forward, I think, to build these accountable and open systems. It’s also a lot about costs, because open standards makes it possible for different systems to talk to each other and work together. I think we’ve all been frustrated trying to access different digital systems that are not working together. It has a lot of costs, it’s very challenging for everyone. I like the term that Anita was using about emotional safety, giving up your data and engaging with digital systems within national ID systems or health and education. It’s a lot of very sensitive data and you need to have that trust that it’s not going to be misused by someone at the other end. And then of course open source systems do have that potential for being more adapted to the different kinds of users, also marginalized parts of the population in any country. And we really need that to be able to continue to work against those gaps. I think particularly some of those privacy concerns are really relevant to those that tend to be more marginalized, tend to be traditionally more excluded.


Sunita Grote: That’s almost of even higher importance when we work with those sorts of communities. As we move towards the last piece of our discussion, I’m going to ask each of the panelists to reflect a little bit on the role of stakeholders that are not present on this panel and kind of what their calls might be or what their reflections might be. Before we do that, I wanted to specifically turn to Celia. And as part of our ongoing discussions around how we partner together on the Venture Fund, we also have started exploring how we can actually leverage private capital, different finance. modalities beyond traditional grant, beyond traditional development funding to actually move capital flows more towards inclusive technology. So I’d love to hear a little bit from you on Sida’s approach to engaging those different financing instruments. What have you seen happen in that space? Thank you, Sunita. And also, first of all, thank you to all the panelists.


Silje Dahl: Very interesting. And I think that this is what you touched upon now is so important. And I think we all have seen the changing in the donor landscape for the past year. And it’s been, of course, challenging, both as donors and as partners working on the other side to find financing for the work that they do. For the Swedish government, it’s a very high priority that the Swedish Development Corporation should focus on leverage funding from other kind of sources and also supporting our partners in doing that. I think we believe that the part of the additional financial flows to long-term solutions that will empower women and girls and youth needs to come from other, for example, private investments and also that our partners shouldn’t be so donor dependent, to be honest. And in doing that, I think we as donors have a very important responsibility as well. And we know that, for example, today there are very limited opportunities for private actors to do business in fragile contexts or in emerging markets. Local entrepreneurs, they have very limited risk observation capacity, for example. So how can we as donors support them? So at Sida, we have different financial models that have been quite successful, I would say. And one of them is, for example, a guarantee. And that’s where Sida will help reduce the risk to a lender. So for example if we have a new startup, an organization working on health for women and rights in a country in Africa and they want to expand their business, they want to enter new markets, they want to sell products in markets that they haven’t been in before, for example. The local bank might be hesitant saying it’s too big of a risk. Then Sida can cover that potential risk for the bank and enable the organizations to grow and start a business. So that’s something we’ve done a lot in other sectors such as energy and the transport sector, environment and this year we will start doing it in health and SRHR. So very excited about that, but I think it’s a very successful model for trying to find other kind of funding. Another model that we use is mobilizing of capital and that’s where I think Sida has a very important role in being catalytic in connecting different partners with each other. So we can help our partners meet other potential financial partners that can provide support to them and this has been something we’ve been working on for many, many years. But it’s also a way of creating sustainability for our partners and less dependency on us and also forcing them to leverage new money. So what we do is that we say if you can find X amount of money from other funders, we will match that. And I think that’s a model that works really well and especially now when we want to expand into Femtech and work more closely on that and work more on new kind of partnerships. I think it’s very important that we look at different financial models as well. to create more sustainability for our partners, which I think should be an end goal with aid. The government donors should be less, and then there will be other kinds of financing flows coming in.


Sunita Grote: Yeah, thanks so much Silje. I think there’s a really long way to go to close that financing gap, so I really appreciate those efforts. We know, for instance, that less than 1% of VCs, so venture capital funding, goes to Africa. It’s crazy once you think about how unequal that distribution is. And then I spoke earlier about the small proportion that goes to female-founded businesses. So if you put that on top of each other, you end up with a very skewed picture of where private capital is currently flowing. On the UNICEF Venture Fund side, we’ve seen similar experiences, and what we’ve been really excited about is seeing how the development funding, that you’ve mentioned as well, can be catalytic. It can de-risk, right? It can sort of showcase that those risks can be managed. How can they be addressed? What do those markets actually have to offer? How competitive those solutions actually are from an investment opportunity perspective? And so, you know, we’re really looking forward to having more of these conversations with private investors to see how that capital can support some of the aims we’ve discussed here today. I’ll turn my attention to our two speakers online first, and Lisa, I’ll come back to you at the end to just reflect a little bit more on the question as sort of a final call to action in a minute about who else should be at this table to move from kind of maybe what are still fairly niche approaches to shifting entire systems, and what role do you think that they need to play? Anina, I’ll start with you.


Annina Wersun: Yeah, thanks so much. And I think we could probably talk for a long time just on this topic. But I’m going to build, if I may, just on what Celia was talking about, and I am going to call also to the donors. I mean, we are incredibly lucky at OPA. from CRVS, we have the support of NORAD, who really as a donor are gold for us because they see the importance of funding the core product. And as a digital public good, that’s something really, really difficult to find, to find funding that’s very flexible, that understands that we need to manage and maintain a digital product that is ultimately supporting critical government infrastructure and not tying that to specific implementations, for instance. But I would call further to build also on Silje’s point, the challenge we have as digital public goods is to be able to invest in, well, in a few different things. One is absolutely to become sustainable. So we have to think about how we can generate revenue and at OpenCRVS, we’re spending a significant amount of time on this, looking at how we can in a market that really thinks and expects us to be free forever, how we can change that, how we can start charging for our services. And in fact, actually, you know, several countries and partners have seen the value that we can bring to the table, but we do have to start operating more like a business. And sometimes that can be very uncomfortable for partners working in our domain, which is totally understandable. But in the same way to become sustainable, we need to think about creative ways of generating revenue. And just like for-profit companies may, well, most of them invest in research and development, we also want to invest in research and development. And in fact, we would love to establish a research and development branch specifically for women and girls, because we truly believe that there’s so much potential to unlock, both in their experiences and from a protection perspective, but also in the economy. And we have to, you know, we have to recognize also the values, obviously, that women bring to the economy. And so, obviously, civil registration, touching on all these different Thank you so much for joining us today. As we think about life events, there are so many opportunities where we can protect women and girls and prevent them from experiencing a life that doesn’t allow them to become active members, for instance, in society and within the economy. But we do need donors to be able to also invest in that. In the donor landscape, as Silje mentioned, we ask that donors take a risk to support digital public goods upfront to invest. And our promise to donors now in the conversations that we’re having is that in five years’ time, we want to be self-sustainable. But in order to get there, we need upfront investment. We need to invest in R&D. We need to invest in our business development strategies. We need to invest in trying out different business services. And it’s really not traditional. It’s different. It’s not necessarily what our industry is used to. But we’re certainly excited about that future because we want to become self-sustaining. We believe that the value of OpenCRBS has been realized. We’re working with eight countries at the moment, and there’s many more who are kind of lined up to explore. And if we think about the potential of, for instance, preventing child marriage, you know, through understanding the age of children or having data to be able to actively take steps in order to do that, that’s something we want to explore more within our research and development capacity. Just think about the long-term effects that that could have on women and girls, on the economy, and for businesses around the world as well. So really a call to donors to, there are always immediate challenges and problems. The world we live in, unfortunately, is experiencing too many of them at this time. But we truly believe that with upfront investment, we can become self-sustainable. And we do think that digital public goods can do that, and they can bring about more and more positive outcomes for those around the world. Thank you very much.


Sunita Grote: Thanks so much, Nina. Tawhida Shiropa?


Tawhida Shiropa: Yeah, I think this is the question. I want to start with a change in the mindset. First, we need to change the mindset of the stakeholder and to some point the investor. Because I met many investors, sometimes they are not convinced about the open source models. What we are building, they wanted to see the revenue from day one. But throughout the process, we learned how we can generate the revenue. And right now we are generating revenue, we are in the cash positive side and we are profitable as a startup. But I think this journey was quite difficult because access to finance as an entrepreneur, I wanted to mention as a woman entrepreneur, is always very difficult. Especially in the banks and the banking system. I think I wanted to mention in Bangladesh, it’s really difficult to access the finance, especially in the banking system. And of course, we need the investors who really value the impact and who really value alongside with the innovation. And that’s one of the things I really wanted to mention. As Sunita mentioned in the whole conversation, less than 2% of the VC fund goes to women-led startups. So I think we really wanted to change the picture. And I think we have all the capacity and the capability. We have the resources, we just need to trust from the VC side. And because we have the scalability option, we have the safety security, we make sure the inclusivity. We wanted to make sure the proper service to our young girls, the future underserved people. So yes, this is something I really wanted to mention. I think the BC mind should be slightly changed so maybe that can make a bigger difference and of course the huge impact in this world and of course ensuring all this open source model and we are the proved example like it can be profitable, it can be generating revenue so you just need to trust a bit on us and thank you very much.


Lisa Sivertsen: Thanks Tawhida. Lisa. Thank you. I think we definitely need to continue to develop those public-private partnerships in order to mobilize funding, we need to engage emerging donors more, it’s great to have IKEA and others on board but we need to have more participation from the private sector but then also the tools and the systems will not be really meeting their goals unless they also engage the users and a diversity of users also finding ways of how to engage the marginalized communities and people to make sure that the systems actually work for everyone. I think also one last point from me, we need to recognize that digital public infrastructures and goods are essential infrastructures for any society just like roads or electricity, we need those systems to be able to build inclusive digital societies so I think that’s a recognition as well that we need to continue to champion across any geographical context.


Sunita Grote: Thank you. Thank you so much to all of the panelists. including in the first segment, for all of your openness and anecdotes and reflections today. I think it was a very rich discussion. And from my side, I hope that as we go about our business here at the Forum and back home, that this discussion maybe gave each of you one specific action point that you feel you can take home with you as you look at either designing or building products. And we heard very much around the need to be deliberate in design and research to ensure diversity, to be deliberate in how products are designed, what code we choose, what models we choose, what data sets we choose. And also how important it is to be able to look under the hood of a product, to be able to create actual safety, actual data privacy, but also that emotional safety, that empowerment. So that we don’t just have passive users, but we have empowered, informed users that can engage with the digital solutions we’re putting out there in the market and can question us. We heard from Tawhida Shiropa about accountability and how that raises the bar on accountability, which I thought was a really powerful framing. And maybe not surprisingly, I think as all of us are facing completely unprecedented challenges when it comes to the financial landscape, we heard so much about how and where each dollar, each kroner, each euro is put, can really shape not just what’s in the market, but who uses it, and to what extent those users are actually builders and owners of what’s put out there in the world. Today, in terms of digital landscape. So I encourage you to approach those panelists that are in the room for any further discussion that you might be interested in. Thank you very much for choosing us and for spending your precious time listening to this discussion. I, for one, really enjoyed the dialogue. And maybe at the next IGF, I’ll be facilitating a panel of four men talking about the importance of women’s health and looking at inclusive solutions. That’s my hope as I walk away from this panel today. So thank you all and enjoy the rest of your day.


Z

Zhao Hu

Speech speed

75 words per minute

Speech length

462 words

Speech time

369 seconds

Legal frameworks and government oversight are essential for protecting minors online

Explanation

China has implemented comprehensive legal protections including the Law on Protection of Minors with a dedicated online protection chapter, Personal Information Protection Law for children’s data, and regulations addressing cyber bullying, data breaches, and internet addiction. Government oversight through annual campaigns and collaborative efforts among agencies ensures platform compliance and standardization.


Evidence

2020 Law of the People’s Republic of China on Protection of Minors with online protection chapter; 2021 Personal Information Protection Law; 2024 regulations on protection of minors in cyberspace; annual CAC campaigns during summer breaks; collaborative efforts among public security, market regulation and culture/tourism departments


Major discussion point

Child Online Safety and Protection


Topics

Children rights | Legal and regulatory | Cybersecurity


Disagreed with

– Makhosazana Lindhorst

Disagreed on

Role of government oversight vs. industry self-regulation in child protection


Industry self-discipline and collaboration between public-private sectors strengthens child protection efforts

Explanation

The China Federation of Internet Societies mobilizes social forces and promotes industry self-discipline through professional committees and public welfare initiatives. Leading internet companies have improved minor mode functions, established reporting mechanisms, and adopted measures against cyber bullying under this collaborative framework.


Evidence

CFIS has 528 members including Tencent, Baidu, and Alibaba; Committee on Minors Online Protection; online security courses with over 10 million views; Minors Online Protection Annual Report 2024; national standards for AI technology involving minors


Major discussion point

Child Online Safety and Protection


Topics

Children rights | Economic | Cybersecurity


Agreed with

– Makhosazana Lindhorst
– Caroline Eriksen
– Alexander Galt

Agreed on

Multi-stakeholder collaboration is essential for effective child online protection


International cooperation and knowledge sharing are vital for effective child online protection

Explanation

China emphasizes learning from international best practices while sharing its own experiences globally. The partnership with UNICEF and hosting international conferences demonstrates the importance of collaborative approaches to child online protection across borders.


Evidence

CFIS partnership with UNICEF China in 2024 for safety by design corporate case collection; upcoming international conference on child online protection in China in September; sharing China’s progress with global internet companies


Major discussion point

Child Online Safety and Protection


Topics

Children rights | Development | Cybersecurity


M

Makhosazana Lindhorst

Speech speed

146 words per minute

Speech length

446 words

Speech time

182 seconds

Regulatory frameworks must balance platform accountability, digital literacy, and accessible harm reporting mechanisms

Explanation

South Africa’s Film and Publication Board requires platform registration and compliance while issuing takedown notices for harmful content. The regulator combines enforcement with education through dedicated advocacy officers who provide digital skills training to children, teachers, and parents.


Evidence

Registration requirements for streamers of films, games and publications; takedown notices for prohibited content; dedicated advocacy officers visiting schools; toolkits for children, teachers and parents; quasi-judicial enforcement committee


Major discussion point

Child Online Safety and Protection


Topics

Children rights | Legal and regulatory | Sociocultural


Agreed with

– Zhao Hu
– Caroline Eriksen
– Alexander Galt

Agreed on

Multi-stakeholder collaboration is essential for effective child online protection


Disagreed with

– Zhao Hu

Disagreed on

Role of government oversight vs. industry self-regulation in child protection


Law enforcement and social workers play critical roles in investigating and addressing online child abuse

Explanation

The Film and Publication Board works closely with law enforcement agencies on child sexual abuse material cases, employing dedicated social workers as investigators. This collaborative approach ensures proper evidence compilation and court proceedings for online child protection cases.


Evidence

Team of social workers who are investigators; collaboration with police; compilation of reports for court evidence; focus on child sexual abuse material cases


Major discussion point

Child Online Safety and Protection


Topics

Children rights | Legal and regulatory | Cybersecurity


C

Caroline Eriksen

Speech speed

121 words per minute

Speech length

467 words

Speech time

230 seconds

Investors can leverage their influence to promote responsible business conduct regarding child rights across portfolio companies

Explanation

As a global investor in almost 9,000 companies, NBIM uses its leverage as a minority shareholder to influence companies and improve market practices. The fund depends on sustainable development and well-functioning markets, making child rights protection a material concern for long-term returns.


Evidence

NBIM invests in almost 9,000 companies in 70 markets; 30% of today’s population are children who spend increasing time online; failure to respect children’s rights creates legal, financial, and reputational risks


Major discussion point

Stakeholder Roles in Digital Child Rights


Topics

Children rights | Economic | Human rights principles


Agreed with

– Zhao Hu
– Makhosazana Lindhorst
– Alexander Galt

Agreed on

Multi-stakeholder collaboration is essential for effective child online protection


Companies beyond tech sector impact child rights through digital advertising and marketing practices

Explanation

Child rights in digital environments extend beyond traditional tech companies to include retail and other sectors that may impact children through their digital advertising and operational practices. This broader perspective recognizes that various industries affect children’s digital experiences.


Evidence

Retail companies may not consider themselves tech companies but impact children’s rights through digital advertising; relevant to companies across sectors and markets


Major discussion point

Stakeholder Roles in Digital Child Rights


Topics

Children rights | Economic | Sociocultural


Agreed with

– Tawhida Shiropa
– Annina Wersun
– Alexander Galt

Agreed on

User-centered design and co-creation are fundamental for inclusive technology


Due diligence and transparency in reporting child rights impacts need significant improvement across industries

Explanation

Research examining over 200 company reports found that only a few meaningfully address how companies impact children’s rights online. Despite companies recognizing this as a material topic, there’s a significant gap in how they report and address these impacts in practice.


Evidence

Research by BSR and UNICEF on more than 200 company reports; only a few meaningfully address child rights impacts; companies see it as material but struggle with reporting; increasing regulation like EU Digital Services Act


Major discussion point

Stakeholder Roles in Digital Child Rights


Topics

Children rights | Legal and regulatory | Economic


A

Alexander Galt

Speech speed

157 words per minute

Speech length

882 words

Speech time

336 seconds

Brands must take responsibility for their entire digital marketing value chain and its impact on children

Explanation

IKEA recognizes that most shopping journeys start online and that brands fund the digital marketing ecosystem, creating responsibilities for how that conduct affects children. The company takes a value chain approach to understand and address potential harms through various actors in the digital marketing space.


Evidence

Most IKEA shopping journeys start online through e-commerce, search, social media, peer-to-peer marketplaces, and social gaming; brands fund digital marketing ecosystem by buying media space; research partnership with UNICEF on digital marketing harms


Major discussion point

Stakeholder Roles in Digital Child Rights


Topics

Children rights | Economic | Sociocultural


Agreed with

– Zhao Hu
– Makhosazana Lindhorst
– Caroline Eriksen

Agreed on

Multi-stakeholder collaboration is essential for effective child online protection


Companies beyond tech sector impact child rights through digital advertising and marketing practices

Explanation

IKEA demonstrates how non-tech companies have both direct and indirect contact with children through digital channels. The company has developed policies for child safeguarding, data protection, and inclusive design while also addressing indirect impacts through marketing partnerships and platform engagement.


Evidence

IKEA’s commitment to integrate child rights across all organizations; direct contact through e-commerce and smart products; inclusive design approach with diverse families; clear policies on portraying children in marketing; no direct targeting of children in commercial messages


Major discussion point

Stakeholder Roles in Digital Child Rights


Topics

Children rights | Economic | Human rights principles


Agreed with

– Tawhida Shiropa
– Annina Wersun

Agreed on

User-centered design and co-creation are fundamental for inclusive technology


S

Sunita Grote

Speech speed

172 words per minute

Speech length

2736 words

Speech time

950 seconds

Massive investment gaps exist with only 2-3% of funding going to women-founded companies or female health solutions

Explanation

Despite the women’s health industry being projected to grow to $1.2 trillion by 2030, only 2% of private capital investment in 2023 went to women co-founded or founded companies. Similarly, only 3% of digital health investment focuses on female health challenges, while medical research funding for women’s health remains at 2%.


Evidence

Only 2% of investment in 2023 went to women co-founded companies; 3% of digital health investment focuses on female health solutions; 2% of medical research funding goes to pregnancy, childbirth and reproductive health; women’s health industry projected to grow to $1.2 trillion by 2030


Major discussion point

Gender Digital Divide and Women’s Exclusion


Topics

Gender rights online | Economic | Development


Agreed with

– Silje Dahl
– Tawhida Shiropa
– Annina Wersun

Agreed on

Alternative financing mechanisms are needed to support inclusive technology development


Women face multiple barriers including lack of access to devices, internet, and financial services in developing economies

Explanation

Significant structural barriers prevent women’s participation in the digital economy, with 31% of women worldwide not in education, employment or training, 740 million women in developing economies remaining unbanked, and one in five adolescent girls married before age 18. These challenges limit their ability to benefit from digital technologies.


Evidence

31% of women worldwide not in education, employment or training; 740 million women in developing economies unbanked; one in five adolescent girls married before age 18


Major discussion point

Gender Digital Divide and Women’s Exclusion


Topics

Gender rights online | Development | Economic


AI systems demonstrate significant bias against women and girls in almost half of documented cases

Explanation

When digital solutions are designed for everyone without specific consideration for women and girls, they often fail to meet their needs. The prevalence of bias in AI systems against women and girls represents a systemic problem in how technology is developed and deployed.


Evidence

Almost half of publicly documented bias in AI systems is bias against women and girls


Major discussion point

Gender Digital Divide and Women’s Exclusion


Topics

Gender rights online | Human rights principles | Legal and regulatory


L

Lisa Sivertsen

Speech speed

114 words per minute

Speech length

1003 words

Speech time

527 seconds

Gender-blind digital tools enhance existing gaps and structural discrimination against women and girls

Explanation

Digital tools and infrastructures that don’t consider gender differences will amplify existing disparities and structural discrimination. NORAD recognizes that without deliberate attention to gender inclusion, digitalization efforts can worsen rather than improve outcomes for women and girls.


Evidence

Worrying negative trends in gender and marginalization regarding access to digital tools; failure of designing gender diverse tools; lack of trust in systems as a big issue


Major discussion point

Gender Digital Divide and Women’s Exclusion


Topics

Gender rights online | Development | Human rights principles


Open source systems allow for external audits and adaptability to different user needs, especially marginalized populations

Explanation

Open source systems enable transparency, accountability, and trust-building because they can be externally audited and are adaptable to different contexts. This is particularly important for marginalized communities who may have greater privacy concerns and need for customized solutions.


Evidence

Norway’s open source data policy across different areas; resistance from companies selling non-open source systems; DHIS2 used by more than 100 countries; systems need to work together and reduce costs


Major discussion point

Open Source and Digital Public Goods


Topics

Digital standards | Development | Human rights principles


Agreed with

– Tawhida Shiropa
– Annina Wersun

Agreed on

Open source and digital public goods enable transparency, accountability, and inclusivity


Public-private partnerships are essential for mobilizing diverse funding sources for inclusive technology

Explanation

Building inclusive digital societies requires recognizing digital public infrastructures as essential infrastructure like roads or electricity. This necessitates continued development of public-private partnerships and engagement with diverse stakeholders to mobilize funding and ensure systems work for everyone.


Evidence

Need for more participation from private sector; engagement with emerging donors; tools must engage diversity of users and marginalized communities; digital public infrastructures as essential infrastructure


Major discussion point

Financing and Sustainability Models


Topics

Development | Economic | Infrastructure


T

Tawhida Shiropa

Speech speed

156 words per minute

Speech length

2506 words

Speech time

958 seconds

Co-design with target users, especially girls, is essential for creating safe and relevant digital solutions

Explanation

Moner Bandhu’s approach involved extensive consultation with thousands of students from urban, rural, and remote areas to understand their struggles and needs. This co-design process revealed issues like fear of reporting abuse, anxiety, and lack of private spaces to share concerns, which directly informed the app’s development.


Evidence

Consultation with thousands of students from urban, rural, semi-urban, and remote areas; girls shared struggles with reporting abuse, gender-based violence, anxiety, and lack of private spaces; anonymized assessment and female counselor options developed based on feedback


Major discussion point

Inclusive Technology Design and Development


Topics

Children rights | Gender rights online | Sociocultural


Agreed with

– Annina Wersun
– Alexander Galt

Agreed on

User-centered design and co-creation are fundamental for inclusive technology


Emotional safety and trust-building are as important as technical security measures for vulnerable users

Explanation

Beyond technical encryption, emotional safety is crucial for girls using mental health services. This includes providing anonymized assessments, female counselor options, and maintaining sensitive escalation protocols while ensuring data protection and real-time response to high-risk situations.


Evidence

End-to-end encryption; anonymized assessments; female counselor choice; high-risk flags for real-time response; sensitive escalation protocols; parental consent processes for users without phone access


Major discussion point

Inclusive Technology Design and Development


Topics

Children rights | Gender rights online | Privacy and data protection


Digital solutions must address local contexts and be available in local languages for true inclusivity

Explanation

Effective digital solutions require contextualization for different communities, including language localization and consideration of varying levels of digital literacy and income. This ensures that parents and users understand what data is being collected and how services work.


Evidence

Contextualization in Bangla and local languages; consideration of low-income parents and digital literacy levels; explanation of data collection and services in understandable terms


Major discussion point

Inclusive Technology Design and Development


Topics

Development | Multilingualism | Digital access


Open source approaches enable transparency, accountability, and trust-building with users and communities

Explanation

Open source development makes the system transparent to communities, stakeholders, and government while maintaining data privacy. This transparency builds accountability to users, especially young girls, and creates trust through the ability to see how decisions are made and safeguard policies are implemented.


Evidence

Open coding system visible to communities and government; encrypted anonymous data protection; alignment with digital public goods principles; accountability to customers and young girls


Major discussion point

Open Source and Digital Public Goods


Topics

Digital standards | Privacy and data protection | Development


Agreed with

– Lisa Sivertsen
– Annina Wersun

Agreed on

Open source and digital public goods enable transparency, accountability, and inclusivity


Digital public goods reduce costs and licensing barriers while enabling local customization and scaling

Explanation

Open source and digital public goods approaches eliminate licensing fees and enable local developers to customize solutions without starting from scratch. This is particularly important for entrepreneurs in developing countries who face banking and payment restrictions, making solutions more cost-effective and accessible globally.


Evidence

Struggled with license fees due to Central Bank of Bangladesh restrictions; local developers can customize without starting from scratch; cost-effective for low-resource institutions; enables sharing and scaling globally


Major discussion point

Open Source and Digital Public Goods


Topics

Development | Economic | Digital access


Investor mindsets need to change to value impact alongside innovation and understand open source business models

Explanation

Many investors are not convinced about open source models and expect immediate revenue, but the journey demonstrates that profitable, cash-positive businesses can be built with open source approaches. Women entrepreneurs particularly need access to finance and investor trust in their capabilities and scalable solutions.


Evidence

Difficulty convincing investors about open source models; expectation of revenue from day one; achieved cash positive and profitable status; less than 2% of VC funding goes to women-led startups; demonstrated scalability and impact


Major discussion point

Financing and Sustainability Models


Topics

Economic | Gender rights online | Development


Agreed with

– Sunita Grote
– Silje Dahl
– Annina Wersun

Agreed on

Alternative financing mechanisms are needed to support inclusive technology development


Disagreed with

– Annina Wersun

Disagreed on

Approach to financing sustainability for digital public goods


S

Silje Dahl

Speech speed

142 words per minute

Speech length

1236 words

Speech time

518 seconds

Technology-facilitated gender-based violence requires incorporation into national legislation and policies

Explanation

Sida supports partners working on ending gender-based violence who have developed tools to address technology-facilitated GBV in the Southern Africa region. This emerging form of violence is increasing online and poses risks of normalizing violence, requiring integration into national legal frameworks.


Evidence

Partner working on technology-facilitated GBV in Southern Africa; GBV has increased online and risks normalizing violence; work with civil society and governments to incorporate into national legislation and policies


Major discussion point

Regional Approaches and Local Solutions


Topics

Gender rights online | Legal and regulatory | Cybersecurity


Anonymous and safe information access is vital for youth in regions where topics like sexual health are controversial

Explanation

In Western Central Africa, where SRHR topics are controversial, innovative approaches are needed to share information safely. An app providing anonymous access to SRHR information and health facility locations has reached 9 million young people, demonstrating the importance of safe, accessible platforms.


Evidence

App in Western Central Africa reached 9 million young people; anonymous and safe access to SRHR information; helps find health facilities; addresses vulnerability to HIV, STDs, and early pregnancies


Major discussion point

Regional Approaches and Local Solutions


Topics

Gender rights online | Development | Sociocultural


Alternative financing mechanisms like guarantees and capital mobilization can reduce donor dependency

Explanation

Sida uses financial models like guarantees to reduce risk for lenders and capital mobilization to connect partners with other funders. These approaches help organizations become less donor-dependent while leveraging additional funding sources for sustainable growth.


Evidence

Guarantee model where Sida covers potential risk for banks; capital mobilization connecting partners with financial partners; matching funding model requiring partners to find other sources; expansion into health and SRHR sectors


Major discussion point

Financing and Sustainability Models


Topics

Economic | Development | Inclusive finance


Agreed with

– Sunita Grote
– Tawhida Shiropa
– Annina Wersun

Agreed on

Alternative financing mechanisms are needed to support inclusive technology development


Supporting locally adapted solutions that address specific cultural and regulatory contexts is crucial

Explanation

Sida prioritizes supporting programs and partners that bridge the digital gap and gender divide by ensuring women and girls can access useful digital tools. This includes supporting solutions that are locally adapted and address the needs of groups normally left behind, including the LGBTQI community.


Evidence

140 partners working on digitalization; partnership with UNICEF for FEMTIC initiative; focus on locally adapted solutions; attention to LGBTQI community and other marginalized groups


Major discussion point

Regional Approaches and Local Solutions


Topics

Development | Gender rights online | Digital access


A

Annina Wersun

Speech speed

165 words per minute

Speech length

1480 words

Speech time

534 seconds

Design choices can spark important policy conversations and challenge assumptions about inclusive services

Explanation

OpenCRVS demonstrates inclusive practices through its default configuration, such as birth registration forms that don’t mandate father’s information. This design choice opens discussions about creating more inclusive services for single mothers, survivors of gender-based violence, and non-traditional family structures.


Evidence

Default country ‘Farageland’ with pre-configured inclusive forms; birth registration form not mandating father’s information; sparks high-level discussions about assumptions; reactions questioning why father’s details aren’t required


Major discussion point

Inclusive Technology Design and Development


Topics

Gender rights online | Legal and regulatory | Human rights principles


Agreed with

– Tawhida Shiropa
– Alexander Galt

Agreed on

User-centered design and co-creation are fundamental for inclusive technology


Default configurations in digital systems should demonstrate inclusive practices to influence government adoption

Explanation

OpenCRVS includes dashboards specifically configured to highlight insights into women and girls’ experiences, using civil registration as a continuous source of population data. This approach shows governments what’s possible and often leads to adoption of inclusive features and involvement of additional ministries.


Evidence

Dashboards highlighting women and girls’ experiences; civil registration as continuous population data source; examples of targeting maternity services and understanding women’s mortality; involvement of Ministry of Women’s Affairs


Major discussion point

Inclusive Technology Design and Development


Topics

Gender rights online | Development | Data governance


Digital public goods need upfront investment to become self-sustainable while maintaining their public benefit mission

Explanation

OpenCRVS faces the challenge of becoming sustainable in a market that expects digital public goods to be free forever. The organization is working to generate revenue through service charges while investing in research and development, particularly for women and girls, requiring donor support for this transition.


Evidence

Working with eight countries with more lined up; need to change market expectations about being free; establishing R&D branch for women and girls; five-year sustainability goal; potential for preventing child marriage through age data


Major discussion point

Financing and Sustainability Models


Topics

Economic | Development | Children rights


Agreed with

– Sunita Grote
– Silje Dahl
– Tawhida Shiropa

Agreed on

Alternative financing mechanisms are needed to support inclusive technology development


Disagreed with

– Tawhida Shiropa

Disagreed on

Approach to financing sustainability for digital public goods


Agreements

Agreement points

Multi-stakeholder collaboration is essential for effective child online protection

Speakers

– Zhao Hu
– Makhosazana Lindhorst
– Caroline Eriksen
– Alexander Galt

Arguments

Industry self-discipline and collaboration between public-private sectors strengthens child protection efforts


Regulatory frameworks must balance platform accountability, digital literacy, and accessible harm reporting mechanisms


Investors can leverage their influence to promote responsible business conduct regarding child rights across portfolio companies


Brands must take responsibility for their entire digital marketing value chain and its impact on children


Summary

All speakers emphasized that protecting children online requires coordinated efforts across government, industry, civil society, and other stakeholders rather than siloed approaches


Topics

Children rights | Economic | Legal and regulatory


Open source and digital public goods enable transparency, accountability, and inclusivity

Speakers

– Lisa Sivertsen
– Tawhida Shiropa
– Annina Wersun

Arguments

Open source systems allow for external audits and adaptability to different user needs, especially marginalized populations


Open source approaches enable transparency, accountability, and trust-building with users and communities


Digital public goods need upfront investment to become self-sustainable while maintaining their public benefit mission


Summary

Speakers agreed that open source approaches provide transparency, enable customization for diverse needs, and build trust through accountability, particularly important for marginalized communities


Topics

Digital standards | Development | Human rights principles


User-centered design and co-creation are fundamental for inclusive technology

Speakers

– Tawhida Shiropa
– Annina Wersun
– Alexander Galt

Arguments

Co-design with target users, especially girls, is essential for creating safe and relevant digital solutions


Design choices can spark important policy conversations and challenge assumptions about inclusive services


Companies beyond tech sector impact child rights through digital advertising and marketing practices


Summary

Speakers emphasized the importance of involving end users, particularly marginalized groups, in the design process to ensure solutions meet actual needs and challenge existing assumptions


Topics

Gender rights online | Human rights principles | Sociocultural


Alternative financing mechanisms are needed to support inclusive technology development

Speakers

– Sunita Grote
– Silje Dahl
– Tawhida Shiropa
– Annina Wersun

Arguments

Massive investment gaps exist with only 2-3% of funding going to women-founded companies or female health solutions


Alternative financing mechanisms like guarantees and capital mobilization can reduce donor dependency


Investor mindsets need to change to value impact alongside innovation and understand open source business models


Digital public goods need upfront investment to become self-sustainable while maintaining their public benefit mission


Summary

All speakers acknowledged significant funding gaps for inclusive technology and the need for innovative financing approaches that value impact alongside financial returns


Topics

Economic | Development | Gender rights online


Similar viewpoints

Both speakers emphasized the critical role of government regulation and law enforcement in child online protection, highlighting the need for comprehensive legal frameworks and specialized personnel

Speakers

– Zhao Hu
– Makhosazana Lindhorst

Arguments

Legal frameworks and government oversight are essential for protecting minors online


Law enforcement and social workers play critical roles in investigating and addressing online child abuse


Topics

Children rights | Legal and regulatory | Cybersecurity


Both speakers highlighted how technology can amplify existing gender inequalities when not designed with deliberate attention to women’s and girls’ needs

Speakers

– Sunita Grote
– Lisa Sivertsen

Arguments

AI systems demonstrate significant bias against women and girls in almost half of documented cases


Gender-blind digital tools enhance existing gaps and structural discrimination against women and girls


Topics

Gender rights online | Human rights principles | Development


Both speakers emphasized the importance of creating safe spaces for vulnerable users, particularly young people, to access sensitive information and services

Speakers

– Silje Dahl
– Tawhida Shiropa

Arguments

Anonymous and safe information access is vital for youth in regions where topics like sexual health are controversial


Emotional safety and trust-building are as important as technical security measures for vulnerable users


Topics

Gender rights online | Children rights | Sociocultural


Unexpected consensus

Non-tech companies have significant responsibility for child rights online

Speakers

– Caroline Eriksen
– Alexander Galt

Arguments

Companies beyond tech sector impact child rights through digital advertising and marketing practices


Brands must take responsibility for their entire digital marketing value chain and its impact on children


Explanation

It was unexpected to see strong consensus that companies like IKEA and retail brands have substantial responsibility for child rights online, extending the conversation beyond traditional tech companies to the broader ecosystem of digital marketing and commerce


Topics

Children rights | Economic | Sociocultural


Emotional safety is as important as technical security

Speakers

– Tawhida Shiropa
– Lisa Sivertsen

Arguments

Emotional safety and trust-building are as important as technical security measures for vulnerable users


Open source systems allow for external audits and adaptability to different user needs, especially marginalized populations


Explanation

The consensus on emotional safety being equally important as technical security measures was unexpected, showing a sophisticated understanding that goes beyond traditional cybersecurity approaches to include psychological and social dimensions of safety


Topics

Privacy and data protection | Human rights principles | Sociocultural


Overall assessment

Summary

Strong consensus emerged around multi-stakeholder collaboration for child protection, the value of open source approaches for inclusivity, user-centered design principles, and the need for alternative financing mechanisms. Speakers also agreed on the importance of addressing gender digital divides and the role of non-tech companies in digital rights.


Consensus level

High level of consensus with complementary rather than conflicting viewpoints. This suggests a mature understanding of the challenges and potential solutions in digital child rights and gender inclusion. The agreement across diverse stakeholder perspectives (government, private sector, civil society, entrepreneurs) indicates strong potential for coordinated action and policy development in these areas.


Differences

Different viewpoints

Role of government oversight vs. industry self-regulation in child protection

Speakers

– Zhao Hu
– Makhosazana Lindhorst

Arguments

Legal frameworks and government oversight are essential for protecting minors online


Regulatory frameworks must balance platform accountability, digital literacy, and accessible harm reporting mechanisms


Summary

Zhao Hu emphasizes strong government oversight through comprehensive legal frameworks and annual campaigns, while Lindhorst focuses more on balanced regulatory approaches that combine enforcement with education and community engagement


Topics

Children rights | Legal and regulatory | Cybersecurity


Approach to financing sustainability for digital public goods

Speakers

– Annina Wersun
– Tawhida Shiropa

Arguments

Digital public goods need upfront investment to become self-sustainable while maintaining their public benefit mission


Investor mindsets need to change to value impact alongside innovation and understand open source business models


Summary

Annina advocates for traditional donor support with a transition to self-sustainability, while Tawhida emphasizes the need for investors to fundamentally change their mindset about open source business models and immediate revenue expectations


Topics

Economic | Development | Children rights


Unexpected differences

Emphasis on emotional safety vs. technical security

Speakers

– Tawhida Shiropa
– Makhosazana Lindhorst

Arguments

Emotional safety and trust-building are as important as technical security measures for vulnerable users


Law enforcement and social workers play critical roles in investigating and addressing online child abuse


Explanation

While both work on child protection, Tawhida emphasizes emotional safety and user empowerment while Lindhorst focuses on enforcement and investigation mechanisms. This represents different philosophical approaches to protection – empowerment vs. enforcement


Topics

Children rights | Privacy and data protection | Legal and regulatory


Overall assessment

Summary

The discussion showed remarkable consensus on core principles (need for inclusive design, importance of child protection, value of open source) with disagreements primarily on implementation approaches and emphasis


Disagreement level

Low to moderate disagreement level. Most differences were complementary rather than contradictory, representing different stakeholder perspectives (government, private sector, civil society, entrepreneurs) rather than fundamental philosophical divisions. The implications are positive – showing multiple viable pathways toward shared goals rather than irreconcilable conflicts


Partial agreements

Partial agreements

Similar viewpoints

Both speakers emphasized the critical role of government regulation and law enforcement in child online protection, highlighting the need for comprehensive legal frameworks and specialized personnel

Speakers

– Zhao Hu
– Makhosazana Lindhorst

Arguments

Legal frameworks and government oversight are essential for protecting minors online


Law enforcement and social workers play critical roles in investigating and addressing online child abuse


Topics

Children rights | Legal and regulatory | Cybersecurity


Both speakers highlighted how technology can amplify existing gender inequalities when not designed with deliberate attention to women’s and girls’ needs

Speakers

– Sunita Grote
– Lisa Sivertsen

Arguments

AI systems demonstrate significant bias against women and girls in almost half of documented cases


Gender-blind digital tools enhance existing gaps and structural discrimination against women and girls


Topics

Gender rights online | Human rights principles | Development


Both speakers emphasized the importance of creating safe spaces for vulnerable users, particularly young people, to access sensitive information and services

Speakers

– Silje Dahl
– Tawhida Shiropa

Arguments

Anonymous and safe information access is vital for youth in regions where topics like sexual health are controversial


Emotional safety and trust-building are as important as technical security measures for vulnerable users


Topics

Gender rights online | Children rights | Sociocultural


Takeaways

Key takeaways

Multi-stakeholder collaboration is essential for child online safety, requiring coordination between governments, regulators, investors, brands, and civil society


Legal frameworks and regulatory oversight must be complemented by industry self-discipline and digital literacy education to effectively protect children online


The gender digital divide represents a massive missed opportunity, with only 2-3% of investment going to women-founded companies or female health solutions despite projected $1.2 trillion market growth by 2030


Inclusive technology design requires deliberate co-design with target users, especially marginalized communities like women and girls, to ensure solutions meet actual needs rather than assumptions


Open source and digital public goods approaches enable transparency, accountability, and trust-building while reducing costs and barriers to access for vulnerable populations


Emotional safety and trust are as critical as technical security measures when serving vulnerable users, particularly in sensitive areas like mental health and reproductive health


Digital public infrastructure should be treated as essential infrastructure like roads or electricity, requiring sustained investment and maintenance


Alternative financing mechanisms including guarantees, blended finance, and public-private partnerships are needed to reduce donor dependency and achieve sustainability


Design choices in digital systems can influence policy conversations and challenge discriminatory assumptions, particularly around inclusive services for women and marginalized groups


Resolutions and action items

UNICEF and BSR will officially launch disclosure recommendations and guidance for companies on child rights impacts on Thursday, June 27th via webinar


China Federation of Internet Societies will host an international conference on child online protection in September, inviting global participation


OpenCRVS commits to becoming self-sustainable within five years through business development and revenue generation strategies


Digital public goods organizations need to invest in research and development, particularly focused on women and girls’ experiences and protection


Donors should develop open source data policies across education, health, environment and climate work areas


SIDA will expand guarantee and capital mobilization models to health and SRHR sectors to support women-led organizations


Industry players should adopt human rights due diligence approaches and value chain responsibility for child rights impacts


Unresolved issues

How to effectively change investor mindsets to value impact alongside innovation and understand open source business models


Addressing the massive financing gap where less than 1% of VC funding goes to Africa and only 2-3% to women-founded businesses globally


Overcoming resistance from companies selling proprietary systems when pushing for open source alternatives


Balancing the need for digital public goods to become financially sustainable while maintaining their public benefit mission


Scaling successful local solutions to address global challenges while maintaining cultural and contextual relevance


Ensuring meaningful participation of marginalized communities in digital system design beyond tokenistic consultation


Addressing technology-facilitated gender-based violence through coordinated policy and legislative responses across different jurisdictions


Suggested compromises

Digital public goods should explore creative revenue generation models while maintaining core open source principles and public benefit mission


Donors should provide flexible, upfront investment in digital public goods with the understanding that organizations will work toward self-sustainability within defined timeframes


Companies should adopt graduated approaches to child rights due diligence, starting with transparency in reporting and moving toward comprehensive value chain responsibility


Regulatory frameworks should balance platform accountability with user empowerment through digital literacy rather than purely restrictive approaches


Investment strategies should combine traditional development funding with innovative financing mechanisms like guarantees to de-risk private capital investment in inclusive technology


Thought provoking comments

So our task is to protect and empower children as active participants and pioneers of the digital world as opposed to protecting them from the digital world.

Speaker

Josianne Galea Baron


Reason

This reframes the entire approach to child online safety from a restrictive paradigm to an empowerment paradigm. It challenges the traditional binary thinking of ‘protection vs. access’ and introduces a more nuanced view that children should be empowered digital citizens rather than passive recipients of protection.


Impact

This comment set the philosophical foundation for the entire first segment, influencing how subsequent speakers framed their approaches. It moved the discussion away from purely regulatory/restrictive measures toward more holistic strategies that include digital literacy, empowerment, and active participation.


For girls safety is not about the encryption… for girls safety is the emotional safety. So, when a girl fills out the, I think the check-in, well-being check-ins or fill out anything assessment, so we wanted to make sure from our side we prioritize their data protected security… but sometimes parents are not that much educated and those times we also call them and, you know, discuss with them these other things.

Speaker

Tawhida Shiropa


Reason

This insight distinguishes between technical safety measures and emotional/psychological safety, revealing that true safety for vulnerable populations requires understanding their lived experiences. It challenges the tech industry’s tendency to focus primarily on technical solutions while overlooking human-centered needs.


Impact

This comment deepened the conversation about what ‘safety’ actually means in practice, moving beyond technical specifications to human-centered design. It influenced subsequent discussions about trust, accountability, and the importance of understanding user contexts, particularly for marginalized communities.


Take birth registration, for example. In many countries, the father’s details are either assumed or required. But OpenCRVS shows an alternative, and that is a birth registration form that does not mandate the father’s information… these conversations at the highest level really allow us to explore these assumptions and also inform and educate those who may just not have access to this information.

Speaker

Annina Wersun


Reason

This demonstrates how seemingly small design choices can challenge systemic assumptions and create policy change. It shows how digital public goods can be vehicles for social change by making alternative approaches visible and sparking important conversations about inclusion.


Impact

This concrete example shifted the discussion from abstract concepts about inclusivity to tangible demonstrations of how design choices can challenge social norms. It reinforced the theme that technology design is inherently political and can either perpetuate or challenge existing inequalities.


We are so grateful about the opportunity to work with UNICEF and also with UNESCO. We work on engaging more women and girls to get education and work within the STEM sectors… During the pandemic, it actually proved so easy to use and such a safe tool that it spread also and it was taken into use also by a lot of middle-income and high-income countries, including Norway. We actually started using it ourselves.

Speaker

Lisa Sivertsen


Reason

This reverses the typical narrative of technology transfer from developed to developing countries, showing how solutions designed for low-resource contexts can prove superior even for wealthy nations. It challenges assumptions about where innovation comes from and demonstrates the value of inclusive design.


Impact

This comment introduced a powerful counter-narrative that influenced how participants thought about innovation and technology transfer. It reinforced the business case for inclusive design and challenged traditional hierarchies in global development.


And brands like IKEA are the companies that fund this digital marketing ecosystem. We buy the media space to engage with people who want to buy our products. And we think that comes with responsibilities and ability to influence… It’s not just a binary of brands and platforms, there’s many actors within the space and many leverage points that we need to engage with.

Speaker

Alexander Galt


Reason

This expands the conversation beyond the typical focus on tech platforms to include the entire ecosystem of actors who fund and enable digital spaces. It introduces the concept of shared responsibility across the value chain and challenges the binary thinking about who is responsible for child safety online.


Impact

This comment broadened the scope of stakeholder responsibility and influenced the discussion toward systems thinking. It helped establish the framework for understanding how different actors can use their leverage points to create change, which became a recurring theme throughout both segments.


I think the BC mind should be slightly changed so maybe that can make a bigger difference and of course the huge impact in this world and of course ensuring all this open source model and we are the proved example like it can be profitable, it can be generating revenue so you just need to trust a bit on us.

Speaker

Tawhida Shiropa


Reason

This directly challenges investor biases and demonstrates that open source, impact-focused models can be financially viable. It provides concrete evidence against the false dichotomy between profit and social impact, particularly for women-led ventures.


Impact

This comment provided a powerful counter-narrative to traditional investment thinking and reinforced the discussion about changing financial flows. It gave concrete evidence for the business case arguments that other speakers were making and highlighted the intersection of gender bias and investment bias.


Overall assessment

These key comments fundamentally shaped the discussion by challenging binary thinking and introducing more nuanced, systems-based approaches to child rights and gender equity in digital spaces. The comments moved the conversation from traditional regulatory/technical solutions toward more holistic approaches that consider emotional safety, inclusive design, value chain responsibility, and alternative business models. They established recurring themes of empowerment over protection, the importance of understanding lived experiences, the power of design to challenge social norms, and the need for systemic change across multiple stakeholder groups. Most importantly, these insights demonstrated how seemingly technical decisions about digital products are inherently political and social, requiring deliberate choices to either perpetuate or challenge existing inequalities.


Follow-up questions

How will digital technologies, including generative AI, impact children’s lives positively and negatively now and in the future?

Speaker

Josianne Galea Baron


Explanation

This represents one of the defining challenges of our times that countries around the world are grappling with, requiring ongoing research to understand evolving impacts.


What is the right age for children to participate in online platforms?

Speaker

Josianne Galea Baron


Explanation

This is a critical policy question that requires further research and evidence-based approaches to determine appropriate age thresholds for different types of online participation.


What emerging risks and dangers await children as they explore new digital environments?

Speaker

Josianne Galea Baron


Explanation

As digital environments rapidly evolve, continuous research is needed to identify and understand new risks to children’s safety and wellbeing online.


How can different stakeholders from across sectors play their part in delivering a digital world that works for children?

Speaker

Josianne Galea Baron


Explanation

This was the central question of the first forum segment, requiring ongoing exploration of roles and responsibilities across various stakeholder groups.


How can we make inclusive technology for everyone and leverage opportunities in women’s health?

Speaker

Sunita Grote


Explanation

With the women’s health industry projected to grow to $1.2 trillion by 2030, research is needed on how to better design and scale inclusive solutions.


How can we close equity divides in the online world, particularly the gender divide?

Speaker

Sunita Grote


Explanation

This was the focus of the second forum segment, requiring research on specific approaches to address digital gender gaps and design solutions for women and girls.


How can digital public goods become self-sustainable while maintaining their open source nature?

Speaker

Annina Wersun


Explanation

This represents a critical challenge for digital public goods organizations that need to balance sustainability with accessibility and openness.


How can donors and investors change their mindset to better support open source models and women-led startups?

Speaker

Tawhida Shiropa


Explanation

Research is needed on effective approaches to shift investment patterns, given that less than 2% of VC funding goes to women-led startups.


How can we prevent child marriage through better use of civil registration data?

Speaker

Annina Wersun


Explanation

This represents a specific research and development opportunity to use data analytics for child protection that requires further exploration.


How can we better measure and address technology-facilitated gender-based violence?

Speaker

Silje Dahl


Explanation

This emerging form of violence requires research on prevalence, impacts, and effective intervention strategies.


Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.