Strengthening Corporate Accountability on Inclusive, Trustworthy, and Rights-based Approach to Ethical Digital Transformation
8 Jul 2025 17:00h - 17:45h
Strengthening Corporate Accountability on Inclusive, Trustworthy, and Rights-based Approach to Ethical Digital Transformation
Session at a glance
Summary
This discussion was a roundtable session at the WSIS Forum 2025 focused on strengthening corporate accountability for ethical digital transformation, hosted by the World Benchmarking Alliance. The session examined how to implement the Global Digital Compact (GDC), particularly Article 29B which calls for digital technology companies to develop industry accountability frameworks through inclusive stakeholder consultations. The moderator highlighted that while the digital economy is worth $16.5 trillion and projected to reach $32.9 trillion by 2028, only 14% of 200 major digital technology companies are contributing to digital inclusion according to their benchmark study.
Michael Kende discussed the challenges of addressing misinformation and algorithmic bias, explaining how Section 230 protections limit platform liability but create governance challenges. He noted that while the GDC provides good guidance on digital trust and safety, it lacks binding enforcement mechanisms. Hlekiwe Kachali from UNICEF emphasized that children’s rights must be central to digital governance, pointing out that only 17% of corporate reports mention child rights despite increasing digital dependency across all aspects of life. She stressed the need to rethink regulatory responsibilities as digital interactions become integral to humanitarian work and social participation.
Shamira Ahmed highlighted significant barriers in the think tank ecosystem, particularly the marginalization of Global South researchers and unequal funding that limits meaningful participation in GDC implementation. Jan Gerlach from Wikimedia Foundation advocated for community-led governance models and decentralized decision-making, warning against increased government pressure on companies and the exclusion of civil society from internet governance. The discussion concluded with participants emphasizing the need for practical implementation mechanisms, data collection on compliance, and multi-stakeholder approaches to ensure the GDC’s effectiveness in promoting inclusive digital transformation.
Keypoints
## Major Discussion Points:
– **Corporate Accountability in Digital Transformation**: The central focus on strengthening corporate accountability for inclusive, trustworthy, and rights-based approaches to ethical digital transformation, particularly in light of the Global Digital Compact (GDC) Article 29B which calls for industry accountability frameworks through stakeholder consultations.
– **Implementation Challenges of the Global Digital Compact**: Discussion of how to translate the non-binding GDC into actionable policies and standards, with speakers addressing the gap between high-level international commitments and on-the-ground implementation across different sectors and regions.
– **Child Rights and Digital Safety**: Extensive discussion on protecting children’s rights in digital spaces, including the need for child rights impact assessments, addressing online harms that have offline consequences, and ensuring digital technologies are designed with children’s wellbeing in mind.
– **Content Moderation and Platform Governance**: Analysis of challenges around misinformation, algorithmic bias, and platform accountability, including discussion of Section 230 protections, the balance between freedom of expression and harm mitigation, and different governance models (centralized vs. decentralized community-led approaches).
– **Global South Representation and Think Tank Limitations**: Critical examination of systemic barriers facing Global South researchers and think tanks in contributing to digital governance discussions, including funding inequities, limited access to networks, and potential conflicts of interest that may undermine meaningful participation in GDC implementation.
## Overall Purpose:
The discussion aimed to conduct a stakeholder consultation on implementing the Global Digital Compact, specifically focusing on how governments, multilaterals, and various stakeholders can create public standards that hold technology companies accountable for advancing digital inclusion while protecting human rights.
## Overall Tone:
The discussion maintained a professional, collaborative tone throughout, with speakers demonstrating expertise while acknowledging the complexity of the challenges. The tone was constructive but realistic, with participants openly discussing limitations and systemic barriers rather than offering overly optimistic solutions. There was a sense of urgency given the critical timing with the WSIS+20 review and recent GDC adoption, but the conversation remained measured and thoughtful rather than alarmist.
Speakers
– **Dio Herdiawan Tobing** – Global Policy Lead at the World Benchmarking Alliance
– **Michael Kende** – Senior Advisor and Analysis at Masons and Board Chair of DataSphere’s Initiative
– **Hlekiwe Kachali** – Senior ICT Advisor at UNICEF, Technology Advisor
– **Shamira Ahmed** – Policy Leader Fellow at the European University Institute, South Africa; Executive Director of the Data Economy Policy Hub
– **Jan Gerlach** – Public Policy Director at the Wikimedia Foundation
– **Audience** – Various audience members who made interventions
Additional speakers:
– **Nicola Sewe** – Engagement Lead on Digital Inclusion Benchmark at the World Benchmarking Alliance (mentioned by Dio but did not speak)
– **Lizzie** – Representative from the UK Foreign Office
– **Axel Leblois** – From the Global Initiative for Inclusive Information and Communication Technologies
Full session report
# Strengthening Corporate Accountability for Ethical Digital Transformation: A Comprehensive Analysis of Global Digital Compact Implementation
## Executive Summary
This roundtable discussion at the WSIS Forum 2025, hosted by the World Benchmarking Alliance, brought together leading experts to examine the critical challenge of implementing the Global Digital Compact (GDC), particularly Article 29B which calls for digital technology companies to develop industry accountability frameworks through inclusive stakeholder consultations. The session addressed the stark reality that whilst the digital economy is valued at $16.5 trillion and projected to reach $32.9 trillion by 2028 (representing 17% of global GDP), less than 14% of 200 major digital technology companies are contributing meaningfully to digital inclusion according to the World Benchmarking Alliance’s 2023 benchmark studies.
The discussion featured diverse perspectives on translating high-level international commitments into actionable policies that can address pressing challenges including content moderation, algorithmic bias, children’s digital rights protection, and systemic inequalities in global policy development. Participants shared practical tools and approaches whilst acknowledging the complex challenges of implementing non-binding international frameworks in rapidly evolving digital environments.
## Key Participants and Perspectives
The discussion featured diverse expertise spanning international organisations, policy research, and digital rights advocacy. **Dio Herdiawan Tobing** from the World Benchmarking Alliance moderated the session, establishing the context that corporate accountability for digital inclusion remains critically inadequate despite the sector’s massive economic footprint. **Michael Kende**, Senior Advisor and Analysis at Masons and Board Chair of DataSphere’s Initiative, provided technical analysis of platform governance challenges, particularly around Section 230 protections and content moderation complexities. **Hlekiwe Kachali**, Technology Advisor at UNICEF, brought a children’s rights perspective, emphasising the urgent need to protect young people in digital spaces with a temporal framework extending to 2100. **Shamira Ahmed** from the Data Economy Policy Hub offered critical analysis of systemic barriers facing Global South researchers in digital governance discussions. **Jan Gerlach** from the Wikimedia Foundation advocated for community-driven governance models as alternatives to top-down regulatory approaches.
## Opening Context: The Digital Accountability Gap
Tobing opened the discussion with sobering statistics about the scale of the digital economy and the accountability gap within it. Despite the digital economy’s enormous scale and growth trajectory, the vast majority of technology companies are failing to contribute meaningfully to inclusive digital transformation. The World Benchmarking Alliance’s 2023 benchmark revealed that less than 14% of 200 digital technology companies are contributing to digital inclusion, representing a fundamental challenge for implementing the GDC’s vision of ethical digital development.
This accountability gap becomes more concerning when considered alongside the sector’s economic influence. As Tobing noted, the digital economy represents 17% of global GDP and is projected to nearly double by 2028, yet corporate performance in digital inclusion remains critically inadequate. The discussion aimed to explore how Article 29B of the GDC, which calls for industry accountability frameworks developed through inclusive stakeholder consultations, might address this gap.
## Content Moderation and Platform Governance Challenges
Kende provided crucial historical context for understanding current content moderation challenges by explaining the origins and implications of Section 230 protections in the United States. He traced the development of these protections through key legal cases, including Stratton Oakmont vs. Prodigy and the CompuServe case, explaining how these “26 words of Section 230” originally designed to protect early platforms from liability for user-generated content now create significant challenges for addressing harmful content.
The discussion revealed the complexity of platform governance in addressing different types of problematic content. Kende distinguished between misinformation, disinformation, and hate speech, whilst highlighting how algorithmic bias creates automated discrimination in content curation systems. He explained that platforms face difficult decisions about content moderation that balance protecting freedom of expression with mitigating various forms of harm.
These technical challenges illustrate the complexity of implementing GDC principles in practice, where high-level commitments to digital rights and safety must be translated into specific platform policies and automated systems that operate at massive scale.
## Children’s Rights and Digital Protection
Kachali’s contributions significantly shaped the discussion by highlighting the particular vulnerabilities of children in digital environments and the temporal challenges of digital governance. She presented compelling evidence that online harms have direct offline consequences, with UNICEF research showing that children who experience bullying or sexual abuse online demonstrate much higher rates of self-harm in real life.
This evidence challenged artificial separations between online and offline experiences that often undermine policy effectiveness. As Kachali stated, “we talk about online, we talk about offline, there isn’t really that much of a distinction anymore. It is the same life.” This perspective influenced the broader discussion by emphasising that digital governance cannot be treated as separate from physical world welfare and safety.
Kachali also introduced a crucial temporal dimension to the discussion, noting that whilst the UN Convention on the Rights of the Child was adopted in 1989 (coinciding with the birth of the World Wide Web around 1989-1990), solutions developed today must remain relevant and effective through 2100. This long-term perspective requires frameworks that can adapt to technological change whilst maintaining core human rights protections.
The conversation addressed practical tools for protecting children’s digital rights, including UNICEF’s digital child rights impact assessment tool for private sector organisations to evaluate their digital products. However, Kachali noted that UNICEF’s study of 195 corporate reports from 95 countries found that only 17% mention child rights, despite children’s increasing dependence on digital technologies for education, social interaction, and development.
## Systemic Barriers in Global Policy Development
Ahmed’s intervention introduced a critical dimension to the discussion by exposing structural inequalities within the global think tank ecosystem that could undermine equitable GDC implementation. She identified three key barriers facing Global South researchers: lack of diversity, equity, and inclusion in think tank leadership and research; limited and unequal funding that restricts meaningful participation; and risks of self-censorship when private sector organisations fund research on digital economy solutions.
This analysis revealed how funding structures and representation gaps could undermine the GDC’s inclusive aspirations. Ahmed warned about the potential for “ethics-washing” when private sector organisations fund research on creating equitable solutions, highlighting conflicts of interest that may compromise the independence and credibility of policy research.
The discussion of these systemic barriers added complexity to implementation discussions by revealing that the challenge is not simply about translating policy into practice, but about ensuring that the voices and perspectives shaping implementation are themselves representative and independent.
## Community-Driven Governance Models
Gerlach’s advocacy for community-driven governance provided a constructive alternative to discussions about regulatory challenges and structural inequalities. He presented Wikipedia’s governance model as demonstrating that “decentralized decision making” can achieve both content integrity and freedom of expression through consensus-building and open debate among volunteers.
This perspective showed that alternative governance models are not merely theoretical but practically viable. Gerlach argued that governments should support multi-stakeholder approaches and public policies that empower communities behind digital public goods, rather than forcing companies into content decisions that may stifle innovation or exclude civil society voices.
The Wikipedia model offered insights into how digital governance might evolve beyond traditional regulatory frameworks, demonstrating that community-driven approaches can effectively manage complex content decisions whilst maintaining both freedom of expression and equitable access to information.
## Implementation Examples and Practical Tools
The discussion identified several concrete tools and approaches for advancing GDC implementation. Beyond UNICEF’s digital child rights impact assessment tool, participants discussed various mechanisms for strengthening corporate accountability and measuring progress.
Lizzie from the UK Foreign Office contributed important context about the Online Safety Act implementation, presenting it as one model for making platforms responsible for user safety whilst protecting freedom of expression. She also announced a UK review of their approach to ensuring responsible business conduct, incorporating insights from businesses, investors, trade unions, academia, and civil society as a multi-stakeholder approach to developing public standards that hold companies accountable.
Axel Leblois from the Global Initiative for Inclusive ICTs contributed important information about digital accessibility measurement, committing to conducting annual surveys to assess digital accessibility progress. This systematic data collection approach could potentially influence GDC implementation by providing concrete metrics for tracking progress by September 2025.
These practical examples demonstrated that whilst the GDC’s non-binding nature creates implementation challenges, various stakeholders are developing concrete tools and mechanisms to advance its principles in practice.
## Global Digital Compact Implementation Challenges
A central theme throughout the discussion was the challenge of translating the GDC’s aspirational language into concrete action. Speakers acknowledged that whilst the GDC provides valuable shared language and a comprehensive framework for digital governance, its non-binding nature creates significant implementation challenges.
The discussion revealed the complexity of coordinating between multiple international instruments – the GDC, Sustainable Development Goals, the Pact for the Future, and the Declaration for Future Generations – without creating overwhelming reporting obligations for organisations. Participants recognised the need for better integration and coordination amongst global governance systems.
Speakers agreed that the GDC provides important guidance on digital trust and safety through international cooperation and media literacy curricula, but acknowledged that the lack of binding enforcement mechanisms limits its effectiveness in compelling corporate behaviour change. This challenge requires innovative approaches that can make international frameworks effective in practice.
## Areas of Consensus and Shared Principles
Despite different perspectives on implementation approaches, participants demonstrated strong agreement on several key principles. There was universal recognition that the GDC provides valuable shared language and frameworks for digital governance, even whilst acknowledging its limitations as a non-binding instrument.
Speakers agreed on the need for more inclusive, accountable, and rights-based approaches to digital governance. They shared recognition that current frameworks are insufficient and that meaningful progress requires addressing both technical challenges and structural inequalities in global governance systems.
Participants also agreed on the importance of multi-stakeholder approaches that meaningfully include diverse voices, particularly marginalised communities and Global South perspectives. This consensus suggested potential pathways for collaborative implementation despite the identified barriers.
## Ongoing Challenges and Future Directions
The discussion concluded with recognition of several ongoing challenges that will require continued attention. The fundamental tension between the GDC’s non-binding nature and the need for enforceable accountability mechanisms remains a key challenge. Participants acknowledged that making international frameworks effective in compelling corporate behaviour change requires innovation in governance approaches.
The challenge of coordinating between multiple international instruments without creating overwhelming reporting obligations for organisations also requires ongoing attention. Participants recognised the need for better integration and coordination amongst global governance systems.
Systemic inequities in the global think tank ecosystem that limit Global South participation in policy development represent another ongoing challenge. Addressing these inequalities may be prerequisite to meaningful implementation of the GDC’s inclusive aspirations.
## Implications for Digital Governance
This discussion highlighted the complexity of translating international digital governance frameworks into effective action. Whilst the GDC provides important shared language and comprehensive approaches to digital challenges, successful implementation will require addressing both technical challenges and structural inequalities in global governance systems.
The conversation suggested that effective digital governance may require hybrid approaches that combine regulatory frameworks, community-driven governance models, and innovative accountability mechanisms. The challenge lies in developing approaches that can adapt to rapid technological change whilst maintaining core human rights protections.
The discussion also emphasised that digital governance cannot be separated from broader questions of equity, representation, and power in global policy-making systems. Addressing these structural challenges may be essential for achieving the GDC’s vision of inclusive and ethical digital transformation.
## Conclusion
This roundtable discussion demonstrated both the promise and challenges of implementing the Global Digital Compact in practice. Participants shared strong consensus on the need for more accountable and inclusive digital governance whilst identifying significant barriers ranging from the non-binding nature of international frameworks to systemic inequalities in global policy development.
The conversation revealed that successful GDC implementation will require innovative approaches that combine regulatory frameworks, community-driven governance models, and systematic efforts to address structural inequalities in global governance systems. The practical tools and examples shared by participants – from UNICEF’s impact assessment tools to Wikipedia’s community governance model to the UK’s multi-stakeholder review process – provide concrete pathways for translating the GDC’s aspirational vision into actionable improvements.
The discussion’s emphasis on children’s rights, community governance models, and Global South participation provides important guidance for future implementation efforts. The collaborative tone of the discussion, with participants building on each other’s contributions and sharing practical solutions, suggests that whilst significant challenges remain, there is substantial commitment among diverse stakeholders to advance the GDC’s vision of ethical and inclusive digital transformation.
Moving forward, the challenge lies in scaling these practical approaches whilst addressing the systemic barriers that limit meaningful participation in digital governance. The temporal framework introduced by Kachali – developing solutions today that remain effective through 2100 – provides an important reminder that digital governance frameworks must be both immediately actionable and adaptable to future technological developments.
Session transcript
Dio Herdiawan Tobing: Good afternoon, everyone. This is a very critical time for everybody. Five o’clock in the afternoon, very much appreciated that everyone is joining our session today. So we are from the World Benchmarking Alliance. And I think it’s probably better for me to kind of stand up so that I can see everybody. So again, thank you so much for attending our events, side events as a part of the WSIS Forum 2025. And the session this afternoon is about strengthening corporate accountability on inclusive, trustworthy, and rights-based approach to ethical digital transformations. My name is Dio Herdiawan Tobing. Dio, I’m a global policy lead at the World Benchmarking Alliance. And I’m alongside of a colleague, Nicola Sewe, who is the engagement lead on digital inclusion benchmark. And why we are doing this is that the purpose of the roundtable is an avenue of stoke-taking and discussions among everybody who is part of the side event. So we will not have a panel format for this evening, because we trust that everybody, having seen that this is 5 p.m., would have attended so many panels as a part of a side event and high-level sessions, probably in the ITU, the AI for a Good Summit. And this is a reflection session for us to discuss, especially as we are in a critical momentum this year, as you probably heard, part of the ongoing discussions is that WSIS’s Plus 20 review. We have just had the Global Digital Compact last year being implemented, adopted by a number of states and being implemented. And a critical part that we are looking at is Article 29B of the GDC, which looks at the extent to which the need and call for digital technology companies and developers to develop industry accountability framework through inclusive stakeholder consultations, which define responsibilities, set enforceable standards, and commit to the publications of audible reports. And why is this important? It’s that the digital economy is a significant and growing portion of the world’s overall economic output, currently estimated to be worth around $16.5 trillion, and by 2028 it’s projected to reach $32.9 trillion. The value is doubling from an estimate of $11 trillion back in 2016 to $23 trillion by 2025. And this figure, because it’s growing, it is representing an approximately 17% of global GDP. And the sector has created so many jobs globally, estimations of 73 million jobs in 2024. It’s going to go even more in 2030. The forecasted number is 92 million by 2030. And digital platforms and social media especially, which is now the users, are representing 64% of the world’s population or at around 51 billion. But what does it mean by all this growing figure on the digital economy, social media platforms, digital platforms, all the infrastructures digitally being installed? What we did at the World Benchmarking Alliance through our digital inclusion benchmark, in 2023, in the same forum in WSISIS, We launched it and found out that less than 14% of the 200 digital technology companies less than 14% are contributing to digital inclusion Now what does it mean when we are referring to digital inclusion? Or that it is the performance of tech companies on enhancing universal access to digital tech improving all levels of digital skills fostering trustworthy use by mitigating risks and harms and innovating openly, inclusively, and ethically So today, at this very important global milestone we have the GDC, we have the WSIS’ PLUS20 process under review and all eyes are now on the strengthening of digital corporations IGF as well, our big question is how should governments, multilaterals, come up with public standards that hold tech and private accountable in advancing digital inclusions We have a round of interventions this evening We are joined by, first of all, I would like to invite Michael Kende I’m sorry to put you on the spot as a Senior Advisor and Analysis at Masons and Board Chair of DataSphere’s initiative And probably to start the discussions on this very important topic You have advised different public bodies on internet policy issues Could we ask and give the first floor to you on the questions related to given the persistent challenge of misinformation and algorithmic bias how can global governance instruments like the GDC meaningfully influence platform design content moderations and policies to uphold freedom of expression while mitigating its harms
Michael Kende: Great question, and it’s a hard one I guess I don’t need, do I need a microphone? For the one, if anyone don’t mind Okay So just for definition So misinformation is false or inaccurate information not necessarily with malicious intent, unlike disinformation, and certainly not like hate speech, which is a different category. And algorithmic bias, which you asked about, is automated curation that might discriminate against certain groups or promote or discriminate against certain pieces of content. Now, the source of the challenge of this is in the kind of famous Section 230, which limits the liability of intermediaries like platform when they’re publishing, and they’re considered as publishers of user-generated content. So they’re treated as a publisher, so they’re immune from liability for things that they put up. If it’s illegal, that’s a different story, but for misinformation, which is not illegal in most countries and in most contexts, they’re not obliged to take it down, but they are protected from their content moderation decisions. They can leave things up and be protected. They can take things down and be protected as long as it’s in good faith. Now, in hindsight, today, given the challenges that we’re discussing, this may look a bit excessive or causing some of the problems, but if we go back in history, the Section 230 stemmed out of some cases against some early platforms that were kind of mail exchange lists. And the company Stratton Oakmont, which was at the heart of the Wolf of Wall Street for shady financial practices, sued one of the online companies, Prodigy, for defamation. They didn’t sue the person who wrote it, or they might have, but they sued the platform. And the judge said at the time that since Prodigy was moderating some content, they were responsible for everything that they left up. And a parallel one against CompuServe said, CompuServe didn’t moderate anything, therefore they weren’t liable for defamation for the things that were published. because they were just purely publishing. So this led to quite a conflict because there was no way that any platform could check every piece of content, whether or not it was defamatory or illegal or otherwise, and take down or put out, leave things up. So they came up with these 26 words of section 230 that gave liability so that companies could choose their own content moderation for what they put up and down. Now, obviously now we’ve seen the challenges of that and that can be embodied in Twitter or X with the takeover for better or worse, that gets a bit political. Obviously the content moderation changed a lot. People that had been banned before were now allowed back on. They changed the way that the content was moderated with no effect in the US, potentially some effect under the Digital Services Act or Digital Markets Act in Europe. But that’s taking its time. So that’s the challenge that has to be faced. The Global Digital Compact looked at a lot of this and really has a lot of language on digital trust and safety, that this is urgent, that we have to counter this kind of speech. There’s talk about international cooperation, curricula to teach users to have the skills to understand what is misinformation and what is not, which is very important. And ultimately for being able to digest and decide what to believe, I guess, or not. Access to and dissemination of independent, fact-based information to counter misinformation. There’s a call on tech companies and social media platforms to enhance their transparency, to provide researchers with access to data and other pieces. So very good, but the challenge of course is that it’s not binding. It’s not binding on countries, much less on companies. So that’s the challenge, but it puts some things out there that, for instance, if countries come up with a good curriculum that’s good at teaching children how to… Recognise disinformation, or worse. That can be good as it can be disseminated, and share best practices.
Dio Herdiawan Tobing: Khekuwe Kachali, Senior ICT Advisor at UNICEF. This is very much relevant to what Michael said. This is not binding, but it gives a sense of standards, it gives a sense of guidance, and you are currently actively advising UNICEF on the ICT strategy, do I get it right? And working with children as emerging users of digital platform and tech. With young children increasingly being active in the digital space, how can the GDC be leveraged, despite its positivity, the limits, to ensure that digital technologies are designed and governed in a way that it protects children’s rights, that the well-being of children is being considered, and it is bridging the digital divide affecting access to education and opportunities. At the end of the day, it’s about the children’s rights, it’s being considered as a part of the GDC.
Hlekiwe Kachali: Technology Advisor in UNICEF. Good people, it’s 5 p.m. on a Tuesday. You’ve been walking around. This is a difficult room, and I’m sitting in a difficult place. If you want to interject, feel free, although apparently we’ll have time to talk by the time all of us have said something. I will start by asking everyone a question. How many of us here have kids? How many of us here have children? How many of us here know children? There are people who don’t know children? Oh, I am verifying, validating, checking. The question that you’ve asked around how do instruments like the GDC and others, how do we leverage those to ensure that we promote, we advocate for, and we protect child rights? Difficult one. I am in a semi-reflective mood, I suppose, and I will start by saying this is 2025. The UN is 79 years old. We are going to be 80 in October. 2025 still, and we are 75 years from the turn of the next century. We are about midway between that aspirational moment in 1945 and the turn of the century in 2100. Why am I talking about 2100? I have a nine-year-old nephew, for instance, and I believe that 2100 is enough of a horizon or a time span for his lifetime and for him and his potential children to still see 2100. That means what we do today should be useful, should be, for lack of a better word, appropriate for 2100. I am not saying that we’re going to craft solutions today that will solve 2100’s problems, no, but if we have 2100 as a horizon. as a time span. I can imagine, like I said, people’s children and their children, given good health and good leadership from people in this room, for instance, that they will be able to see 2100. What can we do about that? The Convention on the Rights of the Child was adopted in 1989. It is the most widely adopted rights treaty by anyone. If we’re using the 1989-1990 framework, if we think of Satyam Berners-Lee, that is when the World Wide Web and the Internet as most of us know it was born. The Convention on the Rights of the Child, those rights have remained immutable. They have not changed. Think though, WWW from 1989 or 1990 and what you see today, the digital platforms that Dio is asking us about. There has been a marked shift in that and we have now instruments like the Pact for the Future signed in September 2024. Why am I talking about the pact? Because the Global Digital Compact is an annex of the pact. I do believe when we think of the GDC, we must also start from the pact. There are lofty ambitions in the pact, in the GDC, in the Declaration for Future Generations, which is the second annex. The pact itself has 56 actions. The Global Digital Compact has five objectives which are elaborated on. The declaration has its own commitment, its own guiding principles. My point is that most of us agree with what the substance in those documents is. However, those lofty aspirations, how do we take those and make them actual operational, make them actionable, so that for most of the people who are not sitting in this room and who are not coming to AI for good, that it actually makes a difference in their lives. What is the concrete manifestation of our lofty ambitions? Before going into what we can do as actual concrete actions for the GDC and some of the actions that we have done and how we can leverage the GDC, I would like for us to keep a few things in mind. We all lead digital lives. How we live, how we learn, how we earn, all of that, all the way to 2100, I can almost assure you with the confidence of ChatGPT hallucinating, is that we are digitally dependent and our lives are digitally mediated. It does not matter how offline or online you are. You are affected by digital. We must also remember that we as individuals and as corporations, we interact with each other generally without state mediation. For those of you on WhatsApp, for instance, you ticked whatever boxes you ticked for WhatsApp. There wasn’t really much state mediation and yet you still interacted with an organization. And then, on top of that, if I take my sector, the humanitarian sector, we used to talk about singular technologies used by humanitarians. That’s not the case anymore. Now, technology is an integral part of everything we do. It’s how we register our beneficiaries. It’s how we serve our beneficiaries. It’s how they hold us to account. All that to say is we cannot get away from it. How then do we bring in instruments like the Pact for the Future and the Global Digital Compact and make them actually actionable? What GDC has done, and this has already been spoken about, is that it gives us shared language. Most of us agree with the GDC. We may not agree with implementation, but we agree with GDC. It gives us a critical path. It gives us a critical roadmap and helps to define the actual direction or the general direction for where we should go. That is an important starting point for human beings, considering you ask 17 human beings what the definition of something is and you get 46 different answers. It gives us a starting point. What the GDC also does, it specifically… calls out to, in this case, everyone, private sector included, to use the UN guiding principles on business and human rights. It’s very specific about those instruments. The GDC also calls on private enterprise to use impact assessments. And in this case, UNICEF, because of our child rights mandate, we recently released a digital child rights impact assessment that can be used by commercial, by private sector, and by others. Specifically, if you have any products, any assets, if you are a private sector entity or otherwise, you can use our child rights impact assessment. It’s specifically meant for parts that are digital or parts of your assets or your products that are digital, or parts of your products if your product is fully digital as well. And then the Global Digital Compact is also quite specific about transparency and reporting. You, Dio, spoke about auditable reports. The Global Digital Compact talks about transparency. UNICEF did a study a couple of years ago. We took 195 corporate reports from 95 different countries, but surely that means that it’s across different continents. And we found that only, only 17 percent of those even made mention of child rights. To end for this part, what I will say is, I had spoken, for instance, about humanitarian action and how digital is a completely integral part of everything that we do. What we need to remember now is that non-private sector, whichever way you want to classify that, and private sector, we are now intersecting increasingly, and we are intersecting every day. What I mean is… When I say we use digital, it means that we are potentially using products mainly from private sector. That means that the interactions between those of us who are not in private sector and those who are in private sector, that means that our interactions, our collaborations, our partnerships, that thinking needs to change. And to your question, what I would say to end is that in this era where we’re all digitally dependent, where almost everything is digitally mediated, and you have to have some version of digital to participate in society, we must potentially also rethink, using instruments like the Global Digital Compact, how we assign regulatory responsibilities.
Dio Herdiawan Tobing: Thank you so much, Hlekiwe. And thank you for highlighting that the need for, you know, regulations to be immune, in a sense, you know, for the future. Thanks for highlighting also that the UNICEF’s reports that, you know, only a limited amount of companies that, you know, have mentioned child rights. Now, the questions about implementations, because this is a new instrument, the GDC has just been adopted. I agree completely with you that it has to be read alongside with the Back of the Future and the other declarations, the Younger, the Future Generation declarations. I would like to go to Shamira Ahmed now, Policy Leader Fellow at the European University Institute, South Africa. And she’s also the Executive Director of the Data Economy Policy Hub. My question relates to implementations and especially your role as an academic. What do you think the role of a policy think tank that you are leading, and a scholar as well, to bridge that gap between international digital governance framework like the GDC and the actual on-the-ground implementations like what, like you mentioned? especially in during the emerging technologies, like AI now, are developed and deployed. There is a need to adhere this to an ethical standard and it needs to be inclusive at the same time. Please.
Shamira Ahmed: Yes, thank you, Dio. And as my fellow panelists mentioned, it is quite complex, so if we think of think tanks in the GDC ecosystem, I think framing multi-stakeholder engagement, which is one of the core principles of the GDC, is a way to think of how think tanks can support enabling participation for marginalized groups. They can also contribute to accountability frameworks. They can convene diverse actors. Many think tanks operate with governments, technical experts, civil and other civil society groups beyond think tank space and marginalized communities. So often think tanks co-create solutions and ensure that there’s implementation for different aspects within the national context and they can be leveraged. The existing knowledge can be leveraged for GDC implementation, particularly for interdisciplinary and grounded in contextual real world needs. So essentially, policy think tanks on the ground can be a key tool to translating high-level commitments that are developed at the GDC level and narrow it down or filter down to actionable context-specific strategies at the national and local levels within and between communities because they already have the existing networks, the existing stakeholders at the local level, and ensuring that they have the capacity and they are recognized as key stakeholders in the process can be useful. And I think one of the key challenges that a lot of think tank, think tanks face, especially in the global south, when it comes to amplifying underrepresented voices and helping ensure no one is left behind, as with the aspirations of the GDC and the pact of the future, it’s that there’s often a gap in, and the risks in the think tank ecosystem that could potentially undermine GDC implementation, because there is an existing ecosystem, before we can start thinking of think tanks as implementers and a tool to support the GDC processes, let’s look at the existing global think tank ecosystem, and there’s been a lot of research on the gaps of the global think tank space, especially when we’re looking at global policies or transnational policies such as the global digital contact that are expected to be filtered down from the international level to national level and to more local level within and between communities. There’s a lack of diversity, equity and inclusion in leadership and research in the think tank space and this can undermine the GDC’s commitment to inclusion and diversity, it can result in underrepresentation of marginalized groups who are not given a seat at the table to highlight what the issues are, especially global south researchers in the policy and network space, and also I think it also comes down to money, there’s limited and equal funding for global south researchers and policy think tanks, they have have been a number of research projects that have highlighted this in the think tank space and overall in policy research. And one of these organizations is the Partnership for Economic Policy, where they highlight that researchers from the global south are often marginalized, they’re not recognized, and this restricts their ability to contribute meaningfully to GDC implementation. So from procurement to potential collusions with funders, with the private sector, with governments, it creates an uneven working space where a lot of global south researchers can’t contribute to the discussions in a meaningful way. And also, I think there’s also a risk, given the lack of funding in the think tank space where there’s a potential of self-censorship, because a lot of organizations that are funding the research on the digital economy, I won’t mention the names, they are private sector organizations funding research on creating equitable solutions and policy for the digital economy, and then that can create some form of self-censorship, and it can lead to minimizing contentious topics, and it can be a form of ethics-washing, for example, where private actors will use the research as a form of accountability to minimize the risks of what they’re doing because they’ve paid the organization to conduct the research, and it’s not only private actors, international development organizations also do this, it’s quite prominent. So yeah, basically there are a lot of barriers for southern researchers in the think tank policy space to contribute. And beyond a lack of access to networks, resources, and data, all of these factors would hinder their capability to actually produce meaningful implementation of the global digital compact. So I think before we start thinking of think tanks as enablers or as platforms or tools to enhance the GDC, let’s take a step back and think of the existing inequities and the risks of the ecosystem, the think tank system itself, and existing multilateral cooperation. There’s an urgent reform for multilateral cooperation, which requires capacity building, building technical expertise, especially from think tanks from the global south to contribute meaningfully to the discussions, not as an afterthought.
Dio Herdiawan Tobing: Thank you so much, Shamira, for highlighting on the difficulties, the limits of the role of think tanks as enabler. And particularly, when we look at the GDC and the implementation, there is also an under-resourced in terms of access, in terms of representations, especially on the communities as well, that think tanks could potentially facilitate. So we see a double burden there, the burden of the intermediaries, the role as enabler that think tanks can highlight, but also the access between the groups that needs to be represented. And I would like to now move on to Jean Guerlain, the public policy director at the Wikimedia Foundations. And this is important because we’ve discussed about the limitations of access. We’ve discussed the limitations of expressions of the communities that need to have as a part of the GDC implementations. And Wikimedia is a free and open knowledge. So my question to you is that as a champion of free and open knowledge, how does Wikimedia Foundation see the role in shaping the implementations of the GDC, in particular related to safeguarding freedom of expression and ensuring a credible access of information in the face of growing regulatory pressures of online content?
Jan Gerlach: Thank you, Dio. Hi, my name is Jan. I’m sitting over here because I think I missed the memo that we were all congregating over there and that there was a change of plans. But maybe it serves to actually demonstrate a point that I will make later about decentralized decision making. We’ll see. Anyway, so before I answer the question, and I’ll be brief, I’ll give a bit of context about what the Wikimedia Foundation is and does. We’re the nonprofit organization based in San Francisco that hosts Wikipedia and supports the communities of volunteers around the world who contribute to the online encyclopedia. In 2024, we collaborated with Wikipedians from around the world, most of them actually from the global majority, to submit our aspiration for the GDC, maybe also our lofty ambitions. In the process towards the GDC, our priority was to get negotiating parties to think about and understand the value of communities who build knowledge projects together, and to understand the kinds of policies that these communities need, what supports them in their work. Digital public goods, the open source projects that are free and open to use for everyone, and government support for them were an important part of our engagement in the path towards the GDC. Now that we’re in the implementation phase, one important thing to point out for me is that Wikipedia has been recognized as a digital public good, and we see it as essential infrastructure underpinning access to information, civic participation, and also AI, which seems very important to point out here as half of the building, or maybe two-thirds, have been taken over by a different conversation. The way that Wikipedia is governed, namely through community self-governance that supports freedom of expression and equitable access to information, is an example of how decentralized models can lead to positive outcomes in the context of digital public goods. The Wikimedia Foundation’s role in shaping the implementation of the GDC is really to highlight the effectiveness of community-led governance and decentralized moderation, not just of content, but really decentralized governance of how the whole community thinks about itself, how the platform is built, and the policies that govern the platform as well. So we offer a model where content integrity, as a counterpart to disinformation, is fostered by consensus and open debate and deliberation among the volunteers who build Wikipedia rather than top-down or ad-driven models or even political control. Our concern, however, is that instead we are moving more and more towards a model in which governments put, one, more pressure and responsibility on companies to make decisions about access and content, and two, move away from internet governance, a governance system that supports communities who build public interest projects such as Wikipedia together. So this includes platform regulation that forces companies to make decisions about content or incentivizes closed ecosystems, walled gardens. It also includes a move towards an internet governance framework that excludes civil society, including the communities that we’ve been talking about, who build digital public goods. We at the Wikimedia Foundation advocate on behalf of these communities, on behalf of the users and contributors of Wikipedia, and their ability to make decisions about content, set their own platform policies, and engage in conversations such as these about public policies that affect them, whether it’s UN spaces like IGF, national debates. or local conversations to people who built the non-commercial side of the internet, the places that we all love and use, they need to be heard. In our age of AI that is built on top of the digital commons, the GDC’s implementation must recognize and prioritize public interest, community-driven platforms like Wikipedia. We ask governments therefore to support multi-stakeholder approaches to internet governance and public policies that support digital public goods by empowering communities behind them through smart platform regulation and privacy protections that allow people to contribute safely. Thank you.
Dio Herdiawan Tobing: Thank you so much, Jan. And now, given all the speakers have already shared their insights about how they can contribute to implement the GDC from different angles, multilaterals, free and open knowledge. Yeah, exactly. And the idea of the event this afternoon is that to gather insights, to gather knowledge that everyone can speak freely decentralized in this room. We have 10 minutes before we are being kicked out of the room. Eight minutes for everybody who would like to share their thoughts and insights, especially related to GDC implementations. Feel free and we’ll deliver the mic to you.
Audience: Thank you. Hi, I’m Lizzie. I’m from the UK Foreign Office. I just wanted to take this opportunity to set out a little bit about how we in the UK are implementing some of the corporate accountability aspects of the GDC. So we firmly believe the GDC must be implemented in a way that respects, protects and promotes human rights and welcome practical steps to support including the advisory service potentially set out by the OHCHR. We’ve actually hosted a couple of events here with the multi-stakeholder community. So Many thanks to our partners here for that. And we hope to continue to work strongly with the multi-stakeholder community on implementation. A couple of points about how we’re implementing this kind of thing domestically. Our Online Safety Act, which started to come into force this year, makes user-to-user and search services more responsible for their users’ safety on platforms. It gives providers new duties to implement systems and processes to reduce the risk that services are used for illegal activity. Strongest protections in the acts have been designed for children. Platforms are required to prevent children from accessing harmful and age-inappropriate content. We do, however, recognise the vital importance of freedom of expression, and as a fundamental human right, that we should protect in this country. Safeguards for freedom of expression have been built in through the act. In terms of approach to business and trade, all UK businesses should respect the OECD guidelines and the UN Guiding Principles on Business and Human Rights, as previously mentioned, and are expected to conduct risk-based human rights and environmental due diligence. In our recently launched trade strategy, the UK announced a review of our approach to ensuring responsible business conduct, which will harness the insights and expertise of businesses, investors, trade unions, academia, civil society, and our trading partners. It is vital, given the often borderless uses of internet and AI, that we support a global building of both perspectives.
Dio Herdiawan Tobing: Thank you, Lizzie, for the intervention. Yes. Oh, okay. The gentleman first, and then yourself, or? Gentle first? Okay. Okay. Please, sir. And then back to you. Thank you, Wayne.
Audience: Hi, I’m Axel Leblois from the Global Initiative for Inclusive Information and Communication Technologies. We are a nonprofit dedicated to digital accessibility rights since the launch of the CRPD in 2006. So once a year, we do a major survey to assess how much progress is made in terms of digital accessibility for blind folks, deaf folks, people with cognitive or physical disabilities. And there are a whole area of regulation, bodies, standards already in place, so it’s relatively easy to measure. We were very involved in the GDC preparation, including having to deal with many country delegations to modify the text that was being processed. We have about 150 country panels of advocates that actually report on what’s happening in their countries. So we hope that we will be fast enough to be ready by September, but maybe not, it depends on the speed of data collection. But we think it’s a good way to actually influence what’s going on for implementation, because once you are able to get data on how much is actually implemented and not implemented, then you can actually advocate better. So that’s kind of the contribution I wanted to make, try to get data of how much of the GDC guidelines are actually implemented by the parties to the GDC where the countries, the states.
Dio Herdiawan Tobing: Thank you so much, sir. Do you want to comment?
Hlekiwe Kachali: Two, I hope, brief comments. One on the back of the good gentlemen there around compliance, I’ll call it that for the time being, without the enforcement, of who has done what for the GDC. Like I said, the BACT 56 actions, GDC has that many words, there are 69 SDGs, 17 SDGs, 169 targets. That’s before I go into all the other instruments that we have. We also, as a multilateral system, need to think through how these instruments are related to each other. To your point around resourcing, reporting obligations. When it comes to children, for all of us, but let me talk about children for the time being, what happens online has offline consequences, also known as the real world that most of us would think about. UNICEF research just did a study on online bullying and harms perpetuated online and found that kids who were bullied or sexually abused online had a much higher proportion of self-harm in real life. All that to say is we talk about online, we talk about offline, there isn’t really that much of a distinction anymore. It is the same life. It is the same individual.
Dio Herdiawan Tobing: Thank you, Hlekiwe. We now probably one last statement from the floor. We have only one minute actually, but we’ll be happy to entertain one last insights or probably questions from the panelists, if there’s any. The last question is from 3.37 p.m. If not, then probably one minute to close the session this afternoon, and thank you so much. Please give a round of applause to the panelists, as well as all of you who are on this second day of the WSIS Forum. Stay strong for the week, and thank you so much for attending the sessions, and hopefully everyone… feel there is a part to take, especially in the GDC and implementations in whatever that you are doing. And feel free to connect, feel free to reach out if there’s anything that we could potentially link to collaborate. One last statement? No? Thank you so much. And yeah, see you around. Thank you so much. Thank you. Will you take a photo? Thank you.
Dio Herdiawan Tobing
Speech speed
130 words per minute
Speech length
1599 words
Speech time
732 seconds
Less than 14% of 200 digital technology companies contribute to digital inclusion despite the growing $16.5 trillion digital economy
Explanation
Despite the massive growth of the digital economy, which is projected to reach $32.9 trillion by 2028 and represents 17% of global GDP, the World Benchmarking Alliance’s digital inclusion benchmark found that very few tech companies are actually contributing to digital inclusion. This represents a significant gap between economic growth and inclusive development in the digital sector.
Evidence
Digital economy worth $16.5 trillion currently, projected to reach $32.9 trillion by 2028; represents 17% of global GDP; created 73 million jobs in 2024, forecasted 92 million by 2030; digital platforms represent 64% of world’s population or 51 billion users; World Benchmarking Alliance digital inclusion benchmark from 2023 found less than 14% of 200 digital technology companies contribute to digital inclusion
Major discussion point
Corporate Accountability and Digital Governance Framework Implementation
Topics
Development | Economic | Human rights
Hlekiwe Kachali
Speech speed
152 words per minute
Speech length
1504 words
Speech time
589 seconds
The Global Digital Compact provides shared language and critical roadmap for implementation but faces challenges due to its non-binding nature
Explanation
The GDC offers important benefits by creating consensus on definitions and directions for digital governance, providing a framework that most stakeholders agree with. However, its effectiveness is limited because it lacks binding enforcement mechanisms, making implementation dependent on voluntary compliance rather than mandatory adherence.
Evidence
GDC gives shared language when 17 human beings would give 46 different answers to define something; provides critical roadmap and direction; calls out use of UN guiding principles on business and human rights; calls for impact assessments and transparency reporting
Major discussion point
Corporate Accountability and Digital Governance Framework Implementation
Topics
Legal and regulatory | Human rights | Development
Agreed with
– Michael Kende
Agreed on
Global Digital Compact provides valuable framework but lacks binding enforcement mechanisms
Children’s rights from 1989 Convention remain immutable while digital landscape has dramatically changed since 1990
Explanation
The Convention on the Rights of the Child, adopted in 1989, established unchanging fundamental rights for children. However, the digital world has transformed completely since 1989-1990 when the World Wide Web was born, creating a disconnect between static rights frameworks and rapidly evolving digital platforms that affect children’s lives.
Evidence
Convention on the Rights of the Child adopted in 1989, most widely adopted rights treaty; World Wide Web born in 1989-1990; marked shift from early WWW to current digital platforms; rights have remained immutable while digital landscape transformed
Major discussion point
Children’s Rights and Digital Protection
Topics
Human rights | Children rights | Sociocultural
Only 17% of corporate reports from 195 companies across 95 countries mention child rights
Explanation
UNICEF’s comprehensive study revealed a significant gap in corporate accountability regarding children’s rights. The vast majority of companies fail to even acknowledge child rights in their reporting, indicating a lack of awareness or prioritization of children’s issues in corporate governance and transparency efforts.
Evidence
UNICEF study of 195 corporate reports from 95 different countries across different continents found only 17% made mention of child rights
Major discussion point
Children’s Rights and Digital Protection
Topics
Human rights | Children rights | Economic
Online bullying and sexual abuse have direct offline consequences including higher rates of self-harm in real life
Explanation
UNICEF research demonstrates that the distinction between online and offline experiences is artificial, as digital harms translate directly into real-world consequences. Children who experience online bullying or sexual abuse show significantly higher rates of self-harm in their physical lives, proving the interconnected nature of digital and physical wellbeing.
Evidence
UNICEF research study found kids who were bullied or sexually abused online had much higher proportion of self-harm in real life
Major discussion point
Children’s Rights and Digital Protection
Topics
Human rights | Children rights | Cybersecurity
UNICEF released digital child rights impact assessment tool for private sector to evaluate their digital products
Explanation
In response to the GDC’s call for impact assessments, UNICEF developed a practical tool that private sector entities can use to evaluate how their digital products and services affect children’s rights. This tool can be applied to fully digital products or digital components of broader offerings, providing concrete guidance for corporate compliance with child rights standards.
Evidence
UNICEF recently released digital child rights impact assessment for commercial/private sector use; can be used for products with digital parts or fully digital products
Major discussion point
Children’s Rights and Digital Protection
Topics
Human rights | Children rights | Economic
Agreed with
– Audience
Agreed on
Corporate accountability requires transparency and impact assessments
Multilateral system needs to address how various instruments relate to each other to avoid overwhelming reporting obligations
Explanation
The proliferation of international frameworks creates a complex web of overlapping obligations that may overwhelm organizations and states. With the Pact for the Future having 56 actions, the GDC having multiple objectives, plus 17 SDGs with 169 targets and other instruments, there’s a need for better coordination to make compliance manageable and effective.
Evidence
Pact for the Future has 56 actions, GDC has objectives, 17 SDGs with 169 targets, plus other instruments; need to think through how instruments relate to each other regarding resourcing and reporting obligations
Major discussion point
Implementation Gaps and Systemic Barriers
Topics
Legal and regulatory | Development | Human rights
Michael Kende
Speech speed
161 words per minute
Speech length
678 words
Speech time
252 seconds
Section 230 protections create challenges for content moderation as platforms are immune from liability for user-generated content
Explanation
Section 230 was created to resolve early legal conflicts where platforms faced liability for content they moderated, leading to the famous ’26 words’ that protect platforms from liability for user-generated content. While this enabled platforms to make content moderation decisions without fear of legal consequences, it has created current challenges with misinformation and harmful content that platforms are not obligated to remove.
Evidence
Historical cases: Stratton Oakmont sued Prodigy for defamation because Prodigy moderated some content, making them responsible for everything; CompuServe wasn’t liable because they didn’t moderate anything; led to Section 230’s 26 words giving liability protection; Twitter/X example showing content moderation changes after takeover with no US legal effect
Major discussion point
Corporate Accountability and Digital Governance Framework Implementation
Topics
Legal and regulatory | Liability of intermediaries | Content policy
Disagreed with
– Jan Gerlach
– Audience (UK Foreign Office)
Disagreed on
Approach to content moderation and platform regulation
Misinformation differs from disinformation and hate speech, while algorithmic bias creates automated discrimination in content curation
Explanation
It’s important to distinguish between misinformation (false information without malicious intent), disinformation (deliberately false information), and hate speech (a separate category entirely). Algorithmic bias represents another challenge where automated systems discriminate against certain groups or content types, creating systematic unfairness in how information is curated and presented to users.
Evidence
Misinformation defined as false or inaccurate information not necessarily with malicious intent, unlike disinformation and hate speech; algorithmic bias defined as automated curation that might discriminate against certain groups or promote/discriminate against certain content
Major discussion point
Misinformation and Content Moderation Challenges
Topics
Sociocultural | Content policy | Human rights
Global Digital Compact addresses digital trust and safety through international cooperation and media literacy curricula but lacks binding enforcement
Explanation
The GDC contains comprehensive language on combating misinformation through international cooperation, educational curricula to help users identify false information, and requirements for tech companies to enhance transparency and provide researcher access to data. However, these provisions are not legally binding on either countries or companies, limiting their practical enforcement and effectiveness.
Evidence
GDC has language on digital trust and safety, international cooperation, curricula to teach users skills to understand misinformation, access to independent fact-based information, calls for tech company transparency and researcher data access; but it’s not binding on countries or companies
Major discussion point
Misinformation and Content Moderation Challenges
Topics
Legal and regulatory | Human rights | Sociocultural
Agreed with
– Hlekiwe Kachali
Agreed on
Global Digital Compact provides valuable framework but lacks binding enforcement mechanisms
Shamira Ahmed
Speech speed
116 words per minute
Speech length
808 words
Speech time
414 seconds
Think tanks face diversity, equity and inclusion gaps in leadership and research that undermine GDC’s commitment to inclusion
Explanation
The global think tank ecosystem suffers from significant representation problems, particularly affecting Global South researchers and marginalized communities. This lack of diversity in leadership and research directly contradicts the GDC’s goals of inclusion and diversity, creating a fundamental barrier to effective implementation of the compact’s inclusive principles.
Evidence
Research on gaps in global think tank space shows lack of diversity, equity and inclusion in leadership and research; underrepresentation of marginalized groups and Global South researchers in policy and network space
Major discussion point
Implementation Gaps and Systemic Barriers
Topics
Development | Human rights | Sociocultural
Agreed with
– Jan Gerlach
Agreed on
Need for multi-stakeholder approaches in digital governance
Limited and unequal funding for Global South researchers restricts meaningful contribution to GDC implementation
Explanation
Systematic funding inequalities prevent Global South researchers and policy think tanks from participating meaningfully in GDC implementation processes. Organizations like the Partnership for Economic Policy have documented how Global South researchers are marginalized and not recognized, creating an uneven playing field that restricts their ability to contribute to digital governance discussions.
Evidence
Partnership for Economic Policy research highlighting Global South researchers are marginalized and not recognized; limited and unequal funding restricts ability to contribute meaningfully to GDC implementation; uneven working space prevents contribution to discussions
Major discussion point
Implementation Gaps and Systemic Barriers
Topics
Development | Economic | Human rights
Potential self-censorship occurs when private sector organizations fund research on digital economy policy solutions
Explanation
When private sector organizations fund research on creating equitable digital economy policies, it creates conflicts of interest that can lead to self-censorship among researchers. This dynamic can result in the minimization of contentious topics and may constitute a form of ethics-washing, where private actors use funded research to demonstrate accountability while potentially influencing the research outcomes.
Evidence
Private sector organizations funding research on creating equitable solutions and policy for digital economy; can create self-censorship and lead to minimizing contentious topics; can be form of ethics-washing where private actors use research as accountability to minimize risks; also occurs with international development organizations
Major discussion point
Implementation Gaps and Systemic Barriers
Topics
Economic | Legal and regulatory | Human rights
Jan Gerlach
Speech speed
141 words per minute
Speech length
656 words
Speech time
278 seconds
Wikipedia demonstrates effective community self-governance model supporting freedom of expression and equitable access to information
Explanation
Wikipedia operates through decentralized community governance where content integrity is maintained through consensus and open debate among volunteers rather than top-down control or commercial algorithms. This model shows how community-led governance can effectively support both freedom of expression and equitable access to information, offering an alternative to corporate or government-controlled platforms.
Evidence
Wikipedia recognized as digital public good and essential infrastructure; governed through community self-governance supporting freedom of expression and equitable access; content integrity fostered by consensus and open debate among volunteers rather than top-down or ad-driven models
Major discussion point
Digital Public Goods and Community Governance
Topics
Human rights | Freedom of expression | Sociocultural
Community-led governance and decentralized moderation offer effective alternatives to top-down content control models
Explanation
Decentralized governance models, as demonstrated by Wikipedia, provide effective alternatives to centralized content control systems. These community-driven approaches allow for more nuanced, context-sensitive decision-making about content and platform policies, while avoiding the problems associated with either corporate algorithmic control or government censorship.
Evidence
Wikipedia’s decentralized moderation and governance model; community consensus and open deliberation; volunteers building Wikipedia rather than top-down, ad-driven, or political control models
Major discussion point
Misinformation and Content Moderation Challenges
Topics
Sociocultural | Content policy | Human rights
Governments should support multi-stakeholder approaches and public policies that empower communities behind digital public goods
Explanation
Rather than increasing pressure on companies to make content decisions or moving toward exclusionary governance frameworks, governments should support the communities that build public interest projects like Wikipedia. This includes implementing smart platform regulation and privacy protections that enable safe community participation in building the digital commons.
Evidence
Advocacy for multi-stakeholder approaches to internet governance; support for digital public goods by empowering communities through smart platform regulation and privacy protections; recognition of communities who built non-commercial side of internet
Major discussion point
Digital Public Goods and Community Governance
Topics
Legal and regulatory | Human rights | Development
Agreed with
– Shamira Ahmed
Agreed on
Need for multi-stakeholder approaches in digital governance
Platform regulation should avoid forcing companies into content decisions and instead support open, community-driven ecosystems
Explanation
Current regulatory trends that pressure companies to make content decisions or incentivize closed ecosystems are counterproductive. Instead, regulation should support open, community-driven platforms and avoid creating walled gardens, particularly as AI systems increasingly depend on the digital commons that communities have built.
Evidence
Concern about governments putting pressure on companies to make content decisions; platform regulation forcing companies into content decisions or incentivizing closed ecosystems/walled gardens; AI built on top of digital commons; need to prioritize public interest, community-driven platforms
Major discussion point
Digital Public Goods and Community Governance
Topics
Legal and regulatory | Human rights | Sociocultural
Disagreed with
– Michael Kende
– Audience (UK Foreign Office)
Disagreed on
Approach to content moderation and platform regulation
Audience
Speech speed
164 words per minute
Speech length
549 words
Speech time
200 seconds
UK’s Online Safety Act implements corporate accountability by making platforms responsible for user safety while protecting freedom of expression
Explanation
The UK’s Online Safety Act represents a practical implementation of corporate accountability principles by requiring user-to-user and search services to implement systems that reduce risks of illegal activity and protect users, especially children. The act includes built-in safeguards for freedom of expression, demonstrating how regulation can balance safety and rights protection.
Evidence
UK Online Safety Act started coming into force in 2024; gives providers duties to implement systems and processes to reduce risk of illegal activity; strongest protections designed for children; platforms required to prevent children accessing harmful content; safeguards for freedom of expression built into the act
Major discussion point
Corporate Accountability and Digital Governance Framework Implementation
Topics
Legal and regulatory | Human rights | Children rights
Agreed with
– Hlekiwe Kachali
Agreed on
Corporate accountability requires transparency and impact assessments
Disagreed with
– Michael Kende
– Jan Gerlach
– Audience (UK Foreign Office)
Disagreed on
Approach to content moderation and platform regulation
Digital accessibility compliance can be measured through systematic data collection to influence better GDC implementation
Explanation
The Global Initiative for Inclusive ICT conducts annual surveys to assess progress in digital accessibility for people with disabilities, leveraging existing regulations, standards, and 150 country panels of advocates. This systematic data collection approach provides a model for measuring GDC implementation effectiveness and can be used to advocate for better compliance with digital inclusion goals.
Evidence
Annual survey assessing digital accessibility progress for blind, deaf, and people with cognitive/physical disabilities; existing regulations, bodies, and standards make measurement relatively easy; 150 country panels of advocates reporting on their countries; data collection enables better advocacy for implementation
Major discussion point
Corporate Accountability and Digital Governance Framework Implementation
Topics
Human rights | Rights of persons with disabilities | Development
Agreements
Agreement points
Global Digital Compact provides valuable framework but lacks binding enforcement mechanisms
Speakers
– Hlekiwe Kachali
– Michael Kende
Arguments
The Global Digital Compact provides shared language and critical roadmap for implementation but faces challenges due to its non-binding nature
Global Digital Compact addresses digital trust and safety through international cooperation and media literacy curricula but lacks binding enforcement
Summary
Both speakers acknowledge that while the GDC offers important guidance, shared language, and comprehensive approaches to digital governance issues, its effectiveness is fundamentally limited by the lack of binding enforcement mechanisms on countries and companies
Topics
Legal and regulatory | Human rights | Development
Corporate accountability requires transparency and impact assessments
Speakers
– Hlekiwe Kachali
– Audience
Arguments
UNICEF released digital child rights impact assessment tool for private sector to evaluate their digital products
UK’s Online Safety Act implements corporate accountability by making platforms responsible for user safety while protecting freedom of expression
Summary
Both speakers support the implementation of systematic assessment tools and regulatory frameworks that require companies to evaluate and report on the impact of their digital products and services
Topics
Legal and regulatory | Human rights | Children rights
Need for multi-stakeholder approaches in digital governance
Speakers
– Shamira Ahmed
– Jan Gerlach
Arguments
Think tanks face diversity, equity and inclusion gaps in leadership and research that undermine GDC’s commitment to inclusion
Governments should support multi-stakeholder approaches and public policies that empower communities behind digital public goods
Summary
Both speakers emphasize the importance of inclusive, multi-stakeholder governance models that meaningfully include diverse voices, particularly marginalized communities and Global South perspectives
Topics
Human rights | Development | Sociocultural
Similar viewpoints
Both speakers present empirical evidence showing significant gaps in corporate accountability and inclusion in the digital sector, with the vast majority of companies failing to adequately address rights-based approaches
Speakers
– Hlekiwe Kachali
– Dio Herdiawan Tobing
Arguments
Only 17% of corporate reports from 195 companies across 95 countries mention child rights
Less than 14% of 200 digital technology companies contribute to digital inclusion despite the growing $16.5 trillion digital economy
Topics
Human rights | Economic | Development
Both speakers express concerns about current regulatory approaches to content moderation, though from different angles – one highlighting the challenges of platform immunity and the other advocating for community-driven alternatives to corporate content control
Speakers
– Michael Kende
– Jan Gerlach
Arguments
Section 230 protections create challenges for content moderation as platforms are immune from liability for user-generated content
Platform regulation should avoid forcing companies into content decisions and instead support open, community-driven ecosystems
Topics
Legal and regulatory | Content policy | Human rights
Both speakers identify systemic barriers in the international governance system that prevent effective implementation, whether through resource inequalities or overwhelming bureaucratic complexity
Speakers
– Shamira Ahmed
– Hlekiwe Kachali
Arguments
Limited and unequal funding for Global South researchers restricts meaningful contribution to GDC implementation
Multilateral system needs to address how various instruments relate to each other to avoid overwhelming reporting obligations
Topics
Development | Legal and regulatory | Human rights
Unexpected consensus
Community-driven governance models as viable alternatives to corporate or government control
Speakers
– Jan Gerlach
– Hlekiwe Kachali
Arguments
Wikipedia demonstrates effective community self-governance model supporting freedom of expression and equitable access to information
The Global Digital Compact provides shared language and critical roadmap for implementation but faces challenges due to its non-binding nature
Explanation
Despite coming from very different organizational perspectives (Wikimedia Foundation vs UNICEF), both speakers converge on the value of decentralized, community-driven approaches to digital governance as alternatives to top-down control mechanisms
Topics
Human rights | Sociocultural | Legal and regulatory
Integration of online and offline experiences requires holistic policy approaches
Speakers
– Hlekiwe Kachali
– Michael Kende
Arguments
Online bullying and sexual abuse have direct offline consequences including higher rates of self-harm in real life
Misinformation differs from disinformation and hate speech, while algorithmic bias creates automated discrimination in content curation
Explanation
Both speakers, from different domains (child rights vs internet policy), recognize that digital governance cannot be separated from real-world impacts and requires nuanced understanding of different types of harms and their interconnected effects
Topics
Human rights | Cybersecurity | Sociocultural
Overall assessment
Summary
Speakers demonstrated strong consensus on the need for more inclusive, accountable, and rights-based approaches to digital governance, while acknowledging significant implementation challenges including lack of binding enforcement, resource inequalities, and systemic barriers to meaningful participation
Consensus level
High level of consensus on principles and problems, with convergence around the need for multi-stakeholder approaches, corporate accountability mechanisms, and recognition that current frameworks are insufficient. The consensus suggests a shared understanding of both the urgency of digital governance reform and the complexity of implementation challenges, which could facilitate collaborative approaches to GDC implementation despite the identified barriers.
Differences
Different viewpoints
Approach to content moderation and platform regulation
Speakers
– Michael Kende
– Jan Gerlach
– Audience (UK Foreign Office)
Arguments
Section 230 protections create challenges for content moderation as platforms are immune from liability for user-generated content
Platform regulation should avoid forcing companies into content decisions and instead support open, community-driven ecosystems
UK’s Online Safety Act implements corporate accountability by making platforms responsible for user safety while protecting freedom of expression
Summary
Michael Kende explains how Section 230 creates challenges by protecting platforms from liability, making content moderation difficult. Jan Gerlach argues against forcing companies into content decisions and advocates for community-driven approaches like Wikipedia’s model. The UK representative presents their Online Safety Act as a solution that makes platforms responsible while protecting expression rights, representing a middle-ground regulatory approach.
Topics
Legal and regulatory | Human rights | Content policy
Unexpected differences
Role of private sector funding in research and policy development
Speakers
– Shamira Ahmed
– Hlekiwe Kachali
Arguments
Potential self-censorship occurs when private sector organizations fund research on digital economy policy solutions
UNICEF released digital child rights impact assessment tool for private sector to evaluate their digital products
Explanation
This represents an unexpected disagreement about engaging with the private sector. Ahmed warns about the risks of private sector funding leading to self-censorship and ethics-washing in research, while Kachali presents UNICEF’s collaboration with private sector through impact assessment tools as a positive development. This disagreement is significant because it reveals different approaches to private sector engagement in implementing digital governance frameworks.
Topics
Economic | Legal and regulatory | Human rights
Overall assessment
Summary
The discussion revealed moderate levels of disagreement primarily around implementation approaches rather than fundamental goals. Key areas of disagreement included content moderation strategies (regulatory vs. community-driven approaches), barriers to GDC implementation (non-binding nature vs. systemic inequalities), and private sector engagement (collaboration vs. concerns about capture).
Disagreement level
Moderate disagreement with significant implications. While speakers generally agreed on the importance of digital inclusion, human rights protection, and the value of the GDC framework, their different approaches to implementation could lead to conflicting policy recommendations. The disagreements suggest that successful GDC implementation will require reconciling different philosophical approaches to governance (top-down regulation vs. community self-governance) and addressing both structural barriers (funding inequalities) and legal frameworks (binding vs. non-binding instruments) simultaneously.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers present empirical evidence showing significant gaps in corporate accountability and inclusion in the digital sector, with the vast majority of companies failing to adequately address rights-based approaches
Speakers
– Hlekiwe Kachali
– Dio Herdiawan Tobing
Arguments
Only 17% of corporate reports from 195 companies across 95 countries mention child rights
Less than 14% of 200 digital technology companies contribute to digital inclusion despite the growing $16.5 trillion digital economy
Topics
Human rights | Economic | Development
Both speakers express concerns about current regulatory approaches to content moderation, though from different angles – one highlighting the challenges of platform immunity and the other advocating for community-driven alternatives to corporate content control
Speakers
– Michael Kende
– Jan Gerlach
Arguments
Section 230 protections create challenges for content moderation as platforms are immune from liability for user-generated content
Platform regulation should avoid forcing companies into content decisions and instead support open, community-driven ecosystems
Topics
Legal and regulatory | Content policy | Human rights
Both speakers identify systemic barriers in the international governance system that prevent effective implementation, whether through resource inequalities or overwhelming bureaucratic complexity
Speakers
– Shamira Ahmed
– Hlekiwe Kachali
Arguments
Limited and unequal funding for Global South researchers restricts meaningful contribution to GDC implementation
Multilateral system needs to address how various instruments relate to each other to avoid overwhelming reporting obligations
Topics
Development | Legal and regulatory | Human rights
Takeaways
Key takeaways
The Global Digital Compact (GDC) provides shared language and a critical roadmap for digital governance but faces significant implementation challenges due to its non-binding nature
Corporate accountability in digital inclusion is severely lacking, with less than 14% of 200 major digital technology companies contributing to digital inclusion despite the $16.5 trillion digital economy
Children’s digital rights protection requires urgent attention as only 17% of corporate reports mention child rights, and online harms have direct offline consequences including increased self-harm rates
Content moderation faces fundamental challenges due to Section 230 protections that shield platforms from liability while giving them broad discretion over content decisions
Systemic barriers exist in the global think tank ecosystem, particularly affecting Global South researchers through limited funding, lack of diversity, and potential self-censorship
Community-led governance models like Wikipedia demonstrate effective alternatives to top-down content control, supporting both freedom of expression and content integrity
Digital accessibility compliance can be measured and tracked to influence better GDC implementation through systematic data collection
The distinction between online and offline experiences is increasingly meaningless as digital interactions have real-world consequences
Resolutions and action items
UNICEF released a digital child rights impact assessment tool that private sector organizations can use to evaluate their digital products
UK announced a review of their approach to ensuring responsible business conduct, incorporating insights from businesses, investors, trade unions, academia, and civil society
Global Initiative for Inclusive ICTs committed to conducting annual surveys to assess digital accessibility progress and potentially influence GDC implementation by September
Participants encouraged to connect and collaborate on GDC implementation in their respective roles and organizations
Unresolved issues
How to make the non-binding GDC enforceable and create meaningful accountability mechanisms for tech companies
How to address the fundamental tension between content moderation needs and Section 230 protections
How to reform the multilateral system to better coordinate between various instruments (GDC, SDGs, and other frameworks) without overwhelming reporting obligations
How to address systemic inequities in the global think tank ecosystem that limit Global South participation in policy development
How to balance government regulation with community-driven governance models without stifling innovation or freedom of expression
How to ensure adequate funding and resources for Global South researchers and organizations to meaningfully participate in GDC implementation
How to prevent ethics-washing by private sector organizations funding research on digital economy policies
Suggested compromises
Implementing smart platform regulation that supports community-driven ecosystems while ensuring user safety, as demonstrated by the UK’s Online Safety Act approach
Balancing freedom of expression protections with content moderation needs through built-in safeguards in legislation
Supporting multi-stakeholder approaches to internet governance that include civil society and community voices alongside government and private sector interests
Using existing frameworks like UN Guiding Principles on Business and Human Rights as common standards while developing new digital-specific tools
Leveraging decentralized decision-making models that combine community self-governance with appropriate oversight mechanisms
Thought provoking comments
The famous Section 230 limits liability of intermediaries like platforms when publishing user-generated content… In hindsight, today, given the challenges we’re discussing, this may look a bit excessive or causing some of the problems, but if we go back in history, the Section 230 stemmed out of some cases against early platforms… So they came up with these 26 words of section 230 that gave liability so that companies could choose their own content moderation.
Speaker
Michael Kende
Reason
This comment provided crucial historical context that reframed the entire discussion about platform accountability. By explaining the origins of Section 230, Kende showed how well-intentioned policies can create unintended consequences, adding nuance to what could have been a simplistic discussion about corporate responsibility.
Impact
This historical grounding set the tone for a more sophisticated discussion about the complexities of digital governance. It moved the conversation away from simple blame toward understanding systemic challenges, influencing subsequent speakers to consider implementation difficulties rather than just idealistic goals.
This is 2025. The UN is 79 years old… I have a nine-year-old nephew, and I believe that 2100 is enough of a horizon… That means what we do today should be useful, should be appropriate for 2100… The Convention on the Rights of the Child was adopted in 1989… those rights have remained immutable. They have not changed. Think though, WWW from 1989 or 1990 and what you see today… There has been a marked shift.
Speaker
Hlekiwe Kachali
Reason
This temporal framing was profoundly thought-provoking because it juxtaposed the permanence of human rights with the rapid evolution of technology. By using personal stakes (her nephew) and a 75-year horizon, Kachali transformed an abstract policy discussion into something urgent and human.
Impact
This comment fundamentally shifted the discussion’s temporal perspective from short-term implementation challenges to long-term sustainability. It influenced subsequent speakers to think about durability and adaptability of solutions, and established a more philosophical foundation for practical policy discussions.
We all lead digital lives… We must also remember that we as individuals and as corporations, we interact with each other generally without state mediation… And then, technology is an integral part of everything we do. It’s how we register our beneficiaries. It’s how we serve our beneficiaries. It’s how they hold us to account. All that to say is we cannot get away from it.
Speaker
Hlekiwe Kachali
Reason
This observation was insightful because it highlighted the fundamental shift in how society operates – that digital mediation has become inescapable and that private-public interactions now occur largely outside traditional regulatory frameworks. This challenged assumptions about how governance should work.
Impact
This comment pushed the discussion toward recognizing the inadequacy of traditional regulatory approaches and the need for new frameworks. It influenced later speakers to consider more innovative governance models and highlighted the urgency of the accountability challenge.
There’s a lack of diversity, equity and inclusion in leadership and research in the think tank space… There’s limited and equal funding for global south researchers… And also, there’s a risk of self-censorship, because organizations funding research on the digital economy are private sector organizations funding research on creating equitable solutions… it can be a form of ethics-washing.
Speaker
Shamira Ahmed
Reason
This was a crucial intervention that exposed the structural inequalities within the very systems meant to implement equitable digital governance. Ahmed revealed how funding structures and representation gaps could undermine the GDC’s inclusive aspirations, introducing uncomfortable but necessary complexity.
Impact
This comment significantly deepened the discussion by revealing systemic barriers to implementation that other speakers hadn’t addressed. It forced the conversation to confront not just what should be done, but who gets to decide and implement solutions, adding a critical power analysis dimension.
The way that Wikipedia is governed, namely through community self-governance that supports freedom of expression and equitable access to information, is an example of how decentralized models can lead to positive outcomes… We offer a model where content integrity is fostered by consensus and open debate and deliberation among volunteers… rather than top-down or ad-driven models.
Speaker
Jan Gerlach
Reason
This comment was insightful because it offered a concrete, successful alternative to the governance challenges discussed earlier. Instead of just critiquing existing systems, Gerlach presented a working model of decentralized governance that addresses many of the concerns raised about corporate accountability and community participation.
Impact
This intervention provided a constructive counterpoint to the earlier discussions about regulatory challenges and structural inequalities. It demonstrated that alternative governance models are not just theoretical but practically viable, influencing the conversation toward solution-oriented thinking and community empowerment.
What happens online has offline consequences, also known as the real world… UNICEF research found that kids who were bullied or sexually abused online had a much higher proportion of self-harm in real life. All that to say is we talk about online, we talk about offline, there isn’t really that much of a distinction anymore. It is the same life.
Speaker
Hlekiwe Kachali
Reason
This comment was particularly powerful because it challenged the artificial separation between digital and physical realms that often undermines policy effectiveness. By providing concrete evidence of real-world harm from online activities, it grounded the abstract policy discussion in human consequences.
Impact
This final intervention reinforced the urgency and stakes of the entire discussion, reminding participants that digital governance isn’t just about technology or economics, but about human welfare. It served as a compelling conclusion that tied together the various threads of accountability, implementation, and human rights.
Overall assessment
These key comments fundamentally shaped the discussion by progressively deepening its analytical sophistication. Kende’s historical context established that digital governance challenges are systemic rather than simply matters of corporate bad faith. Kachali’s temporal framing elevated the stakes and introduced long-term thinking, while her observations about digital-physical integration challenged traditional regulatory assumptions. Ahmed’s structural critique forced uncomfortable but necessary conversations about power and representation in implementation processes. Gerlach’s Wikipedia example provided hope and concrete alternatives to top-down approaches. Together, these interventions transformed what could have been a superficial discussion about policy implementation into a nuanced exploration of governance challenges, structural inequalities, and innovative solutions. The comments built upon each other to create a comprehensive analysis that acknowledged both the complexity of the challenges and the possibility of meaningful progress through community-centered approaches.
Follow-up questions
How should governments and multilaterals come up with public standards that hold tech and private companies accountable in advancing digital inclusion?
Speaker
Dio Herdiawan Tobing
Explanation
This is the central question of the roundtable discussion, addressing the gap between policy frameworks like the GDC and actual corporate accountability mechanisms
How can global governance instruments like the GDC meaningfully influence platform design, content moderation, and policies to uphold freedom of expression while mitigating harms from misinformation and algorithmic bias?
Speaker
Dio Herdiawan Tobing
Explanation
This addresses the challenge of making non-binding international frameworks effective in influencing actual platform operations and design decisions
How can the GDC be leveraged to ensure that digital technologies are designed and governed to protect children’s rights and bridge the digital divide affecting access to education?
Speaker
Dio Herdiawan Tobing
Explanation
This focuses on the specific implementation challenges for child protection in digital spaces, given that children are increasingly active digital users
What is the role of policy think tanks and scholars in bridging the gap between international digital governance frameworks like the GDC and actual on-the-ground implementation?
Speaker
Dio Herdiawan Tobing
Explanation
This addresses the implementation gap and the role of intermediary organizations in translating high-level policy into actionable strategies
How does Wikimedia Foundation see its role in shaping GDC implementation, particularly in safeguarding freedom of expression and ensuring credible access to information?
Speaker
Dio Herdiawan Tobing
Explanation
This explores how organizations championing free and open knowledge can contribute to implementing digital governance frameworks
How do we assign regulatory responsibilities in an era where we’re all digitally dependent and everything is digitally mediated?
Speaker
Hlekiwe Kachali
Explanation
This addresses fundamental questions about governance structures needed for a digitally dependent society and the intersection between private and non-private sectors
How do we take lofty aspirations from documents like the GDC and make them operational and actionable so they make a difference in people’s lives?
Speaker
Hlekiwe Kachali
Explanation
This addresses the implementation gap between high-level policy commitments and concrete actions that impact communities
How do we reform the existing inequities and risks in the global think tank ecosystem before leveraging think tanks as enablers for GDC implementation?
Speaker
Shamira Ahmed
Explanation
This highlights the need to address structural inequalities in policy research and think tank representation, particularly for Global South researchers
How do we ensure adequate data collection and measurement of GDC implementation progress across different countries and sectors?
Speaker
Axel Leblois
Explanation
This addresses the need for systematic monitoring and evaluation mechanisms to track actual implementation of GDC guidelines
How do we think through how different international instruments (GDC, SDGs, other frameworks) relate to each other and avoid duplicative reporting obligations?
Speaker
Hlekiwe Kachali
Explanation
This addresses the challenge of coordinating multiple international frameworks and reducing the burden of overlapping compliance requirements
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.