Uncategorized
WS #231 Address Digital Funding Gaps in the Developing World
WS #231 Address Digital Funding Gaps in the Developing World
Session at a glance
Summary
This discussion, hosted by the APNIC Foundation at the Internet Governance Forum 2025 in Oslo, focused on addressing digital funding gaps in the developing world. The panel brought together representatives from various organizations including the Tech Global Institute, Swiss Federal Department of Foreign Affairs, ICANN, GIZ, and the APNIC Foundation to examine challenges in internet and digital development funding.
The panelists highlighted that despite 20 years of progress since the World Summit on the Information Society, significant disparities persist, with 32% of the world’s population remaining unconnected and stark differences between global north (93% connectivity) and global south (27% connectivity) regions. A concerning trend has emerged where official development assistance, philanthropic giving, and corporate funding are declining simultaneously, creating unprecedented funding gaps for digital development initiatives.
Key challenges identified include the creation of new digital divides, forced internet shutdowns affecting marginalized populations, and the tension between digital transformation goals and human rights protection. The discussion emphasized that meaningful digital transformation requires more than just infrastructure deployment—it needs local capacity building, community ownership, and alignment with national development priorities rather than externally imposed solutions.
Panelists stressed the importance of shifting from traditional donor-recipient models toward collaborative partnerships that respect local sovereignty and decision-making. They advocated for outcome-focused rather than output-focused approaches, transparency in funding mechanisms, and the mainstreaming of digital solutions across all development sectors including health, education, and climate resilience.
The conversation concluded with calls for greater coordination among stakeholders to avoid duplication of efforts and maximize impact despite shrinking resources. Participants emphasized that digital inclusion should be treated as a fundamental development issue requiring collective action and innovative funding models to ensure sustainable progress across developing regions.
Keypoints
## Major Discussion Points:
– **Persistent Digital Divides and Funding Challenges**: The discussion highlighted that despite 20 years of progress since WSIS, 32% of the world’s population remains unconnected, with stark disparities between Global North (93% connected) and Global South (27% connected). This is compounded by declining Official Development Assistance (ODA), reduced philanthropic giving, and shrinking corporate funding for digital development.
– **Need for Structural Change and Local Ownership**: Panelists emphasized moving beyond traditional donor-recipient models toward approaches that prioritize local capacity building, government sovereignty in determining digital futures, and community-driven solutions. There was strong criticism of supply-oriented approaches that impose solutions without considering local contexts and procurement processes.
– **Collaboration Over Competition**: A recurring theme was the urgent need to reduce duplication of efforts among organizations and instead focus on meaningful collaboration. Speakers called for breaking down silos, sharing resources, and creating collective impact frameworks rather than competing for the same objectives with shrinking funding pools.
– **Digital Transformation as Cross-Sectoral Enabler**: The conversation explored how digital transformation should be mainstreamed across all development sectors (health, education, climate, etc.) rather than treated as a separate infrastructure challenge. This includes ensuring that digital solutions contribute to broader socio-economic outcomes and sustainable development goals.
– **Balancing Digital Development with Human Rights**: Panelists discussed the tension between rapid digital transformation and human rights protection, emphasizing that meaningful access must include not just connectivity but also digital literacy, affordability, cybersecurity, and the ability for communities to shape their own digital futures while maintaining fundamental rights.
## Overall Purpose:
The discussion aimed to address critical funding gaps in digital development across the Global South and developing regions, bringing together diverse stakeholders to explore innovative funding models, partnership approaches, and collaborative frameworks that could ensure sustainable progress in internet and digital development despite declining traditional funding sources.
## Overall Tone:
The discussion maintained a professional yet candid tone throughout, with speakers demonstrating both concern about current challenges and cautious optimism about potential solutions. The tone was collaborative and solution-oriented, with panelists building on each other’s points rather than debating. There was an underlying sense of urgency about the widening digital divides, but this was balanced by constructive dialogue about practical approaches. The conversation became increasingly inspiring toward the end, with speakers drawing parallels to successful global health initiatives and emphasizing the potential for collective action to overcome seemingly insurmountable challenges.
Speakers
**Speakers from the provided list:**
– **Neeti Biyani** – Works with the APNIC Foundation, Session moderator/host
– **Sabhanaz Rashid Diya** – Executive Director at Tech Global Institute, working at the intersection of government, businesses and civil society
– **Remy Friedmann** – Senior Advisor Human Security and Business at the Swiss Federal Department of Foreign Affairs
– **Maarten Botterman** – Director on the Board of Directors at ICANN (participating online)
– **Franz von Weizsaecker** – Responsible for Economic Development and Digital Transformation at GIZ
– **Raj Singh** – CEO of the APNIC Foundation
– **Audience** – Various audience members who asked questions and provided comments
– **Online moderator** – Colleague named Omar, serving as online moderator
**Additional speakers:**
– **Molly** – Works with the Digital Health and Rights Project, released research report on ODA donors in Europe
Full session report
# Addressing Digital Funding Gaps in the Developing World: A Comprehensive Discussion Report
## Executive Summary
This discussion, hosted by the APNIC Foundation at the Internet Governance Forum 2024, brought together experts to examine critical challenges surrounding digital funding gaps in developing regions. The session, moderated by Neeti Biyani from the APNIC Foundation, featured representatives from the Tech Global Institute, Swiss Federal Department of Foreign Affairs, ICANN, GIZ, and the APNIC Foundation.
The discussion revealed persistent digital divides despite decades of progress. With 32% of the world’s population remaining unconnected and stark disparities between the Global North (93% connectivity) and Global South (27% connectivity), the challenge has been compounded by declining official development assistance and reduced corporate funding for digital development initiatives.
Key themes included the need to shift from traditional donor-recipient models toward collaborative partnerships, the importance of mainstreaming digital solutions across all development sectors, and the tension between rapid digital transformation and human rights protection. Speakers emphasized moving from capacity building to capability building and focusing on sustainable outcomes rather than project outputs.
## Key Participants and Their Main Contributions
### Neeti Biyani – APNIC Foundation (Session Moderator)
Biyani framed the discussion around the need for digital transformation that serves whole-of-society and whole-of-government approaches. She emphasized that meaningful digital transformation must lead to better lives and socio-economic outcomes, questioning the balance between government interventions and market-driven solutions.
### Sabhanaz Rashid Diya – Tech Global Institute
Diya provided critical statistics showing that 32% of the world’s population remains unconnected, with 93% connectivity in the Global North versus 27% in the Global South. She highlighted concerning trends including “296 shutdowns in 54 countries” and identified an “unhealthy tension between digital development and human rights.” Diya criticized the “culture of imposition” where Global North actors decide what gets implemented in Global South countries, describing this as a “norm shapers versus norm takers dynamic.”
### Remy Friedmann – Swiss Federal Department of Foreign Affairs
Friedmann emphasized Switzerland’s approach to partnership, inclusiveness, and community-driven solutions. He stressed that governments have a responsibility to protect human rights while setting digital standards and safeguards. He advocated for mainstreaming digital exclusion within climate, gender, education, and health agendas rather than treating it as a separate issue, calling for collective impact frameworks that center local actors.
### Maarten Botterman – ICANN (Online Participant)
Botterman represented ICANN’s “one world, one internet” vision through capacity building and regional outreach programs. He highlighted ICANN’s Coalition for Digital Africa and engagement with 41 African governments. He emphasized that local capacity building and understanding are crucial for successful digital inclusion initiatives, with communities needing to determine their own digital transformation priorities.
### Franz von Weizsaecker – GIZ
Von Weizsaecker noted that German development funding faces a 10% decline while some philanthropic funding attempts to compensate. He advocated for mainstreaming digital transformation across all sectors as an enabler for sustainable development goals. He warned about potential waste of global public goods like low earth orbit satellite infrastructure due to corporate monopolization and called for open ecosystem approaches.
### Raj Singh – APNIC Foundation
Singh highlighted that despite decades of work, “we seem to be creating new digital divides constantly” and questioned whether the development community is solving problems or perpetuating them. He advocated for a fundamental shift from “capacity building to capability building,” explaining that “when you have capabilities, then you can do things.” Singh noted the APNIC Foundation’s 16-year-old innovation fund and criticized significant duplication of efforts among organizations despite shrinking funding resources.
## Major Discussion Topics
### Persistent Digital Divides and Funding Challenges
The discussion opened with stark statistics highlighting ongoing connectivity gaps. Diya’s presentation showed that despite 20 years of progress since the World Summit on the Information Society, significant portions of the global population remain unconnected, with particularly acute challenges in the Global South.
Singh added regional context, noting that the Asia Pacific region presents unique challenges with the most advanced and least developed economies coexisting. He specifically highlighted that “half of the South Asia sub-region remains unconnected” and observed that “there’s a lot of submarine cables being deployed all across the world. The problem is, it’s the cables that are being deployed. There’s no supporting ecosystem that’s being set up at the same time.”
Multiple speakers acknowledged declining resources across traditional funding streams. Von Weizsaecker reported the 10% decline in German development funding budgets, while Singh noted that private sector funding is shrinking due to margin pressure and reduced corporate social responsibility investments.
### Power Dynamics and Local Ownership
A significant portion of the discussion focused on power imbalances in digital development. Diya’s analysis of the “culture of imposition” highlighted how Global North actors often determine solutions for Global South contexts. She suggested that when communities gain the ability to define conditions for accepting or rejecting interventions, “we see a little bit more of that negotiation, a little bit more of that empowerment and that ownership happening.”
Singh supported this perspective by distinguishing between capacity and capabilities: “It’s no longer for us… it’s not about building capacity anymore, it’s about building capabilities. Because when you have capabilities, then you can do things.” This represents a shift from training-focused approaches toward enabling actual implementation and sustainable outcomes.
### Digital Transformation as Cross-Sectoral Enabler
Speakers emphasized viewing digital transformation not as standalone infrastructure but as an enabler across all development sectors. Von Weizsaecker stated that “digital transformation should be mainstreamed across all sectors as an enabler for sustainable development goals.”
Friedmann reinforced this view, emphasizing that “digital exclusion should be mainstreamed within climate, gender, education, and health agendas.” This mainstreaming approach suggests that digital solutions should contribute to broader socio-economic outcomes rather than being pursued independently.
### Human Rights and Development Tensions
The discussion addressed tensions between rapid digital transformation and human rights protection. Diya identified this as an “unhealthy tension,” noting that digital transformation often comes with “disconnection trends through forced shutdowns and network throttling” that particularly affect marginalized populations.
Friedmann offered a governmental perspective, emphasizing that “governments have responsibility to protect human rights whilst setting digital standards and safeguards,” suggesting that human rights considerations should be integrated into digital development from the outset.
## Key Challenges Identified
### Coordination and Duplication
Singh highlighted the problem of “multiple organizations doing similar work with little actual collaboration despite funding constraints.” He questioned whether progress had been made on collaboration, asking “In six months, have we gone any step forward or not in terms of collaboration and avoiding duplication?”
### Infrastructure Without Ecosystems
The discussion revealed concerns about infrastructure deployment without supporting systems. Singh noted that submarine cables are being deployed without supporting ecosystems, while Von Weizsaecker warned about potential waste of satellite infrastructure due to corporate monopolization.
### Declining Traditional Funding
Multiple speakers acknowledged the convergence of declining official development assistance, reduced philanthropic giving, and shrinking corporate funding, creating unprecedented challenges for digital development initiatives.
### Transparency and Accountability
An audience member from the Digital Health and Rights Project asked: “How can we work together as funders and other stakeholders, with transparency in investment amounts, where they’re being invested and who we’re working with?” This highlighted the need for better tracking and coordination mechanisms.
## Proposed Solutions and Next Steps
### Outcome-Focused Approaches
The APNIC Foundation committed to shifting focus from outputs to outcomes in their grant-making and project evaluation. Singh offered to share expertise on outcome-focused metrics with other organizations, including ICANN.
### Collaborative Frameworks
Friedmann proposed collective impact frameworks that center local actors while bringing together funders, implementers, and communities into strategic alignment. This approach could help reduce duplication while respecting different organizational mandates.
### Mainstreaming Digital Solutions
Multiple speakers advocated for integrating digital solutions across all development sectors rather than treating them as separate initiatives. This requires coordination across health, education, climate, and gender programs.
### Open Ecosystem Approaches
Von Weizsaecker proposed open ecosystem approaches to infrastructure development, particularly for satellite networks, to ensure that global investments automatically benefit underserved regions.
## Audience Participation and Questions
The session included active audience participation, with questions about transparency in digital health investments and challenges faced by private companies trying to offer solutions in developing regions. A representative from a Norwegian company highlighted difficulties private companies face due to skepticism and lack of trust, making it difficult to offer even low-cost solutions to developing regions.
These questions underscored ongoing challenges in creating effective mechanisms for private sector engagement in development contexts while maintaining focus on sustainable, locally-owned solutions.
## Conclusion
The discussion revealed that addressing digital funding gaps requires more than mobilizing additional resources—it demands fundamental changes in how digital development is conceptualized and implemented. Key themes included shifting from donor-recipient to partnership models, from capacity building to capability building, and from outputs to outcomes.
While speakers identified significant challenges including declining funding, persistent coordination problems, and ongoing power imbalances, they also outlined concrete steps forward. These include the APNIC Foundation’s commitment to outcome-focused approaches, proposals for collective impact frameworks, and emphasis on mainstreaming digital solutions across all development sectors.
The conversation concluded with recognition that success will depend not just on what is done, but how it is done—with genuine partnership, respect for local priorities, and focus on sustainable outcomes that serve the communities digital development aims to support. As Singh observed, the challenge is ensuring that decades of development work actually solve problems rather than perpetuate them, requiring honest assessment and willingness to make fundamental changes in approach.
Session transcript
Neeti Biyani: Hello, a very good afternoon to everyone who’s joining us in person in Oslo and good morning or good evening if you’re joining us online. I am Neeti Biyani. I work with the APNIC Foundation and I am going to be hosting this session today, which is titled Addressing Digital Funding Gaps in the Developing World. Considering we do have the scope for a roundtable set up here, I’d request if anyone wants to join us on stage so that we can have a more candid, informal conversation, I’d like you to be as big a part of it as we are. So if anyone wants to please come on stage and join us here, please feel free. All right. Let me just start out by introducing the APNIC Foundation, a little bit about what we do. We are an internet and digital development organisation. We serve across 56 economies in the Asia-Pacific region. We invest in and channel resources towards building technical and human capabilities. We help drive digital innovation and we enable digital transformation across the region, working with a host of different stakeholders and partners. I am privileged to be hosting this conversation on addressing gaps in funding for digital development at the IGF 2025 in Oslo today, this afternoon. As we all know, we’ve seen quite a rapid decline in official development assistance, ODA, philanthropy giving, corporate giving, all of which have together exacerbated gaps in funding available for internet and digital development or ICT for development in the global majority especially. In this context, I hope that you know, as various stakeholders present in the room today, we can discuss, have a rethink of funding models, partnerships and collective impact to ensure that we are impacting sustainable progress and outcomes across the various regions in the developing world. I am joined today by a remarkable panel. I’ll maybe start from my absolute left. I have Sabhanaz Rashid Diya. She is Executive Director at Tech Global Institute, working at the intersection of government, businesses as well as civil society. To my left is Remy Friedmann, who is Senior Advisor Human Security and Business at the Swiss Federal Department of Foreign Affairs. I am joined online by Maarten Botterman, who is a Director on the Board of Directors at ICANN. On my right, I am joined by Franz von Weizsaecker. Did I completely murder your last name? Who is responsible for Economic Development and Digital Transformation at GIZ. And to my absolute right is Raj Singh, CEO of the EPNIC Foundation. And I am Neeti, your moderator today. So, I maybe want to start out by asking Diya, how do you understand digital development efforts across the global majority, across the developing regions so far, maybe, you know, in the last couple of decades, you know, how do we, how do we, you know, for the establishing of a common understanding of digital development, of internet development, how do we understand that collectively?
Sabhanaz Rashid Diya: Thanks, Neeti. And hi, everybody. Very privileged and excited to be on this panel. I think just diving right into it, you know, in the last, I would say, if you take the last 20 years, perhaps, starting with the original commitments of the WSIS process and how there was a real sort of global understanding around bringing the unconnected online, you know, supporting digital transformation, particularly an important, and addressing some of the issues in digital divides, gender, you know, sort of creating more equitable world that is digitally transformative. We’ve seen a tremendous amount of progress. We’ve seen sort of, you know, more countries coming online, communities coming online. There has been a huge push to create jobs, create sort of development outcomes, you know, tackle some of the most pressing, pressing issues in society using technology. But at the same time, I think, you know, we are still, you know, 20 years later, we’re still at a point where 32% of the world’s population still remains unconnected. And, you know, that continues to be a challenge. And it’s, and the global north-south disparity is, is quite, still quite prevalent. In the global north, we have 93% who are people who are online. In the global south, we only have 27% of the people who are online. So that disparity is very much present. And we know, you know, from our work and communities that the people who are most disproportionately affected are women, girls, marginalized populations, you know, minority groups, et cetera, who, who just are not able to come online, whether that’s for device affordability, whether that’s because of, you know, lack of the actual infrastructure, being able to reach certain parts of the world. And so we have a number of challenges, which is beyond just the infrastructure, but also many structural challenges. And so I think there’s still a lot of need for that push to happen. But also in the last 10 years, I would say, there’s also been trends of not just people not having the resources to come online, but also when people are online, a tendency to also disconnect them. So we talk a lot about the unconnected population, but there is also a tendency of disconnecting populations through forced shutdowns or network throttling. And that’s also a huge concern. And so just based on Access Now’s report, last year alone, there was 296 shutdowns in 54 countries. So that’s an extraordinary number. So even when there is this push in the broader development agenda to get more people online, there’s also this counter, I would say, sort of trend where people are getting disconnected. And in that sort of a situation, I think there’s we’re at a pivotal moment where more and more people, especially the global south, they feel the need to kind of catch up with the global technology race. They’re seeing huge amounts of some countries moving very fast, some countries slowing down. And so I think at this point, and especially in line with the discussion we’re going to have today, the funding reductions we see around the world, the older systems falling is a really concerning trend, because in many ways, that’s going to really set back years of progress that has been made. And now more than ever, we need that to happen, because that disparity between the global north and south is, you know, whatever progress we’ve made, that’s probably going to widen a lot more. And it’s going to put an entire part of the world, perhaps the majority part of the world, in a position where they’re completely left out of any kind of technological progress and any kind of opportunities to see the benefits of digital in their countries. And so I think this broad connection between resource availability, you know, the country’s aspirations for development and broader economic progress is very much at risk today, and happy to speak more about it to them.
Neeti Biyani: Thanks Diya. I think just bouncing off of what we’ve heard so far, let me turn to Raj. Raj, if you can speak on behalf of the APAC region, which is one of the largest developing regions in the world with the bulk of the unconnected, underserved, remote, dispersed populations. How do you understand, you know, Diya’s reference to the changes that we’ve seen in the funding landscape quite recently?
Raj Singh: So you’re quite right to say that the Asia Pacific is very diverse. It’s the world’s largest region. And just before this session, I was speaking in the parliamentary session and what I said there was, and I’ll just repeat that here, the fact that the Asia Pacific as an example, you know, we’ve got some of the most advanced economies in the world and we’ve got some of the most least developed economies in the world. And that in itself becomes a challenge. Now, overlaid on top of that is that, you know, you see parts of the world leaping ahead in various types of new technologies, be it, you know, the government itself either invest in it or they’ve got very mature industries or the private sector who are taking the lead in moving forward. Then you have all these other economies that are nowhere near that level of development. And then, you know, for example, you go to an event like the IGF or to various other multilateral meetings and other conferences around the world and you hear all these stories, people say, no, we’re doing this, we’re doing that. AEA does this, you know, IOT or whatever the next iteration is going to be, it’s going to change everything and this is what you need to do. So suddenly you have here an economy which probably doesn’t even have a fully-fledged policymaking unit. in their regulatory section that can actually develop policies and shape what things should look like because most of the time they’re working off a reactive basis right something happens they need to react to that so it’s what I call reactive policy making not proactive policy making so that just to sort of lay the foundation there of why I think this is a problem right then you have the different levels of development that exists and again Asia Pacific is pretty much a poster child South Asia which is one of the four sub-regions in the Asia Pacific half of that sub-region is still unconnected okay which is a very stark statistic right now there are various challenges on why that is the case but the fact still remains half that sub-region is unconnected and and if someone’s heard me speak before they you would know that I always bring up one thing is and that’s that we seem to be creating new digital digital divides constantly we’re not stopping and I’ve been in this sector this industry for pretty much all my career so going on close to 30 years now we were talking about stuff 30-20 years ago in a slightly different guise it was ICT4D we’re still talking about the same issues some of those issues have not been solved I mean last year we did a couple of panels at the AGF and we brought up these issues again given it’s only been six odd months since the last IGF but the fact is you know nothing has changed all that much now on top of all that what we’ve seen in in the front end of this year there’s been changes made globally in how things have been funded ODA for example overseas development assistance so so that’s one the part of money is shrunk but if you look at the private sector even their own parts of money are shrinking because there’s margin pressure right there was a time when a lot of the private sector would go and fund things out of CSR or other reasons what we are seeing right now that’s shrinking very very rapidly And then governments themselves, they have their own priorities. I mean, you know, do they invest in primary health care, or do they invest in a funky new internet infrastructure, right? So that’s sometimes a hard decision to make. So I’m just going to let that hang there, because, you know, what I’m keen to see is, and Franz was with me on the panel last year, we had some interesting discussions on where we should get to, and which means, you know, don’t duplicate stuff, work together, collaborate, complement. But I’m keen to understand, in six months, have we gone any step forward or not? I don’t know.
Neeti Biyani: Thanks, Raj. We’ll definitely come to that in the course of this conversation. I think at this point, let me turn to Maarten. Maarten, you’re online. Hopefully, you can hear me, even though I can’t see you at the moment. But, you know, you’re here representing ICANN. I see you. Hi, Maarten. You’re here representing ICANN, wearing a few more hats, I’m very sure. ICANN’s had a particularly focused approach to supporting initiatives and developments across the world, which is that you want to further your vision of a single, open, globally interoperable internet. I just want to ask you, bouncing off of what we’ve heard from Raj, and, you know, the particular challenges that, you know, either being unconnected or, you know, not having meaningful connectivity or meaningful access. How do you interpret those in terms of ICANN’s mission, the role that ICANN plays, as well as how you envision ICANN’s impact across the landscape, across the sector?
Maarten Botterman: Thanks, Neeti, for the question. It’s indeed, as you say, it’s one world, one internet. And ICANN is there to serve the world. Raj is representing the APEC region. As we know today, most of the internet users live in the APEC region, whereas 25 years ago, most of them lived in North America. So you see a shift in the world. This also means that there’s a shift in the world in terms of experience, where the markets are. And we’re very much aware that we are there to support the world. And that means that in countries where advancement of internet is less, there’s more to win. And we actually actively reach out to support that. An example is the continent of Africa. The continent of Africa is where, percentage-wise, there’s most growth happening over the years to come in terms of people also getting connected. That is where we see many of the next billion, as well as in other parts of the world, including APEC. But in Africa, we are, for instance, actively engaged in a capacity-building initiative, which is called the Coalition for Digital Africa, which is really to support all kinds of transformative projects that aim at enhancing Africa’s internet infrastructure, cybersecurity, digital inclusivity, and the governance. So initiatives we have there is engaging with about 41 African governments, countries, but also with, for instance, the African University Collaboration Group. And things we’ve been doing there is, on one hand, together with internet society, for instance, setting up new connection points for the internet, installing new root server instances from the ICANN managed route that Elroot referred to. That results now, most of the traffic within the region and to expand also the top-level domain performance monitoring in Africa. All this to help Africa to also step up and grab the opportunities. Next to that, we’re also supporting, for instance, as you know, ICANN is a multi-stakeholder organisation in which different groups have their place. And also for governments, we do capacity development. We help them to get on board of the GECC, become aware of law-making practices and with people from over 19 countries on the technical functioning of the internet. Now, next to that, of course, this is just a regional example and we have regional outreach throughout the world because we are very much aware that we are there to serve the world. Now, with that, a very clear example of that would be the next round of top-level domains where there will be an opening for new initiatives to have a top-level domain that serves specifically also regions around the world. For instance, in their own character sets, on languages. Languages are important, they represent culture, but also not everybody in the world is able or should be expected to communicate in English. So it’s good that we also support that richness and we actively do of languages and character sets. Also in a new round. And also being aware that particular regions that aren’t that advanced yet, extra support is necessary. We foresee that the applicant support programme that has been put in place for those who are less aware at the moment, to help them to get up to speed with appropriate applications and good business plans. So with that, we truly believe one world, one internet and ICANN is standing ready to support that effectively.
Neeti Biyani: Thanks Maarten. I think, Franz, if I can come to you. We’ve heard from Maarten that, you know, their support starts with squarely serving the mission of a single, open, interoperable internet. Correct me if I’m wrong, but I feel like GIZ had a slightly different approach, where you support bettering and furthering socio-economic progress and outcomes across the service regions you work in. But you’ve, since 2018, mainstreamed digital solutions as part of the projects that you do support, wherein you’re trying to make sure that every project has a digital solution, a digital infrastructure component that then can be mainstreamed or replicated or scaled. What’s your experience been like working on real, very real world issues, if I may call them that, but with, you know, squarely mainstreamed digital solutions?
Franz von Weizsaecker: Thank you so much. Maybe let me get one thing straight. It shouldn’t come across as if I was not in favor of free and open and internet for everybody. So that definitely is part of our agenda. And the general trends in development funding, you’ve been describing it initially, the official development assistance tendencies, looking at USAID and so on, are going down. At the same time, some philanthropic funding is coming with the Gates Foundation, trying to compensate some of those declines. And just today, we got the draft budget of the German government. We have about a 10 percent decline in the German development funding budget of the BMZ ministry, which is not as disruptive as in some other contexts. So indeed, what you mentioned is digital transformation is indeed a mainstreaming topic across our entire portfolio of achieving all the sustainable development goals or all the goals of the African 2063 agenda or various national development agendas. And just like, I mean, there are some mainstreaming topics that development actors are working on. We have gender mainstreaming, human rights mainstream and digital mainstreaming. And to be honest, the digital mainstreaming is possibly the most successful one of those, knowing that no line ministry, no initiative, no health initiative, education initiative, and throughout all the sectors, the Internet became a key enabler to be achieving the sustainable development goals and the factor for economic growth and for trade and for all the other goals that you have there. At the same time, some of the goals are conflicting. So if you look at the climate goals, of course, we have a huge energy consumption and corresponding carbon emissions resulting from AI, from data centers, from digital infrastructures. But then if you compare how much carbon emissions per economic activity you have in the digital sector, that is still much less than you have in more traditional industries in mining and manufacturing and all these other economic activities. So still you could say the the GDP per carbon emission is still quite good in the digital economy in comparison. And yeah, we do try to achieve these sustainable development goals in a context where, you know, our funding governments from Germany, from Europe, there are some political shifts we have to deal with. And if you look at the European Union, the major initiative is called the Global Gateway. And that is the attempt to leverage also private capital, also private capital of European investors in the achievement of development goals in infrastructure and for energy, for digital data centers, IXPs. So, that is part of what we are doing, is trying to leverage these private capital for the achievement of the development goals, knowing that we cannot entirely rely on public ODA funding all the time. So, that is maybe the big trend going forward. Of course, at the same time, any private capital depends on the regulatory and the investment environment to be ready for that. And I cannot say that this is the case in all of the countries that we work with, where any European investors might be very hesitant to come in and invest in some countries when the general investment environment doesn’t seem to be ready. So, that is part of the risk of this tendency, is that we are losing out, especially on some of the security-wise, like those countries where the security situation or the general investment environment is not good. So, that is indeed a big open question of how we best… deal with that. For the Asian region, largely, a very large part of it is very ready for private investment, also international FDI. However, not all of the places, and especially in Africa, many countries, private investors would be concerned to be putting their money in. So, it’s about de-risking from using development banks’ mechanisms, de-risking these private investments, and then, of course, using the ODA where it is needed, where there is no alternative with this, for example, global gateway investment package.
Neeti Biyani: Thanks, Franz. Having spoken of EU governments, we happen to have a representative on our panel. Remy, if I can turn to you, last but not the least at all. How do you see the role of the Swiss government in the debate of digital development, building the digital rights narrative, contributing to this ongoing conversation of more and more people having access, what kind of access, etc.? While there is an infrastructure and a connectivity and access question, there is also a quality of access question. So, would you want to comment on that from the Swiss government’s perspective?
Remy Friedmann: Thank you, Neeti, and thank you for inviting me to this very interesting conversation. Hello, everyone. You put me just in an awkward position. Switzerland is not an EU member, so we’ll speak for Switzerland, not for the EU. But anyway, I mean, Switzerland’s international cooperation is guided by several principles, including promoting sustainable development, alleviating poverty, fostering peace and human rights. This is embedded in our international cooperation strategy, which is currently being renewed. And we really emphasize partnership, inclusiveness. Switzerland really strives to strengthen digital capacities of its partner countries to improve the resilience of public services and civil society, but still risks like the digital divide which worsens inequalities must be addressed. We are, as Switzerland, committed to establish strong digital governance frameworks aligned with international law and the different processes, ensuring fair and secure data use that protect individual dignity and safety. We are all here committed to advancing digital inclusion through open, rights-respecting and sustainable approaches, but we must recognize that the declining landscape of ODA and philanthropic capital poses a serious challenge, but also an urgent call for innovation and coordination also in the way we work. We are all convinced, and you just asked what we are striving for, is really that we all believe that meaningful Internet access is not a luxury, it’s really a foundational element of economic development, participation, democratic participation and community resilience. Switzerland is a member of the Freedom Online Coalition, and in line with the donors’ principles for human rights in the digital age at the Coalition published in 2023, we support efforts that are transparent, locally grounded and aligned with broader development goals. So, to this end, we see value in pooling resources, whether through catalytic funding, regional investment mechanism or blended finance, in order to unlock scalable and context-sensitive solutions. But we need, really most importantly, to ensure that these solutions are community-driven, inclusive and capable of strengthening local digital ecosystems rather than creating dependency. So really, this session is an opportunity to rethink how we work together, but not just how we fund. So, let’s see how we can construct a collective impact framework that builds. on Existing Knowledge, centering on local centres, local actors, bringing together funders, implementers and communities into a strategic alignment. These are great assemblages, but it’s a big question also. So let’s see how we can do so to more effectively close the digital divide and ensure that communities are not left behind in this digital transformation. So a question for all of us, I mean, how can we collectively address this growing gap and ensure that digital progress benefits everyone, not just a few?
Neeti Biyani: Thank you, Remy. I think you did my job there. That was going to be my next question, actually. So maybe before I just go back to some of the speakers, I wanted to open up the question to the audience and ask if you wanted to step in at this point and share any ideas, any reflections you had about, you know, how do we bridge the gaps that we see in internet development, digital development at the moment? Remy brought in, you know, a very key element of what the rest of the conversation is hopefully going to focus on, which is establishing a collective impact framework or ways of thinking where we are not competing, but we’re sharing space and we’re working with one another to hopefully make change, transformation, impact more sustainable, where we’re not reinventing the wheel, excuse me, but we’re replicating and scaling solutions where we can. The floor is open. The mic’s all yours. If you want to jump in, share your ideas, share your thoughts.
Audience: Yes, hello. I work for a Norwegian company within law enforcement, and we currently sell to… police in all across the world. What we see is it’s difficult to sell to countries that are used to getting funding, even though they are, you know, have support from the UN system. It is a lot of skepticism around how this can be done. And also this zero trust is also difficult to handle. So there are ways that this works in the current setup. And we are happy to offer a low cost version of our tools to these regions, but it’s difficult to find the way. So I think that is, there are options of using current setups that we’re using in every other country. But it’s the zero trust and the lack of kind of solutions and ways to work around this that is difficult, I would say. So how to get in, how to be able to market or talk about the solution, because nobody wants to talk about who they’re collaborating with, if it’s a private company. So there’s a lot of skepticism around private companies. And I think this is a challenge that could have an easy solution. If there was kind of a marketplace or one or another way of dealing with these challenges. So I don’t know if anybody has any experience with this or yeah, can help.
Neeti Biyani: Okay, maybe we can, yeah, Maarten, just before I come to you, maybe we can take one more reflection before we go back to the panel. Ah, I see someone already at the mic there. Yeah, please go ahead.
Audience: I don’t have my headphones in, so I can’t hear. Should I go for it? Thank you. My name is Molly. I’m with the Digital Health and Rights Project, and I’ve just released a research report looking at seven ODA donors in Europe alongside the Bill and Melinda Gates Foundation and the Wellcome Trust looking at their investments into digital health. We found it really difficult to understand where investments were being made, who to talk to, different collaborating partners, and I’m really interested in this idea of collaboration and how we can work together as funders and other stakeholders, but transparency in those investment amounts, where they’re being invested and who we’re working with is really important. So I wondered if the speakers had any comments on transparency and tracking and M&E for investment portfolios. Thank you.
Neeti Biyani: That’s a great question. Thank you. Maarten, if I could go to you for the first reflection, if you had any responses before checking in with the rest of the panel. Yeah, no, thanks.
Maarten Botterman: Thanks a lot. I think it’s a very good question. What we see is there is a lot of offer of willingness to help, but how do you get that to help where it’s most needed? And there is organizations to focus on that. I think the crucial element is that if you talk about digital inclusion, if you really want to serve the world with the internet, if you want the world to access the internet, you also need local capacity building. And without local understanding of what’s needed and what can help, it’s very difficult to lend anything there successfully. So there is initiatives that really follow up on that, ranging from youth training programs to, for instance, the Global Forum for Cyber Expertise, that has capacity building workshops in diverse regions, where they bring in global knowledge together with local knowledge and local stakeholders to match what’s needed today, how good practice looks like, how to implement it. from the global perspective, Topol Tweif, Insights, and Local Needs, and then work together on, so, how would an action plan, what actions could we draw from here that would really help us to leapfrog? Because many of the problems that we have in different regions are already solved somewhere in the world. And also, I think, strategically, we recognise very much in the ICANN strategic plan. If it’s about inclusion, you need to work with locals. You need to make sure that you reach out. And also, Internet Society is very active in reaching out to the regions in that way. So, without contact in the local community, how can you successfully lend a global donation? Global Forum for Cyber Expertise is one of those things. But indeed, it’s to funnel it right and transparency is limited. I realised that we had a conference in Ghana, in Accra, where somebody from the UN presented all the different funds that were available to stimulate something of the Internet in the region. And there were like 60 different ones. Impossible to see where they overlap, how they connect, etc. So, we really need to work on that, and organisations like GFC, but also Internet Society, Global Cyber Alliance and other bodies, they can be very helpful in that, because they understand it’s about linking the global to what’s needed local.
Neeti Biyani: Thanks, Maarten. Do any of my other panellists have responses or views to what we’ve heard so far from the audience? Franz?
Franz von Weizsaecker: Yeah, I would like to respond to the intervention from the Norwegian company trying to sell a digital government solution worldwide. and I do see a major gap and I do see there is an unhealthy tendency sometimes that some of the aid projects they come basically saying like okay here’s your solution we bring it and this is what we want to bring and so it’s a very much supply oriented approach and then not looking about how does the government procure these services in their local national legal system and not looking about how this is going to be how the operational cost of it is going to be covered resulting in quite a few of solutions that come there and then as soon as the project is over they’re gone and that is very unhealthy tendency and therefore I believe the solution to the challenge you were describing is this is a governmental solution so it needs to go through the national governmental procurement process it needs to be a conscious decision by a competent procurement body saying like is this something make or buy is this something which part of the solution can we can we do and maintain locally through our ministerial staff which part of the solution do we need to buy what’s the mode of purchase is this software as a service do we purchase hardware what what is the mode so and that requires quite some more capacity and procurement than is present in many of our partner countries so I definitely am a great advocate in favor of supporting even though it’s not a sexy topic but supporting public procurement for digital solutions that is that is really at the core of it to achieving a lot of the development goals in the line ministries in the security sector as the example you mentioned.
Neeti Biyani: Thanks Franz, Diya you wanted to respond as well?
Sabhanaz Rashid Diya: Yeah thank you I think I’ll try to get a bit to the second question but also maybe till the time of the first one I think one of the challenges we also see in the broader digital transformation narrative is this I think, not really by design, but over the years, sort of this unhealthy tension between digital development and human rights. And I think that itself is the reason why, you know, we don’t see the kind of transparency, see the kind of ownership, or see the kind of, I would say, sort of clarity from many of, particularly in the Global South, right? So I think that tension between, you know, where transformation comes at the cost of human rights, I think that narrative, in many ways, has actually been quite challenging to navigate, particularly in the digital kind of development space. But that has also, at the same time, created this other tension, which is, you know, what we call the norm shapers versus norm takers, right? So you have the Global North, who are shaping the norms, you know, you go to the IGF, look around you, there’s just so few Global South representatives, whether it’s governments, whether it’s civil society, whether it’s the private sector. So there’s also that, because of that tension, I think, even more, there’s this also this culture of, you know, imposition that has also come over the years, where, you know, some of the norm shapers are kind of also deciding what gets done in some of the Global South countries. So to Frances’ point, you know, I think in some ways, you know, ODA, while having a lot of meaningful transformation in many countries, and meaningful impact in many parts of the world, has also kind of permeated that system of, you know, that tension between norm shapers and norm takers, where, you know, you’re constantly having other countries defining in some ways how you should be thinking about development, how you should be thinking about your economy. And that continues to be a challenge. And I think and I think part of trying to tackle that, which goes a bit into Remy’s points, you know, how do we work collectively together, I think is really kind of recognizing that digital transformation and human rights actually can work hand in hand. And when people in the in the norm receiving parts of the world are able to start defining the conditions in which they’re going to take something or reject something. They have the rights to decide, make those choices, I think. Then we see a little bit more of that negotiation, a little bit more of that empowerment and that ownership happening.
Neeti Biyani: Thanks, Diya. That’s a great perspective. And, you know, one that we should try and unpack more, depending on if we have the time. I know that we have a question online. My colleague Omar is with us here as online moderator. Omar, do you want to come in with the question you have?
Online moderator: Yeah, sure. Thank you, Neeti. So we have an insight. One of the participants, Maarten, says that in our work at the UNUEGOV with the governments and stakeholders globally, we often observe the following in relation to the Internet auctions and digital government investment, including those with donor contributions. First, licensing auctions often end up with a focus on profit optimization. This often leads to slower rollout of the infrastructure by telcos. Once license is issued, subpar infrastructure is in remote or less profitable areas and or increase relative cost to end user. While government naturally want to profit maximize it, it often become counterproductive to digital inclusion, affordable access. And this dilemma is seen in both. developed in emerging economies. Second point he mentions is that many government ICT investments are not focusing on post-project benefit realization, cost benefit, productivity gains are generally not monitored or measured. This often leads to envisioned resource relocation is not happening in practice, including to reinvest new service in new solutions. So this was Maarten. I have a question from Robert. Hold on, let me, he has got two questions. The first one is I would like to know what the panelists have to say about very low digital knowledge in global south. Investing in digital infrastructure is fine, but need to go hand in hand with global scaling. The second question he asks is that, okay, thank you, moderator, for pointing out the access in quality of access. Internet down here in Africa is still very slow, inexpensive. I would like to know what the panelists have to say. Thank you.
Neeti Biyani: Thank you. Thank you, Omar, and thank you to the panelists. Thank you to the audience who’s joining us online and for your wonderful thoughts and you know, useful questions. I think for both of the questions that we’ve heard, let me first maybe turn to Raj. We’ve, you know, at the APNIC Foundation, had quite significant experience with skilling across various different groups. I know that the APNIC Foundation also has quite a bit of work that we’re doing on meaningful access, quality of access, bringing about affordability of connectivity. Would you want to step in here?
Raj Singh: So, yes, just a couple of things though. One, there was a question about submarine cables. There was a reference to submarine cables, right, in one of the comments. I have a comment on that. You know, there’s a lot of submarine cables being deployed all across the world. The problem is, it’s the cables that are being deployed. There’s no supporting ecosystem that’s being set up at the same time. And in particular, I’m talking about regions such as the Pacific Islands, but as well as other parts of the world. Now, some of the reasons these submarine cables are suddenly being deployed is geopolitical in nature. I think you all recognize that. Some are, of course, very private sector driven. Irrespective of why it’s happening, the fact is, cables are being landed in economies. And when you ask the locals, what are you doing? Has it improved capacity? Yes, they’ve got a big, fat internet pipe coming in. But there’s no supporting ecosystem that can actually leverage the potential that the cable has. You know, there is no supporting ecosystem around creating new industries that could leverage that. Yeah, you can try and retrofit old industries into using whatever capacity is there, but how about the new economy that we want to build using digital connectivity? So that’s one issue. And I’ve raised this multiple times, including with some of the governments who are actually funding these cables. for various developing economies. There’s no clear answer yet, so I’ll also say that, on why that focus is not there yet. In terms of capacity building, and there was a comment about building skills, is this necessary as building connectivity? I absolutely agree. At the APNIC Foundation, we’ve done something interesting this year. We’ve got a new strategic plan that we’ve developed, and what we’ve done is actually we’re going to start speaking about capacity building in a very different way. It’s no longer for us, and I would suggest for the rest of the world, it’s not about building capacity anymore, it’s about building capabilities. Because when you have capabilities, then you can do things. So we’ve been using capacity building for probably three decades, if not more, in this sector. I think enough of that, now we need to build capabilities. And when I talked about the submarine cable example, the capability does not exist. The connectivity capability exists, but not leveraging a capability out of that cable or what it could do to that economy. There was, I’m sorry, just give me one more minute. The comment we had from my right, the person talked about M&E and transparency and so on. A quick comment on that, that’s something I think we’ve also been looking very hard at at the Foundation in particular. So part of our remit is also grant-making. So we’ve got, in fact, the Asia-Pacific’s longest-running innovation fund. It’s been running for 16 years now, and it’s supported some great technologies and developments over the years. But one thing, since I took over the CEO role, what I’ve been looking at, I am no longer interested in the outputs that those projects create. I want to see outcomes. And that goes to something Remy said at the beginning when he was talking about the structural or the ecosystem that needs to be built. So that’s structural in nature, right? And I think a lot of times we get so carried away with trying to do little things at the very granular level. We don’t recognize that if we don’t build a supporting ecosystem, if we don’t make the structural changes, all that is just a one-shot. I think Franz maybe said something about something dropping in and then you go and that’s it. So I think that the need for the structural changes is very, very important and more so to focus on outcomes. And that goes to things like M&E, which the person also mentioned. If I look at a lot of the metrics that are being used, the metrics are very output-related, they’re not outcome-related. And that’s something else that we’re trying to focus on. So Maarten, maybe there’s some advice there for you and your grant-making. I’m happy to have a chat with you and your team on how we can make that better. So thanks, Neeti.
Neeti Biyani: Thanks, Raj. And thank you to everyone who intervened. Maybe we can come back for a final round of reflections once we’ve discovered a few more things with the panel. I think we’ve heard a lot of thoughts about how to make sure that impact and transformation is more outcome-oriented, how it should really help human beings, societies at large, better their social and economic outcomes, their lives, really. How there are still significant structural challenges that we’re experiencing. And finally, because this panel is about the global majority, it is about the developing regions, how do we then therefore determine our own development goals and our own development outcomes and transformation pathways? So maybe, Franz, if I can start by picking your brain, how does the GIZ understand and approach digital transformation? I’d like to also caveat that by saying that there is no commonly understood and accepted definition of digital transformation at the moment. So I’d just like to say that the way that we understand it at the APNIC Foundation is… Digital transformation is a whole-of-society and whole-of-government approach to really using communication technologies, the internet, everything digital and tech, to better lives, to better quality of lives, and to further socio-economic outcomes. Franz, over to you.
Franz von Weizsaecker: Absolutely.
Online moderator: And Neeti, Maarten also has his hand raised, so later on maybe you could give him a chance.
Neeti Biyani: Thank you. Sorry, Maarten, I’ll come to you.
Franz von Weizsaecker: Yes, absolutely. It’s digital transformation, goes across all the sectors, and we don’t talk anymore about this is a health project. I mean, we say it’s a health project, but it has become a health in a digital age project or education in a digital age project and so on and so on. So that is for all the sectors. But then, of course, you have the underlying digital transformation enablers, which are across the sectors. So internet access, the most prominent one, and may I pick up the question from the online participant on why is internet so little available and so unaffordable in many parts of Africa? It’s the regulatory environment for that. It’s the investment environment for it. And it’s also in some parts, it’s just that the GDP density, like in some rural parts of Africa, there is no economic incentive to build infrastructure. And that maybe leads me to another point. We are about to waste a huge global public good, which is the lower earth orbit and the medium earth orbit by the scramble for space that is happening, driven by a couple of companies that compete in allocating their satellites in this low and medium earth orbit. In a way where it’s not like an open ecosystem where anybody can engage with, but it falls into the hand of a few. very powerful either private companies or governments to occupy that space, so that it will not be used, that this resource will not be put to the best possible use. And I have some slight hope, I don’t know yet where it’s leading to, but I have some slight hope that what either the European Union or some other players are trying to set up with the Iris Square initiative, that this will result in a sort of more of an open ecosystem approach on satellite connectivity, which allows, as you know, these orbits automatically, any satellite that flies over Europe also flies over Africa, if it’s a north-south orbit, or likewise east-west orbit. So you will have an investment that comes, that is fostered by the very, very high developed, like the developed economies, but automatically you’re building an infrastructure that is available for the very low GDP per square kilometre parts of the world, and it will make it economically much more feasible to connect these areas as well. If we are able to establish a framework in which there is good competition, and that’s the second answer to the question, there’s sometimes lack of competition in those telecommunications markets. If there’s one incumbent, and then there is competition from the sky, that also is a factor to lower the prices. So, yeah, that’s my answer to your question.
Neeti Biyani: Thanks, Franz. Maarten, is your comment or intervention to do with digital transformation?
Maarten Botterman: I think so. Okay, go ahead. Well, basically, in a way, it connects also to Frans’ comment about, for instance, low orbit networks. Accessibility is key if you want to participate in the digital transformation, and as the dear colleague asked online in this question, is not only accessibility but also affordability of that for people. It was so good quality. And part of the answer is in what Franz gave. It’s competing infrastructures. I mean, if you just do Starlink for low-orbit networks, it would be very difficult to have a competitive offer. If there’s some kind of competition between networks, be it Starlink, be it 5G, 4G, 3G networks, or even making good use of LoRaWAN in areas where that would be best in connection with the C cables from Dutch. I think then we talk about enabling something. And let me take you back to 1996 when I worked for the European Commission, I admit. And I had the honor and pleasure of running the European Telework Agenda. And I got to one of the outskirts of Europe. I mean, within Europe, it’s not the world, but also in Europe, there were areas that were less connected. And this specific area was the Western Isles, which is in the very north of Scotland. And by that time, they just had one big line from the two large islands for telecom to get access, to get bits from also other telecom providers and make sure there was good quality. The local council took it upon them to define a strategy for how they would want to do this digital transformation over there, how they would commit to supporting and promoting a digital transformation by their own requirement of products, of access, but also stimulation of their local community in what they call the telework commitment. that led to much higher connectivity. But that’s why it’s so important that it’s understood locally what digital transformation can bring for you. Because if you don’t create a pool from that location, what you will get is what the big companies want to push. And that may not always serve you well. So back to really the matter of making sure that local understanding arises of what’s needed and then to get it. Last point on the point that was made by this gentleman on slow and expensive networks in Africa. This is also where governments can make a difference and a very good example there is India. What one can see in India is that internet access and participation is set as a priority in the digital India plan. And one of the key conditions that helps a lot is to ensure that access to the internet for people is affordable. And government can play a role there either by ensuring competition of infrastructures or if there is too little competition by imposing lower reasonable rates. I hope it is a digital development enough.
Neeti Biyani: Thank you, Maarten. I think on that note, I want to come to Remy. Remy, we’ve heard some interesting perspectives, you know, from the panel itself, from, you know, our participants online, as well as the audience we have here today. Where we’ve talked about digital transformation, the key role, the very unique role that indeed only governments can play in trying to determine what national strategies can look like, what those development outcomes look like. How do we know that we’ve gotten to a place where we can claim that we’ve benefited? you know, many, many of our citizens and our people. On the other hand, we’ve also heard some key questions about, you know, meaningful digital transformation, where we’re factoring in accessibility, affordability, where we’re talking about cooperation, where we’re talking about sharing infrastructure. I’d maybe like to glean your thoughts on these, and I’d like to ask you a follow-up question, which is, what is the role that you think government interventions can play here? Vis-a-vis, what is the role that the market itself and competition can play here?
Remy Friedmann: Thank you, big question. Thank you, Neeti. Well, digital transformation is not only ensuring equitable access to the internet worldwide. This comes with a responsibility, because digital transformation is everything that comes with access. We all have the same connectivity, the same speed. Do we have the capacity to deal with the other side of the coin and be responsible in using the internet? And their governments have the role of protecting human rights, setting standards, the necessary safeguards. And that’s why governments are coming together, for example. It was mentioned in the online coalition and their principles. The workforce working in the tech industry must be a right-respecting workforce. But the rights of the workforce need to be respected as well. So, it’s a fourth industrial revolution. It’s a transition, as we speak, of just transition, when we talk about a fair, I would say, energy transition, climate action that needs to respect human rights. The same thing happens with digital transformation and the fact that it comes with We have a responsibility, we still have the obligation to protect human rights, companies need to respect human rights, citizens, individuals need to have access to effective remedy when the rights are not respected and everything we are discussing here at the IGF and in other spaces about respecting human rights in the digital space, everything becomes relevant when we have equitable access, do we have the capacity and there, well, we have capacity building and that’s also an element, development cooperation must come together with capacity building on cyber resilience for example, how cyber development comes with cyber capability or capacity building as well. So that’s why I was saying we need to join forces across different disciplines, use existing frameworks but sometimes really governments have an important role to play, but needless to say that.
Neeti Biyani: Diya, if I turn to you with a similar question of digital transformation, what role does the government have to play here? I know that some bit of your context is also informed by when governments cannot fulfil that role. I know this is a bit of a stretch, probably not what we are here to interrogate, but because the government has a very key role to play here, maybe a very short reflection from you on a scenario like that.
Sabhanaz Rashid Diya: Yeah, thank you for that. I think Remy actually articulated quite well, we are both part of the Female Online Coalition with the advisory network and I think a lot of what we talk about is the role of governments not just to tackle the digital divide, and I think Raj eloquently talked about we see digital divides being created every day in different ways. It’s not just being connected and unconnected, but it’s also connected and disconnected. It is also, you know, some having rights, some not having rights. It’s about being norm shapers versus norm takers. So there’s so many layers and layers of device that we see across the board. So I think, you know, in the coalition, we talk quite a bit about, you know, how, what role can governments play to ensure that transformation is meaningful, and we say meaningful, that it actually takes into consideration some of the, you know, unique social and political context of where they’re existing, that it isn’t coming at the cost of, you know, an imposition and at the cost of people’s rights. So I think the perspective that I will perhaps take here is, I mean, we’ve heard from governments, we’ve heard from APNIC, perhaps of the actual communities that we serve, for whom we’re doing all of this. And I think from the community’s perspective, it’s, you know, oftentimes, I guess, people start, I guess, you know, having seen that distance from digital or seeing that, you know, seeing themselves removed from transformation, when they’re unable to exercise their voices, and they don’t feel empowered by it. And I think that empowerment is quite critical to a meaningful transformation agenda, where it’s not just about rights, but it’s also about redress. It’s about being able to shape it the way that that makes sense for them, it is able to tackle some very real problems for them. And I think connectivity is just the first step or access is the first step to a much broader conversation around, you know, how do we think about transformation in a way that actually serves the people it’s intended for.
Neeti Biyani: Thank you, Diya. I think we do have some time to go into one final round of reflections or thoughts or comments from our audience. Would anyone like to participate? The mic’s all yours.
Maarten Botterman: I’m always willing to reflect. I really see a lot coming together here. Digital transformation is ongoing, and we want it to be for the world. We want it to be for all. But we can’t stuff it down the throat of the world. We need to enable the world to come to the table, to participate, to make sure they know what they get, what they want, how they can benefit from all these things that the new technologies are offering. The Internet is offering. So I’m really very much inspired by the many young people who have engaged in programs like the Youth Ambassadors Program, the AP Youth Program, and other programs, because this shows that new generations will be even making more of a difference than we’ve been building the Internet in the old days, from the outset. So start young with capacity building. Take your responsibility as a stakeholder, whether you’re a government or an NGO or a company. Take your responsibility and empower people. Make sure that they know what to ask for and help them to get it, help to create the circumstances. I think there’s no way back for digital transformation, but we could do it in a way to make sure it’s fair, it’s inclusive and it serves the world as a whole.
Neeti Biyani: Thanks, Maarten. I think you’ve brought us to almost the close of the session on a very inspiring note. I think I’m just going to turn back to my panel for any final thoughts on everything we’ve heard today, everything we’ve discussed today. I will leave the audience with, finally, if I may start with Raj.
Raj Singh: Thanks, Neeti. Thanks to the panellists for joining us today. I think it’s been, we’ve discussed, we’ve covered a lot of ground. I’m not sure how well we answered the actual question that we had for today’s panel. So, what I will say, very quickly, is that we’re back to that same situation where funding is shrinking, issues keep on popping up, digital divides keep on widening. There are multiple organisations doing lots of things, be they governmental, non-governmental, private sector, philanthropic organisations and family offices and whatnot. Each time I look at what everyone’s doing, including my own organisation, it’s also very clear to me that there’s actually, we talk about collaboration, but there’s actually very little of it. Everyone’s got their specific objectives they have to do something and they go out and try and do it. Yes, there’s some discussions and some collaboration at some level, perhaps, but there still is a lot of duplication out there. I think that really is something we need to address, knowing that funding levels are shrinking. Because if we just keep on duplicating someone else’s work, I don’t think we’re actually achieving much in the end. So, I’ll just leave it at that, thanks.
Neeti Biyani: Franz?
Franz von Weizsaecker: All right, let me try to leave us with a positive note. In times of shrinking funding, it’s a time for reckoning as well and it’s a time for maybe reinventing the way how the international development community used to operate. And maybe it’s a time to really build on the sovereignty of governments, the sovereignty to determine their own digital future, to get away from the traditional donor and receiver model and to build on what really matters for the individual economies to emerge, good regulatory systems, good public procurement. good rule of law, a fair taxation system that does not rely entirely on taxing telecommunications, making internet very expensive, and that as a basis to encourage also investments also from private sector to become sovereign in many ways and less depending on international development. So ideally I would wish that our role as GIZ at some point may not be needed anymore, at least not in that form, and we’re shifting from the traditional development towards an international mode of cooperation. Yeah, we’re all trying to work ourselves out of a job.
Neeti Biyani: Remy?
Remy Friedmann: Thank you. Well, maybe pointing at the fact that digital exclusion is fundamentally a development issue, as it was already mentioned, and that we need to somehow break the silos and integrate mainstream digital inclusion within different streams, for example within climate resilience, gender equity, education, health agendas, rather than treating this as a really a separate infrastructure challenge. So this could be maybe a way to go really, because it has to be mainstream, it’s not a separate thing that is only related to infrastructure and access.
Neeti Biyani: Thank you. Diya?
Sabhanaz Rashid Diya: I guess I get the last word, but unless Maarten wants to come in. You know, one of the things I always take a lot of inspiration from is I come from the traditional development sector, right? So I’ve worked in digital, but digital for health, digital for agriculture, the old ICT4D crowd. And, you know, whenever I feel a bit hopeless, I think a lot about the polio movement in the world, and it was an impossible problem to solve many, many years ago. It still had this, it’s not similar, but, you know, infrastructure gaps, funding gaps, collaboration gaps. And I think the problem got really, truly solved when all the different actors began to come together and realize this is how we’re going to solve polio. We’re going to innovate around vaccines. We’re going to go out to the most ruralest part of the world and solve this disease. And in many ways, the world has eradicated some out of polio. There’s a few more cases not coming up, but by and large, it has eradicated polio. So I think if we can solve something as dramatic and as drastic as that, I think there is a natural incentive within digital transformation, digital infrastructure, digital connectivity community to come together. And I hope this current moment that we’re in, where funding is shrinking, it seems that we’re back to square one. I hope this becomes a moment of reckoning to realize that we can really solve tough problems better when we come together. So my hope for the Oda community is that this isn’t just a cry for help or something, but also a real moment to see the value of coming together as a community.
Neeti Biyani: Thank you. Thank you. Maarten, and just in the interest of time, I am going to wrap this up here wearing my moderator’s hat. I wanted to very quickly say thank you to all of our panelists who joined us today, especially Maarten, who joins us virtually. Thank you for attending this session hosted by the APNIC Foundation, to everyone who’s here in person and online as well. I just want to leave everyone on the note of saying I agree with a lot of sentiments we’ve heard today, that I think we are stronger together. I think we need to have very informed conversations about exactly how we determine our own transformation, how we collaborate between various stakeholder groups, and how we make sure that we’re holding space for different voices to be heard as we determine our own development futures. Having said that, I think in the Asia-Pacific, with the region that’s as The APNIC Foundation is as vast and diverse as it is, unlike any other region in the world. The APNIC Foundation has the unique privilege of working across 56 economies, serving the many communities that we do, hopefully, you know, having some amount of impact over the course of our work, touching lives and, you know, making sure that we’re leaving communities a bit better than how we found them. We’re very open to having more conversations with different stakeholders, different groups, different communities, you know where to find us. And once again, thank you for being part of this conversation. Hopefully, we’ll see you next year as well.
Sabhanaz Rashid Diya
Speech speed
183 words per minute
Speech length
1770 words
Speech time
577 seconds
32% of world population remains unconnected with stark global north-south disparity
Explanation
Despite 20 years of progress since WSIS commitments, significant connectivity gaps persist globally. The disparity between global north (93% online) and global south (27% online) remains substantial, with women, girls, and marginalized populations disproportionately affected.
Evidence
In the global north, 93% of people are online while in the global south, only 27% are online. Those most affected include women, girls, marginalized populations, and minority groups who cannot come online due to device affordability and lack of infrastructure.
Major discussion point
Digital Divide and Connectivity Challenges
Topics
Development | Infrastructure
Digital transformation comes with disconnection trends through forced shutdowns and network throttling
Explanation
While efforts focus on connecting the unconnected, there’s a concerning counter-trend of deliberately disconnecting populations. This creates additional barriers to digital inclusion beyond infrastructure limitations.
Evidence
Based on Access Now’s report, last year alone there were 296 shutdowns in 54 countries, representing an extraordinary number of deliberate disconnections.
Major discussion point
Governance and Human Rights Considerations
Topics
Human rights | Infrastructure
Unhealthy tension exists between digital development and human rights protection
Explanation
An unintended tension has emerged where digital transformation is sometimes perceived to come at the cost of human rights. This creates challenges in navigation and acceptance, particularly in the Global South.
Evidence
The tension manifests in the lack of transparency, ownership, and clarity from many Global South stakeholders, creating resistance to digital development initiatives.
Major discussion point
Governance and Human Rights Considerations
Topics
Human rights | Development
Agreed with
– Raj Singh
Agreed on
Collaboration needed despite current duplication of efforts
Need to address norm shapers versus norm takers dynamic between Global North and South
Explanation
The Global North shapes digital norms while the Global South becomes norm takers, creating an imposition culture. This is evident in limited Global South representation at forums like IGF across government, civil society, and private sectors.
Evidence
At IGF, there are very few Global South representatives whether from governments, civil society, or private sector, demonstrating the imbalanced participation in norm-setting.
Major discussion point
Governance and Human Rights Considerations
Topics
Human rights | Development
Agreed with
– Maarten Botterman
– Franz von Weizsaecker
Agreed on
Local ownership and understanding crucial for successful digital development
Raj Singh
Speech speed
174 words per minute
Speech length
1643 words
Speech time
563 seconds
Asia Pacific region has most advanced and least developed economies creating development challenges
Explanation
The Asia Pacific’s diversity presents unique challenges as it contains both the world’s most advanced economies and least developed ones. This creates difficulties in policy development and resource allocation across the region.
Evidence
Some parts of the region are leaping ahead in new technologies with mature industries and private sector leadership, while other economies lack fully-fledged policymaking units and operate on reactive rather than proactive policy making.
Major discussion point
Digital Divide and Connectivity Challenges
Topics
Development | Economic
Half of South Asia sub-region still remains unconnected despite progress
Explanation
South Asia, one of four sub-regions in Asia Pacific, demonstrates the stark reality of digital divides with 50% of the population lacking internet connectivity. This represents a significant development challenge requiring targeted intervention.
Evidence
Half of the South Asia sub-region population remains unconnected, which is described as a very stark statistic.
Major discussion point
Digital Divide and Connectivity Challenges
Topics
Development | Infrastructure
Submarine cables are being deployed without supporting ecosystems to leverage their potential
Explanation
While submarine cables are being deployed globally for various reasons including geopolitical ones, there’s no accompanying ecosystem development to maximize their potential. This results in improved capacity without corresponding economic or social benefits.
Evidence
Cables are being landed in economies like Pacific Islands, and while locals report improved capacity with ‘big, fat internet pipes,’ there’s no supporting ecosystem to create new industries or leverage the cable’s potential for new economy development.
Major discussion point
Digital Divide and Connectivity Challenges
Topics
Infrastructure | Development
Need to shift from capacity building to capability building to enable actual implementation
Explanation
After three decades of capacity building in the sector, the focus should shift to building capabilities that enable people to actually accomplish tasks. This represents a fundamental change in approach to development work.
Evidence
The APNIC Foundation has developed a new strategic plan that speaks about capability building rather than capacity building, recognizing that capabilities enable action while capacity building has been used for three decades without sufficient results.
Major discussion point
Capacity Building vs Capability Development
Topics
Development | Capacity development
Disagreed with
– Maarten Botterman
Disagreed on
Approach to capacity building terminology and focus
Focus should be on outcomes rather than outputs in development projects
Explanation
Development projects should prioritize measuring outcomes that create structural changes rather than just outputs. This requires building supporting ecosystems and making structural changes rather than granular interventions.
Evidence
The APNIC Foundation’s Asia-Pacific’s longest-running innovation fund (16 years) is shifting focus from project outputs to outcomes, emphasizing structural ecosystem building over granular-level interventions.
Major discussion point
Capacity Building vs Capability Development
Topics
Development | Sustainable development
Agreed with
– Franz von Weizsaecker
Agreed on
Need to shift focus from outputs to outcomes in development work
Multiple organizations doing similar work with little actual collaboration despite funding constraints
Explanation
Despite discussions about collaboration, there remains significant duplication of efforts across governmental, non-governmental, private sector, and philanthropic organizations. This inefficiency is particularly problematic given shrinking funding levels.
Evidence
Each organization has specific objectives and goes out to achieve them independently, resulting in duplication of work across multiple types of organizations including governmental, non-governmental, private sector, and philanthropic entities.
Major discussion point
Collaboration and Structural Changes
Topics
Development | Economic
Agreed with
– Sabhanaz Rashid Diya
Agreed on
Collaboration needed despite current duplication of efforts
Online moderator
Speech speed
83 words per minute
Speech length
287 words
Speech time
205 seconds
Internet access in Africa remains slow and expensive due to regulatory and investment environment issues
Explanation
African internet infrastructure faces challenges from licensing auction focus on profit optimization, leading to slower infrastructure rollout and higher costs. Additionally, many government ICT investments lack post-project benefit monitoring and measurement.
Evidence
Licensing auctions focus on profit optimization leading to slower rollout by telcos, subpar infrastructure in remote areas, and increased costs to end users. Government ICT investments often don’t monitor cost-benefit or productivity gains, preventing resource reallocation to new solutions.
Major discussion point
Digital Divide and Connectivity Challenges
Topics
Infrastructure | Economic
Franz von Weizsaecker
Speech speed
144 words per minute
Speech length
1633 words
Speech time
678 seconds
German development funding budget faces 10% decline while some philanthropic funding tries to compensate
Explanation
Official development assistance is declining globally, with Germany experiencing a 10% budget reduction in development funding. Philanthropic organizations like the Gates Foundation are attempting to fill some gaps, but cannot fully compensate for the shortfall.
Evidence
The German government’s draft budget shows about a 10% decline in the German development funding budget of the BMZ ministry, while philanthropic funding from organizations like the Gates Foundation is trying to compensate for some declines.
Major discussion point
Funding Landscape and Resource Constraints
Topics
Development | Economic
Digital transformation should be mainstreamed across all sectors as an enabler for sustainable development goals
Explanation
Digital transformation has become a mainstreaming topic across entire development portfolios, similar to gender and human rights mainstreaming. It serves as a key enabler for achieving sustainable development goals across all sectors including health, education, and economic development.
Evidence
Digital mainstreaming is possibly the most successful mainstreaming topic, with no line ministry or initiative in health, education, or other sectors able to operate without internet as a key enabler for achieving sustainable development goals and economic growth.
Major discussion point
Digital Transformation Approaches and Strategies
Topics
Development | Sustainable development
Agreed with
– Remy Friedmann
– Neeti Biyani
Agreed on
Digital transformation requires mainstreaming across all sectors rather than standalone approach
Lack of competition in telecommunications markets contributes to high costs and poor service
Explanation
Telecommunications markets often lack adequate competition, with incumbent operators dominating. This results in higher prices and poorer service quality, particularly affecting affordability and accessibility in developing regions.
Evidence
Sometimes there’s lack of competition in telecommunications markets with one incumbent operator, and competition from satellite services could help lower prices through increased market competition.
Major discussion point
Market Competition and Infrastructure Development
Topics
Economic | Infrastructure
Low earth orbit satellite infrastructure could provide competitive alternatives if managed as open ecosystem
Explanation
The scramble for low and medium earth orbit space by a few powerful companies and governments risks wasting this global public good. An open ecosystem approach, like the EU’s Iris Square initiative, could better serve global connectivity needs.
Evidence
The EU’s Iris Square initiative aims to create a more open ecosystem approach to satellite connectivity, where satellites that fly over developed regions like Europe automatically also serve less developed areas like Africa, making connectivity more economically feasible.
Major discussion point
Market Competition and Infrastructure Development
Topics
Infrastructure | Development
Government procurement processes need strengthening for sustainable digital solutions
Explanation
Many aid projects take a supply-oriented approach without considering local government procurement processes and operational cost coverage. This results in solutions that disappear once project funding ends, creating an unhealthy dependency cycle.
Evidence
Solutions often come and go as soon as projects are over because they don’t go through national governmental procurement processes or consider how operational costs will be covered, requiring conscious decisions by competent procurement bodies about make-or-buy decisions.
Major discussion point
Market Competition and Infrastructure Development
Topics
Legal and regulatory | Development
Agreed with
– Raj Singh
Agreed on
Need to shift focus from outputs to outcomes in development work
Disagreed with
– Audience member (Norwegian company)
Disagreed on
Role of private sector vs government in digital development solutions
Remy Friedmann
Speech speed
141 words per minute
Speech length
841 words
Speech time
356 seconds
Switzerland emphasizes partnership, inclusiveness, and community-driven solutions in digital development
Explanation
Swiss international cooperation focuses on sustainable development, poverty alleviation, and human rights through partnerships that strengthen digital capacities while addressing risks like digital divides. The approach emphasizes locally grounded, transparent solutions aligned with development goals.
Evidence
Switzerland is a member of the Freedom Online Coalition and supports the donors’ principles for human rights in the digital age, emphasizing transparent, locally grounded solutions aligned with broader development goals through catalytic funding and blended finance mechanisms.
Major discussion point
Digital Transformation Approaches and Strategies
Topics
Development | Human rights
Governments have responsibility to protect human rights while setting digital standards and safeguards
Explanation
Digital transformation requires governments to balance providing equitable internet access with protecting human rights and setting necessary standards. This includes ensuring responsible use of internet and protecting workforce rights in the tech industry.
Evidence
Governments are working together in coalitions like the Freedom Online Coalition to establish principles ensuring the tech industry workforce is rights-respecting, while also protecting the rights of workers and providing access to effective remedy when rights are violated.
Major discussion point
Governance and Human Rights Considerations
Topics
Human rights | Legal and regulatory
Digital exclusion should be mainstreamed within climate, gender, education, and health agendas
Explanation
Rather than treating digital inclusion as a separate infrastructure challenge, it should be integrated across different development streams. This approach recognizes digital exclusion as fundamentally a development issue requiring cross-sectoral integration.
Evidence
Digital inclusion needs to be integrated within climate resilience, gender equity, education, and health agendas rather than being treated as a separate infrastructure challenge, requiring breaking down silos between different development streams.
Major discussion point
Collaboration and Structural Changes
Topics
Development | Sustainable development
Agreed with
– Franz von Weizsaecker
– Neeti Biyani
Agreed on
Digital transformation requires mainstreaming across all sectors rather than standalone approach
Maarten Botterman
Speech speed
133 words per minute
Speech length
1672 words
Speech time
752 seconds
ICANN supports one world, one internet vision through capacity building and regional outreach programs
Explanation
ICANN serves the global internet community with recognition that most users now live in regions like APAC rather than North America. The organization actively supports regions with less internet advancement through targeted capacity building and infrastructure development programs.
Evidence
ICANN’s Coalition for Digital Africa engages with 41 African governments and institutions, installing root server instances, setting up internet connection points with Internet Society, and providing capacity development for governments across 19 countries on internet technical functioning.
Major discussion point
Digital Transformation Approaches and Strategies
Topics
Infrastructure | Development
Local capacity building and understanding crucial for successful digital inclusion initiatives
Explanation
Digital inclusion requires local understanding of needs and local capacity building to be successful. Without local community contact and understanding, global donations and interventions cannot be effectively implemented or sustained.
Evidence
Organizations like Global Forum for Cyber Expertise conduct capacity building workshops that bring global knowledge together with local knowledge and stakeholders, creating action plans that help regions leapfrog by applying solutions already developed elsewhere.
Major discussion point
Capacity Building vs Capability Development
Topics
Development | Capacity development
Agreed with
– Sabhanaz Rashid Diya
– Franz von Weizsaecker
Agreed on
Local ownership and understanding crucial for successful digital development
Disagreed with
– Raj Singh
Disagreed on
Approach to capacity building terminology and focus
Local understanding and ownership crucial for determining digital transformation priorities
Explanation
Digital transformation must be driven by local understanding of what benefits the community rather than what big companies want to push. Government and local councils play key roles in defining strategies and creating demand for digital services.
Evidence
The Western Isles in Scotland example shows how local council strategy for digital transformation, including telework commitment and local community stimulation, led to much higher connectivity by creating local demand and understanding of digital benefits.
Major discussion point
Market Competition and Infrastructure Development
Topics
Development | Sociocultural
Youth engagement and empowerment essential for meaningful digital transformation
Explanation
Young people through programs like Youth Ambassadors and AP Youth Program demonstrate that new generations can make even greater impact on internet development than previous generations. Starting capacity building young is crucial for sustainable digital transformation.
Evidence
Youth Ambassadors Program and AP Youth Program show that new generations will make more difference than those who built the internet initially, emphasizing the importance of starting young with capacity building and empowerment.
Major discussion point
Capacity Building vs Capability Development
Topics
Development | Sociocultural
Neeti Biyani
Speech speed
140 words per minute
Speech length
2239 words
Speech time
954 seconds
Rapid decline in official development assistance, philanthropy, and corporate giving exacerbates funding gaps
Explanation
There has been a significant decline across multiple funding sources including official development assistance, philanthropic giving, and corporate giving. This convergence of funding reductions has created substantial gaps in resources available for internet and digital development in the global majority.
Major discussion point
Funding Landscape and Resource Constraints
Topics
Development | Economic
Digital transformation requires whole-of-society and whole-of-government approach to better lives and socio-economic outcomes
Explanation
Digital transformation should be understood as a comprehensive approach involving all sectors of society and government to improve quality of life and advance socio-economic outcomes. This goes beyond just technology implementation to encompass broader societal change.
Evidence
The APNIC Foundation defines digital transformation as using communication technologies, internet, and digital tech to better lives, improve quality of lives, and further socio-economic outcomes through whole-of-society and whole-of-government approaches.
Major discussion point
Digital Transformation Approaches and Strategies
Topics
Development | Sustainable development
Agreed with
– Franz von Weizsaecker
– Remy Friedmann
Agreed on
Digital transformation requires mainstreaming across all sectors rather than standalone approach
Audience
Speech speed
161 words per minute
Speech length
372 words
Speech time
137 seconds
Transparency in investment portfolios and tracking mechanisms remains challenging
Explanation
Research into ODA donors and major foundations reveals difficulties in understanding where investments are being made, who the collaborating partners are, and tracking investment amounts. This lack of transparency hampers effective collaboration and coordination among stakeholders.
Evidence
Research looking at seven ODA donors in Europe alongside the Bill and Melinda Gates Foundation and Wellcome Trust found it really difficult to understand where investments were being made, who to talk to, and different collaborating partners.
Major discussion point
Funding Landscape and Resource Constraints
Topics
Development | Economic
Agreements
Agreement points
Digital transformation requires mainstreaming across all sectors rather than standalone approach
Speakers
– Franz von Weizsaecker
– Remy Friedmann
– Neeti Biyani
Arguments
Digital transformation should be mainstreamed across all sectors as an enabler for sustainable development goals
Digital exclusion should be mainstreamed within climate, gender, education, and health agendas
Digital transformation requires whole-of-society and whole-of-government approach to better lives and socio-economic outcomes
Summary
All three speakers agree that digital transformation cannot be treated as a separate infrastructure challenge but must be integrated across all development sectors and government approaches to achieve meaningful outcomes.
Topics
Development | Sustainable development
Local ownership and understanding crucial for successful digital development
Speakers
– Maarten Botterman
– Sabhanaz Rashid Diya
– Franz von Weizsaecker
Arguments
Local capacity building and understanding crucial for successful digital inclusion initiatives
Need to address norm shapers versus norm takers dynamic between Global North and South
Government procurement processes need strengthening for sustainable digital solutions
Summary
Speakers consensus that digital development must be locally driven and owned rather than imposed from external actors, with communities determining their own transformation priorities and approaches.
Topics
Development | Human rights
Need to shift focus from outputs to outcomes in development work
Speakers
– Raj Singh
– Franz von Weizsaecker
Arguments
Focus should be on outcomes rather than outputs in development projects
Government procurement processes need strengthening for sustainable digital solutions
Summary
Both speakers emphasize the importance of measuring and achieving sustainable outcomes rather than just project outputs, requiring structural changes and proper implementation frameworks.
Topics
Development | Sustainable development
Collaboration needed despite current duplication of efforts
Speakers
– Raj Singh
– Sabhanaz Rashid Diya
Arguments
Multiple organizations doing similar work with little actual collaboration despite funding constraints
Unhealthy tension exists between digital development and human rights protection
Summary
Both speakers acknowledge that despite funding constraints, organizations continue to work in silos with significant duplication, and call for better collaboration to address complex challenges.
Topics
Development | Human rights
Similar viewpoints
Both speakers highlight the persistent digital divide with specific statistics showing large populations remain unconnected, particularly in developing regions, despite decades of development efforts.
Speakers
– Sabhanaz Rashid Diya
– Raj Singh
Arguments
32% of world population remains unconnected with stark global north-south disparity
Half of South Asia sub-region still remains unconnected despite progress
Topics
Development | Infrastructure
Both speakers acknowledge the significant decline in development funding across multiple sources, creating substantial resource constraints for digital development initiatives.
Speakers
– Franz von Weizsaecker
– Neeti Biyani
Arguments
German development funding budget faces 10% decline while some philanthropic funding tries to compensate
Rapid decline in official development assistance, philanthropy, and corporate giving exacerbates funding gaps
Topics
Development | Economic
Both speakers emphasize the importance of empowering people with actual capabilities rather than just providing training, with focus on enabling action and meaningful participation.
Speakers
– Maarten Botterman
– Raj Singh
Arguments
Youth engagement and empowerment essential for meaningful digital transformation
Need to shift from capacity building to capability building to enable actual implementation
Topics
Development | Capacity development
Both speakers recognize the critical importance of integrating human rights considerations into digital development, rather than treating them as competing priorities.
Speakers
– Remy Friedmann
– Sabhanaz Rashid Diya
Arguments
Governments have responsibility to protect human rights while setting digital standards and safeguards
Unhealthy tension exists between digital development and human rights protection
Topics
Human rights | Development
Unexpected consensus
Private sector engagement challenges in development contexts
Speakers
– Franz von Weizsaecker
– Audience
Arguments
Government procurement processes need strengthening for sustainable digital solutions
Transparency in investment portfolios and tracking mechanisms remains challenging
Explanation
Unexpected consensus emerged around the challenges private companies face in engaging with development contexts, including procurement difficulties and lack of transparency in investment tracking, suggesting systemic issues in public-private partnerships for development.
Topics
Development | Economic
Infrastructure deployment without ecosystem development
Speakers
– Raj Singh
– Franz von Weizsaecker
Arguments
Submarine cables are being deployed without supporting ecosystems to leverage their potential
Low earth orbit satellite infrastructure could provide competitive alternatives if managed as open ecosystem
Explanation
Unexpected alignment on the issue that infrastructure deployment alone is insufficient – both speakers emphasize the need for supporting ecosystems and open approaches to maximize infrastructure benefits, whether for submarine cables or satellite networks.
Topics
Infrastructure | Development
Overall assessment
Summary
Strong consensus exists around the need for locally-driven, mainstreamed approaches to digital development that prioritize outcomes over outputs, integrate human rights considerations, and require better collaboration among stakeholders. Speakers also agree on persistent connectivity challenges and declining funding landscapes.
Consensus level
High level of consensus on fundamental principles and challenges, with implications suggesting a shared understanding of systemic issues requiring coordinated, rights-based, and locally-owned solutions. The agreement spans technical, governance, and development perspectives, indicating potential for unified approaches to addressing digital development gaps.
Differences
Different viewpoints
Role of private sector vs government in digital development solutions
Speakers
– Franz von Weizsaecker
– Audience member (Norwegian company)
Arguments
Government procurement processes need strengthening for sustainable digital solutions
Private companies face skepticism and zero trust when trying to offer solutions to developing countries
Summary
Franz emphasizes that digital solutions must go through proper governmental procurement processes to avoid dependency and ensure sustainability, while the audience member from a Norwegian company highlights the challenges private companies face due to skepticism and lack of trust, making it difficult to offer even low-cost solutions to developing regions.
Topics
Legal and regulatory | Development | Economic
Approach to capacity building terminology and focus
Speakers
– Raj Singh
– Maarten Botterman
Arguments
Need to shift from capacity building to capability building to enable actual implementation
Local capacity building and understanding crucial for successful digital inclusion initiatives
Summary
Raj argues for a fundamental shift from ‘capacity building’ to ‘capability building’ after three decades of insufficient results, while Maarten continues to emphasize traditional capacity building approaches through local understanding and community engagement.
Topics
Development | Capacity development
Unexpected differences
Measurement focus in development projects
Speakers
– Raj Singh
– Audience member (Molly)
Arguments
Focus should be on outcomes rather than outputs in development projects
Transparency in investment portfolios and tracking mechanisms remains challenging
Explanation
While both recognize measurement challenges, Raj advocates for shifting from output-focused to outcome-focused metrics, while the audience member emphasizes the need for better transparency and tracking of existing investment portfolios. This represents different priorities in addressing measurement challenges.
Topics
Development | Economic
Overall assessment
Summary
The discussion revealed relatively low levels of fundamental disagreement among speakers, with most conflicts arising around implementation approaches rather than core goals. Main areas of disagreement included the role of private sector versus government-led solutions, terminology and approaches to capacity building, and specific mechanisms for achieving collaboration.
Disagreement level
Low to moderate disagreement level. The speakers largely agreed on fundamental goals of digital inclusion, bridging digital divides, and the need for collaboration, but differed on specific approaches, mechanisms, and priorities. This suggests a mature field where stakeholders share common objectives but are still working out optimal implementation strategies. The disagreements are constructive and focused on ‘how’ rather than ‘what’ or ‘why’, which is positive for advancing the field.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers highlight the persistent digital divide with specific statistics showing large populations remain unconnected, particularly in developing regions, despite decades of development efforts.
Speakers
– Sabhanaz Rashid Diya
– Raj Singh
Arguments
32% of world population remains unconnected with stark global north-south disparity
Half of South Asia sub-region still remains unconnected despite progress
Topics
Development | Infrastructure
Both speakers acknowledge the significant decline in development funding across multiple sources, creating substantial resource constraints for digital development initiatives.
Speakers
– Franz von Weizsaecker
– Neeti Biyani
Arguments
German development funding budget faces 10% decline while some philanthropic funding tries to compensate
Rapid decline in official development assistance, philanthropy, and corporate giving exacerbates funding gaps
Topics
Development | Economic
Both speakers emphasize the importance of empowering people with actual capabilities rather than just providing training, with focus on enabling action and meaningful participation.
Speakers
– Maarten Botterman
– Raj Singh
Arguments
Youth engagement and empowerment essential for meaningful digital transformation
Need to shift from capacity building to capability building to enable actual implementation
Topics
Development | Capacity development
Both speakers recognize the critical importance of integrating human rights considerations into digital development, rather than treating them as competing priorities.
Speakers
– Remy Friedmann
– Sabhanaz Rashid Diya
Arguments
Governments have responsibility to protect human rights while setting digital standards and safeguards
Unhealthy tension exists between digital development and human rights protection
Topics
Human rights | Development
Takeaways
Key takeaways
Digital transformation requires a shift from capacity building to capability building, focusing on outcomes rather than outputs to create sustainable impact
Collaboration among stakeholders is essential but currently lacking, with significant duplication of efforts despite shrinking funding resources
Local ownership and understanding are crucial for successful digital transformation – solutions cannot be imposed from outside but must be community-driven and context-sensitive
Digital inclusion should be mainstreamed across all development sectors (health, education, climate, gender) rather than treated as a separate infrastructure challenge
The tension between digital development and human rights protection needs to be resolved, with Global South countries having more agency in defining their own development priorities
Structural ecosystem changes are more important than granular interventions – supporting infrastructure like submarine cables needs accompanying ecosystems to be effective
Government procurement processes and regulatory environments need strengthening to ensure sustainable digital solutions and competitive markets
Meaningful access encompasses not just connectivity but also affordability, quality, and the ability to leverage digital tools for socio-economic improvement
Resolutions and action items
APNIC Foundation committed to shifting focus from outputs to outcomes in their grant-making and project evaluation
Raj Singh offered to share expertise on outcome-focused metrics and evaluation with other organizations including ICANN
APNIC Foundation expressed openness to continued collaboration and conversations with different stakeholders and communities
Panelists agreed on the need to move away from traditional donor-receiver models toward more sovereign, government-led digital development approaches
Unresolved issues
How to effectively coordinate and reduce duplication among the numerous organizations working in digital development
How to address the transparency gap in tracking investments and understanding where funding is being allocated across different initiatives
How to balance private sector profit motives with digital inclusion goals, particularly in licensing auctions and infrastructure deployment
How to create effective marketplace or coordination mechanisms for private companies offering solutions to developing countries
How to ensure sustainable funding models as traditional ODA and philanthropic giving continues to decline
How to address the growing digital divides that continue to emerge even as some connectivity gaps are being filled
How to manage the ‘scramble for space’ in low earth orbit to ensure satellite infrastructure serves global development rather than just commercial interests
Suggested compromises
Leveraging private capital through blended finance mechanisms and de-risking strategies while maintaining focus on development outcomes
Using development banks and Global Gateway-type initiatives to bridge public and private funding while ensuring community ownership
Adopting a ‘make or buy’ approach in government procurement that balances local capacity building with necessary external solutions
Creating open ecosystem approaches for satellite infrastructure that allow developed economy investments to automatically benefit less developed regions
Establishing competitive infrastructure frameworks that balance profit optimization with digital inclusion goals
Moving toward international cooperation models rather than traditional development assistance to respect sovereignty while maintaining support
Thought provoking comments
We seem to be creating new digital divides constantly we’re not stopping… we were talking about stuff 30-20 years ago in a slightly different guise it was ICT4D we’re still talking about the same issues some of those issues have not been solved
Speaker
Raj Singh
Reason
This comment reframes the entire discussion by challenging the assumption of progress in digital development. Instead of celebrating advances, Singh highlights the cyclical nature of digital exclusion and questions whether the development community is actually solving problems or just creating new forms of inequality.
Impact
This shifted the conversation from a focus on solutions to a more critical examination of systemic issues. It prompted other panelists to address structural challenges and led to discussions about the need for outcome-focused rather than output-focused approaches.
There’s also this culture of imposition that has also come over the years, where some of the norm shapers are kind of also deciding what gets done in some of the Global South countries… this unhealthy tension between digital development and human rights
Speaker
Sabhanaz Rashid Diya
Reason
This comment introduces a power dynamics perspective that challenges the traditional donor-recipient model. Diya identifies a fundamental tension between development goals and human rights, while highlighting how Global North actors often impose solutions rather than enabling local ownership.
Impact
This comment deepened the conversation by introducing questions of agency, sovereignty, and power imbalances. It led Franz to acknowledge the ‘supply-oriented approach’ problem and influenced the discussion toward more collaborative, locally-driven solutions.
It’s no longer for us… it’s not about building capacity anymore, it’s about building capabilities. Because when you have capabilities, then you can do things… I am no longer interested in the outputs that those projects create. I want to see outcomes.
Speaker
Raj Singh
Reason
This distinction between capacity and capabilities, and outputs versus outcomes, represents a fundamental shift in how development impact should be measured and achieved. It challenges the traditional metrics-driven approach to development work.
Impact
This comment influenced the entire panel’s approach to discussing solutions, with subsequent speakers adopting more outcome-focused language. It also prompted discussions about structural changes and ecosystem building rather than project-based interventions.
We are about to waste a huge global public good, which is the lower earth orbit and the medium earth orbit by the scramble for space that is happening, driven by a couple of companies… it will not be used, that this resource will not be put to the best possible use
Speaker
Franz von Weizsaecker
Reason
This comment introduces a completely new dimension to the digital divide discussion by framing satellite infrastructure as a global commons issue. It connects digital inclusion to resource allocation and corporate monopolization in space.
Impact
This shifted the conversation from terrestrial infrastructure challenges to broader questions of equitable resource distribution and opened up discussion about alternative connectivity solutions for underserved regions.
Digital transformation and human rights actually can work hand in hand. And when people in the norm receiving parts of the world are able to start defining the conditions in which they’re going to take something or reject something… Then we see a little bit more of that negotiation, a little bit more of that empowerment and that ownership happening
Speaker
Sabhanaz Rashid Diya
Reason
This comment offers a constructive resolution to the tension she earlier identified between development and rights. It reframes the relationship from opposition to synergy, while emphasizing local agency in determining development pathways.
Impact
This comment helped steer the conversation toward more collaborative approaches and influenced subsequent discussions about government sovereignty and locally-driven digital transformation strategies.
There’s a lot of submarine cables being deployed all across the world. The problem is, it’s the cables that are being deployed. There’s no supporting ecosystem that’s being set up at the same time… There’s no clear answer yet… on why that focus is not there yet
Speaker
Raj Singh
Reason
This comment reveals a critical gap between infrastructure investment and actual capability building. It exposes how geopolitical motivations for infrastructure development don’t necessarily align with local development needs.
Impact
This observation led to deeper discussions about the disconnect between technical infrastructure and meaningful digital transformation, influencing the conversation toward ecosystem thinking and structural changes.
Overall assessment
These key comments fundamentally shifted the discussion from a traditional development narrative focused on funding gaps and technical solutions to a more critical examination of power dynamics, structural inequalities, and the need for locally-driven transformation. The most impactful interventions challenged basic assumptions about progress, questioned whose voices are centered in development decisions, and reframed success metrics from outputs to outcomes. The conversation evolved from problem identification to systemic critique, ultimately arriving at calls for genuine collaboration, local ownership, and recognition that digital transformation must be determined by communities themselves rather than imposed by external actors. The discussion became increasingly sophisticated as speakers built on each other’s critical insights, moving beyond traditional donor-recipient frameworks toward more equitable partnership models.
Follow-up questions
How can we collectively address the growing gap and ensure that digital progress benefits everyone, not just a few?
Speaker
Remy Friedmann
Explanation
This is a fundamental question about creating inclusive digital transformation that was posed to stimulate discussion on collective action frameworks
How to get in, how to be able to market or talk about solutions, because nobody wants to talk about who they’re collaborating with, if it’s a private company – how can there be a marketplace or another way of dealing with these challenges?
Speaker
Norwegian company representative (audience)
Explanation
This addresses the practical challenge of private companies trying to offer solutions to developing countries but facing skepticism and lack of clear pathways for engagement
How can we work together as funders and other stakeholders, with transparency in investment amounts, where they’re being invested and who we’re working with?
Speaker
Molly (Digital Health and Rights Project)
Explanation
This highlights the need for better transparency and coordination mechanisms in development funding, particularly around tracking and M&E for investment portfolios
How can we construct a collective impact framework that builds on existing knowledge, centering on local actors, bringing together funders, implementers and communities into strategic alignment?
Speaker
Remy Friedmann
Explanation
This seeks to identify practical mechanisms for creating coordinated approaches to digital development that avoid duplication and center local ownership
In six months, have we gone any step forward or not in terms of collaboration and avoiding duplication?
Speaker
Raj Singh
Explanation
This is a follow-up to previous discussions about the need for better coordination, questioning whether progress has been made since the last IGF
How does the government procure digital services in their local national legal system and how will operational costs be covered after project completion?
Speaker
Franz von Weizsaecker
Explanation
This addresses the sustainability challenge of digital solutions in developing countries and the need for proper procurement processes
How can we establish frameworks for more open ecosystem approaches to satellite connectivity that allow global investments to automatically benefit low GDP areas?
Speaker
Franz von Weizsaecker
Explanation
This explores how to prevent the waste of global public goods like orbital space and ensure satellite infrastructure benefits underserved regions
What is the role that government interventions can play vis-a-vis what is the role that the market itself and competition can play in digital transformation?
Speaker
Neeti Biyani
Explanation
This seeks to clarify the balance between government regulation/intervention and market-driven solutions in achieving digital transformation goals
How can we ensure that digital transformation serves the people it’s intended for, with empowerment and the ability to shape transformation in ways that make sense for local communities?
Speaker
Sabhanaz Rashid Diya
Explanation
This addresses the need for community-driven approaches to digital transformation that go beyond just connectivity to meaningful empowerment
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Lightning Talk #65 Enhancing Digital Trust From Rigidity to Elasticity
Lightning Talk #65 Enhancing Digital Trust From Rigidity to Elasticity
Session at a glance
Summary
This discussion at the UN Internet Governance Forum focused on enhancing digital trust by transitioning from rigid to elastic frameworks in global cybersecurity and data governance. The session was organized by multiple Chinese organizations and moderated by Susan Ning, bringing together perspectives from China and international business communities.
The Deputy Director General of China’s Cybersecurity Association opened by arguing that traditional rigid approaches to digital trust, which rely on strict rule-setting and network isolation, are becoming inadequate in today’s complex digital environment. She advocated for resilient digital trust systems that emphasize flexibility, adaptability, and collaboration to respond swiftly to technological changes while accommodating different countries’ distinct digital ecosystems. China’s efforts in this area include developing comprehensive legal frameworks through data security and personal information protection laws, promoting blockchain technology applications, and launching the Global Initiative on Data Security to foster international cooperation.
Arne Byberg from Weibo Japan provided a business perspective, highlighting how multinational companies struggle with the lack of predictable regulatory frameworks. He noted that while Europe offers more cohesive AI regulation, the US lacks federal AI standards, forcing businesses to develop multiple compliance strategies across different regions, which increases costs and inefficiency. CUI Jie from the China Internet Development Foundation discussed their organization’s work in digital adoption, internet security education, and digital village construction projects.
The session concluded with emphasis on establishing a robust global data compliance system built on universal principles of privacy, security, transparency, and accountability, requiring international cooperation to create seamless and secure data governance frameworks.
Keypoints
**Major Discussion Points:**
– **Shift from rigid to elastic digital trust frameworks**: The need to move away from traditional rigid rule-setting and strict network isolation toward more flexible, adaptable, and collaborative digital trust systems that can respond to rapidly evolving technologies and diverse global contexts.
– **Regulatory fragmentation and business challenges**: The lack of harmonized international regulations, particularly in AI governance, creates significant challenges for multinational businesses who must navigate different regulatory frameworks across regions, leading to costly duplicate compliance efforts.
– **China’s comprehensive digital governance approach**: China’s multi-faceted strategy including legal frameworks (Data Security Law, Personal Information Protection Law), technological innovation (blockchain applications), and international cooperation initiatives (Global Initiative on Data Security).
– **Digital inclusion and social responsibility**: Efforts to bridge the digital divide through programs targeting vulnerable populations like the elderly and youth, including anti-fraud education, digital literacy initiatives, and responsible internet usage programs.
– **Global data compliance system necessity**: The urgent need for universally recognized principles governing data privacy, security, transparency, and accountability, supported by international cooperation and harmonized regulations while respecting national differences.
**Overall Purpose:**
The discussion aimed to explore strategies for building enhanced digital trust in an increasingly complex global digital environment, focusing on the transition from rigid regulatory approaches to more flexible, collaborative frameworks that can accommodate diverse national contexts while maintaining security and promoting international cooperation.
**Overall Tone:**
The discussion maintained a consistently formal, diplomatic, and collaborative tone throughout. Speakers emphasized cooperation, shared challenges, and mutual benefit rather than competition or conflict. The tone was forward-looking and solution-oriented, with participants presenting their perspectives as complementary contributions to a shared global challenge rather than competing viewpoints.
Speakers
– **Susan Ning(F)**: Session moderator
– **Deputy Director General for China Cybersecurity Association**: Deputy Director General for China Cybersecurity Association (also referred to as “Madame Duanyin”)
– **Arne Byberg**: From Oslo Law Firm, Weibo Japan Tech Practice, serves multinational customers
– **CUI Jie**: Deputy Secretary General from China Internet Development Foundation
Additional speakers:
– No additional speakers were identified beyond those in the provided speakers names list.
Full session report
# Comprehensive Report: Enhancing Digital Trust Through Elastic Frameworks in Global Cybersecurity and Data Governance
## Executive Summary
This discussion at the UN Internet Governance Forum examined the critical transition from rigid to elastic frameworks in global cybersecurity and data governance to enhance digital trust. The session was organized by the China Cybersecurity Association, China Internet Development Foundation, China Daily, and King Edward Madison’s law firm, and moderated by Susan Ning. The discussion brought together perspectives from China’s cybersecurity establishment and the international business community to address the evolving challenges of digital governance in an increasingly interconnected world.
The central thesis of the discussion revolved around the inadequacy of traditional rigid approaches to digital trust, which rely heavily on strict rule-setting and network isolation. Speakers advocated for a paradigm shift towards resilient digital trust systems that emphasize flexibility, adaptability, and international collaboration while maintaining security and accommodating diverse national digital ecosystems.
## Key Participants and Their Perspectives
### Deputy Director General for China Cybersecurity Association (Madame Duanyin)
The Deputy Director General opened the discussion by establishing the theoretical foundation for the session’s central theme. She argued that traditional approaches to building digital trust, characterized by “rigid rule-setting, strict network isolation, and one-way regulatory mirrors,” are becoming increasingly inadequate as global connectivity deepens and digital environments grow more complex.
Her presentation emphasized that “the shift toward a resilient digital trust framework has become an inevitable divergence,” highlighting the need for systems that can respond swiftly to technological changes while accommodating different countries’ distinct digital ecosystems. She positioned China’s approach as comprehensive and forward-thinking, encompassing legal frameworks through the Data Security Law and Personal Information Protection Law, technological innovation through blockchain applications, and international cooperation through initiatives such as the Global Initiative on Data Security.
### Arne Byberg – International Business Perspective
Byberg, from Weibo Japan Tech Practice, provided a crucial business perspective that grounded the theoretical discussion in practical realities. His contribution highlighted the significant challenges multinational companies face due to regulatory fragmentation, particularly in artificial intelligence governance. He noted that “businesses are increasingly looking for predictability,” while acknowledging that “the technology is moving really, really fast, faster than the regulation.”
His most striking observation concerned the economic inefficiencies created by inconsistent regulatory frameworks: “We see businesses starting to build up double and triple AI initiatives simply to cope with the various regulations of the different regions. And as everyone understands, that is costly and a lot of time wasted, actually.” This comment effectively illustrated how the lack of harmonized international regulations creates tangible costs and operational complexity for global enterprises.
Byberg’s analysis revealed contrasts between regulatory approaches across regions, noting differences between Europe’s AI regulation approach and the United States, where federal AI standards have been affected by recent political changes, forcing businesses to develop multiple compliance strategies across different jurisdictions.
### CUI Jie – Digital Development and Social Responsibility
CUI Jie, Deputy Secretary General from the China Internet Development Foundation, delivered his presentation in Chinese and focused on the practical implementation of digital inclusion initiatives and social responsibility programs. The China Internet Development Foundation, established in June 2015 as a 5A national public-private partnership, has undertaken comprehensive efforts to bridge the digital divide.
His presentation covered three main areas: supporting digital adoption plans to help elderly citizens overcome digital barriers, implementing youth internet anti-addiction programs and cybersecurity education initiatives, and investing in digital village construction projects in locations including Lingyuan in Liaoning and Fuping in Shaanxi to integrate smart agriculture and e-commerce platforms for rural development.
### Susan Ning – Moderator
Susan Ning served as the session moderator, providing brief introductions and transitions between speakers. Her role was limited to facilitating the discussion, with contributions such as “Ladies and gentlemen, it’s my great honor to moderate this session,” “Thank you very much, Madame Duanyin. Our next speaker is Mr. Arne Byberg,” and “Thank you, Mr. Byberg. Our next speaker will be Mr. CUI Jie.”
## Major Discussion Points and Arguments
### The Paradigm Shift from Rigid to Elastic Frameworks
The discussion’s central theme focused on the fundamental inadequacy of traditional rigid approaches to digital trust. The Deputy Director General for China Cybersecurity Association established this framework by arguing that conventional methods relying on strict rule-setting and network isolation are becoming obsolete in today’s complex digital environment.
The proposed alternative—resilient digital trust frameworks—emphasizes flexibility, adaptability, and collaboration as essential characteristics for responding to rapid technological changes. This approach recognizes that different countries have distinct digital ecosystems that require tailored solutions while maintaining interoperability and security standards.
### Regulatory Fragmentation and Business Challenges
A significant portion of the discussion addressed the practical challenges created by inconsistent regulatory frameworks across different regions. Byberg’s contribution was particularly valuable in highlighting how regulatory fragmentation creates substantial costs and inefficiencies for multinational businesses.
The lack of harmonized international regulations, especially in AI governance, forces companies to develop multiple compliance strategies, leading to duplicated efforts and wasted resources. This challenge is exacerbated by the varying approaches taken by different regions and the changing political landscape affecting regulatory stability.
### China’s Comprehensive Digital Governance Strategy
The discussion highlighted China’s multi-faceted approach to digital governance, which encompasses legal, technological, and international cooperation dimensions. The comprehensive legal framework includes the Data Security Law and Personal Information Protection Law, while technological innovation focuses on blockchain applications and other emerging technologies.
China’s international cooperation efforts, particularly through the Global Initiative on Data Security, represent an attempt to foster collaborative approaches to global digital governance while promoting Chinese perspectives and solutions.
### Digital Inclusion and Social Responsibility
The discussion addressed the importance of ensuring that digital development benefits all segments of society. CUI Jie’s presentation highlighted specific programs targeting vulnerable populations, including elderly citizens struggling with digital barriers and youth requiring protection from internet addiction.
These initiatives reflect a broader understanding that digital governance must address social equity and inclusion alongside technical and regulatory challenges. The digital village construction projects represent an attempt to extend digital benefits to rural areas through integrated smart agriculture and e-commerce platforms.
## Key Insights and Observations
### Economic Impact of Regulatory Fragmentation
Byberg’s revelation about businesses building “double and triple AI initiatives” to cope with various regional regulations effectively illustrated the concrete costs of regulatory fragmentation. This insight demonstrated how the lack of harmonized international regulations creates tangible economic consequences and operational complexity for global enterprises.
### Paradigm Shift Recognition
The Deputy Director General’s observation about the inevitable shift from rigid to resilient digital trust frameworks provided the conceptual foundation for the entire discussion. This insight reframed digital governance challenges as requiring fundamental paradigm shifts rather than incremental improvements to existing approaches.
### Comprehensive Approach to Digital Development
CUI Jie’s presentation demonstrated how digital governance extends beyond technical and regulatory considerations to encompass social responsibility and inclusion. The China Internet Development Foundation’s work illustrates practical approaches to ensuring digital development benefits all segments of society.
## Areas Requiring Further Development
### Implementation Mechanisms
While speakers agreed on the need for international cooperation and more flexible regulatory frameworks, specific mechanisms for achieving these goals across different legal systems and cultural contexts require further development.
### Business-Regulatory Balance
The tension between the need for regulatory flexibility to accommodate rapid technological change and businesses’ requirements for predictable regulatory environments remains an ongoing challenge requiring creative solutions.
### Technical Standards
The discussion highlighted the need for interoperable technologies and advanced cybersecurity measures but did not address specific technical standards or implementation details required for practical deployment.
## Conclusion
This discussion at the UN Internet Governance Forum successfully identified critical challenges in contemporary digital governance while proposing conceptual frameworks for addressing them. The session brought together diverse perspectives from regulatory authorities, business communities, and civil society organizations to examine the transition from rigid to elastic approaches in digital trust frameworks.
The speakers demonstrated consensus on fundamental principles, including the need for international cooperation, comprehensive legal frameworks, and inclusive approaches to digital development. However, the discussion also revealed important challenges, particularly the tension between regulatory flexibility and business predictability, and the practical difficulties of implementing harmonized approaches across diverse national contexts.
The session’s emphasis on building flexible, secure, and inclusive digital frameworks provides a constructive foundation for future policy development. The recognition that traditional rigid approaches are inadequate for contemporary digital challenges, combined with practical insights about business needs and social inclusion requirements, offers valuable perspectives for advancing digital governance discussions.
*Note: The transcript indicates that an additional speaker may have presented content about global data compliance systems, but the attribution of this content is unclear from the available materials.*
Session transcript
Susan Ning(F): Ladies and gentlemen, it’s my great honor to moderate this session to discuss the enhanced digital trust, from rigidity to elasticity. And this session is organized by the China Cybersecurity Association, China Internet Development Foundation, China Daily, and King Edward Madison’s, the law firm. Now our first speaker is Madame Duanyin, who is the Deputy Director General for China Cybersecurity Association.
Deputy Director General for China Cybersecurity Association: Ladies and gentlemen, good afternoon. On behalf of the Cybersecurity Association of China, it gives me great honor to attend the IGF and have an exchange with you all on the important topic, enhancing digital trust from rigidity to elasticity. In today’s world, digital technologies are reshaping economic and social life at speed and scale. However, alongside the opportunities, come growing challenges to digital trust. Traditional approaches to build trust tend to rely on the rigid rule-setting, strict network isolation, and one-way regulatory mirrors. Such mechanisms have proven necessary and effective at a certain stage of development and in specific contexts. However, as global connectivity continues to deepen, the digital environment is becoming increasingly complex, and modern techniques are rapidly evolving and becoming deeply incorporated. Evidence in digital governance, philosophies, legal frameworks, and technology capabilities among countries and regions are becoming evident. Really, the digital trust architectures are revealing limitations explicitly. As a result, the shift toward a resilient digital trust framework has become an inevitable divergence and a pricing challenge for global digital governance. A resilient digital trust system emphasizes flexibility, adaptability, and collaboration, such as responding swiftly to changes brought about by new technologies and applications. The trust framework should align with the distinct characteristics of each country or region’s digital ecosystem. Given the different stages of digital economic development, cultural backgrounds, and legal foundations around the world, a resilient system should meet these needs through different trust streets and governance mechanisms. This underscores the importance of cooperation among all stakeholders in building digital trust on a global scale. China has undertaken extensive efforts and explorations in advancing the development of a resilient digital trust framework. China has been continually improving its policy and regulatory system for digital governance while also placing emphasis on maintaining policy flexibilities in the area of data security. The data security law and the personal information protection law provide the foundation legal framework for data governance. On the technological innovation front, China has been actively promoting the research and development of key digital trust technologies, such as applications of blockchain in supply chain finance. In terms of international cooperation, China has taken an active role in growing rulemaking for digital governance prospects. The global initiative on data security upholds the principles of openness, equality, and mutual benefit to build a peaceful, secure, open, and cooperative cyberspace. This initiative provides an important cooperation framework for the global development of a resilient digital trust system. As a national, industry-based, and non-profit social organization formed vocabulary by institutions, enterprises, and individuals engaged in cybersecurity-related industry, education, research, and application in China, the Cybersecurity Association of China maintains a broad collaboration connection with all sectors involved by cyberspace security. We remain committed to promoting technological application, improving industry self-regulation, and facilitating exchange and cooperation in order to create a more trustworthy digital world. Let us work together to build a safer, more trusted, and more prosperous global cyberspace with resilient digital trust as our shared foundation. Thank you.
Susan Ning(F): Thank you very much, Madame Duanyin. Our next speaker is Mr. Arne Byberg from the Oslo Law Firm, Weibo Japan. Arne. Is this on?
Arne Byberg: Yeah, it is. Thanks for having me. Yeah, so I’m Arne Byberg from Weibo Japan Tech Practice. We basically serve multinational customers. And I think what I’d like to share and augment to the Chinese perspective here is that businesses are increasingly looking for predictability. The technology is moving really, really fast, faster than the regulation. We see in the US, for instance, they have still no federal AI regulation. It was, they had one, it was starting to come along, but it was revoked by the current president. Hence, we are basically stuck with the sectorial regulation and state regulation, which makes it hard to navigate. In Europe, it’s a little bit easier to navigate because at least there is some cohesive regulation when it comes to AI Act and the Council of Europe Convention, et cetera. So that is helpful. But what we see is that there is still a lot of dependency on humans in the chain. So sort of having AI actually perform at a level where you can derive direct value is going to be challenging going forward. So I think in terms of international businesses, the multinational businesses, we see them struggle. They are certainly looking for some predictability. Whether that can be achieved through governance in the internet space and organizations and events like this, or collaborations on the AI front, I don’t know. But we hope and would welcome developments in that space because currently we see businesses starting to build up double and triple AI initiatives simply to cope with the various regulations of the different regions. And as everyone understands, that is costly and a lot of time wasted, actually. So with short end to that, that is my input. Thank you. Cheers.
Susan Ning(F): Thank you, Mr. Byberg. Our next speaker will be Mr. CUI Jie, who is the Deputy Secretary General from China Internet Development Foundation. Now Mr. Cui, yeah.
CUI Jie: I speak in Chinese. Hello, everyone. It is a great honor to be able to discuss with you on the UN Internet Governance Forum. On behalf of the China Internet Development Foundation, I would like to express my sincere gratitude to the UN Internet Governance Forum and my sincere greetings to all of you. With digitalization, networkization, and smartness continuing to advance and develop, public services are deeply integrated with the internet. The Internet Project is a new form of public-private partnership where public-private partnerships are based on the Internet and develop public-private partnerships through the use of Internet technology. The China Internet Development Fund, which is the leading social organization in the field of Internet public-private partnerships, was established in June 2015 and is a 5A national public-private partnership. We have always insisted on studying and integrating Xi Jinping’s new era Chinese characteristics and socialist ideas, especially Xi Jinping’s cultural ideas, as well as Xi Jinping’s important ideas on Internet power, focusing on Internet content construction, Internet security, informationization, and global Internet governance. We have established a special fund to support the healthy development of China’s Internet industry, promote Internet positive energy transmission, promote national Internet security and information development, promote Internet-related international and Hong Kong-Macao-Taiwan areas and make active contributions to the development of Internet power. With this opportunity, I would like to share with you some of our key work. In terms of improving digital adoption, we are committed to combining the digital red loop and the national development concept to support the implementation of the digital adoption and public-private partnership plan, combining online and offline to promote anti-fraud and anti-fraud content and help the elderly to cross the digital red loop. This year, we will also implement the Youth Internet Anti-Addiction Plan to help youth establish the correct concept of online and develop good habits of using the Internet. In terms of serving Internet security, we will support the implementation of the Internet Security College Student Innovation Support Plan to promote universities to support innovation and industrial development and improve the ability of students to solve practical problems. In addition, we will support the national Internet Security Standards Week to promote the broad agreement on Internet security standards, improve the standard quality level, promote standard implementation, and promote the standardization of Internet security and high-quality development. We will support the development of digital village to develop innovative Internet security talents. In terms of promoting digital village construction, we will invest in a special fund to support the implementation of digital village distance actions. We will build demonstration villages in ten locations, such as Lingyuan in Liaoning, Fuping in Shaanxi, and Fuping County in Shaanxi Province. The county relies on good ecological tourism resources and regional advantages. We will focus on Lenshuiyu, Longcaoping Village, Lenshuiyu and Mifeng Farming Base, Yingchang Village, and Dianshang Fuhua Base. We will build smart agriculture, produce and sell integrated, comprehensive improvement projects, build industrial development models, and drive the rapid transformation and upgrade of the whole industry. Ladies and gentlemen, these are the key points that I would like to share with you. With this opportunity, I would like to introduce the first project, the Internet of China Public Service Action. This project is aimed at uniting the power of the Internet industry to promote the development of the Internet public service industry and to spread the positive energy of the Internet, and to use the resources and advantages of the public service project to serve the network industry and to promote the security and information development of the national Internet. Last year, we successfully held the first Internet of China Public Service Action. Through technological adaptation, ecological construction, and regional practice, we have created an innovative form of artificial intelligence and public service. We have demonstrated the potential of artificial intelligence technology and provided valuable experience for the digital transformation of the Internet public service industry. This year, we will focus on the two groups, one large and one small, to clean up and refine the experience to help the elderly better use the Internet, and to help young people develop good habits of using the Internet so that they can better enjoy the results of digital development. Here, we call on everyone to pay attention and participate in the 2025 Internet of China Public Service Action. We hope that everyone will integrate with the Internet public service industry from different interfaces and jointly write a new chapter in the digital age of public service. Thank you.
Susan Ning(F): The rapid proliferation of data has also brought unprecedented challenges, including privacy violations, including the misuse of personal information and data breach. It is imperative that we establish a robot global data compliance system to address these challenges and ensure that data serves humanity in a responsible and ethical way. First and foremost, the foundation of a global data compliance system must be built upon a set of universally recognized principles. These principles shall include data privacy, data security, transparency, accountability, and the protection of human rights. Every country, regardless of its size and level of development, must adhere to these principles to create a level playing field for all stakeholders in the digital ecosystem. To achieve this, international cooperation is essential. Data does not recognize national borders. It flows freely across the globe, connecting people and systems in ways we could never have imagined. Therefore, we must work together to harmonize our data protection laws and regulations. This does not mean that we should eliminate all the differences, but rather we should strive for compatibility and interoperability. By doing this, we can create a seamless and secure environment for data economy. Moreover, we need to foster a culture of data responsibility. This involves educating individuals, businesses, and governments about the importance of data compliance. It means promoting best practices in data management and encouraging the development of ethical guidelines for data use. We must also empower citizens to take control of their own data, ensuring that they have the right to access, correct, and delete their own personal information. In addition, the role of technology cannot be overstated. As we develop new technologies, we must also develop the means to secure and protect the data. This includes investing in advanced cybersecurity measures, promoting the use of privacy computing, and encouraging the development of privacy-enhancing technologies. We must also stay ahead of emerging threats such as artificial intelligence and machine learning, which can both enhance or undermine the data security and privacy. Ladies and gentlemen, the global data compliance system is not just a regulatory framework. It is a collective commitment to safeguard the digital future of our planet. It’s a commitment to ensure that data is used for the greater good, that it respects the rights and the freedom of individuals, and that it fosters innovation and economic growth in a sustainable and responsible manner. As we move forward, let’s remember that we are all stakeholders in this digital revolution. Governments, businesses, civil society, and individuals all have a role to play. Together, we shall build a global data compliance system that is resilient, inclusive, and forward-looking. Together, we can create a digital world that is both secure and open, where data flows freely but responsibly. Thank you so much. Thank you.
Deputy Director General for China Cybersecurity Association
Speech speed
92 words per minute
Speech length
501 words
Speech time
324 seconds
Traditional rigid approaches to digital trust are becoming inadequate as global connectivity deepens and digital environments become more complex
Explanation
The speaker argues that traditional trust-building mechanisms that rely on rigid rule-setting, strict network isolation, and one-way regulatory measures are showing limitations. As global connectivity increases and digital environments become more complex with rapidly evolving technologies, these rigid approaches are no longer sufficient for modern digital governance needs.
Evidence
Evidence includes the growing complexity of digital environments, rapid evolution of modern techniques, and evident differences in digital governance philosophies, legal frameworks, and technology capabilities among countries and regions
Major discussion point
Digital Trust Framework Evolution
Topics
Cybersecurity | Legal and regulatory
A shift toward resilient digital trust frameworks emphasizing flexibility, adaptability, and collaboration is necessary to respond to new technologies and applications
Explanation
The speaker advocates for moving from rigid to resilient digital trust systems that can adapt to changes and collaborate across different contexts. This approach should align with distinct characteristics of each country’s digital ecosystem and accommodate different stages of development, cultural backgrounds, and legal foundations.
Evidence
The need is supported by the requirement to respond swiftly to changes from new technologies, accommodate different stages of digital economic development, cultural backgrounds, and legal foundations around the world
Major discussion point
Digital Trust Framework Evolution
Topics
Cybersecurity | Legal and regulatory
Agreed with
– Susan Ning(F)
Agreed on
Need for international cooperation in digital governance
Disagreed with
– Arne Byberg
Disagreed on
Regulatory approach – flexibility vs predictability
China has developed comprehensive legal frameworks including data security law and personal information protection law while promoting blockchain applications and international cooperation through initiatives like the global data security framework
Explanation
The speaker presents China’s multi-faceted approach to building resilient digital trust, which includes establishing foundational legal frameworks for data governance, investing in technological innovation, and leading international cooperation efforts. This demonstrates practical implementation of the resilient trust framework concept.
Evidence
Specific evidence includes China’s data security law and personal information protection law as foundational legal frameworks, blockchain applications in supply chain finance, and the global initiative on data security that upholds principles of openness, equality, and mutual benefit
Major discussion point
Digital Trust Framework Evolution
Topics
Legal and regulatory | Human rights | Cybersecurity
Agreed with
– Susan Ning(F)
Agreed on
Importance of comprehensive legal frameworks for digital governance
Arne Byberg
Speech speed
128 words per minute
Speech length
293 words
Speech time
137 seconds
Businesses are struggling with lack of predictability due to inconsistent AI regulation across regions, with the US having no federal AI regulation and Europe providing more cohesive frameworks
Explanation
The speaker highlights the regulatory fragmentation problem where multinational businesses face uncertainty due to inconsistent AI regulations across different regions. While Europe offers more cohesive regulation through the AI Act and Council of Europe Convention, the US lacks federal AI regulation, relying instead on sectoral and state-level regulations.
Evidence
The US has no federal AI regulation after the previous one was revoked by the current president, leaving businesses to navigate sectoral and state regulations, while Europe has more cohesive regulation with the AI Act and Council of Europe Convention
Major discussion point
Regulatory Challenges and Business Predictability
Topics
Legal and regulatory | Economic
Disagreed with
– Deputy Director General for China Cybersecurity Association
Disagreed on
Regulatory approach – flexibility vs predictability
Multinational companies are forced to build multiple AI initiatives to comply with different regional regulations, resulting in increased costs and wasted resources
Explanation
The speaker explains that the lack of regulatory harmonization forces businesses to create duplicate or triplicate AI systems to meet varying regional requirements. This regulatory fragmentation leads to inefficient resource allocation and increased operational costs for multinational corporations.
Evidence
Businesses are building double and triple AI initiatives to cope with various regulations of different regions, which is costly and results in wasted time
Major discussion point
Regulatory Challenges and Business Predictability
Topics
Economic | Legal and regulatory
CUI Jie
Speech speed
146 words per minute
Speech length
756 words
Speech time
308 seconds
The China Internet Development Foundation focuses on combining digital development with national concepts, supporting digital adoption plans and helping elderly citizens overcome digital barriers
Explanation
The speaker describes the foundation’s approach to digital inclusion by integrating digital development with national development concepts. They specifically focus on helping elderly citizens cross the digital divide through combined online and offline programs, including anti-fraud education and digital literacy initiatives.
Evidence
Implementation of digital adoption and public-private partnership plans, combining online and offline approaches to promote anti-fraud content and help the elderly cross the digital divide
Major discussion point
Digital Development and Public Service Integration
Topics
Development | Sociocultural
Implementation of youth internet anti-addiction programs and cybersecurity education initiatives to promote healthy internet usage and develop cybersecurity talent
Explanation
The speaker outlines educational initiatives targeting both youth internet addiction prevention and cybersecurity talent development. These programs aim to establish proper online habits among young people while building cybersecurity capabilities through university partnerships and innovation support.
Evidence
Youth Internet Anti-Addiction Plan to help youth establish correct online concepts and develop good internet habits, Internet Security College Student Innovation Support Plan to promote university innovation and improve students’ practical problem-solving abilities
Major discussion point
Digital Development and Public Service Integration
Topics
Cybersecurity | Sociocultural | Development
Investment in digital village construction projects across multiple provinces to integrate smart agriculture and e-commerce platforms for rural development
Explanation
The speaker describes comprehensive rural digitalization efforts through the digital village construction initiative. This involves building demonstration villages that integrate smart agriculture, e-commerce platforms, and comprehensive development projects to drive industrial transformation and upgrade in rural areas.
Evidence
Building demonstration villages in ten locations including Lingyuan in Liaoning and Fuping in Shaanxi, focusing on smart agriculture, integrated production and sales, and comprehensive improvement projects to build industrial development models
Major discussion point
Digital Development and Public Service Integration
Topics
Development | Economic
Susan Ning(F)
Speech speed
97 words per minute
Speech length
575 words
Speech time
353 seconds
A robust global data compliance system must be established based on universally recognized principles including data privacy, security, transparency, accountability, and human rights protection
Explanation
The speaker argues for the necessity of establishing a comprehensive global framework for data governance built on fundamental principles. This system should ensure that all countries, regardless of size or development level, adhere to these core principles to create equitable conditions for all digital ecosystem stakeholders.
Evidence
The foundation must include data privacy, data security, transparency, accountability, and protection of human rights, with every country adhering to these principles to create a level playing field
Major discussion point
Global Data Compliance System
Topics
Human rights | Legal and regulatory | Cybersecurity
Agreed with
– Deputy Director General for China Cybersecurity Association
Agreed on
Importance of comprehensive legal frameworks for digital governance
International cooperation is essential to harmonize data protection laws while maintaining compatibility and interoperability across different regulatory frameworks
Explanation
The speaker emphasizes that since data flows freely across borders, international collaboration is crucial for creating compatible regulatory frameworks. Rather than eliminating all differences, the goal should be achieving compatibility and interoperability to enable a seamless and secure global data economy.
Evidence
Data flows freely across the globe connecting people and systems, requiring harmonized data protection laws and regulations while striving for compatibility and interoperability rather than eliminating all differences
Major discussion point
Global Data Compliance System
Topics
Legal and regulatory | Human rights
Agreed with
– Deputy Director General for China Cybersecurity Association
Agreed on
Need for international cooperation in digital governance
Technology development must include advanced cybersecurity measures and privacy-enhancing technologies to address emerging threats from AI and machine learning
Explanation
The speaker stresses that technological advancement must be accompanied by corresponding security and privacy protection measures. This includes investing in cybersecurity, promoting privacy computing, and staying ahead of emerging threats posed by AI and machine learning technologies that can both enhance and undermine data security.
Evidence
Need for advanced cybersecurity measures, privacy computing, privacy-enhancing technologies, and staying ahead of emerging threats from artificial intelligence and machine learning which can both enhance or undermine data security and privacy
Major discussion point
Global Data Compliance System
Topics
Cybersecurity | Human rights | Infrastructure
Agreed with
– Deputy Director General for China Cybersecurity Association
Agreed on
Technology development must be accompanied by security measures
Agreements
Agreement points
Need for international cooperation in digital governance
Speakers
– Deputy Director General for China Cybersecurity Association
– Susan Ning(F)
Arguments
A shift toward resilient digital trust frameworks emphasizing flexibility, adaptability, and collaboration is necessary to respond to new technologies and applications
International cooperation is essential to harmonize data protection laws while maintaining compatibility and interoperability across different regulatory frameworks
Summary
Both speakers emphasize the critical importance of international collaboration in building effective digital governance frameworks, whether through resilient trust systems or harmonized data protection laws
Topics
Legal and regulatory | Cybersecurity
Technology development must be accompanied by security measures
Speakers
– Deputy Director General for China Cybersecurity Association
– Susan Ning(F)
Arguments
China has developed comprehensive legal frameworks including data security law and personal information protection law while promoting blockchain applications and international cooperation through initiatives like the global data security framework
Technology development must include advanced cybersecurity measures and privacy-enhancing technologies to address emerging threats from AI and machine learning
Summary
Both speakers agree that technological advancement must be paired with robust security frameworks and legal protections to ensure safe digital development
Topics
Cybersecurity | Legal and regulatory | Infrastructure
Importance of comprehensive legal frameworks for digital governance
Speakers
– Deputy Director General for China Cybersecurity Association
– Susan Ning(F)
Arguments
China has developed comprehensive legal frameworks including data security law and personal information protection law while promoting blockchain applications and international cooperation through initiatives like the global data security framework
A robust global data compliance system must be established based on universally recognized principles including data privacy, security, transparency, accountability, and human rights protection
Summary
Both speakers advocate for comprehensive legal frameworks that establish clear principles and regulations for digital governance, whether at national or global levels
Topics
Legal and regulatory | Human rights | Cybersecurity
Similar viewpoints
Both speakers recognize that traditional approaches to digital governance are insufficient for current challenges and advocate for new, more comprehensive frameworks
Speakers
– Deputy Director General for China Cybersecurity Association
– Susan Ning(F)
Arguments
Traditional rigid approaches to digital trust are becoming inadequate as global connectivity deepens and digital environments become more complex
A robust global data compliance system must be established based on universally recognized principles including data privacy, security, transparency, accountability, and human rights protection
Topics
Legal and regulatory | Cybersecurity | Human rights
Both speakers identify regulatory fragmentation as a major challenge and implicitly support the need for more harmonized approaches to digital regulation
Speakers
– Arne Byberg
– Susan Ning(F)
Arguments
Businesses are struggling with lack of predictability due to inconsistent AI regulation across regions, with the US having no federal AI regulation and Europe providing more cohesive frameworks
International cooperation is essential to harmonize data protection laws while maintaining compatibility and interoperability across different regulatory frameworks
Topics
Legal and regulatory | Economic
Unexpected consensus
Flexibility in digital governance approaches
Speakers
– Deputy Director General for China Cybersecurity Association
– Arne Byberg
Arguments
A shift toward resilient digital trust frameworks emphasizing flexibility, adaptability, and collaboration is necessary to respond to new technologies and applications
Businesses are struggling with lack of predictability due to inconsistent AI regulation across regions, with the US having no federal AI regulation and Europe providing more cohesive frameworks
Explanation
Despite representing different perspectives (regulatory authority vs. business), both speakers converge on the need for more adaptive and flexible approaches to digital governance, though from different angles – one advocating for resilient frameworks and the other highlighting the problems of rigid, inconsistent regulation
Topics
Legal and regulatory | Economic
Human-centered approach to digital development
Speakers
– CUI Jie
– Susan Ning(F)
Arguments
The China Internet Development Foundation focuses on combining digital development with national concepts, supporting digital adoption plans and helping elderly citizens overcome digital barriers
A robust global data compliance system must be established based on universally recognized principles including data privacy, security, transparency, accountability, and human rights protection
Explanation
Both speakers, despite focusing on different aspects (domestic digital inclusion vs. global data governance), share a human-centered approach that prioritizes protecting and empowering individuals in the digital space
Topics
Human rights | Development | Sociocultural
Overall assessment
Summary
The speakers demonstrate significant consensus on the need for international cooperation, comprehensive legal frameworks, and human-centered approaches to digital governance, while recognizing the inadequacy of traditional rigid regulatory approaches
Consensus level
High level of consensus on fundamental principles with complementary perspectives from different stakeholders (regulatory authorities, business, and civil society). This strong alignment suggests potential for collaborative solutions in global digital governance, particularly around building flexible, secure, and inclusive digital frameworks that balance innovation with protection of rights and interests.
Differences
Different viewpoints
Regulatory approach – flexibility vs predictability
Speakers
– Deputy Director General for China Cybersecurity Association
– Arne Byberg
Arguments
A shift toward resilient digital trust frameworks emphasizing flexibility, adaptability, and collaboration is necessary to respond to new technologies and applications
Businesses are struggling with lack of predictability due to inconsistent AI regulation across regions, with the US having no federal AI regulation and Europe providing more cohesive frameworks
Summary
The Chinese representative advocates for flexible and adaptive regulatory frameworks that can respond to technological changes, while the business representative emphasizes the need for predictable, cohesive regulation that businesses can rely on for planning and compliance
Topics
Legal and regulatory | Economic
Unexpected differences
Scale and scope of regulatory solutions
Speakers
– Arne Byberg
– Susan Ning(F)
Arguments
Multinational companies are forced to build multiple AI initiatives to comply with different regional regulations, resulting in increased costs and wasted resources
International cooperation is essential to harmonize data protection laws while maintaining compatibility and interoperability across different regulatory frameworks
Explanation
While both speakers acknowledge the problems caused by regulatory fragmentation, they propose different solutions – Byberg focuses on the business impact and need for predictable frameworks, while Susan Ning proposes a comprehensive global compliance system. This disagreement is unexpected because both should theoretically support harmonization, but they approach it from different perspectives and with different priorities
Topics
Legal and regulatory | Economic
Overall assessment
Summary
The main areas of disagreement center around regulatory philosophy (flexibility vs predictability), the scope of solutions (national vs global approaches), and priorities in digital governance (business efficiency vs comprehensive protection)
Disagreement level
The level of disagreement is moderate but significant for policy implications. While speakers generally agree on the need for better digital governance and international cooperation, their different approaches – China’s emphasis on flexible national frameworks, business community’s need for predictability, and calls for universal global principles – reflect fundamental tensions between sovereignty, economic efficiency, and universal standards in digital governance
Partial agreements
Partial agreements
Similar viewpoints
Both speakers recognize that traditional approaches to digital governance are insufficient for current challenges and advocate for new, more comprehensive frameworks
Speakers
– Deputy Director General for China Cybersecurity Association
– Susan Ning(F)
Arguments
Traditional rigid approaches to digital trust are becoming inadequate as global connectivity deepens and digital environments become more complex
A robust global data compliance system must be established based on universally recognized principles including data privacy, security, transparency, accountability, and human rights protection
Topics
Legal and regulatory | Cybersecurity | Human rights
Both speakers identify regulatory fragmentation as a major challenge and implicitly support the need for more harmonized approaches to digital regulation
Speakers
– Arne Byberg
– Susan Ning(F)
Arguments
Businesses are struggling with lack of predictability due to inconsistent AI regulation across regions, with the US having no federal AI regulation and Europe providing more cohesive frameworks
International cooperation is essential to harmonize data protection laws while maintaining compatibility and interoperability across different regulatory frameworks
Topics
Legal and regulatory | Economic
Takeaways
Key takeaways
Digital trust frameworks must evolve from rigid, rule-based approaches to flexible, resilient systems that can adapt to rapidly changing technologies and diverse global contexts
There is an urgent need for international regulatory harmonization, particularly in AI governance, as inconsistent regulations across regions create significant compliance burdens and costs for multinational businesses
China has established comprehensive legal frameworks for digital governance while emphasizing flexibility and international cooperation through initiatives like the global data security framework
Digital inclusion efforts must address both ends of the demographic spectrum – helping elderly citizens overcome digital barriers while protecting youth from internet addiction and promoting healthy online habits
A global data compliance system based on universal principles (privacy, security, transparency, accountability, human rights protection) is essential for responsible data governance in an interconnected world
Technology development must be coupled with robust cybersecurity measures and privacy-enhancing technologies to address emerging threats from AI and machine learning
Resolutions and action items
Call for participation in the 2025 Internet of China Public Service Action to promote digital public service integration
Continued implementation of digital village construction projects across multiple Chinese provinces
Ongoing support for cybersecurity education initiatives and youth internet programs
Promotion of international cooperation frameworks for digital governance and data security
Unresolved issues
How to achieve practical regulatory harmonization across different legal systems and cultural contexts while respecting national sovereignty
Specific mechanisms for creating predictable regulatory environments for multinational businesses operating across multiple jurisdictions
Technical standards and implementation details for interoperable privacy-enhancing technologies
Concrete steps for establishing universal data compliance principles that can be effectively enforced globally
How to balance the need for regulatory flexibility with business requirements for predictability and consistency
Suggested compromises
Striving for compatibility and interoperability in data protection laws rather than complete harmonization, allowing for regional differences while maintaining functional cooperation
Maintaining policy flexibility in data security areas while providing foundational legal frameworks
Balancing openness with security in cyberspace governance through collaborative international initiatives
Thought provoking comments
Traditional approaches to build trust tend to rely on the rigid rule-setting, strict network isolation, and one-way regulatory mirrors… However, as global connectivity continues to deepen, the digital environment is becoming increasingly complex… the shift toward a resilient digital trust framework has become an inevitable divergence.
Speaker
Deputy Director General for China Cybersecurity Association
Reason
This comment is insightful because it frames the entire discussion by identifying a fundamental paradigm shift in digital governance – moving from rigid, static approaches to flexible, adaptive frameworks. It acknowledges that traditional security models are becoming inadequate for our interconnected digital reality and introduces the core theme of ‘rigidity to elasticity.’
Impact
This comment established the conceptual foundation for the entire session, setting up the central tension between security and flexibility that subsequent speakers would address. It provided the theoretical framework that other participants could build upon or respond to.
Businesses are increasingly looking for predictability. The technology is moving really, really fast, faster than the regulation… we see businesses starting to build up double and triple AI initiatives simply to cope with the various regulations of the different regions. And as everyone understands, that is costly and a lot of time wasted, actually.
Speaker
Arne Byberg
Reason
This comment is particularly thought-provoking because it introduces the practical business perspective and highlights a critical inefficiency in the current system. It reveals how the lack of harmonized regulation is creating real economic costs and operational complexity for multinational companies, adding a concrete dimension to the abstract discussion of digital trust.
Impact
Byberg’s comment shifted the discussion from theoretical policy frameworks to practical implementation challenges. It introduced the business stakeholder perspective and highlighted the economic consequences of fragmented regulatory approaches, which influenced the moderator’s subsequent focus on global harmonization and practical solutions.
Data does not recognize national borders. It flows freely across the globe, connecting people and systems in ways we could never have imagined. Therefore, we must work together to harmonize our data protection laws and regulations. This does not mean that we should eliminate all the differences, but rather we should strive for compatibility and interoperability.
Speaker
Susan Ning (Moderator)
Reason
This comment is insightful because it captures the fundamental challenge of governing a borderless digital world with nation-state regulatory frameworks. The distinction between ‘harmonization’ and ‘elimination of differences’ is particularly nuanced, suggesting a middle path that respects sovereignty while enabling cooperation.
Impact
This comment synthesized the previous speakers’ points and elevated the discussion to a more strategic level. It moved beyond identifying problems to proposing a specific approach – seeking compatibility rather than uniformity – which provided a constructive framework for thinking about international cooperation in digital governance.
The global data compliance system is not just a regulatory framework. It is a collective commitment to safeguard the digital future of our planet. It’s a commitment to ensure that data is used for the greater good, that it respects the rights and the freedom of individuals, and that it fosters innovation and economic growth in a sustainable and responsible manner.
Speaker
Susan Ning (Moderator)
Reason
This comment is thought-provoking because it reframes data compliance from a technical/legal issue to a moral and civilizational imperative. It connects digital governance to broader themes of planetary stewardship and collective responsibility, elevating the stakes of the discussion.
Impact
This comment provided a philosophical capstone to the discussion, transforming it from a technical policy debate into a broader conversation about shared values and collective responsibility. It unified the various practical concerns raised by previous speakers under a larger moral framework.
Overall assessment
These key comments shaped the discussion by creating a clear progression from problem identification to solution frameworks. The Chinese representative established the theoretical foundation by identifying the paradigm shift needed in digital trust. Byberg’s business perspective grounded the discussion in practical realities and economic consequences. The moderator then synthesized these perspectives, first by proposing a nuanced approach to international cooperation that balances harmonization with sovereignty, and finally by elevating the entire discussion to a moral and civilizational level. Together, these comments created a comprehensive narrative arc that moved from technical challenges through practical implications to philosophical imperatives, demonstrating how effective moderation can weave together diverse perspectives into a coherent and progressively deeper conversation.
Follow-up questions
How can predictability be achieved for multinational businesses navigating different AI regulations across regions?
Speaker
Arne Byberg
Explanation
Businesses are struggling with inconsistent regulations across different regions, leading to costly duplicate AI initiatives, and there’s uncertainty about whether governance through internet organizations can provide solutions
How can international cooperation effectively harmonize data protection laws while maintaining compatibility and interoperability?
Speaker
Susan Ning
Explanation
She emphasized the need to work together to harmonize data protection laws without eliminating all differences, but the specific mechanisms for achieving this compatibility remain unclear
How can we stay ahead of emerging threats from AI and machine learning that can both enhance and undermine data security?
Speaker
Susan Ning
Explanation
She identified AI and machine learning as dual-nature technologies that present both opportunities and threats to data security, requiring proactive approaches that weren’t fully explored
What specific governance mechanisms can effectively address the limitations of rigid digital trust architectures?
Speaker
Deputy Director General for China Cybersecurity Association
Explanation
While she identified that traditional rigid approaches are showing limitations in complex digital environments, the specific alternative mechanisms for resilient trust frameworks need further development
How can different countries align their distinct digital ecosystems while maintaining their unique characteristics?
Speaker
Deputy Director General for China Cybersecurity Association
Explanation
She noted that trust frameworks should align with each country’s distinct digital ecosystem characteristics, but the practical methods for achieving this alignment require further exploration
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
The Value of an Open Free and Secure Internet
Parliamentary Roundtable Safeguarding Democracy in the Digital Age Legislative Priorities and Policy Pathways
Parliamentary Roundtable Safeguarding Democracy in the Digital Age Legislative Priorities and Policy Pathways
Session at a glance
Summary
This discussion focused on safeguarding democracy in the digital age through legislative priorities and policy pathways, bringing together members of parliament from around the world to address how digital technologies impact democratic institutions. The session was part of the parliamentary track at the Internet Governance Forum (IGF) 2025, with participants from Norway, Kenya, California, Barbados, and Tajikistan sharing their national experiences and challenges.
Key speakers emphasized the urgent need to balance freedom of expression with combating misinformation and disinformation, particularly as artificial intelligence technologies blur the lines between fact and fiction. Norwegian MP Grunde Almeland highlighted concerns about truth becoming less relevant in political discourse, while advocating for strengthening independent media organizations as a crucial countermeasure. Kenyan Senator Catherine Mumma outlined her country’s comprehensive legal framework including cybercrime and data protection acts, while noting ongoing challenges with hate speech and electoral misinformation that sometimes leads to violence.
California Assembly Member Rebecca Bauer-Kahn discussed the state’s pioneering role in privacy legislation and AI regulation, including requirements for watermarking AI-generated content and disclosure in political advertisements. She emphasized the need for “technology for good” and increased investment in academic institutions to compete with profit-driven tech companies. Barbados MP Marsha Caddle shared experiences with deepfakes targeting political leaders and stressed the importance of democratic literacy and creating a culture of evidence-based information.
Several participants raised concerns about technological dumping by advanced economies onto developing nations, comparing it to historical patterns of exploitation. The discussion concluded with calls for stronger international cooperation, similar to nuclear weapons treaties, to hold big tech companies and advanced nations accountable for their global impact on democratic processes and human rights.
Keypoints
## Major Discussion Points:
– **Balancing Freedom of Expression with Combating Misinformation**: Parliamentarians discussed the challenge of protecting free speech while addressing the spread of false information, particularly how AI and deepfakes are blurring the lines between fact and fiction in democratic discourse.
– **Legislative Frameworks and International Cooperation**: Panel members shared their countries’ approaches to digital governance, from Kenya’s comprehensive legal framework to California’s privacy legislation, emphasizing the need for harmonized international standards rather than fragmented national approaches.
– **Electoral Integrity and Democratic Trust**: Significant focus on protecting elections from AI-generated disinformation, deepfakes, and manipulation, with examples ranging from deepfakes of political leaders to concerns about electronic voting systems across different jurisdictions.
– **Technology for Good vs. Profit-Driven Solutions**: Discussion of the need to invest in academic institutions and civil society to develop beneficial AI tools, rather than leaving technology development solely to well-funded private companies focused on profit.
– **Global Digital Divide and Technological Responsibility**: Strong emphasis on addressing “technological dumping” where advanced economies and big tech companies export harmful practices to developing nations, with calls for accountability similar to nuclear weapons treaties or climate agreements.
## Overall Purpose:
The discussion aimed to bring together parliamentarians from different countries to share legislative approaches and policy solutions for safeguarding democratic institutions in the digital age, while fostering international cooperation on digital governance frameworks.
## Overall Tone:
The discussion maintained a serious but collaborative tone throughout. It began with formal opening remarks emphasizing urgency and responsibility, evolved into practical sharing of national experiences and challenges, and concluded with passionate calls for global accountability and cooperation. While acknowledging significant challenges, the tone remained constructive and solution-oriented, with participants demonstrating mutual respect and shared commitment to democratic values despite representing diverse jurisdictions and political systems.
Speakers
**Speakers from the provided list:**
– **Nikolis Smith** – Founder and President of StratAlliance Global, a strategic advisory firm supporting public-private partnerships and technology policy engagement; Session moderator
– **Junhua LI** – Undersecretary General of the UN Department of Economic and Social Affairs
– **Martin Chungong** – Secretary General of the Inter-Parliamentary Union (appeared via video message)
– **Catherine Mumma** – Senator from Kenya
– **Rebecca Bauer-Kahn** – California Assembly Member, Chair of the Privacy and Consumer Protection Committee
– **Grunde Almeland** – Member of Parliament from Norway
– **Marsha Caddle** – Member of Parliament from Barbados, former Minister of Innovation and Technology
– **Zafar Alizoda** – Member of Parliament from Tajikistan
– **Audience** – Various audience members who asked questions during Q&A sessions
**Additional speakers:**
– **Kenneth Pugh** – Senator from Chile, South America
– **Mounir Souri** – Member of Parliament from the Kingdom of Bahrain
– **Hugo Carneiro** – Member of Parliament from Portugal
– **John K.J. Kiarie** – Member of Parliament from Kenya
– **Anna Luhmann** – Member of Parliament from Germany
Full session report
# Safeguarding Democracy in the Digital Age: A Parliamentary Perspective on Legislative Priorities and Policy Pathways
## Executive Summary
This comprehensive discussion brought together parliamentarians from across the globe to address the challenge of protecting democratic institutions in an era of rapid digital transformation. The session, moderated by Nikolis Smith of StratAlliance Global as part of the Internet Governance Forum (IGF) 2025 parliamentary track, featured representatives from Norway, Kenya, California, Barbados, Tajikistan, and other jurisdictions sharing their national experiences and legislative approaches to digital governance.
The discussion revealed both the universal nature of digital threats to democracy and the diverse approaches being taken to address them. From deepfakes targeting world leaders to sophisticated misinformation campaigns undermining electoral processes, participants shared practical experiences and legislative solutions while emphasizing the need for international cooperation and balanced approaches that protect both democratic processes and fundamental rights.
## Opening Context and Urgency
The session began with a video message from Martin Chungong, Secretary General of the Inter-Parliamentary Union, who emphasized that digital technologies have fundamentally altered the information landscape, creating an environment where governments struggle to distinguish fact from fiction and electoral processes face constant manipulation. He highlighted how artificial intelligence has transformed the misinformation landscape through deepfakes, AI-generated content, and algorithmic amplification.
Junhua Li, Undersecretary General of the UN Department of Economic and Social Affairs, reinforced the need for global cooperation on combating misinformation, noting that fragmented approaches risk undermining democratic discourse worldwide.
## National Experiences and Legislative Approaches
### Kenya’s Comprehensive Framework and Challenges
Senator Catherine Mumma from Kenya provided a detailed overview of her country’s approach to digital governance. Kenya has established a comprehensive legal framework including the Computer Misuse and Cyber Protection Act, the Data Protection Act, and the Media Council Act. However, she acknowledged significant gaps, particularly in addressing misinformation and disinformation specifically.
“We don’t have a law that specifically addresses misinformation and disinformation,” Mumma explained, noting the challenge of “hitting the balance between protection of human rights and regulating and also allowing innovation to progress unhinged, is something that is beyond legislation, is something that sometimes is beyond the politics of the day.”
She described how Kenya faces particular challenges with misinformation and disinformation on social media during electoral periods, which sometimes escalates to violence and ethnic tensions. Mumma emphasized that electoral integrity depends largely on neutral electoral management bodies rather than just technology, highlighting the importance of institutional frameworks alongside technological solutions.
### California’s Pioneering Regulatory Approach
California Assembly Member Rebecca Bauer-Kahn shared her state’s experience as a pioneer in both privacy legislation and AI regulation. She detailed California’s Consumer Privacy Act (CCPA) and ongoing efforts to implement watermarking requirements and disclosure laws for AI-generated political content.
Bauer-Kahn focused heavily on technological solutions, emphasizing watermarking technology and device-level authentication for distinguishing reality from AI-generated content. She described California’s push for embedded authentication technology in cameras and requirements for platforms to implement watermarking systems.
She acknowledged constitutional constraints, noting that “constitutional protections of free speech create challenges in regulating misinformation while requiring creative solutions like disclosure requirements.” Her approach emphasized “technology for good,” advocating for increased funding for academic institutions to compete with large AI companies and ensure democratic alternatives to profit-driven technological development.
### Norway’s Media-Centric Strategy
Grunde Almeland from Norway presented his country’s approach, which centers on strengthening independent media organizations as a crucial countermeasure to misinformation. He detailed Norway’s legislative measures that prevent media owners from interfering with editorial decisions and provide extensive public funding for media organizations.
“Truth is becoming less relevant,” Almeland observed, explaining that AI-powered content creation enables people to remain in confirmation bias bubbles where they engage only with information that confirms their existing beliefs. He argued that this makes it “hard and harder to pierce with factual debate and true, well, facts.”
Almeland’s perspective was notably pragmatic, arguing that “most things are already quite heavily legislated” and that international cooperation is often more important than creating new legislation. He referenced an expert group on AI and elections and emphasized providing people with fundamental information to make their own decisions rather than making judgments for them.
### Barbados’ Transparent Parliamentary Process
Marsha Caddle from Barbados shared her country’s experience with implementing cybercrime legislation through transparent parliamentary processes that included extensive citizen input. She described Barbados’s transparency measures, including broadcasting parliamentary committee meetings and Prime Minister speeches.
Caddle provided a compelling example of the real-world impact of deepfakes: “The deepfake was about the prime minister saying something in relation to another major world power. Now that has the potential to completely, especially in this current global political environment, to completely put at risk a lot of what a country is doing with respect to policy and global engagement.”
She emphasized the responsibility of platforms to implement better verification methods while balancing accessibility concerns, and called for building local tech ecosystems that can create tools to fight misinformation while promoting innovation.
### Central Asian Perspectives
Zafar Alizoda from Tajikistan highlighted challenges facing Central Asian countries, particularly how global platforms apply different policies to different regions. He noted that while EU citizens benefit from GDPR protections, developing countries often lack the same priority in platform policies, creating unequal protection standards globally.
His intervention highlighted the unequal treatment of different regions by global technology platforms and the need for more equitable international standards.
## Critical Interventions and Broader Perspectives
### Addressing Technological Inequality
A significant intervention came from John K.J. Kiarie, a Member of Parliament from Kenya, who challenged assumptions about technological equality through a post-colonial lens. “To imagine that countries in places like Africa will at one point be at par with Silicon Valley is a fallacy,” Kiarie stated. “To imagine that such advanced economies do not have responsibilities is also wrong.”
He drew explicit parallels to historical exploitation: “what will happen with this AI is that my people will be condemned to digital plantations, just like they were condemned with sugar cane and with coffee and with all these other things that happened in slave trade.”
This intervention prompted discussion about the responsibilities of advanced economies and technology companies, with speakers acknowledging the need for more equitable approaches to global digital governance.
### Audience Engagement and Practical Concerns
The session included substantial audience participation, with questions covering electronic voting security, financial scams targeting vulnerable populations, and age verification challenges. These interventions highlighted practical implementation challenges beyond the policy frameworks discussed by panelists.
Questions about children’s rights and access to information in the context of age restrictions revealed tensions between protection and access that remain unresolved in many jurisdictions.
## Areas of Common Ground
Despite representing diverse jurisdictions, participants found common ground on several key principles:
### International Cooperation
Speakers consistently emphasized that digital governance challenges require coordinated international responses. Almeland suggested that IGF could serve as a platform for developing shared rules, while Mumma described how African parliamentarians have formed regional caucuses to share experiences and develop common approaches.
### Supporting Independent Media and Verification
There was broad agreement on the importance of independent media and verification technologies, though speakers proposed different approaches. The discussion covered various verification methods, from technological solutions like watermarking to institutional approaches focused on editorial independence.
### Balanced Approaches
All speakers emphasized the need to balance protection of democratic processes with preservation of fundamental rights like freedom of expression, though they acknowledged this remains challenging in practice.
## Ongoing Challenges and Future Directions
The discussion identified several unresolved challenges:
### Implementation Gaps
Multiple speakers acknowledged that policy development moves slower than technological advancement, creating persistent gaps between emerging challenges and regulatory responses.
### Cross-Border Enforcement
The global nature of digital platforms creates significant enforcement challenges, with existing international cooperation mechanisms often inadequate for addressing sophisticated cross-border digital manipulation.
### Technological Inequality
The discussion highlighted fundamental questions about ensuring equitable access to digital technologies and preventing the reproduction of historical patterns of exploitation in digital forms.
## Practical Outcomes and Commitments
The session produced several concrete commitments:
– Participants agreed to carry IGF 2025 outcomes back to their respective countries to drive policy coherence
– California committed to continuing legislative efforts on watermarking requirements and embedded authentication technology
– African parliamentarians indicated they would continue using regional caucuses to develop common approaches
– Speakers agreed to explore developing codes of conduct for social media platforms
## Conclusion
The discussion demonstrated both the complexity of challenges facing democratic institutions in the digital age and the potential for meaningful international cooperation. While participants represented different political systems and levels of technological development, they found substantial common ground on fundamental principles while acknowledging that implementation must be adapted to local contexts.
The conversation revealed that effective digital governance requires moving beyond purely regulatory approaches to encompass investment in beneficial technologies, strengthening of democratic institutions like independent media, and genuine international cooperation that addresses power imbalances. The parliamentarians’ commitment to continuing engagement through regional and international forums suggests potential for meaningful progress, though significant challenges around enforcement, technological inequality, and the pace of change remain to be addressed.
Session transcript
Nikolis Smith: Good afternoon, everyone, and welcome back. I trust that everyone was able to get a bite to eat and their stomachs are replenished and ready for more IGF 2025. So today’s session, the title of today’s session is Safeguarding Democracy in the Digital Age, Legislative Priorities and Policy Pathways. This session gathers members of parliament from across the globe to discuss how digital technologies are impacting democracy and what legislative and policy actions are being taken to preserve democratic institutions and trust. Again, my name is Nikolis Smith. I’m the founder and president of StratAlliance Global. StratAlliance is a strategic advisory firm supporting public-private partnerships and technology policy engagement. Now, before we call our distinguished panel to the floor, we have to start by recognizing a familiar face that I’m going to call in just a second. He’s been an advocate for the IGF since its existence. Please welcome Mr. Junhua LI, Undersecretary General of the UN Department of Economic and Social Affairs.
Junhua LI: Your Excellencies, distinguished members of the parliament, dear colleagues, good afternoon. It is my great pleasure to welcome you all to the parliamentary tract of the IGF 2025 in Lillestern. As we convened this important meeting, our purpose is very clear, to bring legislators together with all the other stakeholders in shaping digital policies and legislative frameworks to ensure an open, inclusive, and secure Internet for all. Under the overarching theme of the IGF 2025, Building Digital Governance Together, we will focus on the critical needs for international digital cooperation to address today’s digital challenges. Among the most urgent of these is the dual imperative to protect the freedom of expression while combating the rampant spread of misinformation and disinformation. The ability to speak freely, access accurate information, and engage in an open online discourse is the bedrock of democratic societies. Yet, these fundamental rights are being tested, not only by disinformation and censorship, but also by the rise of powerful technologies like the generative AI that further blurs the lines between fact and fiction, challenging our very understanding of truth. We face profound challenges, from the false narratives that erode trust in public institutions to the targeted disinformation campaigns that threaten peace and stability. The digital environment demands new approaches that uphold human rights while preserving civic space. At the same time, We must ensure that the responses to these threats do not infringe upon the very freedoms we seek to protect. As members of the Parliament, your role in navigating this complex terrain is pivotal. You have the authority to craft the legislation that safeguards the freedoms of the expression and the access to the information. Promotes the media and the information literature and strengthens the resilience of the democratic discourse. You can foster a digital environment where the right to express the diverse views is protected and respected. And where the reliable fact-based information is prioritized over the manipulation and distortion. This is how we can ensure that the innovation and inclusion advance in lockstep with the human dignity and safety. By actively engaging in this forum, you are not only contributing to a vital global dialogue on the digital policies, but it also shaping the national frameworks that reflect these shared values. I appealed and urged all of you to carry the outcomes of our discussions here in IGF 2025, back to your respective departments. Driving continued momentum and policy coherence at the both national and the regional levels. Over the past years, we have seen encouraging progress while expanding the parliamentary engagement in national and the regional IGFs. From West Africa to the Asia-Pacific, this localization of our goal conversation is essential. We are eager to learn from your insights and national experiences and identify the new avenues for collaboration. Let us strengthen this engagement and champion the digital governance that respects the freedom of expression, addresses the information integrity, and supports open, inclusive, and rights-based digital space. I extend my sincere thanks to the Inter-Parliamentary Union, the Norwegian Parliament, and our partners for their invaluable collaboration on the parliamentarian track, and for their commitment to integrating the parliamentary voices into the UN processes. I wish you a very fruitful exchange and impactful outcomes. Thank you.
Nikolis Smith: Thank you very much. Mr. Li, thank you very much for those words of encouragement. As we go through the challenges with Internet governance, now, this would not be a proper parliamentary track session if we did not hear from a very respected person that we all know, Mr. Martin Chungong, Secretary General of the Inter-Parliamentary Union, and as a prominent advocate for resilient democratic institutions, we have a video message that we would like to show you now.
Martin Chungong: Mr. Under-Secretary General, Distinguished Parliamentarians and IGF Participants, I have great pleasure in welcoming you to this Parliamentary Roundtable at the 20th Internet Governance Forum. This session provides a unique platform for parliamentarians, policymakers and digital governance experts to build consensus on one of the most pressing challenges, safeguarding democratic institutions in the digital age. At a time when democratic norms face unprecedented pressure and public trust continues to erode, global cooperation on combating misinformation is more crucial than ever. A fragmented approach to information integrity risks undermining the very foundations of democratic discourse and exacerbating the crisis of trust that threatens our societies. The rapid spread of misinformation through digital technologies has fundamentally altered the information landscape in which our democracies operate. Governments struggle to distinguish fact from fiction, electoral processes face manipulation through coordinated disinformation campaigns, and democratic institutions find their legitimacy questioned based on false narratives. The rise of artificial intelligence has fundamentally transformed the misinformation landscape, with deepfakes, AI-generated content and algorithmic amplification creating unprecedented challenges for democratic discourse. Yet, within this challenge lies profound opportunity. By working together across borders and political systems, we can develop common principles that preserve both free expression and democratic integrity. Parliaments as the voice of the people have a pivotal role in ensuring that digital transformation strengthens rather than weakens democratic governance. In our response, we are guided by the Global Digital Compact, an emerging international consensus on information integrity. And while the Global Digital Compact represents an important foundation, there is still much work to transform its vision into effective safeguards for democracy. I encourage all participants to actively engage in these discussions, recognizing that the frameworks we develop today will determine whether democratic institutions emerge stronger from the digital transformation. Together we can ensure that democracy not only survives the digital age, but emerges more resilient, transparent, and responsive to the citizens we serve. Thank you.
Nikolis Smith: Okay, thank you Mr. Chungong for those remarks. Now it is my deep honor to introduce our panel that’s going to be with us this afternoon. First we have Senator Catherine Mumma from Ghana, I’m from Kenya. Then we have Rebecca Bauer-Kahn, California Assembly Member, Chair of the Privacy and Consumer Protection Committee. Grunde Almeland from Norway, Member of Parliament. Marsha Caddle from Barbados, also a Member of Parliament and former Minister of Innovation and Technology. Zafar Alizoda from Tajikistan, Member of Parliament as well. I’d like to welcome them to the stage. Full disclosure, everyone, I made the first mistake, said that one of our first speakers is actually from Ghana, but she’s from Kenya. My apologies. Wanted to get that out there first and foremost. Okay, so here we are. We’re back. This is the parliamentary track session. We have a lot to talk about over this next hour. So what we’re going to do is we’re going to have our distinguished panelists here. We’ll go through a series of questions. We will also leave time for the audience to ask questions because that’s very important. And then we’ll have some closing remarks as well. So let me start first with the host of this year’s IGF, Grunde Almeland. Recently you just concluded an expert group on AI and elections, was recently tasked with this. What are the biggest challenges that you see, Gunda, from the Norwegian democracy in terms of how it faces and what are those challenges? What are you doing exactly in parliament? Because I know, you know, when we think of AI, it’s at every intercourse that we see, you know, from not only just the IGF, but other bodies. But I know that it’s very important for Norway and the parliament. So if you can just kind of enlighten us and kind of where you guys are at this point.
Grunde Almeland: Of course, first of all, it’s an honor to host this event here and it’s an honor for all of us in Parliament as well that this event is taking place here in our country. But to your question, I think what worries me the most is one of the key findings in the report you’re referencing, and that is that truth is becoming less relevant. And with that, the report that went through all these different elections in 2024 and 2024, how AI is super powering content, creating so much more content to engage with for people, we see that truth is becoming less and less important because what you engage with, what you look at, is things, content that is already confirming your held beliefs and are kind of helping you stay in this comfortable bubble that it’s hard and harder to pierce with factual debate and true, well, facts, so to say. And I don’t want to be all doom and gloom, because there is a lot that we can do. And one of the, you know, they look at a lot of different measures in that report. You know, they look at how you can build competency, you know, how to implement stuff in schools, how you, you know, you should advance research. But one of the key measures is supporting and strengthening independent media organizations. And I think this is the measure that I want to focus on in the beginning now, because it is such an important measure in order to have something that can kind of combat this reality that is being created in a lot of different bubbles. And there is such a connection between our trust as politicians, trust in us as politicians, and people having access to true information. and having access to also media or content that is being edited by a professional, well, independent media, edited media, so to say that they know that what we are doing as politicians are being checked, you know, that we are being transparent about what we are doing and this is where the media comes in. And I think for Norway, independent media has been an important political issue across the isle for a lot of years. I’m very happy to say that we are number one on the press freedom index and it’s partially because of what we are doing in parliament, but of course hugely what the reporters are doing every day on their work. But looking at what the parliaments can do, you know, starting with a strong legislative foundation and that is having, we are having an act that ensures independence, editorial independence, that ensures that owners of a newspaper cannot go in and challenge what the editor or the journalist are reporting on, making sure that we as politicians are not, we do not have access, even though we do allocate a lot of funds to the media, we are not able to access independent decisions on what is being reported. Any owner of a newspaper cannot require to see a journalist’s work before it’s being published. You know, these kind of legislative measures are really important to have a strong foundation. And then comes, you know, funding and we do quite extensively fund the media in Norway. I think this is very important in order to have all these, not only national newspapers that would be there, that could thrive in almost any kind of society, but also having those small local news outlets that can also check what the politicians and staff is doing on a local level. And I think having this kind of built-up media system ensures that people who know it can, you know, they know that they can access information on what we’re actually doing. And there is a lot to be said for this, you know, we have a lot to work on in order to become more transparent, especially when you see a shift in how we also communicate as politicians going from, you know, more simpler days of writing letters to each other and going now on to all these different channels of technology, you know, there is a lot of things to be done. But I think this is a good starting point and I’ll end on this note saying that the report also, you know, is also highlighting one last thing and that is that we have to be level-headed and not exaggerate the impact of AI because exaggerating it, you know, and trying to fearmonger as politicians is also a way of making that kind of misinformation have a stronger meaning in itself.
Nikolis Smith: Thank you for those remarks and you’re absolutely right about keeping the level, right, because we don’t want to approach everything with fear, right? We have to remember that AI is a tool that was invented by humans, right? But there’s benefits, right, to AI and what we’re going to talk about today will speak to that. And so I appreciate those introductory remarks. I wanted to turn now to you, Senator, my good friend from Kenya. You’re very well known in circles on the African continent. You’re very active in the regional things that are happening as it relates to the IGF. Can you help us kind of take us through what are you seeing as kind of the emerging threats in Kenya? And kind of what are the countermeasures that you’re taking to this point?
Catherine Mumma: Thank you very much. First to say that Kenya has embraced matters relating to digital technology in a profound way. We recognize that this is where the world is headed, not just on actually all democratic matters, including both politics and development. So Kenya has kind of anticipated this, and I would want to say that we have a good legal framework that currently supports the growth of internet and digital technological advances. We have a very facilitative constitution that protects freedoms of expression, the right to access information, but also provides for protection of human rights. It is very strong on human rights. It also provides for protection of consumer rights. As a result, we have a number of laws that actually guide or regulate issues relating to matters of internet and digital technology. We have the Computer Misuse and Cyber Protection Act. We have the Data Protection Act. We have the Media Council Act. We have the Copyrights Act that protects intellectual property. And we have the National Cohesion and Integration Act that set up a commission to deal with matters relating to hate speech. But we still have are challenges when it comes to misinformation and disinformation using the social medias. And we don’t have a law that specifically addresses misinformation and disinformation, not because the law is somewhere and needs to quickly come, as you will appreciate from the conversations we’ve had since this morning, hitting the balance between protection of human rights and regulating and also allowing innovation to unhinged, to progress unhinged, is something that is beyond legislation, is something that sometimes is beyond the politics of the day. Because as politicians, a lot of the misuse, the disinformation and misinformation is particularly during electoral times. And for us in Kenya, every time is election time. We actually finish elections today, and the next day we are competing for the next, we are already campaigning. So there is a lot of disinformation use of hate speech in our part of the country, or a part of where our country is. We have been, or we have suffered post-election violence following hate speech that was negatively, I mean, that used negative ethnicity. And that’s how we came up with the Cohesion Commission. But now with matters, digital technology, we’ve had a lot of misinformation that is used by political competing, politically competing groups to actually use demonizing language, misinformation around maybe national policies that are happening. to try and demonize a government, or information to try and demonize an opposition leader. And it’s happening to a stage where it’s ending up in violence. And I would want to say that our challenge, really, is on how we can regulate that without looking as if we are over-correcting or over-enforcing. There is also the challenge of the possibility of abuse of office, misuse by government of some of the privileges. How would we use, for instance, surveillance around matters, digital content, to the advantage of the government and to abuse rights of opponents, political opponents. So I would say we have a good legislative framework not complete enough in the sense that we still have to find ways of protecting rights, including rights of children. I think we have a lot of, we have a lot of access for children on the internet that is actually harming their health, including matters relating to pornography and so on. Now, we would need to think through to find out how do we, beyond the Kenyan parliament and Kenyan legislation, how would we think through a violator that is situated in another jurisdiction? What kind of conversation can we have in forums like the IGF to ensure that beyond national legislation, we are able to come up whether with its codes of conduct that would one hold, accountable, those in charge of these platforms, as well as ensure that the freedom to advance in digital technology happens. There is also, when you’re talking about human rights, we also need to think about beyond the issue of information and disinformation, how do we include more people? In our area, I think one of the things we need to do is have greater investment. Beyond regulation, we need to give some financial investment in the necessary public digital infrastructure that would see those in rural areas equally participating in the benefits of the digital space and technology, to see more women participating in this space, to see more other vulnerable and minority groups participating in this space. So as a country, I believe beyond protecting against disinformation, there is the issue of also inclusion, which is a human rights issue that we need to look at. So as we discuss this issue, beyond just discussing the regulation, we need to discuss how best to invest more in order for more people to participate in this space.
Nikolis Smith: Well, let me just say that, hats off to all the work that you’re doing in Kenya, because as you listed a long list of laws that you’ve been able to implement, so that’s progress, right? Obviously, you made the point clear that there’s still more progress to be done, but I think that Kenya’s in the right direction, they’re going in the right direction, and I commend you guys for that. Thank you. You know, on this same topic, I wanna move now to… Assemblymember Rebecca Bauer-Kahn of California. You know, I lived in California as a kid as well, so there’s a little bit of priority here with that. But California has been very active in this space. Probably more active than other states in the country. So tell us how, you know, what are the approaches now? Knowing what you did back in 2018 when CCPA was passed, and we’re looking now into the future, where are we going forward? And when we think about regulatory approaches, what’s being explored in terms of, to ensure information integrity, right? Because now, as the Senator mentioned, it’s elections, right? And we just got through one election, right? And there’s gonna be more on the horizon. So in terms of integrity, where do we go now?
Rebecca Bauer-Kahn: Well, thank you so much for this conversation. It’s my first time at the IGF, and I have to say that one of my takeaways so far this morning is that, despite the fact that all of our jurisdictions are so different, we’re really all struggling with this same issue of information integrity. And for those that don’t know, the law that was cited is our privacy law. We were the first state in the United States to pass a privacy protective piece of legislation shortly after the European Union passed their privacy laws. And some states have followed, but we’re still not nearly as protective as the European Union. And, you know, I should ground this in being from California. We are home to 32 of the top 50 AI companies. We are home to all of the major social media companies. So, you know, these are the people I represent. They make this technology. They proliferate it to the world. And with that, I think we feel a great responsibility and sitting there this morning, listening to what is happening across the world as a result of some of this, it’s intense what these companies are doing to change the global ecosystem. And we have the federal government. I have long believed the federal government is in the best position to regulate these technologies for us as a country, but they won’t, and they don’t. And so the states are taking it upon themselves to protect our constituents and to try to push these companies in the direction of responsibility. But as I’m sure many people in the room are aware, we too have constitutional protections of freedom of speech. Our constitutional protections say that we may not stop people from speaking. It also says we cannot force people to speak, which is an interesting dynamic because one of the ways that we have tried to combat mis- and disinformation as a result of our First Amendment, our protection of freedom of speech, is to require more speech, to say you have to disclose when you’re using AI in a political advertisement so people know that they’re seeing something that’s AI-generated. That’s held up in the courts right now because the courts are seeing that as forced speech. And so we have a very complicated dynamic of how we get at this issue of mis- and disinformation when we have such strong protections around speech. But that’s one way, and so we continue to try to do that. We’ve passed legislation that requires those disclosures, that require the platforms to take down serious misinformation in the political context, although political speech is even more protected than your average speech in America, so that is a really challenging thing to do. And so the next step we’re taking is trying to push forward on watermarking, which I know is something that the European Union has pushed for. But this ability to understand reality from fiction I think is fundamental to protecting our democracies. And so watermarking and the technology that will go along with it is critical. And I think with the EU pushing on watermarking and California pushing. You know, we are the fourth largest economy in the world. We have a lot of tech companies in our backyard, and can we really make sure that technology comes to fruition so that around the world we can all require it? We can all say we need watermarking. Right now the technology is not yet where we want it to be, but if it is there, maybe it will give us that ability for constituents to know what is real and what is not, and I think that would be game changing. Just a few hours ago someone asked me about what’s happening in California. Many have seen in the worldwide news about what is happening in my home state as it relates to our friction with the federal government right now, and one of the things we’ve faced is massive disinformation. So many deepfakes about what is happening on the streets of Los Angeles. I was there just a week ago. It is incredibly peaceful. That is not what you’re seeing on the social media sites because of all the deepfakes that are being generated, and when people cannot tell that from reality, it leads to serious outcomes in our elections, in our society, and we have to do more, and so California’s gonna continue to push, although I will say that right now the federal government is moving what would be a 10-year ban on state enforcement of artificial intelligence, which would stop most of California’s efforts, and so there really is, when I talk about friction with the federal government in California right now, it’s, I can’t overstate it.
Nikolis Smith: Thank you for that. As a former federal employee, I’m not going to start any more frictions right now, especially that we’re both from the same state, so I wouldn’t want to do that. I do want to turn, though, to another region of the world, and for everybody here who does not speak Russian, please use your headsets. I want to turn now to our friend from Tajikistan. Mr. Alizoda, in kind of the Central Asian response to information manipulation and the steps that are being taken, can you talk to us about the measures that you are taking in terms of building that type of institutional resilience as it relates to everything that you’ve heard so far?
Zafar Alizoda: Thank you, Nikolis. I would like to note and provide information on international stability and the assistance of comprehensive information in Central Asian countries, including the protection of personal data, which is one of the most important tasks in terms of the application of digital technology in Central Asian countries. Each country has its own laws that regulate the collection, storage and use of personal data. In Central Asian countries, personal data falls under the broader concept of the right to protect private life and privacy. According to the legislation of these countries, privacy is a personal property. Responsibility for violations related to personal data depends on specific circumstances, as well as on the legislative norms and rules in the country. Sensitive personal data is a category of personal data that relates to the most personal and confidential information of a person. They can include such data as race or ethnic origin, political beliefs, religious and philosophical beliefs, professional affiliation, medical information, biometric data, information on finance and credit history, and so on. Personal data of this category requires special protection and processing, as their disclosure or use can lead to discrimination, stigmatization and other negative consequences for a person. The formation of legislation that comprehensively provides for the right to regulate relations related to the protection of personal data is one of the most complex tasks of the state. Currently, the Central Asian countries are actively developing a legal institute for the protection of personal data. However, it should be noted that the legislation does not regulate many issues in the field of the protection of personal data of citizens. It can be attributed to the absence of national legislation on the protection of personal data. on measures to react quickly when personal data is leaked and the necessary measures to minimize the consequences of such a leak, as well as the absence of obligations on the notification of authorities for the owner or operator. The provision of digital privacy is a complex problem affecting the rights and legal interests of the public and private sectors. According to the assessment of national experts, it is necessary to revise the law on the protection of personal data and to make adjustments to it that reflect the modern applications of advanced technologies. The development of the digital economy, even with all its good intentions, should not have the consequences of the refusal to protect human rights and freedoms. Any current and proposed business practice should envisage the assessment of its consequences for the non-permanent privacy so that there is an opportunity to consider the provision of information on how politics and technology ensure the alleviation of risks for the non-permanence of private life. In parallel with European law, as our colleague has already said, on the protection of personal data, legislators and central countries should consider the possibility of introducing a legal mechanism for assessing the risks of the General Regulation on the Protection of Personal Data, GDPR. It should be noted that the DPI assessment procedure is not always used, especially when the processing of data is associated with a high risk of violation of the rights and legal interests of citizens. For the conscientious and effective development of the technology used, society should have modern effective legal instruments and independent control over the compliance of human rights to the non-permanence of personal life and confidentiality of personal data. It should also be noted that at the legislative level, there is an important role for the promotion of information policy. and the strategy of legal measures that allow to increase integrity and balance of the information space, the Central Asian countries are state members of the Inter-Parliamentary Assembly of the CIS and jointly develop joint proposals of national parliaments of the countries of cooperation
Rebecca Bauer-Kahn: on issues of mutual interest in this direction. Thank you.
Nikolis Smith: As you mentioned, I think the first kind of task I think our panelists would agree is that domestically as you’re going through a process you have to have something to formulate a risk assessment. It’s key. I think that using the IGF to discuss these issues would be a great opportunity so hopefully you’ll be able to take back what you hear here and take it back to Tajikistan and continue those efforts. Last but not least from the island of Barbados Miss Marsha Caddle What steps are underway in Barbados to rebuild public trust in elections and the democratic system?
Marsha Caddle: Thanks. That’s a big question.
Nikolis Smith: It’s loaded.
Marsha Caddle: Let me just set some context by saying that Barbados is a small island in the Caribbean. Population of 270,000 and declining, which is another of the existential threats that we’re facing. Falling population and aging population. Barbados has always had since independence a history of stable, free and fair elections and a high degree of political stability. I think it’s important to set that context because you know the circumstances in which we are talking about these issues of maintaining democracy and democratic participation are against the backdrop of expectations of stability and truth. The other thing that we have in Barbados is extremely high internet and digital penetration. So you’ll see our numbers say something like 114% mobile penetration. So we are kind of over the maximum, right? People have very immediate access to information and high expectations about that information. So then the question becomes about not just access, but meaningful access and use as we talk about these issues. Just before I got on a plane to come here, the office of the prime minister in Barbados had to push out urgently warnings about a deepfake that had just been circulating. And I want to share the example because it highlights not just the domestic issues when it comes to democratic participation and trust, but also the potential risk to destabilize global relations, international relations, and foreign policy. The deepfake was about the prime minister saying something in relation to another major world power and saying untruthfully that Barbados was taking a certain diplomatic stance with respect to this country. Now that has the potential to completely, especially in this current global political environment, to completely put at risk a lot of what a country is doing with respect to policy and global engagement. So we’re not just talking about domestic trust, but we’re talking about international and a country’s global position in the world. And so I wanted to say a little bit in answer to your question about what we are doing. One of the things I think is very important. is this issue of democratic literacy. How do people interact with policy conversations, with electoral processes? So one of the things that we’ve tried to do simply is to push out as much truth and transparency as we can. Start to get people accustomed to an environment of truth and evidence, because that, even before we started talking about technology, is perhaps something that hasn’t been as strong as we would like. So for example, we have these joint select committees of parliament that consider issues before they’re passed, consider legislation that have to do with governance, with social issues, and so on, and they’re broadcast. There are very few things that the Prime Minister says, speeches, engagements, that are not broadcast either in real time or recorded and shared. And why? Because we want to be able to get people used to the idea of truth. This is the original source, and this is where you can find it. You can find it on these channels, you can find it in these ways. The other thing that we are working on is investing in a tech ecosystem that can balance or build tools that essentially fight against misinformation, so that there are others who are investing very heavily in misinformation. What can we do to invest in tech creators who are going to combat that with things that promote truth? One of the things, though, that we think is very important is encouraging platforms to return to more robust methods of verification. We think that that is critical. And I’ll say very quickly, I think it was Rebecca who said earlier that political speech is very protected in general in your jurisdiction. The interesting thing is that while political speech is protected and while I can sit in one jurisdiction in a country like Barbados and see things proliferating about political actors in my space, on the other hand, as a political actor on a social media platform sitting in Barbados, I am not trusted to generate content. And so as soon as I try to generate content as a politician, I’m told, well actually you’re a politician and we’re not sure that you should be able to say these things on our platform. So the question of being able to combat misinformation is also, I’m also constrained because of some of the rules that platforms that are generating in other parts of the world but impact the way I can talk to my constituents, the way that they operate. So I think that these are some of the ways that we’ve started to really try and encourage an environment of real evidence and truth. There is legislation. I was the minister who brought cybercrime legislation. We took it to the Joint Select Committee. We heard evidence. We heard pushback. We heard concerns on human rights from citizens and we amended the legislation. And so I think that this is a healthy way to get people in the conversation and make sure that we realize that democratic participation and adherence to truth and evidence is everybody’s concern.
Nikolis Smith: Thank you very much. So we’ve talked an array of different areas in terms of what we’re doing within our governments, the challenges that we’re facing. What else? We’ll start back with you, Senator. But what are the other gaps that we’re missing? I think there’s room there where we can recognize the existing challenges that are there. But are there areas that we’re not focusing on? Maybe that could help us kind of bring this together? And then the second piece of that is that on the non-legislative side, right? Are there areas where we can collaborate? Obviously the IGF is a great platform, right? But are there other areas that we could be doing on an international front, right? Because we’re looking at it from a domestic lens, right? To make sure that we come together. So I’ll start with you.
Catherine Mumma: Thank you very much. Now, because we are parliamentarians, I have noticed that we tend to focus on the impact of technology in the political space. But I would want us to think broader and imagine the innovation in the health sector, for instance, with digital technology. How will telemedicine look like? And how should parliament anticipate the possibilities of human rights violations with advancement of digital technology in the health sector? And therefore, how would that law look like? So we should not be fixated with a particular law that would deal with matters of digital technology. We need to think broadly to see, would we need to look at the laws in the health sector? Do we need to tweak something in the health sector, in the water sector, in the other sectors, so that we know the dangers that we are seeing now around democratic spaces could actually extend and have even more, or even more profound implications to the common person. So we need to think broadly around that and agree on how best to deal with this. So I think laws on digital technology are not about a particular legislation. It’s cross-border. And we need to think beyond this and allow our professionals. in all sectors to help us think around this. Now when it comes to thinking on what to do internationally and nationally, first to thank the IGF for the proactive way in which they are actually moving this agenda and getting us to learn and also discuss more within their forums. In the Africa space, African parliamentarians have actually taken the liberty to form the African Parliamentary Caucuses, the Africa-wide in West Africa, in East Africa, so that we can actually compare notes and know that what happens in Tanzania will affect us and Kenya will affect those in Malawi, will affect those in Nigeria. So we need to start borrowing from each other and listening to each other and learning to grow on this. And beyond legislation, we need to find out how the mechanisms we have in place could be built upon to do, somebody in the morning talked about a mechanism for auditing information. How would that look like? Kenya has the Data Protection Commissioner. It also has the Media Council. Should we add on to their mandates some more clauses that will help us to monitor the area better? Do we need maybe an African Union or East African community mechanism that will help us to check the situation further? So there is all these opportunities. Thank you.
Nikolis Smith: Thank you. We have about 10 minutes left of this section, then we’re gonna go into some Q&A. So I’m just gonna go down the line. Rebecca, I’ll turn to you next.
Rebecca Bauer-Kahn: So I think that where I’m sitting, one of the things that is missing is technology for good. We see sort of technology in the hands of very few players right now. that are, for better or worse, profit-driven. And how do we push technology to be the solution in the technology age? And I think that that’s something that we really need to be working on, both locally in California, but also globally. And so part of what we’re trying to figure out is how can we fund that? How can we put more money into our academic institutions to have the compute power to compete with the largest AI companies? Right now, the only companies able to build large language models are these very well-funded companies, and our academics need to be in that space. Our civil society needs to be playing in this space to create technology for good. There’s one example for us in the United States. On the intellectual property front, the University of Chicago created an AI model that allows you to put something into your copyrighted material that if a model is trained with your material, it will actually refuse it, if you will. And that’s not a legal protection, it’s a technology protection. And those are the kind of tools that I think we need to really allow us to battle against, as you said, the misinformation and the disinformation ecosystem that is growing and can, I believe, be solved in part by better technology for good, and we have a real role in doing so. And part of that, I also, the reason we believe we’re home to so many of these companies is our academic institutions, is the training that they provide. And if we’re investing in technology for good at our academic institutions, are we then putting people into the world to create companies for good? And how do we create that ecosystem, I think is really important. And then I’ll say on the global landscape, I think it is this kind of collaboration. I think it is understanding we’re all trying different things. We’re all out there in a world that was created over the last decade, trying to find solutions to very new problems. And as has been said many times today, this technology is moving faster than the policy. And so to the extent that we can listen to each other. and hear what is working in your jurisdiction, how can I bring it home to help the people where I live, I think we’re better off, because we have to move fast in order to protect our societies, and the only way to do that is in collaboration and partnership.
Nikolis Smith: Thank you, Rebecca. Grunde?
Grunde Almeland: Well, I think I’ll pick up on technology for good, and as you were talking about democratic literacy, we know very well that being able to adapt digitally does not necessarily translate to democratic literacy or media literacy, but one of the reasons why I wanted to focus on independent media in the beginning is because the example of Norway is also an example of how digital adaptability enabled us to still have quite a high level of trust in media, and how that technology actually were able to create a foundation for people being able to access independent media, being able to have this high level of media literacy that we are very fortunate to have, and I think it’s such a good example, because in the months and years that we’re moving towards, using that same kind of inspiration on taking technology as a tool to enable to have more transparency, making sure that we adapt technology that strengthens these kind of institutions that we want to uphold, strengthen democracy is such an opportunity. There are so easy to point at all the challenges, because they are so evident and apparent to us, but it’s also such big possibilities in having these tools that also can create more transparency. Just as a small example to end off, we have a lot of complaints from journalists in Norway of how much time we’re using to review their applications for access for information in government institutions. It’s such a small example of how you can simplify a lot of these processes as well. That ensures the whole process being more simple, easier accessible for journalists and also more transparent for the public.
Nikolis Smith: Thank you.
Marsha Caddle: Yes. So I think that, you know, creating this culture of evidence so that people feel that, OK, should I propagate something if I cannot show that it is true? I think that that is something that that has certainly helped. But also investing in in the kind of learning, certainly in countries in the in the global majority countries, investing in the kind of learning that will allow our people to create tools that they find useful and that they that they generate and are able themselves to trust one of the ways in Barbados that we started when I was minister of technology is to be able to train people in things like data analytics and data science. And this is not just informal academic institutions, but partnering with companies. There’s one, for example, that does a lot of work on the continent of Africa called Zindi that we’re working with so that people can learn some of these skills and be able to create. Tools and play in that space, we think that there’s an AI value chain, which means that for some countries in the global majority, it may not be practical to say we are going to build these large language models, but we can create at some part of that value chain and start to create some of these technology tools. So I do think that the culture of evidence to support strong legislation and establishing sources of truth that people see that they can trust is a part of of the puzzle as well.
Nikolis Smith: Thank you. Headsets for.
Zafar Alizoda: I would like to add that the policy of global platforms is different for different countries and regions. If, for example, the data of EU citizens is protected by GDPR, then small developing countries in the regions of Asia are deprived of such a priority. For example, the legislation of Tajikistan, although it is close to GDPR, but still many issues remain unresolved, such as the trans-border data transfer, the third-party data transfer in order to improve the product of the company. It is possible to improve the legislation of Tajikistan, and the legislators of Tajikistan always work on these issues. However, there is a question of the application practice. It is difficult to control the implementation of the law-enforcing levers due to the limited market for global platforms. In this regard, I think it is necessary for global platforms to improve their policies for all users, regardless of the country of the user.
Nikolis Smith: You can go to the microphone. Make sure to state your name and affiliation, please.
Audience: Thank you. Hi, I will make my question in Spanish. So, put the headphones on. I would like to ask you about the Tribunal of Ethics of the Council of the Peruvian Press. In the electoral processes, there is a basic problem, which is a massive action to move towards the digital vote. We have already had a couple of problems in Latin America. One is the data transmission, not precisely in the electoral process, but in the transmission. and the other in electoral processes, as was the case in Bolivia and Venezuela a few years ago. There is a movement of retrogression in declaring electronic votes unconstitutional in various countries, not only in the Latin American region, but also in other countries around the world. From the legislative point of view of these countries, how can we avoid the misuse of these electronic systems, especially the electronic vote, to avoid affecting democratic processes? And this in addition to the processes prior to the elections, as happened in Romania, where disinformation or misinformation is used, if you want to put it that way, to affect an electoral process. So we have, on the one hand, the problems prior to electoral processes, and on the other hand, we have the problems of the electoral process itself. Thank you very much. I’m Senator Kenneth Pugh from Chile, South America, and I would like to ask the panel precisely about an issue. We are human beings. Humans, we have human rights, and that’s Article 19 in the chapter of human rights. Problem is, in order to get confidence, we need to know each other, we need to talk, we need to have a will, and then we will trust. So human beings need to be in contact. How we are gonna achieve that in the digital environment with digital trust, when we are providing one of the most important right, which is freedom of expression to artificial intelligence, which are not humans. How we are gonna define who has a human identity, it doesn’t mean that we are getting the ID provided by the government. How we are gonna differentiate humans from not humans in the cyberspace. How we will know if they are minors or not, because in the real life, we can see them. It’s a young boy or girl. But how we will do it in the cyberspace. If you have anything to share, I will be very grateful. Thank you very much.
Nikolis Smith: Thank you. So why don’t we pause there and start addressing before we go through the whole line, and I lose track of all the questions. So why don’t we start directly with Hiz. Anybody on stage wanna go first?
Marsha Caddle: Yeah, I think Rebecca will end up talking a lot longer, so let me take the last one. Yeah. Let me take the last one about how do you, really this is about intent. The last question, how can you differentiate the origin of an idea or a certain idea? set of information, and I think it is less about origin and more about output, that’s going to be what we end up having to regulate and legislate, because it is going to be very difficult to say, just like often we don’t know now, we can’t see the author behind something, we may be able to eventually verify, but that takes time and more and more people are very impatient when it comes to information. But I think one of the things that we’re going to have to concern ourselves with is really verification of what is generated, and as well as being able to tell where it was generated from, so to require that we can see that this particular output used AI, but also to be able to use different pieces of legislation to generate output. So for example, we’ve seen cases recently where there was coercion used to get young people to do certain things, and this came from an AI actor, and what that jurisdiction, I don’t want to mention which country it was in, but what that jurisdiction ended up having to start to look at is, well, what is the kind of content, no matter the source, that can make its way into this space where vulnerable people are represented, and to start to use kinds of keyword technology and authentication technology to say, look, because this content has come into this space, it cannot be allowed here because of the nature of the people who are here, whether they’re young people and children and so on. So I think that, as you say, it’s going to be very difficult more and more to kind of police the difference between the two, but I think identifying the and then regulating or being able to direct the content and the outcome is going to be more and more the kinds of work that we’re going to have to do.
Nikolis Smith: Thank you for that.
Catherine Mumma: On the issue, I think it was the first question about what can Parliament do about electoral laws and electoral systems where there is misuse of technology, I guess, for rigging elections. My view is that a good electoral system is largely dependent on the electoral management body because for the fraud to happen, whether with AI or any, it usually will take place with some collusion by people within the electoral management board. So whether it is through the person they procure to carry out the elections, whether it is through their own IT systems and Kenya has usually gone to the Supreme Court to discuss the issue of manipulation of the transmission of elections, the electronic transmission of the presidential results and we’ve had issues where, for instance, the last election, the electoral management body completely refused to open the service to audit the electoral system that transmitted the results. So I believe it’s not so much the technology as is a corrupt set of minds that are behind this whole thing and I believe if electoral management bodies is remained neutral, then elections, whether digitally driven or not, will remain credible.
Nikolis Smith: I know both, I was looking at Grunde and Rebecca, you guys are both vying for one, so I’ll let you go first.
Rebecca Bauer-Kahn: No, I love, I just want to start there, which is I love that perspective in the United States, every state runs their own elections, and so it’s done differently across our whole country. And I think that this is a question not just is the election integrity real, but do people believe it, right? I mean, part of what holds our democracy up is sort of an agreed principle that our elections are free and fair, and that’s been a challenge. And so I do think that one of the things that’s critical personally is a paper trail. So even if you’re using a machine that you get a receipt, that there is a way to audit it, which I think is so critically important, so I’ll just add that on. Somebody asked about our watermarking legislation in California, last year we did pass a law, and it was signed by the governor that requires the platforms to show a watermark in a few years. And we did that because we wanted to signal to the innovation economy that California was going to require this, because we knew that although the technology isn’t there today, if we required it, the brilliant minds out there would create the technology because there would be a market for it. So I believe that’s coming, which is very exciting, I think, for the whole world. This year, we’re moving a piece of legislation that would require the devices, so your camera, to have embedded in it the technology that would authenticate. We actually, as a legislature, we’re one of the entities that used Adobe’s technology for the first time, where every image we took in-house was run through their watermarking technology, so that when we then put it out into the world, we could trace it back. We could prove whether this was a real image or whether it was a doctored image. So that technology is coming. it in practice and I think it’s really exciting because it is one of the things that will enable us to see fiction from reality.
Nikolis Smith: There was a question about financial scams. That’s something that has come up. We have not moved legislation yet on AI and financial scams but I think it’s so important and I think the foundation of that is privacy legislation. Part of why the financial scams are getting so sophisticated is because there is so much access to information about every single one of us. When they call and they say they’re your aunt or your uncle and they know your children’s names, you fall for that scam in a way you wouldn’t if they didn’t have that much information about you. So part of it is protecting privacy which I think is critical and then the second piece is making sure that as you said, this isn’t just about AI legislation. It’s about legislation. We already have laws that outlaw these type of scams. So how do we say that it’s as much a violation of the law if it’s done by a real person or done by an AI tool and making sure that our laws that protect our communities extend out to all AI actors which I think goes perfectly to the last question which was about this question about AI being humans and that’s an interesting question in the United States because we just had a court for the first time have to struggle with this question. There was a chatbot that a young boy in Florida died by suicide because a chatbot told him to take his own life and the mother has sued the company and the company claims they had a First Amendment protected right to speak. They had a right to speak. The chatbot could say whatever it wanted and the court said no. The chatbots do not have constitutional rights like humans do. So that was a huge win. It’s one court but I think it’s a step in the right direction to saying these AI companies are not humans. They are not the same as you and I. They do not have the rights that we do and really pushing that forward I think will be critical in making sure that we have the protections necessary to protect them from these AI tools.
Grunde Almeland: I think it’s important to remember as politicians and legislators that we should not meet this whole new world of technology with panic and believing that we do not have anything legislated already, because most things are already quite heavily legislated. And sometimes it has to be amended, and sometimes we need to come up with new legislation, but most of things are already legislated. We just have to see how technology fits into it. And I think this relates to a lot of the different questions that are put forward here. And talking about the scam as well, I think this is, well, AI used in scams, you know, it falls under, I think if you look in any kind of country and legislative space, you see that this would fall under what is already in criminal codes. But the issue is really that there is a lack of cooperation between countries in order to tackle these new challenges. And I think when you see, we have a really good example that came out in Norwegian media just a few weeks ago of a system called Magicat, and it’s a great piece of journalism that is available in English as well, you can Google it, and it shows how sophisticated these kind of scams are, and how not well prepared Norwegian society in this case was also to actually tackle these kind of international scams. So I think international cooperation is often more the answer than, you know, coming up with the exact new legislation. And just a quick remark on the watermarking side of it. We have a good case study about this in the Norwegian society as well, the media landscape in Norway came together to create this technology, and are cooperating with BBC and New York Times, a lot of these big media outlets to have good watermarking technology in place. implemented in journalism, but what is the key component of that is not, you know, having this kind of verification check, but it’s having the information accessible for people to, who took the photo, where is it from, this kind of essential information that gives people an opportunity to make their own decision upon the content and are not trying to force them to by saying this is real or this is not real. So I think this is something also to remember for us as politicians that we need to give people the fundamental information, just not always trying to decide for them.
Nikolis Smith: So we are running out of time, but I want to see if we can be really efficient in the queue line. I know that there’s some members of Parliament that is also looking to ask some questions. So can we just do really shrunk-created questions, make sure they’re not too long, so we can get some quick responses before we go into the closing part. Please.
Audience: Assalamu alaikum. I am Mounir Souri from the Kingdom of Bahrain, a member of the Parliament. Can you hear me or do you want me to speak in English? Is it okay in Arabic? English would be great. I think it’s very important that we have the power to control the legislations in order to protect the society. We say today that it’s difficult, the legislations have changed. The technology has evolved day by day and the legislation is difficult. If we make a legislation today, the technology will evolve tomorrow. If we make it, it will be difficult to preserve it. My question today is, is it possible that there are other powers other than the legislations to protect privacy while we have freedom? We are between transparency and freedom. Do we want to protect the society? Can we make the AI control the content? So that we don’t depend on humans? The AI is producing information and people are misusing it. Can we, as artists and officials, make the content balanced? Does it help itself to protect the content? Does the AI protect the content? Thank you. My name is Hugo Carneiro and I’m an MP from Portugal. So the questions are this. Social networks should verify news and fake news and misinformation. But following the US elections, we became aware that, for example, Facebook Meta will stop doing that kind of verification. What do you think that we should do? So regulation should be a solution for these kinds of cases or we should trust in these big companies to verify this information. Second question. a colleague asked before about the ID, when we want to open an account, a bank account, for example, even if we use a cell phone, we have face recognition, we can take a photo from our ID, should we implement these solutions, for example, when someone wants to open an account in a social network, because there are a lot of fake profiles, and I don’t see other solution if we don’t give a stronger step on this. And last question, the French president, Emmanuel Macron, announced that probably he will enact laws to prohibit youngsters under 15 years old to have a social network account. So there are a lot of misinformation and fake news that influence the political decisions that our youngsters are taking or are learning. Do you believe that a solution or a path should be to prohibit youngsters of having social network accounts? Thank you.
Nikolis Smith: So we’re going to have time for one question for each side, and we’ll try to do our best in a kind of a lightning round as we respond to this. I know I wish we had more time, but one more question from each, and I would encourage you at the conclusion of this, there will be a reception this evening. Some of the MPs, you’ll be able to kind of engage with them there, if you don’t get a chance to answer your question here now. So one more on each side, thank you.
Audience: Thank you very much. John K.J. Kiarie, Member of Parliament in Kenya, and it is to all the people on the panel, including the moderator, and mine is to ask, to your mind, what do you think are practical pragmatic steps that IGF can take to place responsibilities not only on big tech developers but also on economies that are advanced, in jurisdictions that are advanced to an extent that they are feeding all the other countries with the technology they are doing so that we have so much of technological dumping because to imagine that countries in places like Africa will at one point be at par with Silicon Valley is a fallacy. To imagine that such advanced economies do not have responsibilities is also wrong. We are here at IGF. It takes a lot for even some of these countries to be represented at IGF and we have real cases of technological dumping that does not speak to even the basic human rights. In Kenya, for example, we had a company walk into the country and start collecting biometric data, scanning people’s irises and inducing vulnerable populations with tokens in the name of world coin and the behavior that they brought to Kenya are things that they would never do in their own countries, even with the existing laws. But whenever the countries in the southern hemisphere raise this, we are told to go and develop our own laws. So I’m asking, is IGF practically able to rein in on us so that we can place responsibilities not only on big tech, but even on countries that develop this technology to the extent that they carry responsibility to carry everyone along. Because as we speak today, even as we talk about Internet, everything about Internet is never manufactured in Africa. We do not manufacture the fiber optic cables. We do not manufacture the devices. We do not have a single satellite in the terrestrials. To imagine that we are all on the same table and working on the same laws at IGF and working on the same conventions would be a fallacy. So I am asking practically, is IGF able to do what the world did on the onset of the nuclear weapons? Because that was fast. We are here in the 20th IGF, but when we look at nuclear, the bomb was invented in 1945, and by 1957, there were already treaties that were putting responsibilities on the developers and on the inventors of that technology. When will that happen for internet? When will that happen for social media? When will that happen for big tech? When will that happen for the countries that are so advanced? Because if we do not do anything right now, we will end up exasperating the divisions that exist and the disparities that exist, and what will happen with this AI is that my people will be condemned to digital plantations, just like they were condemned with sugar cane and with coffee and with all these other things that happened in slave trade. To imagine that we will all work together as a world is a big fallacy. What practical examples can we take out of IGF? What practical actions can we take to put responsibilities where it belongs? Because to imagine that we are all okay in that part would be a big fallacy. Thank you very much.
Nikolis Smith: Thank you very much. So, for the last question, what we’re gonna do, we’ll do something a little bit different instead of having everybody respond to all those questions. Why don’t we package that in our closing remarks? And we’ll start with Mr. Alizoda because he didn’t get a chance to respond, and that way we can still finish. One last question? Yes. One second.
Audience: I tried to be very brief.
Nikolis Smith: Thank you.
Audience: And a concrete question. because I would be very interested in a Kenyan perspective on this issue. You said earlier that you hope that IGF will help to facilitate a code of conduct for social media organisations’ platforms. I’m wondering if you really think that that will be enough, a voluntary code of conduct for social media organisations, or if we rather need more like standards regulation plus also alternative platforms that actually work for democracy instead of undermining it, that work for freedom of speech instead of restricting it, and what is a Kenyan perspective on these kind of new social media platforms, who could do it, how, and what would be something that you would want there. My name is Anna Luhmann. I’m a member of parliament from Germany. Thank you.
Nikolis Smith: Thank you. Thank you, everybody. Thank you, as I said. We’ll start with Mr. Alizoda. If you can, I know you heard, you didn’t get a chance to respond in the first round, so as you’re thinking about your closing remarks, you can try to think it and kind of contextualise it in a way that you can respond to some of these questions from the audience. Thank you.
Zafar Alizoda: Thank you, Nikolis. I agree with the proposal of the Kenyan representative. Indeed, the efforts of parliamentarians and experts from all countries in the discussion of the issue of the protection of information once again confirm the fact that no country can be an outsider in this matter. Becoming an equal participant in these interrelationships on the Internet, we must respect the general conditions and measures to regulate the preservation of the integrity of information as a whole. Thank you. and also to harmonize the regulations in each country with global principles and standards. Thank you.
Marsha Caddle: And this is really a kind of a repeat of that conversation, right? That we saw the dumping, in this case of greenhouse gas emissions, we experienced it, we suffered from it, and then we started to slowly try to regulate a global system that would see the polluting countries start to invest in adaptation and mitigation, and it’s been a long, arduous process that has not settled. And so for me, we have to learn the lessons of nuclear regulation, we have to learn the lessons of climate that we’re still experiencing now, and to be able to say, look, these are the things that we require of major tech countries and major tech companies. For example, you’ll see that major social network companies and creators can benefit hugely. They don’t even have to physically come to a country and collect data, but they can benefit hugely just from pushing information into a jurisdiction where there is little control. So I agree with you. I cannot speak to whether IGF is that place, but I do think that we have to learn the lessons of the last three decades in climate and in other areas, and rather than having it take another three decades to come to a global compact that is about accountability, that that needs to happen. We already have models for it. We already know what that looks like. And it’s just time to act in a global way.
Grunde Almeland: Well, IGF certainly can be a space where we are able to find this kind of common ground, and I really hope it will be, because this is such an international question that trying to, I think, you know, often trying to regulate this in our national jurisdiction is just creating a lot of Swiss cheese for these companies. And while it is delicious with Swiss cheese, it’s not always good for you. And I really do believe that we need to find these spaces where we work together internationally in order to find this common ground, a common set of rules. And there are a lot of challenges, and I think a lot of the questions point to those challenges, when in terms of verification, you know. Having a set rule on verification also excludes vulnerable people in vulnerable situations, or is able to exclude them. You know, having people in, making sure that in areas that are able to actually speak up is also important, you know. Requiring an age verification for children to access networks, while it is an active discussion in Norway as well, it still has the dilemmas of, you know, children also have fundamental rights in order to gain information, be active in, you know, they are not small people that are being put in a room until they become adults. They should be an active part of society, you know. These are all dilemmas that we have to navigate as well, while we still try to protect, but not overprotect.
Nikolis Smith: Thank you so much.
Rebecca Bauer-Kahn: Yeah, I think, I mean, I think what is being said is really at the crux of all of it, which is global cooperation. I think, you know, we talk about… so much of what the world has done, and we’ve gone different directions. I mean, I don’t know if we have MPs from Australia here, but they have banned social media for young people. How is that going for them? Is it having the problems you describe? I think we can learn so much from one another and really move the ball forward, because as the gentleman from Bahrain said, policy moves slower than technology, and I think only through that collaboration can we really move forward in a way that protects our communities. You know, there was a question about privacy versus some of these society-protecting tools, and I think we can figure this out together. I mean, we’re moving a piece of legislation this year that would require devices to be able to verify your identity so that you don’t have to share that with the platforms, that there is a way technologically to do that in a privacy-protective way, and if we do that together, I think we can move the world forward, not have Swiss cheese, and have societies that are protected from some of the ills that we’ve talked about today. I want to acknowledge that I live in one of those jurisdictions that is responsible for these tech companies, and the weight of that is real, and you also can imagine how it affects our electoral politics, especially in a country where you can buy elections, and that’s perfectly legal now in America, and so we live in a very complicated political dynamic, but I will say that this topic of technology and its impact on society is becoming one of the most agreed-upon political topics, because I think that we are living in a reality where we see the downsides, whether it be for our children and how their mental health is being affected, or our democracies and truth, and so I’m hopeful that even in the complicated country and democracy I live in, we’ll be able to move forward solutions that will be protective, not just of our own people, but of the world. Thank you.
Nikolis Smith: Thank you.
Catherine Mumma: Now, codes of ethics are… The first tool for self-regulation for a lot of professionals and professional associations and organizations. So I think the first self-regulation opportunity lies in some codes of ethics. And since the big tech may not necessarily be in an association where we can say as an association come up with a code of ethics, I’m thinking IGF could be a good space to initiate this. And I would want to suggest that you look at the IPU resolution on AI. Also look at the IPU draft code of ethics on science and technology which might give some suggestions on what could happen. But that would be actually extending some opportunity for big tech companies to realize that the freedoms may have crossed the lines in terms of freedoms and what they’re doing may be harming very vulnerable populations, especially in countries that may not be as enabled. And that brings me to the point my colleague KJ just raised around the more developed countries taking responsibility and the big tech companies taking responsibility in regard to what the negative sides of tech is happening in the more developing countries. I would want to say first we need to recognize that the international protective mechanism around human rights is breaking as far as I’m concerned. We’ve seen what’s happened in Gaza. and we are all helpless, or the world seems to be helpless as a lot of human rights violations are happening, not just in Gaza, but in other places, in Sudan, in Ukraine, and wherever else. I would want to first query whether we need to reimagine what international cooperation was supposed to be, and whether that international cooperation can be rethought and reimagined to truly provide the protections that it’s supposed to provide. Meantime, I think as the small countries, we may need to do what we have to do. One of the things I think from the morning session, I would think we must, in protection of our vulnerable populations, we must start putting conditionalities to the licenses that we give until that time when we will have the tech companies realize that ruining our young people through facilitating access to what they wouldn’t do in their own countries is a violation of human rights. That distorting, facilitating, or enabling the distortion of elections in our countries in order for us to end up with wrong governments is a violation of our rights. So even as we place the responsibility to the United Nations and the international community, we must start looking inward and determine the incremental kind of arrangement that we will have with these companies to ensure that for the very vulnerable, we give conditionalities to the licenses before we issue those licenses to the big tech companies. Thank you very much.
Nikolis Smith: Wow, I’m seeing the flashing red. We should be done already. I just want to say, can we just give a round of applause to this great panel, this discussion? Thank you all. Two things that I just want to make sure that I underscore here. Number one is that this is still day zero. Day one is actually tomorrow, so this track will continue tomorrow morning. So make sure that you’re looking at the schedule. You’ll have more opportunities throughout the week to talk to some of these people on stage through other sessions. So make sure you take advantage of that, especially the folks that didn’t get a chance to as they were queuing to ask questions. One more thing before we close for all members of Parliament that will be going to an event and reception at the Parliament this evening. What you’re going to do when we leave here, we’ll exit out and go to the left and down. There will be some folks waiting for you guys to take you. The bus, I believe, that’s going to escort you leaves at 1600. So 4 p.m. If you have any questions, you can come talk to some of us as we come off the stage as well. But again, thank you so much and enjoy the rest of your week. Thank you. Yeah.
Grunde Almeland
Speech speed
155 words per minute
Speech length
1737 words
Speech time
671 seconds
Truth is becoming less relevant as AI superpowers content creation, leading people to engage only with information confirming their existing beliefs
Explanation
Almeland argues that AI is creating so much more content that people primarily engage with information that confirms their held beliefs, making them stay in comfortable bubbles that are increasingly difficult to pierce with factual debate and true facts. This represents one of the key findings from a report on AI and elections that examined different elections in 2024.
Evidence
Referenced a report that went through different elections in 2024 and analyzed how AI is super powering content creation
Major discussion point
Misinformation and Disinformation Challenges
Topics
Human rights | Sociocultural
Independent media organizations are crucial for combating misinformation, requiring strong legislative foundations ensuring editorial independence and public funding
Explanation
Almeland emphasizes that supporting and strengthening independent media is a key measure to combat the reality being created in different information bubbles. He argues there is a strong connection between trust in politicians and people having access to true information from professional, independent media that can check what politicians are doing.
Evidence
Norway is number one on the press freedom index, has legislative measures ensuring owners cannot challenge editorial decisions, extensive public funding for media including local outlets
Major discussion point
Supporting Independent Media and Transparency
Topics
Human rights | Legal and regulatory
Agreed with
– Marsha Caddle
Agreed on
Supporting independent media and transparency mechanisms
Norway’s success stems from legislative measures preventing owners from interfering with editorial decisions and extensive media funding
Explanation
Almeland details Norway’s approach which includes acts ensuring editorial independence, preventing owners from accessing independent editorial decisions, and prohibiting owners from requiring to see journalists’ work before publication. This is combined with extensive public funding to support both national and local media outlets.
Evidence
Norway ranks number one on press freedom index, has specific legislative protections for editorial independence, funds media extensively including small local news outlets
Major discussion point
Supporting Independent Media and Transparency
Topics
Human rights | Legal and regulatory
Technology should be used as a tool to strengthen democratic institutions and create more transparency
Explanation
Almeland argues that while there are evident challenges with technology, there are also big possibilities in using these tools to create more transparency and strengthen institutions that uphold democracy. He emphasizes the opportunity to adapt technology that strengthens democratic institutions rather than focusing only on the problems.
Evidence
Example of simplifying processes for journalists’ access to government information, Norway’s experience using digital adaptability to maintain high trust in media
Major discussion point
Technology for Good and Innovation
Topics
Human rights | Infrastructure
Agreed with
– Rebecca Bauer-Kahn
– Marsha Caddle
Agreed on
Technology should be leveraged for democratic good and transparency
Most issues are already legislated and need adaptation rather than entirely new laws, with international cooperation being more crucial than new legislation
Explanation
Almeland argues that politicians and legislators should not approach new technology with panic, believing they have nothing legislated already. Most things are already heavily legislated and sometimes need amendment or new legislation, but often it’s about seeing how technology fits into existing frameworks.
Evidence
Example of AI-used scams falling under existing criminal codes, Norwegian case study of Magicat scam system showing need for international cooperation rather than new legislation
Major discussion point
Legislative and Regulatory Approaches
Topics
Legal and regulatory | Cybersecurity
Disagreed with
– Catherine Mumma
– Zafar Alizoda
Disagreed on
Legislative approach – new laws versus adapting existing frameworks
Verification should provide fundamental information allowing people to make their own decisions rather than forcing judgments about what is real
Explanation
Almeland describes Norway’s approach to watermarking technology in media, which cooperates with major outlets like BBC and New York Times. The key component is not just verification checks, but making essential information accessible about who took photos and where they’re from, giving people the opportunity to make their own decisions.
Evidence
Norwegian media landscape cooperation with BBC and New York Times on watermarking technology, focus on providing source information rather than declaring content real or fake
Major discussion point
Verification and Authentication Solutions
Topics
Human rights | Sociocultural
Agreed with
– Rebecca Bauer-Kahn
– Marsha Caddle
Agreed on
Importance of verification and authentication solutions
Disagreed with
– Rebecca Bauer-Kahn
Disagreed on
Approach to content regulation and verification
IGF can serve as a platform for finding common ground and developing shared rules rather than creating fragmented national regulations
Explanation
Almeland believes IGF can be a space for finding common ground on international digital governance issues. He argues that trying to regulate these issues in national jurisdictions creates ‘Swiss cheese’ for companies, and international cooperation is needed to find common rules.
Evidence
Metaphor of Swiss cheese regulation being delicious but not always good, emphasis on need for common set of rules internationally
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Human rights
Agreed with
– Martin Chungong
– Catherine Mumma
– Rebecca Bauer-Kahn
– Zafar Alizoda
– Audience
Agreed on
Need for international cooperation and global standards rather than fragmented national approaches
Martin Chungong
Speech speed
86 words per minute
Speech length
319 words
Speech time
220 seconds
Digital technologies have fundamentally altered the information landscape, with governments struggling to distinguish fact from fiction and electoral processes facing manipulation
Explanation
Chungong argues that the rapid spread of misinformation through digital technologies has fundamentally changed how democracies operate. Governments face challenges distinguishing fact from fiction, electoral processes are manipulated through coordinated disinformation campaigns, and democratic institutions find their legitimacy questioned based on false narratives.
Evidence
References to coordinated disinformation campaigns affecting electoral processes and false narratives undermining institutional legitimacy
Major discussion point
Misinformation and Disinformation Challenges
Topics
Human rights | Sociocultural
The rise of AI has transformed the misinformation landscape with deepfakes, AI-generated content, and algorithmic amplification creating unprecedented challenges
Explanation
Chungong emphasizes that artificial intelligence has fundamentally transformed the misinformation landscape through deepfakes, AI-generated content, and algorithmic amplification. These technologies create unprecedented challenges for democratic discourse by blurring the lines between fact and fiction.
Evidence
Specific mention of deepfakes, AI-generated content, and algorithmic amplification as new technological challenges
Major discussion point
Misinformation and Disinformation Challenges
Topics
Human rights | Sociocultural
Global cooperation on combating misinformation is crucial as fragmented approaches risk undermining democratic discourse
Explanation
Chungong argues that at a time when democratic norms face unprecedented pressure and public trust continues to erode, global cooperation on combating misinformation is more crucial than ever. A fragmented approach to information integrity risks undermining the foundations of democratic discourse and exacerbating the crisis of trust.
Evidence
Reference to the Global Digital Compact as emerging international consensus on information integrity
Major discussion point
International Cooperation and Global Standards
Topics
Human rights | Legal and regulatory
Agreed with
– Grunde Almeland
– Catherine Mumma
– Rebecca Bauer-Kahn
– Zafar Alizoda
– Audience
Agreed on
Need for international cooperation and global standards rather than fragmented national approaches
Catherine Mumma
Speech speed
118 words per minute
Speech length
1910 words
Speech time
969 seconds
Kenya has established a comprehensive legal framework including Computer Misuse and Cyber Protection Act, Data Protection Act, and Media Council Act, though gaps remain in addressing misinformation specifically
Explanation
Mumma explains that Kenya has embraced digital technology and established a facilitative constitutional framework protecting freedom of expression and human rights. The country has implemented several relevant laws but still lacks specific legislation addressing misinformation and disinformation, particularly during electoral periods.
Evidence
Listed specific laws: Computer Misuse and Cyber Protection Act, Data Protection Act, Media Council Act, Copyrights Act, National Cohesion and Integration Act; mentioned constitutional protections for freedom of expression and access to information
Major discussion point
Legislative and Regulatory Approaches
Topics
Legal and regulatory | Human rights
Disagreed with
– Grunde Almeland
– Zafar Alizoda
Disagreed on
Legislative approach – new laws versus adapting existing frameworks
Kenya faces challenges with misinformation and disinformation on social media, particularly during electoral periods, leading to violence and ethnic tensions
Explanation
Mumma describes how Kenya experiences continuous electoral competition with significant misuse of social media for disinformation and hate speech. The country has suffered post-election violence following hate speech using negative ethnicity, and digital technology has amplified politically motivated misinformation that sometimes leads to violence.
Evidence
Kenya’s experience with post-election violence following hate speech, establishment of Cohesion Commission, continuous electoral campaigning environment
Major discussion point
Misinformation and Disinformation Challenges
Topics
Human rights | Sociocultural
Beyond regulation, need greater investment in public digital infrastructure to ensure rural areas, women, and vulnerable groups can participate in digital spaces
Explanation
Mumma argues that beyond protecting against disinformation, there’s a human rights issue of inclusion that requires financial investment in necessary public digital infrastructure. This would enable greater participation by those in rural areas, women, and other vulnerable and minority groups in the benefits of digital space and technology.
Evidence
Emphasis on need for investment in public digital infrastructure for rural areas, women, and vulnerable groups
Major discussion point
Digital Inclusion and Infrastructure
Topics
Development | Human rights
Electoral integrity depends largely on neutral electoral management bodies rather than just technology
Explanation
Mumma argues that electoral fraud, whether with AI or other means, usually requires collusion by people within electoral management bodies. She believes that if electoral management bodies remain neutral, then elections will remain credible regardless of whether they are digitally driven or not.
Evidence
Kenya’s experience with Supreme Court cases on electronic transmission of presidential results, electoral management body’s refusal to allow audit of electoral transmission systems
Major discussion point
Supporting Independent Media and Transparency
Topics
Human rights | Legal and regulatory
Need to think broadly about digital technology impacts across all sectors including health, water, and other areas beyond just political spaces
Explanation
Mumma emphasizes that parliamentarians tend to focus on technology’s impact in political spaces, but need to think broader about innovations in health sector, telemedicine, and other sectors. Laws on digital technology should be cross-sectoral rather than focused on particular legislation.
Evidence
Examples of telemedicine and digital technology applications in health, water, and other sectors
Major discussion point
Digital Inclusion and Infrastructure
Topics
Development | Legal and regulatory
African parliamentarians have formed regional caucuses to share experiences and develop common approaches
Explanation
Mumma explains that African parliamentarians have proactively formed African Parliamentary Caucuses across West Africa and East Africa to compare notes and learn from each other. They recognize that what happens in one country affects others, so they need to borrow from each other and learn together.
Evidence
Formation of Africa-wide, West African, and East African parliamentary caucuses for sharing experiences on digital governance
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Development
Agreed with
– Grunde Almeland
– Martin Chungong
– Rebecca Bauer-Kahn
– Zafar Alizoda
– Audience
Agreed on
Need for international cooperation and global standards rather than fragmented national approaches
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Explanation
Mumma supports the need for international protective mechanisms, noting that current human rights protections are breaking down. She argues for reimagining international cooperation to truly provide protections, while also suggesting that smaller countries should place conditionalities on licenses given to tech companies to protect vulnerable populations.
Evidence
Comparison to nuclear weapons treaties developed quickly after 1945 bomb invention, examples of human rights violations in Gaza, Sudan, Ukraine
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Human rights
Agreed with
– Grunde Almeland
– Martin Chungong
– Rebecca Bauer-Kahn
– Zafar Alizoda
– Audience
Agreed on
Need for international cooperation and global standards rather than fragmented national approaches
Rebecca Bauer-Kahn
Speech speed
189 words per minute
Speech length
2003 words
Speech time
634 seconds
California has passed privacy legislation (CCPA) and is working on watermarking requirements and disclosure laws for AI-generated political content
Explanation
Bauer-Kahn explains that California was the first US state to pass privacy protective legislation after the EU, and is home to major AI and social media companies. The state is taking responsibility by requiring disclosure of AI use in political advertisements and mandating platforms to take down serious political misinformation, though constitutional free speech protections create challenges.
Evidence
California is home to 32 of top 50 AI companies and all major social media companies; passed CCPA after EU privacy laws; legislation requiring AI disclosure in political ads currently in courts
Major discussion point
Legislative and Regulatory Approaches
Topics
Legal and regulatory | Human rights
Need to invest in technology for good through academic institutions and civil society to compete with profit-driven companies
Explanation
Bauer-Kahn argues that technology is currently in the hands of very few profit-driven players, and there’s a need to push technology to be the solution in the technology age. This requires funding academic institutions to have compute power to compete with largest AI companies and putting civil society into the space to create technology for good.
Evidence
University of Chicago example of AI model that allows copyrighted material to refuse training by AI models; emphasis on academic institutions as source of tech company talent
Major discussion point
Technology for Good and Innovation
Topics
Development | Economic
Agreed with
– Grunde Almeland
– Marsha Caddle
Agreed on
Technology should be leveraged for democratic good and transparency
Watermarking technology and device-level authentication are critical for distinguishing reality from AI-generated content
Explanation
Bauer-Kahn describes California’s approach to requiring watermarking technology, first from platforms and then from devices themselves. The state legislature used Adobe’s technology to watermark their own images, demonstrating the technology’s capability to trace and authenticate content.
Evidence
California passed watermarking law for platforms, moving legislation for device-level authentication, California legislature’s use of Adobe watermarking technology for their own images
Major discussion point
Verification and Authentication Solutions
Topics
Legal and regulatory | Infrastructure
Agreed with
– Grunde Almeland
– Marsha Caddle
Agreed on
Importance of verification and authentication solutions
Disagreed with
– Grunde Almeland
Disagreed on
Approach to content regulation and verification
Privacy legislation is fundamental to preventing sophisticated financial scams that exploit personal information
Explanation
Bauer-Kahn argues that financial scams are becoming sophisticated because there is so much access to personal information about individuals. When scammers call knowing children’s names and family details, people fall for scams they wouldn’t otherwise. Privacy protection is therefore foundational to preventing these scams.
Evidence
Example of scammers calling with detailed family information making scams more believable
Major discussion point
Privacy and Human Rights Protection
Topics
Human rights | Cybersecurity
Constitutional protections of free speech create challenges in regulating misinformation while requiring creative solutions like disclosure requirements
Explanation
Bauer-Kahn explains that US constitutional protections prevent stopping people from speaking and also prevent forcing people to speak. This creates complications when trying to combat misinformation through required disclosures, as courts may see disclosure requirements as forced speech, particularly for political speech which is even more protected.
Evidence
Court challenges to California’s AI disclosure requirements in political advertisements being seen as forced speech
Major discussion point
Privacy and Human Rights Protection
Topics
Human rights | Legal and regulatory
Disagreed with
– Marsha Caddle
Disagreed on
Role of government versus platforms in content moderation
Paper trails and audit capabilities are essential for maintaining election integrity and public trust
Explanation
Bauer-Kahn emphasizes that election integrity requires not just real integrity but public belief in that integrity. She advocates for paper trails even when using electronic voting machines, so there are receipts and ways to audit elections, which is critically important for maintaining democratic legitimacy.
Evidence
US system where every state runs elections differently, emphasis on agreed principle that elections are free and fair
Major discussion point
Verification and Authentication Solutions
Topics
Human rights | Legal and regulatory
Marsha Caddle
Speech speed
151 words per minute
Speech length
1749 words
Speech time
693 seconds
Deepfakes are creating serious problems, including false diplomatic statements that could destabilize international relations
Explanation
Caddle describes how Barbados had to urgently warn about a deepfake of the Prime Minister making false statements about the country’s diplomatic stance toward a major world power. This highlights how deepfakes risk not just domestic trust but can completely destabilize a country’s global position and international relations.
Evidence
Specific example of deepfake about Barbados Prime Minister’s diplomatic statements requiring urgent government response
Major discussion point
Misinformation and Disinformation Challenges
Topics
Human rights | Sociocultural
Barbados has implemented cybercrime legislation developed through transparent parliamentary processes with citizen input
Explanation
Caddle explains that as the minister who brought cybercrime legislation, she took it to Joint Select Committee, heard evidence and pushback including human rights concerns from citizens, and amended the legislation accordingly. This represents a healthy way to get people in conversation and ensure democratic participation in addressing truth and evidence.
Evidence
Personal experience as minister bringing cybercrime legislation through Joint Select Committee with public hearings and amendments based on citizen feedback
Major discussion point
Legislative and Regulatory Approaches
Topics
Legal and regulatory | Human rights
Promoting transparency through broadcasting parliamentary proceedings and providing original sources helps establish a culture of truth and evidence
Explanation
Caddle describes Barbados’s approach of broadcasting joint select committee proceedings and most Prime Minister speeches either in real time or recorded. The goal is to get people accustomed to an environment of truth and evidence by providing access to original sources and establishing where people can find accurate information.
Evidence
Broadcasting of parliamentary joint select committees and Prime Minister’s speeches, emphasis on providing original sources
Major discussion point
Supporting Independent Media and Transparency
Topics
Human rights | Sociocultural
Agreed with
– Grunde Almeland
Agreed on
Supporting independent media and transparency mechanisms
Investment in tech ecosystems that can build tools to fight misinformation while promoting innovation is essential
Explanation
Caddle argues for investing in tech ecosystems that can balance or build tools to fight against misinformation, since others are investing heavily in creating misinformation. The focus should be on investing in tech creators who will combat misinformation with tools that promote truth.
Evidence
Emphasis on need to invest in tech creators to combat misinformation with truth-promoting tools
Major discussion point
Technology for Good and Innovation
Topics
Development | Economic
Agreed with
– Grunde Almeland
– Rebecca Bauer-Kahn
Agreed on
Technology should be leveraged for democratic good and transparency
Countries should invest in training people in data analytics and science to create their own tools rather than just consuming technology
Explanation
Caddle describes Barbados’s approach of training people in data analytics and data science, not just in formal academic institutions but partnering with companies. The goal is to enable people to create tools they find useful and can trust, participating in the AI value chain even if not building large language models.
Evidence
Partnership with Zindi company for data analytics training, concept of AI value chain allowing participation at different levels
Major discussion point
Technology for Good and Innovation
Topics
Development | Economic
High internet penetration creates expectations for meaningful access and use of information
Explanation
Caddle notes that Barbados has extremely high internet and digital penetration (114% mobile penetration), which creates high expectations about immediate access to information. The challenge becomes not just access, but meaningful access and use as these issues are discussed.
Evidence
Barbados has 114% mobile penetration rate indicating over-maximum coverage
Major discussion point
Digital Inclusion and Infrastructure
Topics
Development | Infrastructure
Platforms should return to more robust verification methods while balancing accessibility concerns
Explanation
Caddle argues that encouraging platforms to return to more robust methods of verification is critical. However, she notes the contradiction where political actors are constrained from generating content on platforms due to restrictions, while misinformation about them proliferates freely.
Evidence
Personal experience as politician being restricted from generating content on platforms while misinformation about political actors spreads
Major discussion point
Verification and Authentication Solutions
Topics
Human rights | Legal and regulatory
Agreed with
– Grunde Almeland
– Rebecca Bauer-Kahn
Agreed on
Importance of verification and authentication solutions
Disagreed with
– Rebecca Bauer-Kahn
Disagreed on
Role of government versus platforms in content moderation
Zafar Alizoda
Speech speed
142 words per minute
Speech length
884 words
Speech time
371 seconds
Central Asian countries need to revise personal data protection laws and consider GDPR-like risk assessment mechanisms
Explanation
Alizoda explains that Central Asian countries are actively developing legal institutes for personal data protection, but legislation doesn’t regulate many issues in this field. He argues for revising laws on personal data protection and making adjustments that reflect modern applications of advanced technologies, including considering GDPR-like risk assessment procedures.
Evidence
Current gaps in Central Asian legislation including absence of national personal data protection laws, lack of breach notification requirements, need for Data Protection Impact Assessment procedures like GDPR
Major discussion point
Legislative and Regulatory Approaches
Topics
Legal and regulatory | Human rights
Disagreed with
– Grunde Almeland
– Catherine Mumma
Disagreed on
Legislative approach – new laws versus adapting existing frameworks
Personal data protection requires special attention for sensitive categories and comprehensive legislative frameworks
Explanation
Alizoda details that sensitive personal data includes race, ethnic origin, political beliefs, religious beliefs, professional affiliation, medical information, biometric data, and financial information. This category requires special protection and processing as disclosure can lead to discrimination, stigmatization and other negative consequences.
Evidence
Detailed categorization of sensitive personal data types and their potential negative consequences if disclosed
Major discussion point
Privacy and Human Rights Protection
Topics
Human rights | Legal and regulatory
Global platforms have different policies for different regions, with developing countries lacking the same protections as EU citizens under GDPR
Explanation
Alizoda argues that global platforms have different policies for different countries and regions, with EU citizens protected by GDPR while small developing countries in Asian regions are deprived of such priority. Even when legislation is close to GDPR, many issues remain unresolved and enforcement is difficult due to limited market leverage.
Evidence
Comparison between EU GDPR protections and lack of similar protections for developing countries, specific mention of Tajikistan’s legislation being close to GDPR but with enforcement challenges
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Human rights
Agreed with
– Grunde Almeland
– Martin Chungong
– Catherine Mumma
– Rebecca Bauer-Kahn
– Audience
Agreed on
Need for international cooperation and global standards rather than fragmented national approaches
Audience
Speech speed
142 words per minute
Speech length
1539 words
Speech time
647 seconds
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Explanation
John K.J. Kiarie from Kenya argues that IGF should take practical steps to place responsibilities on big tech developers and advanced economies, similar to how nuclear weapons were quickly regulated after 1945. He emphasizes that technological dumping occurs where companies engage in practices in developing countries that they would never do in their home countries, and that imagining all countries can be at par with Silicon Valley is a fallacy.
Evidence
Nuclear weapons example where treaties were established by 1957 just 12 years after 1945 invention; World Coin example in Kenya collecting biometric data with tokens in ways not done in home countries; Africa’s complete dependence on imported internet infrastructure
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Development
Agreed with
– Grunde Almeland
– Martin Chungong
– Catherine Mumma
– Rebecca Bauer-Kahn
– Zafar Alizoda
Agreed on
Need for international cooperation and global standards rather than fragmented national approaches
Nikolis Smith
Speech speed
161 words per minute
Speech length
2307 words
Speech time
856 seconds
AI is a tool invented by humans that has benefits alongside challenges
Explanation
Smith emphasizes the importance of maintaining a balanced perspective on AI, acknowledging that while there are legitimate concerns about its impact on democracy and society, AI fundamentally remains a human-created tool that offers significant benefits. He advocates against approaching AI with fear and instead focusing on both its positive potential and necessary safeguards.
Evidence
Reminder that AI was invented by humans and has benefits that should be recognized alongside challenges
Major discussion point
Technology for Good and Innovation
Topics
Human rights | Sociocultural
The IGF provides a valuable platform for international collaboration on digital governance issues
Explanation
Smith positions the IGF as an important forum for bringing together parliamentarians, policymakers, and digital governance experts to build consensus on pressing challenges like safeguarding democratic institutions. He emphasizes that the IGF offers opportunities for continued dialogue and learning throughout the week beyond individual sessions.
Evidence
Organization of parliamentary track sessions, facilitation of panel discussions with MPs from multiple countries, provision of networking opportunities
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Development
Junhua LI
Speech speed
104 words per minute
Speech length
541 words
Speech time
310 seconds
The IGF 2025 aims to bring legislators together with stakeholders to shape digital policies ensuring an open, inclusive, and secure Internet for all
Explanation
Li emphasizes that the parliamentary track of IGF 2025 has a clear purpose of bringing legislators together with other stakeholders to shape digital policies and legislative frameworks. Under the theme ‘Building Digital Governance Together,’ the focus is on international digital cooperation to address today’s digital challenges while ensuring an open, inclusive, and secure Internet for all.
Evidence
IGF 2025 theme ‘Building Digital Governance Together’ and focus on international digital cooperation
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Human rights
The dual imperative of protecting freedom of expression while combating misinformation and disinformation is among the most urgent digital challenges
Explanation
Li identifies the need to balance protecting freedom of expression with combating the spread of misinformation and disinformation as one of the most critical challenges facing digital governance today. He argues that the ability to speak freely, access accurate information, and engage in open online discourse forms the bedrock of democratic societies, but these rights are being tested by disinformation, censorship, and AI technologies that blur the lines between fact and fiction.
Evidence
Reference to generative AI blurring lines between fact and fiction, false narratives eroding trust in public institutions, targeted disinformation campaigns threatening peace and stability
Major discussion point
Misinformation and Disinformation Challenges
Topics
Human rights | Sociocultural
Parliamentarians have pivotal authority to craft legislation that safeguards freedoms while strengthening democratic resilience
Explanation
Li emphasizes that members of parliament have unique authority and responsibility to navigate the complex terrain of digital governance. They can craft legislation that safeguards freedom of expression and access to information, promotes media and information literacy, and strengthens the resilience of democratic discourse while ensuring that responses to digital threats do not infringe upon the very freedoms they seek to protect.
Evidence
Parliamentary authority to craft legislation, promote media literacy, and strengthen democratic discourse
Major discussion point
Legislative and Regulatory Approaches
Topics
Legal and regulatory | Human rights
Expanding parliamentary engagement in national and regional IGFs is essential for localizing digital governance conversations
Explanation
Li highlights the encouraging progress in expanding parliamentary engagement in national and regional IGFs across different regions from West Africa to Asia-Pacific. He argues that this localization of digital governance conversations is essential and that learning from national experiences and identifying new avenues for collaboration strengthens the overall framework for digital governance.
Evidence
Examples of parliamentary engagement from West Africa to Asia-Pacific, emphasis on localization of digital governance conversations
Major discussion point
International Cooperation and Global Standards
Topics
Legal and regulatory | Development
Agreements
Agreement points
Need for international cooperation and global standards rather than fragmented national approaches
Speakers
– Grunde Almeland
– Martin Chungong
– Catherine Mumma
– Rebecca Bauer-Kahn
– Zafar Alizoda
– Audience
Arguments
IGF can serve as a platform for finding common ground and developing shared rules rather than creating fragmented national regulations
Global cooperation on combating misinformation is crucial as fragmented approaches risk undermining democratic discourse
African parliamentarians have formed regional caucuses to share experiences and develop common approaches
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Global platforms have different policies for different regions, with developing countries lacking the same protections as EU citizens under GDPR
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Summary
All speakers agreed that digital governance challenges require coordinated international responses rather than isolated national efforts, with IGF serving as a key platform for developing common standards and approaches
Topics
Legal and regulatory | Human rights | Development
Technology should be leveraged for democratic good and transparency
Speakers
– Grunde Almeland
– Rebecca Bauer-Kahn
– Marsha Caddle
Arguments
Technology should be used as a tool to strengthen democratic institutions and create more transparency
Need to invest in technology for good through academic institutions and civil society to compete with profit-driven companies
Investment in tech ecosystems that can build tools to fight misinformation while promoting innovation is essential
Summary
Speakers agreed that technology should be actively developed and deployed to strengthen democratic institutions and combat misinformation, rather than being left solely to profit-driven entities
Topics
Development | Economic | Human rights
Importance of verification and authentication solutions
Speakers
– Grunde Almeland
– Rebecca Bauer-Kahn
– Marsha Caddle
Arguments
Verification should provide fundamental information allowing people to make their own decisions rather than forcing judgments about what is real
Watermarking technology and device-level authentication are critical for distinguishing reality from AI-generated content
Platforms should return to more robust verification methods while balancing accessibility concerns
Summary
All three speakers emphasized the critical need for verification and authentication technologies, though with different approaches – from watermarking to providing source information to enable informed decision-making
Topics
Legal and regulatory | Infrastructure | Human rights
Supporting independent media and transparency mechanisms
Speakers
– Grunde Almeland
– Marsha Caddle
Arguments
Independent media organizations are crucial for combating misinformation, requiring strong legislative foundations ensuring editorial independence and public funding
Promoting transparency through broadcasting parliamentary proceedings and providing original sources helps establish a culture of truth and evidence
Summary
Both speakers agreed that independent media and transparent government processes are fundamental to combating misinformation and maintaining democratic trust
Topics
Human rights | Legal and regulatory | Sociocultural
Similar viewpoints
Both emphasized the need for rapid international action similar to nuclear weapons regulation, with specific focus on addressing technological dumping and ensuring big tech companies are held accountable for practices in developing countries that they wouldn’t engage in at home
Speakers
– Catherine Mumma
– Audience
Arguments
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Topics
Legal and regulatory | Human rights | Development
Both emphasized the importance of building local capacity and alternative technology ecosystems that serve democratic purposes rather than just profit motives, though Bauer-Kahn focused on academic institutions while Caddle emphasized practical skills training
Speakers
– Rebecca Bauer-Kahn
– Marsha Caddle
Arguments
Need to invest in technology for good through academic institutions and civil society to compete with profit-driven companies
Countries should invest in training people in data analytics and science to create their own tools rather than just consuming technology
Topics
Development | Economic
Both recognized that existing legal frameworks can address many digital challenges but require adaptation and creative approaches, particularly when balancing free speech protections with the need to combat misinformation
Speakers
– Grunde Almeland
– Rebecca Bauer-Kahn
Arguments
Most issues are already legislated and need adaptation rather than entirely new laws, with international cooperation being more crucial than new legislation
Constitutional protections of free speech create challenges in regulating misinformation while requiring creative solutions like disclosure requirements
Topics
Legal and regulatory | Human rights
Unexpected consensus
Balanced approach to AI regulation without fear-mongering
Speakers
– Grunde Almeland
– Nikolis Smith
Arguments
Most issues are already legislated and need adaptation rather than entirely new laws, with international cooperation being more crucial than new legislation
AI is a tool invented by humans that has benefits alongside challenges
Explanation
Despite the serious concerns about AI’s impact on democracy, both speakers emphasized avoiding panic and fear-mongering, instead advocating for measured responses that recognize both challenges and opportunities. This balanced perspective was unexpected given the gravity of the democratic threats discussed
Topics
Human rights | Sociocultural | Legal and regulatory
Cross-sectoral approach to digital governance beyond just political applications
Speakers
– Catherine Mumma
– Zafar Alizoda
Arguments
Need to think broadly about digital technology impacts across all sectors including health, water, and other areas beyond just political spaces
Personal data protection requires special attention for sensitive categories and comprehensive legislative frameworks
Explanation
Both speakers from very different regions (Kenya and Tajikistan) independently emphasized that digital governance must extend beyond political concerns to encompass healthcare, personal data protection, and other sectors. This holistic view was unexpected in a session focused on democratic safeguards
Topics
Legal and regulatory | Human rights | Development
Overall assessment
Summary
Strong consensus emerged around the need for international cooperation, technology for democratic good, verification solutions, and supporting independent media. Speakers from diverse regions and political systems found common ground on fundamental principles while acknowledging different implementation approaches.
Consensus level
High level of consensus on core principles with recognition that implementation must be adapted to local contexts. The agreement suggests potential for meaningful international collaboration on digital governance frameworks, though speakers acknowledged significant challenges in enforcement and ensuring equitable treatment across different jurisdictions.
Differences
Different viewpoints
Approach to content regulation and verification
Speakers
– Grunde Almeland
– Rebecca Bauer-Kahn
Arguments
Verification should provide fundamental information allowing people to make their own decisions rather than forcing judgments about what is real
Watermarking technology and device-level authentication are critical for distinguishing reality from AI-generated content
Summary
Almeland advocates for providing information and letting people decide for themselves what is real, while Bauer-Kahn emphasizes the need for technological solutions like watermarking to definitively distinguish reality from AI-generated content
Topics
Human rights | Legal and regulatory
Legislative approach – new laws versus adapting existing frameworks
Speakers
– Grunde Almeland
– Catherine Mumma
– Zafar Alizoda
Arguments
Most issues are already legislated and need adaptation rather than entirely new laws, with international cooperation being more crucial than new legislation
Kenya has established a comprehensive legal framework including Computer Misuse and Cyber Protection Act, Data Protection Act, and Media Council Act, though gaps remain in addressing misinformation specifically
Central Asian countries need to revise personal data protection laws and consider GDPR-like risk assessment mechanisms
Summary
Almeland believes most issues are already covered by existing laws that need adaptation, while Mumma and Alizoda emphasize the need for new specific legislation to address gaps in digital governance
Topics
Legal and regulatory | Human rights
Role of government versus platforms in content moderation
Speakers
– Rebecca Bauer-Kahn
– Marsha Caddle
Arguments
Constitutional protections of free speech create challenges in regulating misinformation while requiring creative solutions like disclosure requirements
Platforms should return to more robust verification methods while balancing accessibility concerns
Summary
Bauer-Kahn focuses on government regulatory approaches constrained by constitutional protections, while Caddle emphasizes the responsibility of platforms themselves to implement better verification
Topics
Human rights | Legal and regulatory
Unexpected differences
Trust in electoral management versus technology solutions
Speakers
– Catherine Mumma
– Rebecca Bauer-Kahn
Arguments
Electoral integrity depends largely on neutral electoral management bodies rather than just technology
Paper trails and audit capabilities are essential for maintaining election integrity and public trust
Explanation
This disagreement is unexpected because both speakers are concerned with election integrity, but Mumma emphasizes human institutional factors while Bauer-Kahn focuses on technological safeguards. This reveals different cultural and systemic approaches to the same problem
Topics
Human rights | Legal and regulatory
Overall assessment
Summary
The main areas of disagreement center on regulatory approaches (new laws vs. adapting existing ones), the balance between government regulation and platform responsibility, verification methods (information provision vs. technological authentication), and institutional vs. technological solutions for election integrity
Disagreement level
The level of disagreement is moderate and constructive. Speakers share common goals of protecting democracy and human rights in the digital age, but differ on implementation strategies. These disagreements reflect different national contexts, constitutional frameworks, and development levels rather than fundamental philosophical differences. The implications are positive as they provide multiple pathways for addressing digital governance challenges, allowing different jurisdictions to adopt approaches suited to their specific circumstances while maintaining overall coherence in global digital governance efforts.
Partial agreements
Partial agreements
Similar viewpoints
Both emphasized the need for rapid international action similar to nuclear weapons regulation, with specific focus on addressing technological dumping and ensuring big tech companies are held accountable for practices in developing countries that they wouldn’t engage in at home
Speakers
– Catherine Mumma
– Audience
Arguments
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Need for international mechanisms similar to nuclear weapons treaties to place responsibilities on big tech developers and advanced economies
Topics
Legal and regulatory | Human rights | Development
Both emphasized the importance of building local capacity and alternative technology ecosystems that serve democratic purposes rather than just profit motives, though Bauer-Kahn focused on academic institutions while Caddle emphasized practical skills training
Speakers
– Rebecca Bauer-Kahn
– Marsha Caddle
Arguments
Need to invest in technology for good through academic institutions and civil society to compete with profit-driven companies
Countries should invest in training people in data analytics and science to create their own tools rather than just consuming technology
Topics
Development | Economic
Both recognized that existing legal frameworks can address many digital challenges but require adaptation and creative approaches, particularly when balancing free speech protections with the need to combat misinformation
Speakers
– Grunde Almeland
– Rebecca Bauer-Kahn
Arguments
Most issues are already legislated and need adaptation rather than entirely new laws, with international cooperation being more crucial than new legislation
Constitutional protections of free speech create challenges in regulating misinformation while requiring creative solutions like disclosure requirements
Topics
Legal and regulatory | Human rights
Takeaways
Key takeaways
Truth is becoming less relevant in the digital age as AI-powered content creation enables people to remain in confirmation bias bubbles, making factual debate harder to achieve
Supporting and strengthening independent media organizations is crucial for combating misinformation, requiring strong legislative foundations that ensure editorial independence and adequate funding
Most digital governance challenges can be addressed through existing legislation that needs adaptation rather than entirely new laws, with international cooperation being more important than creating new regulations
A comprehensive approach is needed that balances protecting human rights and freedom of expression while regulating harmful content and allowing innovation to progress
Technology for good initiatives must be prioritized, including investment in academic institutions and civil society to create tools that compete with profit-driven platforms
Global platforms apply different policies to different regions, with developing countries lacking the same protections as more developed jurisdictions
Watermarking and authentication technologies are critical for distinguishing between real and AI-generated content, requiring both legislative mandates and technological development
Electoral integrity depends more on neutral electoral management bodies than on technology itself, though paper trails and audit capabilities remain essential
Digital inclusion requires investment in public infrastructure to ensure rural areas, women, and vulnerable groups can meaningfully participate in digital spaces
Resolutions and action items
Parliamentarians should carry IGF 2025 outcomes back to their respective countries to drive policy coherence at national and regional levels
Continue expanding parliamentary engagement in national and regional IGFs, particularly in West Africa and Asia-Pacific regions
California will continue pushing watermarking requirements for platforms and devices, with legislation requiring embedded authentication technology in cameras
African parliamentarians will continue using regional caucuses (Africa-wide, West Africa, East Africa) to share experiences and develop common approaches
Countries should consider implementing conditionalities on licenses given to big tech companies to protect vulnerable populations
IGF should explore developing codes of conduct for social media platforms, potentially building on IPU resolutions on AI and draft codes of ethics on science and technology
Parliamentarians should look beyond political spaces to consider digital technology impacts across all sectors including health, water, and other areas
Unresolved issues
How to effectively regulate misinformation and disinformation without appearing to over-correct or enabling government abuse of surveillance powers
How to place meaningful responsibilities on big tech developers and advanced economies that export technology to developing countries without adequate protections
Whether voluntary codes of conduct for social media platforms will be sufficient or if mandatory standards and regulations are needed
How to verify human identity versus AI-generated content in digital spaces while protecting privacy and avoiding exclusion of vulnerable populations
How to handle cross-border enforcement when violators are situated in different jurisdictions from those being harmed
Whether and how to restrict social media access for minors while respecting children’s fundamental rights to information and participation
How to balance age verification requirements with privacy protection and inclusion of vulnerable populations
How to address the challenge that policy moves slower than technology development
How to ensure meaningful international cooperation when existing international protective mechanisms for human rights appear to be failing
Suggested compromises
Focus on regulating output and content rather than trying to identify the origin of AI-generated material, as verification of source becomes increasingly difficult
Provide fundamental information through watermarking and verification systems that allow people to make their own decisions rather than forcing judgments about what is real or fake
Use existing criminal codes and legislation to address AI-enabled crimes like scams, rather than creating entirely new legal frameworks
Implement incremental arrangements with tech companies through licensing conditionalities while working toward broader international cooperation
Require disclosure and transparency (such as watermarking AI-generated political content) as an alternative to restricting speech, though this faces constitutional challenges
Invest in both regulation and digital infrastructure to ensure broader participation while protecting against harmful uses
Combine legislative approaches with investment in technology for good and media literacy education
Use keyword technology and authentication to regulate content based on the nature of vulnerable populations in specific spaces rather than blanket restrictions
Thought provoking comments
Truth is becoming less relevant… what you engage with, what you look at, is things, content that is already confirming your held beliefs and are kind of helping you stay in this comfortable bubble that it’s hard and harder to pierce with factual debate and true, well, facts, so to say.
Speaker
Grunde Almeland
Reason
This comment cuts to the philosophical heart of the democratic crisis in the digital age – not just that misinformation exists, but that truth itself is losing its currency as people retreat into confirmation bias bubbles. It reframes the problem from technical to epistemological.
Impact
This observation set the tone for the entire discussion by establishing that the challenge isn’t just about regulating technology, but about fundamental changes in how societies relate to truth. It influenced subsequent speakers to focus on building trust and verification mechanisms rather than just content moderation.
We don’t have a law that specifically addresses misinformation and disinformation, not because the law is somewhere and needs to quickly come… hitting the balance between protection of human rights and regulating and also allowing innovation to unhinged, to progress unhinged, is something that is beyond legislation, is something that sometimes is beyond the politics of the day.
Speaker
Catherine Mumma
Reason
This comment reveals the profound complexity of democratic governance in the digital age – acknowledging that some challenges transcend traditional legislative solutions and require deeper societal consensus-building.
Impact
This shifted the conversation from a focus on what laws to pass toward a more nuanced discussion about the limits of legislation and the need for multi-stakeholder approaches, influencing other panelists to discuss non-legislative solutions like media literacy and international cooperation.
The deepfake was about the prime minister saying something in relation to another major world power… Now that has the potential to completely, especially in this current global political environment, to completely put at risk a lot of what a country is doing with respect to policy and global engagement. So we’re not just talking about domestic trust, but we’re talking about international and a country’s global position in the world.
Speaker
Marsha Caddle
Reason
This comment expanded the scope of the discussion beyond domestic democratic concerns to international relations and diplomacy, showing how digital manipulation can destabilize global governance systems.
Impact
This observation elevated the urgency of the discussion by demonstrating that digital threats to democracy have immediate geopolitical consequences, leading other speakers to emphasize the need for international cooperation and shared standards.
To imagine that countries in places like Africa will at one point be at par with Silicon Valley is a fallacy. To imagine that such advanced economies do not have responsibilities is also wrong… what will happen with this AI is that my people will be condemned to digital plantations, just like they were condemned with sugar cane and with coffee and with all these other things that happened in slave trade.
Speaker
John K.J. Kiarie (audience member from Kenya)
Reason
This powerful intervention reframed the entire discussion through a post-colonial lens, challenging the assumption that all countries are equal participants in digital governance and drawing explicit parallels to historical exploitation.
Impact
This comment fundamentally shifted the conversation’s power dynamics, forcing panelists to confront issues of technological colonialism and global inequality. It led to more substantive discussion about the responsibilities of developed nations and tech companies, with multiple panelists acknowledging the validity of this critique in their closing remarks.
We see sort of technology in the hands of very few players right now that are, for better or worse, profit-driven. And how do we push technology to be the solution in the technology age?… how can we fund that? How can we put more money into our academic institutions to have the compute power to compete with the largest AI companies?
Speaker
Rebecca Bauer-Kahn
Reason
This comment identified a structural problem – the concentration of technological power – and proposed a concrete alternative pathway through public investment in ‘technology for good,’ moving beyond regulatory responses to proactive solutions.
Impact
This shifted the discussion from defensive measures (regulating harmful technology) to offensive strategies (developing beneficial alternatives), inspiring other panelists to discuss investment in local tech ecosystems and capacity building.
Most things are already quite heavily legislated… sometimes it has to be amended, and sometimes we need to come up with new legislation, but most of things are already legislated. We just have to see how technology fits into it… I think international cooperation is often more the answer than, you know, coming up with the exact new legislation.
Speaker
Grunde Almeland
Reason
This comment challenged the prevailing assumption that new technologies require entirely new legal frameworks, suggesting instead that existing laws need better enforcement and international coordination.
Impact
This pragmatic perspective helped ground the discussion in practical governance realities, leading other speakers to focus more on implementation challenges and international cooperation mechanisms rather than drafting new legislation.
Overall assessment
These key comments fundamentally shaped the discussion by progressively expanding its scope and depth. The conversation began with technical concerns about AI and elections but evolved into a sophisticated analysis of global power structures, epistemological challenges to democracy, and the limits of traditional governance approaches. The intervention by the Kenyan MP was particularly transformative, forcing the panel to confront uncomfortable truths about technological inequality and historical patterns of exploitation. This led to a more honest and substantive discussion about the responsibilities of developed nations and the need for truly equitable international cooperation. The comments collectively moved the discussion from a narrow focus on content moderation and election security to broader questions about truth, power, and justice in the digital age, ultimately producing a more nuanced understanding of the challenges facing democratic governance in the 21st century.
Follow-up questions
How can we regulate digital technology violations across different sectors beyond just political spaces (health, water, etc.)?
Speaker
Catherine Mumma
Explanation
She emphasized the need to think broadly about digital technology laws across all sectors, not just focusing on political/democratic spaces, as violations could have profound implications in healthcare, water management, and other critical areas.
What mechanisms can be developed for auditing information at regional and international levels?
Speaker
Catherine Mumma
Explanation
She suggested exploring mechanisms like expanding mandates of existing bodies (Data Protection Commissioner, Media Council) and creating African Union or East African community mechanisms for better monitoring.
How can we invest in technology for good and fund academic institutions to compete with large AI companies?
Speaker
Rebecca Bauer-Kahn
Explanation
She highlighted the need for more funding for academic institutions to have compute power to build large language models and create technology solutions that serve public good rather than just profit.
How can watermarking technology be improved and implemented globally?
Speaker
Rebecca Bauer-Kahn
Explanation
She noted that while California and EU are pushing for watermarking requirements, the technology needs further development to be effective globally in distinguishing real from AI-generated content.
How can international cooperation be improved to tackle cross-border digital crimes and scams?
Speaker
Grunde Almeland
Explanation
He emphasized that most digital crimes fall under existing criminal codes but lack of international cooperation makes enforcement difficult, citing sophisticated international scam operations.
How can global platforms improve their policies for all users regardless of country?
Speaker
Zafar Alizoda
Explanation
He pointed out that platform policies differ by region, with EU citizens protected by GDPR while developing countries lack such priority, creating unequal protection standards.
How can we prevent technological dumping and ensure advanced economies take responsibility for technology impacts in developing countries?
Speaker
John K.J. Kiarie
Explanation
He raised concerns about advanced countries and big tech companies engaging in practices in developing countries that they wouldn’t do in their own jurisdictions, calling for practical IGF actions to address this disparity.
What are practical steps to prevent electronic voting system misuse and maintain electoral integrity?
Speaker
Audience member from Peru
Explanation
The question addressed concerns about data transmission problems and electoral process manipulation in electronic voting systems, seeking legislative solutions to protect democratic processes.
How can we differentiate between human and AI-generated content/actors in digital spaces?
Speaker
Senator Kenneth Pugh from Chile
Explanation
He raised fundamental questions about human identity verification in cyberspace and how to maintain human rights protections when AI systems are given freedom of expression capabilities.
Should social media platforms be required to verify user identity similar to banking systems?
Speaker
Hugo Carneiro from Portugal
Explanation
He questioned whether stronger identity verification requirements for social media accounts could help combat fake profiles and misinformation.
What is the effectiveness of age restrictions for social media access for minors?
Speaker
Hugo Carneiro from Portugal
Explanation
He referenced France’s proposed ban on social media for under-15s and asked whether such restrictions are effective solutions for protecting young people from misinformation.
Are voluntary codes of conduct sufficient for social media platforms or do we need mandatory regulation and alternative platforms?
Speaker
Anna Luhmann from Germany
Explanation
She questioned whether voluntary self-regulation would be adequate or if stronger regulatory measures and democracy-supporting alternative platforms are needed.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Building Digital Governance Together
Taking Stock
Networking Session #26 Transforming Diplomacy for a Shared Tomorrow
Networking Session #26 Transforming Diplomacy for a Shared Tomorrow
Session at a glance
Summary
This networking session focused on the transformative role of AI and AI-powered tools in modern diplomacy, hosted by the Data Innovation Lab of the German Federal Foreign Office. Sebastian Blum, the program manager, opened by explaining how AI is already reshaping diplomatic practices through enhanced capacity, predictive analytics, and comprehensive data processing capabilities. He emphasized that major foreign ministries are rapidly adopting AI technologies, making investment in AI capacity and data literacy essential for maintaining competitive diplomatic positions.
Claire Patzig, a junior data associate and computer scientist at the lab, presented the technical perspective on developing AI tools for diplomats. She stressed that their approach focuses on enriching rather than replacing human diplomatic work, recognizing diplomacy as a discipline with centuries of tradition involving sensitive government communications. Patzig outlined three current use cases: virtual embassies for public diplomacy in remote or politically challenging areas, negotiation tools for international conferences like COP summits, and capacity-building prototypes to train young diplomats in various scenarios.
The German approach involves establishing data labs across all federal ministries, creating a bottom-up technological development strategy tailored to specific user needs. This structure promotes collaboration while addressing interoperability challenges through peer-to-peer networks and shared projects with organizations like GIZ. The discussion highlighted the importance of cultural sensitivity, international partnerships, and responsible AI implementation in diplomatic contexts.
Questions from the audience addressed technical interoperability concerns and the potential for providing diplomatic training tools to underserved regions, particularly small island states in the Pacific where elected officials often lack formal diplomatic training. The session emphasized that successful AI integration in diplomacy requires continuous communication, respect for existing expertise, and focus on enhancing rather than replacing human diplomatic connections.
Keypoints
**Major Discussion Points:**
– **AI’s transformative role in diplomacy**: The discussion covers how AI tools are already reshaping diplomatic practices through NLP algorithms for processing diplomatic records, predictive analytics for geopolitical forecasting, and the urgent need for foreign ministries to adopt AI capabilities to remain competitive internationally.
– **Germany’s bottom-up approach to AI development**: The speakers explain Germany’s strategy of establishing data labs in every federal ministry to develop AI tools from the ground up, working directly with diplomats rather than imposing technology solutions, ensuring tools meet actual user needs and use cases.
– **Specific AI applications in diplomatic work**: Three concrete use cases are presented – virtual embassies for public diplomacy in remote or politically challenging areas, negotiation support tools for international processes like COP climate negotiations, and training systems for young diplomats to practice argumentation and decision-making.
– **Interoperability and collaboration challenges**: Discussion of technical and organizational challenges in ensuring different government AI systems can communicate effectively, the importance of peer-to-peer networks among data scientists, and the need for clear communication and respect between technical and diplomatic communities.
– **Capacity building for underserved regions**: A participant from the Cook Islands raises concerns about smaller nations lacking diplomatic training resources, highlighting the potential for AI tools to support countries with limited diplomatic education infrastructure.
**Overall Purpose:**
The discussion aims to showcase the German Federal Foreign Office’s Data Innovation Lab work in integrating AI into diplomatic practices, promote dialogue about responsible AI use in diplomacy, and explore partnerships for developing shared tools that can benefit the global diplomatic community.
**Overall Tone:**
The tone is collaborative and educational throughout, with speakers emphasizing partnership over competition. The presenters maintain a humble, service-oriented approach, repeatedly stressing that AI should enrich rather than replace human diplomatic work. The tone becomes more interactive and practical when addressing audience questions, particularly around technical interoperability and capacity building for developing nations, demonstrating genuine interest in inclusive solutions.
Speakers
– **Sebastian Blum**: Program manager at the Data Innovation Lab by the Federal Foreign Office, moderator for the session
– **Claire Patzig**: Junior data associate at the Data Innovation Lab by the Federal Republic of Germany from the FFO (Federal Foreign Office), computer scientist involved in developing software for diplomats
– **Audience**: Multiple audience members including:
– Tauga: Works with the German International cooperation agency (GIZ)
– Maureen Hilliard: From the Cook Islands, one of the small island states in the Pacific
Additional speakers:
None identified beyond those in the speakers names list.
Full session report
# Report: AI-Powered Tools in Modern Diplomacy – A Networking Session by the German Federal Foreign Office Data Innovation Lab
## Executive Summary
This networking session, hosted by the Data Innovation Lab of the German Federal Foreign Office, explored the role of artificial intelligence in contemporary diplomatic practices. The discussion was moderated by Sebastian Blum (program manager) and featured Claire Patzig (Junior Data Associate and Computer Scientist). Originally planned as a panel with multiple government representatives, the session was restructured due to cancellations. The speakers examined how AI technologies are being integrated into diplomatic work while emphasizing human-centered development approaches and Germany’s bottom-up strategy for AI implementation across federal ministries.
## Opening Context and AI’s Role in Diplomacy
Sebastian Blum opened the session by establishing the context for AI adoption in diplomatic institutions. He outlined how AI tools are reshaping diplomatic practices through enhanced capacity via natural language processing, predictive analytics using historical and real-time data, and improved data processing capabilities for reporting and analysis.
Blum emphasized that major foreign ministries worldwide are adopting AI technologies, creating an environment where investment in AI capacity and data literacy has become important for maintaining effective diplomatic positions. He noted the competitive aspect of this technological adoption in the diplomatic landscape.
## Germany’s Human-Centered Approach
Claire Patzig provided insight into Germany’s approach to AI development in diplomatic contexts, emphasizing that their methodology focuses on enriching rather than replacing human diplomatic work. She stated: “It’s not about technology being arrogant and saying we can in any way replace human beings but instead it’s about enriching what we are currently already doing.”
Patzig acknowledged diplomacy as a discipline with centuries of accumulated expertise and stressed that technology development must understand the people it serves—diplomats who navigate complex international relationships through sophisticated communication strategies.
## Bottom-Up Development Strategy
The German approach involves establishing data labs across federal ministries, creating what Patzig described as a bottom-up technological development strategy. This decentralized structure ensures that AI tools are developed based on specific user needs and actual use cases rather than imposing top-down solutions.
Patzig mentioned that this approach involves collaboration between ministries and organizations such as the German International Cooperation Agency (GIZ), though she noted challenges in ensuring interoperability between different systems.
## Three Key Applications
Patzig outlined three applications currently being developed:
### Virtual Embassies
Digital platforms that provide services to citizens abroad and people wanting to learn about Germany in areas where physical embassies cannot operate due to various constraints. Patzig mentioned a “make-a-thon” involving students that focused on developing virtual embassy concepts.
### Negotiation Support Tools
AI systems designed to support diplomats during complex international negotiations, particularly in multi-stakeholder processes. Patzig noted they are “currently working together with other huge federal ministries and GIZ on a negotiation tool” for upcoming proceedings.
### Training Systems
AI-powered platforms to help diplomats develop skills through simulated scenarios and practice opportunities that would be difficult to replicate through traditional training methods.
## Technical and Implementation Challenges
The discussion addressed several challenges in implementing AI solutions within diplomatic contexts:
### Communication and Data Literacy
Patzig emphasized the importance of clear communication when introducing technical solutions to established diplomatic processes, noting the need to respect existing expertise while introducing technological enhancements.
### Evaluation Complexity
Unlike fields with clearly defined success metrics, Patzig noted that diplomatic AI applications face unique evaluation challenges: “If we are talking about diplomacy, we don’t have those clearly defined goals, right? It’s a very broad approach and here we really have to see how it actually can enhance diplomacy instead of big noise that just takes away from the human connection that is at the core of it.”
She contrasted this with medical applications like cancer detection, where accuracy can be clearly measured.
## International Collaboration and Capacity Building
Patzig discussed Germany’s commitment to international partnerships and mentioned plans for “open sourcing them and sort of levelling the playing field.” She emphasized the importance of avoiding complete dependency on single solutions and supporting shared values through international cooperation.
A significant moment came when Maureen Hilliard, representing the Cook Islands, raised concerns about capacity building needs in small island states. She explained that in Pacific island nations, government officials are often elected from small communities without formal diplomatic training: “People in the Pacific, the government is actually elected by the small communities, people that, you know, someone within the community, no experience whatsoever with government sort of processes… But there is no training.” Her question was cut off mid-sentence in the transcript.
## Responsible AI Implementation
Throughout the discussion, speakers emphasized principles for responsible AI implementation in diplomatic contexts, including cultural sensitivity and preservation of human connections in diplomacy. Patzig noted that diplomatic AI applications must respect diverse traditions and communication styles while enhancing rather than replacing human diplomatic capabilities.
She also mentioned considerations around sovereignty and European independence in AI development, though this topic was not extensively elaborated.
## Current AI Usage
Patzig mentioned that they have been “using large language models probably since [unclear – possibly ChatGPT] for about nine years now,” indicating ongoing experience with AI technologies in their work.
## Session Limitations and Conclusion
*Note: The transcript appears to have quality issues in several places, with repetitive sections and unclear audio. The session ended with an incomplete question from the audience member representing the Cook Islands.*
This networking session provided insight into Germany’s thoughtful approach to integrating AI technologies into diplomatic practice. The German Federal Foreign Office’s Data Innovation Lab presented a model that emphasizes enhancement rather than replacement of human capabilities, with focus on bottom-up development and international collaboration. However, several challenges remain unresolved, particularly regarding capacity building for underserved regions, interoperability between systems, and developing appropriate evaluation frameworks for diplomatic AI applications.
The discussion highlighted both the potential benefits of AI in diplomacy and the complexity of implementing such technologies in a field that relies fundamentally on human relationships and cultural understanding.
Session transcript
Sebastian Blum: So, welcome everyone to our networking session on transforming diplomacy for Shared Tomorrow. My name is Sebastian Blum. I’m a program manager at the Data Innovation Lab by the Federal Foreign Office and I’m more than excited to be your moderator for today’s session. As said before, this session is brought to you by the Data Innovation Lab and today we’re discussing a crucial topic which is the transformative force of AI and AI-powered tools in diplomacy. In this session we wanted to showcase some of the potential that AI and AI-powered tools bring in foreign policy and we initially wanted to offer a platform for open dialogue among different government representatives from different regions. As you can see, there have been some minor changes in the setting of the panel because we had some cancellations so we were actually thinking to focus a little bit more on the work of the Data Innovation Lab in the next following 25 minutes. Having said that, let me briefly set the stage a bit about how AI is actually already reshaping the practice of diplomacy and talk a bit about the changes in our work that we’re already witnessing at the Data Innovation Lab itself. So firstly, we can definitely see that AI tools amplify diplomatic capacity. For example, NLP algorithms can extract key insights and transform transcripts, meetings, minutes and other diplomatic records facilitating the creation of comprehensive and timely reports. Also the huge amount of data and AI holds a huge capacity for predictive analytics, for example by leveraging historical and real-time data to forecast geopolitical trends and potential outcomes for policy decisions. And lastly, we’re also already witnessing major foreign ministries racing ahead with AI adoption and this is definitely highlighting that the integration of AI for sure is urgent and vital and therefore investing in AI capacity and more importantly data literacy is not optional but necessary. AI diplomats may risk losing ground on the international forum. Having clarified this and with this context in mind, I would like to turn to our speaker and my dear colleague, Claire Patzig, who is a junior data associate at the Data Innovation Lab by the Federal Republic of Germany from the FFO and I’m really honored to have this exciting panel with you today and I would like to hand over but before a small procedural note like we’re having a short introduction by Claire with some inputs and afterwards also due to the circumstances we’re having this panel right now, we want to open the floor also to you to share your experiences and let some space for your questions. So Claire, the floor is yours. You might share some of the challenges and opportunities in integrating AI into diplomatic practices.
Claire Patzig: Thank you so much for the kind introduction. So I’m going to speak about this topic more from the view of a computer scientist and of someone who is actually developing the software our diplomats are supposed to use and therefore I want to really highlight that one thing is similar for every software development process. It’s about the people you are developing for and in this case we are talking about diplomats. The sheer practice of diplomacy is something we in the end of the day do every single person does that, right? We are negotiating every single day and in this case we are talking about discipline with lots of tradition and in this case we are also talking about something where governments are dealing with the most sensitive issues currently happening worldwide. It’s about core values of people and in this case it’s about communication. We are talking about something that people have hundreds of years of experience with and I mean on a large scale we are using large language models probably since JCPT for about nine years now. So it’s not about technology being arrogant and saying we can in any way replace human beings but instead it’s about enriching what we are currently already doing and I think this approach is really shown in the take that the German government has taken. So in this approach every German federal ministry has a data lab so it’s from the ground up so technology is really developed for the people who are actually using that and their use cases and so also the German Federal Foreign Office has a data lab and the data innovation lab serves to communicate those findings to the outside world, connect with other partners, this is why we are organizing this networking session, as well as developing also prototypes themselves. And therefore I want to bring you like three current use cases we are working on to really also showcase how rich diplomacy is. So we just had a make-a-thon last week with students on virtual embassies, here we are talking about public diplomacy so it’s not about the practice of diplomacy itself but rather it’s really about our citizens abroad as well as people in other countries who want to learn about Germany or maybe coming to Germany. And here we see as a use case that obviously we might have an embassy in the capital but there are remote areas where people might have issues traveling to the capital for an embassy so we are really thinking about the system of virtual embassies where we might be able to offer our services in countries where we might not even have due to the current political environment or natural catastrophes being able to open an embassy and now we might be able to provide the service still to those people. And then if you look at the practice of diplomacy, right, we are really looking at negotiations at the heart of it and so we had a huge challenge last year on developing tools, really supporting diplomats and here really again focusing on working with diplomats instead of just working for them and currently we are working together with other huge federal ministries as well as the GIZ on developing a negotiation tool for the upcoming COP and here it’s also about sort of providing a service not only for Germany but open sourcing them and sort of leveling the playing field. I mean, most of the people in this room might be following the WSIS process and just following this one process, right, and keeping an eye on everything that is going on or the other institutions publishing on that just what the UN with reports is putting out is a lot and if you look at governments, they are obviously forced to sort of follow everything that is going on and even for somewhat large countries like Germany, this is a lot and here we really see a use case for AI again in sort of combining all of that knowledge and really making it easier for diplomats to stay ahead with everything. And last but not least, it’s also about capacity building. So obviously our academy knows what they are doing but still sort of supporting young diplomats in their journey is something where you really need individual support and we are currently developing, I’m doing that myself, a prototype really to drain young diplomats to sort of bring them in lots of many different situations, being able to come up fast with arguments and here we really think that this is again a use case that is interesting to every ministry of foreign affairs globally and where we again really want to stress working together and coming up with tools that really support each other. So here it’s really about having you as an audience also reaching out to us, sharing maybe what you are working on and really have some interactive dialogue.
Sebastian Blum: Well, thank you so much. I think you really were drawing to this point that the Data Innovation Lab itself was actually resulting out of the Data and AI Lab by the Federal Foreign Office. But drawing a little bit back to that, you were also referring to some different partnerships and initiatives. What do you think such initiatives as the Data Innovation Lab, how do they help to promote the responsible use of AI in diplomacy?
Claire Patzig: I think the responsible use here is that we are not talking about something like, for example, cancer detection, right? We can benchmark that and say we have an accuracy of about, I don’t know, 98%. If we are talking about diplomacy, we don’t have those clearly defined goals, right? It’s a very broad approach and here we really have to see how it actually can enhance diplomacy instead of big noise that just takes away from the human connection that is at the core of it. And there again, it’s about cultural sensitivity, right, like German diplomats might have a different focus than other parts of the world. And again, we learn from shared partnerships in this area because it’s about the practice, not really about the content, whatever negotiation, whether it’s climate or peace or whatever. It’s not about that. is the co-founder and co-creator of the AI and Diplomacy Lab. It’s not just about AI, it’s about the discipline and learning from each other again.
Sebastian Blum: Thank you so much. Having said that, I would really like to hand over to the floor as well to our online participants. Feel free to ask any questions to us about the Data Innovation Lab. The floor is yours to ask any questions to us about the Data Innovation Lab. We’re also very happy to have you here to talk about AI in the field of AI and diplomacy, and you’re also invited to present yourself, so the floor is yours. We’re also checking, of course, the online participants, so if there’s any question by our online audience, you’re also invited to present yourself. And we’re also happy to have you here to talk about AI in the field of AI and diplomacy. So, please, feel free to ask any questions. Thank you. And we’re also happy to have you here to talk about AI in the field of AI and diplomacy. So, please, feel free to ask any questions. And we’re also very happy to have you here to talk about AI in the field of AI and diplomacy. So, please, feel free to ask any questions. So, please, feel free to ask any questions. And we’re also happy to have you here to talk about AI in the field of AI and diplomacy. So, please, feel free to ask any questions. And we’re also happy to have you here to talk about AI in the field of AI and diplomacy. So, please, feel free to ask any questions. Please, feel free to ask any questions. U in May, we had a timed meeting mentioned spaces for AI, and we hosted a data talk. So, we are not only focussed on developing tools, but also soft aspects of that. So, integrating is possible. We do that in workshops, to embassies, partners worldwide, and being present in conferences, as well as inviting amazing speakers to our data talks. We also try to include people from different areas.
Claire Patzig: So, we try to make sure that the data talks aren’t maybe connected only to certain countries. And we are able to host ourselves. And we try to take that in account in our own work. And then, again, it’s about partnerships, right. About even, right, we are currently hearing a lot about sovereignty and about Europe being sovereign itself. And I think that’s a very important thing, because we are talking about the fact that the Internet doesn’t exist on its own just in one nation, right. We are at the Internet Governance Forum. We are talking about so deep interconnections. We have to take that into account and be realistic about what we are developing. And in some cases, if it’s not about something very sensitive, we can also still, obviously, work with other partners and not have to do everything on our own. So, I think that’s a very important thing, because we are talking about the fact that there are many countries and many ministries of foreign affairs that might still not have the capacities to develop tools themselves. And there again, I think it’s the same point again and again. We are coming back to partnerships, countries that support the same values and developing together. And obviously, we have a dependency, but that’s something that’s important, that is enrichment to what we are building.
Sebastian Blum: Thank you very much. Pose is the next slide.
Audience: Hi, my name is Tauga, I work with the German International cooperation agency. And previously, you told us that basically every German ministry has its own data lab. And I mean, we’re a government owned company in Germany, we also have our data lab, we also work on solutions like this. So what I’m interested in is how do you look at the issue of interoperability in such a bottom up approach from the technical perspective? How do you ensure that all these solutions that are being developed are able to communicate with each other and to link up even to the rest of the world? Thank you.
Claire Patzig: Yes, so there we are coming back again to plane, this was really the point to make it easier for also data labs to share data within government and to work together. We see that there is a peer to peer network established so that our data scientists are able to communicate and discuss what they are currently focusing on. Then again, I think for, I mean, we have within the government a new ministry, this will always be an issue that you constantly have to work on. And I think there’s lots that you can do in that regard. But for example, together with GIZ and for other ministries, we are currently working together on one shared project. And that’s just about discussing clear responsibilities, having everyone on board. And then with a shared goal in mind, it’s obviously very much possible to work together because we have one single focus. And here I think we also see especially as the Ministry of Foreign Affairs, because we are not solely focused on one issue, but we are covering many, many topics. And therefore, we are used to working together with other ministries and the experts working on those topics. So it’s bringing that into the digital sphere and also to our data scientists as a mindset. So that is, I think, very important to us as government.
Sebastian Blum: I think this was also particularly interesting because we were also talking more about the technological aspects. But what are the organizational challenges in our work that you’re witnessing coming out from a really specific position also inside of the organization of the Federal Foreign Office?
Claire Patzig: Yeah. I think it’s about clearly, it’s about always the same thing. It’s about communication. And clearly communicating as maybe the more technical community, respect for the work that is already happening. It’s not about, again, replacing anything that’s going on, but enriching. And at the same time, obviously, people have lots of experience and they know what they are doing. So you have to adapt to new processes. I think something everyone struggles with in all areas, for example, is how you sort of collect data to have that in a format that works for the technical community, but that might not be relevant for the people who are currently working as experts on those topics. And there it’s about capacity building and about data literacy and just coming back to the basics again and again. Yeah.
Sebastian Blum: Thank you so much. I see we have another question on the floor. You just can talk into the microphone. Hi. Is it on? Yeah, it is on. We can hear you clearly.
Audience: My name is Maureen Hilliard. I’m actually from an underserved region. I’m from the Cook Islands, one of the small island states in the Pacific. And one of the things that I’ve actually sort of like found in the 20 years that I’ve lived on that little island is that, you know, like, I mean, we would be really appreciative of developing tools of development for new diplomats. I mean, people in the Pacific, the government is actually elected by the small communities, people that, you know, someone within the community, no experience whatsoever with government sort of processes and things like that. But, you know, there seem to be nice people, so they get elected to government. But there is no training. I do know there’s no training for the whole diplomacy school. It’s the sort of schools that many of these people sort of like get sent overseas to work with other sort of like ministries in their area. And I was just wondering, does your hub, does it provide sort of like training facilities for, you know, underserved regions where government officials have, you know, like really do require those schools be really, really important? I mean, I sit and listen, I mean, when I sort of like think of how wonderful some of the governments are and the work that they’re doing, you know, like I know that our guys go
Sebastian Blum
Speech speed
167 words per minute
Speech length
955 words
Speech time
341 seconds
AI tools amplify diplomatic capacity through NLP algorithms that extract insights from diplomatic records and facilitate comprehensive reporting
Explanation
Sebastian argues that natural language processing algorithms can analyze diplomatic documents like transcripts and meeting minutes to extract key insights, making it easier to create thorough and timely reports. This represents a significant enhancement to traditional diplomatic documentation processes.
Evidence
NLP algorithms can extract key insights and transform transcripts, meetings, minutes and other diplomatic records facilitating the creation of comprehensive and timely reports
Major discussion point
AI’s Transformative Role in Diplomacy
Topics
Development | Infrastructure
Agreed with
– Claire Patzig
Agreed on
AI should enhance rather than replace human diplomatic capabilities
AI enables predictive analytics by leveraging historical and real-time data to forecast geopolitical trends and policy outcomes
Explanation
Sebastian contends that AI’s ability to process vast amounts of historical and current data provides diplomats with predictive capabilities for understanding future geopolitical developments. This analytical capacity can inform better policy decision-making by anticipating potential outcomes.
Evidence
the huge amount of data and AI holds a huge capacity for predictive analytics, for example by leveraging historical and real-time data to forecast geopolitical trends and potential outcomes for policy decisions
Major discussion point
AI’s Transformative Role in Diplomacy
Topics
Development | Infrastructure
Agreed with
– Claire Patzig
Agreed on
Importance of partnerships and collaboration in AI development for diplomacy
Major foreign ministries are racing to adopt AI, making investment in AI capacity and data literacy necessary to avoid losing ground internationally
Explanation
Sebastian emphasizes the competitive nature of AI adoption in diplomacy, arguing that countries must invest in AI capabilities and data literacy to remain relevant on the international stage. He suggests that failing to adopt AI could result in diplomatic disadvantage.
Evidence
we’re also already witnessing major foreign ministries racing ahead with AI adoption and this is definitely highlighting that the integration of AI for sure is urgent and vital and therefore investing in AI capacity and more importantly data literacy is not optional but necessary
Major discussion point
AI’s Transformative Role in Diplomacy
Topics
Development | Capacity development
Agreed with
– Claire Patzig
Agreed on
Urgent need for AI adoption and capacity building in diplomatic institutions
Claire Patzig
Speech speed
170 words per minute
Speech length
1600 words
Speech time
561 seconds
AI should enrich existing diplomatic practices rather than replace human beings, focusing on supporting people with hundreds of years of diplomatic experience
Explanation
Claire emphasizes that AI technology should complement and enhance human diplomatic capabilities rather than substitute for human judgment and experience. She stresses the importance of respecting the long tradition and expertise that exists in diplomatic practice.
Evidence
It’s not about technology being arrogant and saying we can in any way replace human beings but instead it’s about enriching what we are currently already doing
Major discussion point
Human-Centered Approach to AI Development in Diplomacy
Topics
Development | Capacity development
Agreed with
– Sebastian Blum
Agreed on
AI should enhance rather than replace human diplomatic capabilities
Technology development must be grounded in understanding the people it serves – diplomats who deal with sensitive issues and core values through communication
Explanation
Claire argues that successful AI development in diplomacy requires deep understanding of the end users and their work context. She emphasizes that diplomats handle the most sensitive global issues and core human values, requiring technology solutions that respect this responsibility.
Evidence
one thing is similar for every software development process. It’s about the people you are developing for and in this case we are talking about diplomats… governments are dealing with the most sensitive issues currently happening worldwide
Major discussion point
Human-Centered Approach to AI Development in Diplomacy
Topics
Development | Capacity development
The German approach involves every federal ministry having a data lab to develop technology from the ground up for actual users and their specific use cases
Explanation
Claire describes Germany’s decentralized approach to AI development where each ministry operates its own data lab. This bottom-up strategy ensures that technology solutions are tailored to the specific needs and use cases of the people who will actually use them.
Evidence
every German federal ministry has a data lab so it’s from the ground up so technology is really developed for the people who are actually using that and their use cases
Major discussion point
Human-Centered Approach to AI Development in Diplomacy
Topics
Development | Infrastructure
Virtual embassies can provide diplomatic services in remote areas or countries where physical embassies cannot operate due to political or natural circumstances
Explanation
Claire presents virtual embassies as a practical AI application that extends diplomatic reach beyond traditional physical limitations. This technology can serve citizens and interested parties in areas where establishing physical diplomatic presence is impossible or impractical.
Evidence
we might have an embassy in the capital but there are remote areas where people might have issues traveling to the capital for an embassy… we might be able to offer our services in countries where we might not even have due to the current political environment or natural catastrophes
Major discussion point
Practical AI Applications in Diplomatic Work
Topics
Development | Digital access
AI tools can support diplomatic negotiations by helping diplomats stay current with multiple international processes and institutional reports
Explanation
Claire identifies information management as a key challenge for diplomats who must track numerous international processes and reports. AI can help synthesize and organize this vast amount of information, making it more manageable for diplomatic professionals.
Evidence
just following this one process, right, and keeping an eye on everything that is going on or the other institutions publishing on that just what the UN with reports is putting out is a lot… here we really see a use case for AI again in sort of combining all of that knowledge
Major discussion point
Practical AI Applications in Diplomatic Work
Topics
Development | Infrastructure
AI can assist in capacity building by training young diplomats through simulated scenarios to develop quick argumentation skills
Explanation
Claire describes an AI application for diplomatic training that provides individualized support to new diplomats. The system creates various scenarios to help trainees practice developing arguments quickly, addressing the need for personalized diplomatic education.
Evidence
we are currently developing, I’m doing that myself, a prototype really to drain young diplomats to sort of bring them in lots of many different situations, being able to come up fast with arguments
Major discussion point
Practical AI Applications in Diplomatic Work
Topics
Development | Capacity development | Online education
Responsible AI use in diplomacy requires focusing on enhancing human connection rather than replacing it, considering cultural sensitivity across different regions
Explanation
Claire argues that responsible AI implementation must preserve the human elements that are central to diplomatic practice while being sensitive to cultural differences. She emphasizes that diplomacy lacks clearly defined success metrics like other fields, requiring a more nuanced approach.
Evidence
we don’t have those clearly defined goals, right? It’s a very broad approach and here we really have to see how it actually can enhance diplomacy instead of big noise that just takes away from the human connection that is at the core of it
Major discussion point
Responsible AI Implementation and Partnerships
Topics
Human rights | Cultural diversity
International partnerships are essential for developing tools that support shared values while avoiding complete dependency on single solutions
Explanation
Claire advocates for collaborative international development of AI tools among countries that share similar values. She acknowledges the reality of interdependence while emphasizing the importance of partnerships to avoid over-reliance on any single solution or provider.
Evidence
countries that support the same values and developing together. And obviously, we have a dependency, but that’s something that’s important, that is enrichment to what we are building
Major discussion point
Responsible AI Implementation and Partnerships
Topics
Development | Infrastructure
Agreed with
– Sebastian Blum
Agreed on
Importance of partnerships and collaboration in AI development for diplomacy
Interoperability challenges in bottom-up approaches require peer-to-peer networks among data scientists and clear communication of responsibilities in shared projects
Explanation
Claire addresses the technical challenges of ensuring different AI systems can work together in a decentralized development approach. She describes solutions including networks for data scientists to communicate and clear project management for collaborative efforts.
Evidence
there we are coming back again to plane, this was really the point to make it easier for also data labs to share data within government and to work together. We see that there is a peer to peer network established
Major discussion point
Responsible AI Implementation and Partnerships
Topics
Infrastructure | Digital standards
Communication and respect for existing expertise are crucial when introducing technical solutions to diplomatic processes
Explanation
Claire identifies organizational communication as a key challenge, emphasizing the need for technical teams to respect existing diplomatic expertise. She stresses that AI should enhance rather than replace current practices and that successful implementation requires adapting to established processes.
Evidence
it’s about always the same thing. It’s about communication. And clearly communicating as maybe the more technical community, respect for the work that is already happening
Major discussion point
Organizational and Capacity Building Challenges
Topics
Development | Capacity development
Data collection and formatting present ongoing challenges requiring capacity building and data literacy training
Explanation
Claire highlights the practical difficulties of collecting and formatting data in ways that work for both technical systems and domain experts. She identifies this as a universal challenge requiring ongoing education and skill development across organizations.
Evidence
something everyone struggles with in all areas, for example, is how you sort of collect data to have that in a format that works for the technical community, but that might not be relevant for the people who are currently working as experts on those topics
Major discussion point
Organizational and Capacity Building Challenges
Topics
Development | Capacity development | Data governance
Agreed with
– Sebastian Blum
Agreed on
Urgent need for AI adoption and capacity building in diplomatic institutions
Audience
Speech speed
157 words per minute
Speech length
343 words
Speech time
130 seconds
There is significant need for diplomatic training tools in underserved regions where government officials lack formal diplomatic education
Explanation
An audience member from the Cook Islands describes how small island states often elect community members to government positions without any diplomatic training or experience. This highlights a significant gap in diplomatic capacity building for underserved regions that could benefit from AI-powered training tools.
Evidence
people in the Pacific, the government is actually elected by the small communities, people that, you know, someone within the community, no experience whatsoever with government sort of processes and things like that… But there is no training
Major discussion point
Organizational and Capacity Building Challenges
Topics
Development | Capacity development | Digital access
Agreements
Agreement points
AI should enhance rather than replace human diplomatic capabilities
Speakers
– Sebastian Blum
– Claire Patzig
Arguments
AI tools amplify diplomatic capacity through NLP algorithms that extract insights from diplomatic records and facilitate comprehensive reporting
AI should enrich existing diplomatic practices rather than replace human beings, focusing on supporting people with hundreds of years of diplomatic experience
Summary
Both speakers agree that AI’s role in diplomacy should be to augment and support human diplomatic work rather than substitute for human judgment and expertise. They emphasize that AI tools should amplify existing capabilities while respecting the traditional knowledge and experience of diplomatic professionals.
Topics
Development | Capacity development
Urgent need for AI adoption and capacity building in diplomatic institutions
Speakers
– Sebastian Blum
– Claire Patzig
Arguments
Major foreign ministries are racing to adopt AI, making investment in AI capacity and data literacy necessary to avoid losing ground internationally
Data collection and formatting present ongoing challenges requiring capacity building and data literacy training
Summary
Both speakers recognize the critical importance of building AI capabilities and data literacy within diplomatic organizations. They agree that this is not optional but necessary for maintaining competitive diplomatic effectiveness.
Topics
Development | Capacity development
Importance of partnerships and collaboration in AI development for diplomacy
Speakers
– Sebastian Blum
– Claire Patzig
Arguments
AI enables predictive analytics by leveraging historical and real-time data to forecast geopolitical trends and policy outcomes
International partnerships are essential for developing tools that support shared values while avoiding complete dependency on single solutions
Summary
Both speakers emphasize the collaborative nature of diplomatic AI development, recognizing that partnerships are essential for creating effective tools while maintaining sovereignty and shared values.
Topics
Development | Infrastructure
Similar viewpoints
Both speakers share a user-centered approach to AI development in diplomacy, emphasizing that technology must be designed with deep understanding of diplomatic professionals’ needs and the sensitive nature of their work.
Speakers
– Sebastian Blum
– Claire Patzig
Arguments
AI tools amplify diplomatic capacity through NLP algorithms that extract insights from diplomatic records and facilitate comprehensive reporting
Technology development must be grounded in understanding the people it serves – diplomats who deal with sensitive issues and core values through communication
Topics
Development | Capacity development
Both recognize the critical need for diplomatic training and capacity building, particularly for new or inexperienced government officials, and see AI as a potential solution to address training gaps.
Speakers
– Claire Patzig
– Audience
Arguments
AI can assist in capacity building by training young diplomats through simulated scenarios to develop quick argumentation skills
There is significant need for diplomatic training tools in underserved regions where government officials lack formal diplomatic education
Topics
Development | Capacity development | Digital access
Unexpected consensus
Bottom-up approach to AI development in government
Speakers
– Sebastian Blum
– Claire Patzig
Arguments
Major foreign ministries are racing to adopt AI, making investment in AI capacity and data literacy necessary to avoid losing ground internationally
The German approach involves every federal ministry having a data lab to develop technology from the ground up for actual users and their specific use cases
Explanation
Despite the competitive pressure for rapid AI adoption that Sebastian mentions, both speakers surprisingly agree on a methodical, decentralized approach where each ministry develops its own solutions. This consensus on taking time for proper, user-centered development despite competitive pressures is unexpected.
Topics
Development | Infrastructure
Cultural sensitivity and respect for traditional diplomatic practices
Speakers
– Sebastian Blum
– Claire Patzig
Arguments
AI enables predictive analytics by leveraging historical and real-time data to forecast geopolitical trends and policy outcomes
Responsible AI use in diplomacy requires focusing on enhancing human connection rather than replacing it, considering cultural sensitivity across different regions
Explanation
While Sebastian focuses on the technical capabilities and competitive advantages of AI, there’s unexpected consensus with Claire’s emphasis on cultural sensitivity and preserving human connections. This shows alignment between technical advancement and humanistic values.
Topics
Human rights | Cultural diversity | Development
Overall assessment
Summary
The speakers demonstrate strong consensus on fundamental principles: AI should augment rather than replace human diplomatic capabilities, capacity building is essential, partnerships are crucial for development, and user-centered design must guide implementation. There’s also agreement on practical applications like training tools and the need for cultural sensitivity.
Consensus level
High level of consensus with significant implications for diplomatic AI development. The agreement suggests a mature, thoughtful approach to AI integration that balances technological advancement with respect for diplomatic tradition and human expertise. This consensus could facilitate more coordinated international efforts in developing responsible AI tools for diplomacy.
Differences
Different viewpoints
Unexpected differences
Overall assessment
Summary
The discussion shows minimal disagreement, with speakers generally aligned on AI’s role in diplomacy. The main difference lies in emphasis rather than fundamental disagreement – Sebastian focuses more on AI’s transformative potential and competitive necessity, while Claire emphasizes human-centered development and respect for existing diplomatic expertise.
Disagreement level
Very low level of disagreement. This was a collaborative presentation rather than a debate, with speakers complementing each other’s perspectives. The slight difference in emphasis (technological capabilities vs. human-centered approach) actually strengthens their overall argument for responsible AI implementation in diplomacy. The audience question about training needs for underserved regions was supportive rather than challenging, indicating broad consensus on the topic’s importance.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers share a user-centered approach to AI development in diplomacy, emphasizing that technology must be designed with deep understanding of diplomatic professionals’ needs and the sensitive nature of their work.
Speakers
– Sebastian Blum
– Claire Patzig
Arguments
AI tools amplify diplomatic capacity through NLP algorithms that extract insights from diplomatic records and facilitate comprehensive reporting
Technology development must be grounded in understanding the people it serves – diplomats who deal with sensitive issues and core values through communication
Topics
Development | Capacity development
Both recognize the critical need for diplomatic training and capacity building, particularly for new or inexperienced government officials, and see AI as a potential solution to address training gaps.
Speakers
– Claire Patzig
– Audience
Arguments
AI can assist in capacity building by training young diplomats through simulated scenarios to develop quick argumentation skills
There is significant need for diplomatic training tools in underserved regions where government officials lack formal diplomatic education
Topics
Development | Capacity development | Digital access
Takeaways
Key takeaways
AI should augment rather than replace human diplomatic expertise, focusing on enriching existing practices while respecting the traditional discipline of diplomacy
Germany’s bottom-up approach with data labs in every federal ministry enables technology development tailored to specific user needs and use cases
Three key AI applications in diplomacy are emerging: virtual embassies for underserved regions, negotiation support tools for international processes like COP, and capacity building through simulated training scenarios
International partnerships and open-source development are essential for creating equitable AI tools that serve countries with varying technological capacities
Responsible AI implementation requires cultural sensitivity, clear communication between technical and diplomatic communities, and focus on enhancing human connection
Data literacy and capacity building are fundamental organizational challenges that must be addressed alongside technical development
Resolutions and action items
Continue developing the negotiation tool for upcoming COP in collaboration with other federal ministries and GIZ
Open-source AI tools to level the playing field for countries with limited technological resources
Maintain peer-to-peer networks among data scientists across German federal ministries to facilitate communication and collaboration
Provide workshops and training to embassies and international partners on AI integration
Encourage audience members to reach out and share their own AI development work for collaborative dialogue
Unresolved issues
How to effectively address the training needs of underserved regions like small island states where government officials lack formal diplomatic education
Technical interoperability challenges between different data labs and AI systems across ministries and international partners
Specific methods for collecting and formatting data that works for both technical communities and diplomatic experts
Balancing national sovereignty concerns with the need for international collaboration in AI development
Defining clear benchmarks for success in diplomatic AI applications, unlike more measurable fields like medical diagnosis
Suggested compromises
Work with partners on non-sensitive applications while maintaining sovereignty over critical diplomatic tools
Accept some level of dependency on international partnerships as enrichment rather than weakness, particularly for countries lacking development capacity
Adapt existing diplomatic processes to accommodate new data collection requirements while respecting traditional expertise
Focus on shared values and common practices in diplomacy rather than content-specific applications to enable broader international collaboration
Thought provoking comments
It’s not about technology being arrogant and saying we can in any way replace human beings but instead it’s about enriching what we are currently already doing… We are talking about something that people have hundreds of years of experience with and I mean on a large scale we are using large language models probably since JCPT for about nine years now.
Speaker
Claire Patzig
Reason
This comment is insightful because it reframes the entire AI-diplomacy discussion by emphasizing humility and augmentation rather than replacement. It acknowledges the deep historical tradition of diplomacy while contextualizing AI as a relatively new tool that should complement rather than compete with human expertise.
Impact
This comment established the philosophical foundation for the entire discussion, shifting the conversation away from potential fears about AI replacing diplomats to a more collaborative framework. It set the tone for all subsequent discussions about practical applications and partnerships.
If we are talking about diplomacy, we don’t have those clearly defined goals, right? It’s a very broad approach and here we really have to see how it actually can enhance diplomacy instead of big noise that just takes away from the human connection that is at the core of it.
Speaker
Claire Patzig
Reason
This observation is particularly thought-provoking because it highlights a fundamental challenge in applying AI to diplomacy – the lack of clear, measurable outcomes unlike in fields like medical diagnosis. It emphasizes that diplomatic success cannot be easily quantified and that human connection remains central.
Impact
This comment deepened the technical discussion by introducing the complexity of evaluation metrics in diplomatic contexts. It led Sebastian to probe further about responsible AI use and shifted the conversation toward the nuanced challenges of implementing AI in inherently human-centered fields.
How do you look at the issue of interoperability in such a bottom up approach from the technical perspective? How do you ensure that all these solutions that are being developed are able to communicate with each other and to link up even to the rest of the world?
Speaker
Tauga (GIZ representative)
Reason
This question is insightful because it identifies a critical systemic challenge that wasn’t previously addressed – the risk of creating isolated solutions in a bottom-up approach. It demonstrates sophisticated understanding of both technical and organizational challenges in government digitalization.
Impact
This question significantly shifted the discussion from theoretical applications to practical implementation challenges. It forced the speakers to address real-world coordination problems and led to a more detailed explanation of inter-ministerial collaboration mechanisms.
People in the Pacific, the government is actually elected by the small communities, people that, you know, someone within the community, no experience whatsoever with government sort of processes… But there is no training. I do know there’s no training for the whole diplomacy school.
Speaker
Maureen Hilliard (Cook Islands representative)
Reason
This comment is profoundly thought-provoking because it introduces a completely different perspective on diplomatic capacity building – the reality of small island states where community members become diplomats without formal training. It highlights the global inequality in diplomatic resources and training.
Impact
This comment fundamentally broadened the scope of the discussion from developed country AI implementation to global diplomatic capacity building. It challenged the implicit assumptions about diplomatic training and resources, though the transcript cuts off before showing the full response and impact.
We are currently working together with other huge federal ministries as well as the GIZ on developing a negotiation tool for the upcoming COP and here it’s also about sort of providing a service not only for Germany but open sourcing them and sort of leveling the playing field.
Speaker
Claire Patzig
Reason
This comment is insightful because it reveals a strategic approach to AI diplomacy that goes beyond national advantage to global capacity building. The concept of ‘leveling the playing field’ through open-source diplomatic tools represents a significant philosophical stance on international cooperation.
Impact
This comment introduced the theme of international cooperation and equity in AI diplomatic tools, setting up the later discussion about partnerships and the eventual question from the Cook Islands representative about capacity building for underserved regions.
Overall assessment
These key comments shaped the discussion by establishing a collaborative, human-centered framework for AI in diplomacy while progressively expanding the scope from technical implementation to global equity concerns. Claire Patzig’s philosophical grounding prevented the discussion from becoming overly technocratic, while the audience questions from Tauga and Maureen Hilliard introduced crucial practical and equity dimensions that challenged the speakers to think beyond their immediate context. The progression from theory to technical challenges to global capacity building created a comprehensive exploration of AI diplomacy that acknowledged both opportunities and systemic inequalities. The discussion evolved from a presentation format to a more interactive dialogue that revealed the complexity of implementing AI solutions in diverse diplomatic contexts worldwide.
Follow-up questions
How do you ensure interoperability between different data labs’ solutions from a technical perspective?
Speaker
Tauga (German International Cooperation Agency)
Explanation
This addresses a critical technical challenge in Germany’s bottom-up approach where every ministry has its own data lab, requiring solutions to communicate and link with each other and globally
What are the organizational challenges in implementing AI tools within the Federal Foreign Office structure?
Speaker
Sebastian Blum
Explanation
Understanding internal organizational barriers is crucial for successful AI integration in diplomatic institutions beyond just technological considerations
Does the Data Innovation Lab provide training facilities for underserved regions where government officials lack diplomatic training?
Speaker
Maureen Hilliard (Cook Islands)
Explanation
This highlights the capacity building needs of small island states and underserved regions whose elected officials often lack formal diplomatic training or experience
How to effectively collect and format data that works for technical communities while remaining relevant for subject matter experts?
Speaker
Claire Patzig
Explanation
This represents a fundamental challenge in bridging the gap between technical requirements for AI systems and the practical needs of diplomatic practitioners
How to balance AI sovereignty concerns with international partnerships in diplomatic AI tool development?
Speaker
Claire Patzig
Explanation
This addresses the tension between national/regional AI sovereignty and the interconnected nature of diplomacy that requires international collaboration
How to measure success and define clear goals for AI applications in diplomacy unlike other fields with benchmarkable outcomes?
Speaker
Claire Patzig
Explanation
Unlike medical AI with clear accuracy metrics, diplomatic AI applications lack clearly defined success criteria, making evaluation and improvement challenging
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Media Hub
Open Forum #39 From Research to Action Cybercrime Justice in Africa
Open Forum #39 From Research to Action Cybercrime Justice in Africa
Session at a glance
Summary
This discussion focused on the launch of a research report titled “Access to Justice in the Digital Age: Empowering Victims of Cybercrime in Africa,” co-organized by UNICRI and ALT Advisory. The session examined cybercrime trends, challenges, and solutions across four African countries: South Africa, Namibia, Sierra Leone, and Uganda. Tina Power, the report’s main author, presented four key findings: cybercrime is rising in Africa but remains under-documented due to significant under-reporting, there’s a disparity in understanding between financial and personal cybercrimes, certain cybercrimes are inherently gendered in nature, and deep structural barriers continue to hinder access to justice.
Michael Ilishebo, a digital forensic analyst from Zambia Police, highlighted major law enforcement challenges including lack of technical capacity, increased use of AI by criminals, encryption technologies that delay investigations, and jurisdictional issues. He emphasized that most African countries are not party to the Budapest Convention, limiting international cooperation. Sandra Aceng from Women of Uganda Network discussed obstacles victims face, particularly women and gender minorities, including social stigma, limited digital literacy, dismissive police attitudes, and insufficient legal frameworks specifically addressing technology-facilitated gender-based violence.
The speakers provided recommendations including strengthening legal frameworks with victim-centered approaches, enhancing law enforcement capacity through training, improving public-private partnerships with tech companies, and developing comprehensive support services for victims. Senator Shweba Fola Besalisu from Nigeria emphasized the need for continental coordination and bringing civil society organizations into the conversation. The discussion concluded that addressing cybercrime in Africa requires a multi-stakeholder approach combining robust legal frameworks, institutional capacity building, and coordinated regional responses to effectively protect victims and ensure access to justice.
Keypoints
## Major Discussion Points:
– **Rising cybercrime in Africa with significant under-reporting challenges**: The discussion highlighted that while cybercrime is increasing across African countries, there’s a substantial gap in understanding the full scope due to victims not knowing they’ve been affected by crimes, lacking knowledge of reporting procedures, or facing stigma when reporting personal harms like non-consensual sharing of intimate images.
– **Gender-based dimensions of cybercrime and online harms**: A key focus was on how certain cybercrimes disproportionately affect women and LGBTQI+ communities, including revenge porn, cyber-stalking, and doxxing. The discussion emphasized how victims face cultural stigma, fear of public shaming, and dismissive attitudes from law enforcement when reporting these crimes.
– **Law enforcement capacity and technical challenges**: Speakers addressed the significant obstacles facing African law enforcement agencies, including lack of technical skills for digital forensics, inadequate legal frameworks, jurisdictional issues in cross-border crimes, and the increasing sophistication of criminals using AI and encryption technologies.
– **Legislative gaps and the need for comprehensive legal frameworks**: The conversation emphasized the importance of having well-crafted, human rights-based cybercrime laws that address both financial and personal harms, with clear definitions and accountability mechanisms. The discussion included recommendations for legislative audits and model laws tailored to African contexts.
– **Multi-stakeholder collaboration and capacity building needs**: Speakers stressed the necessity of partnerships between government, civil society, private sector (especially big tech companies), and international organizations to effectively combat cybercrime, along with comprehensive training for justice system actors and digital literacy programs for communities.
## Overall Purpose:
The discussion aimed to launch UNICRI’s research report “Access to Justice in the Digital Age: Empowering Victims of Cybercrime in Africa” and translate research findings into actionable recommendations. The session sought to identify common trends in cybercrime across African countries, understand barriers to justice for victims, and develop practical solutions through multi-stakeholder dialogue involving law enforcement, civil society, legal experts, and policymakers.
## Overall Tone:
The discussion maintained a professional, collaborative, and solution-oriented tone throughout. While speakers acknowledged serious challenges and shared concerning case studies of victims, the atmosphere remained constructive and focused on finding practical remedies. The tone was respectful and inclusive, with speakers building upon each other’s points rather than disagreeing. There was a sense of urgency about addressing the issues, but also optimism about the potential for positive change through coordinated efforts. The conversation became slightly more animated during the Q&A session when participants raised concerns about balancing civil liberties with victim protection and the need for continental approaches to regulation.
Speakers
– **Ottavia Galuzzi**: Social expert at UNICRI (United Nations Interregional Crime and Justice Research Institute), session moderator
– **Tina Power**: Director of ALT Advisory, Attorney of the High Court of South Africa, main author of the report being launched
– **Michael Ilishebo**: Digital Forensic Analyst and Cybercrime Investigator at the Zambia Police
– **Sandra Aceng**: Executive Director at WOGNET (Women of Uganda Network)
– **Audience**: Various audience members who asked questions during the Q&A session
**Additional speakers:**
– **Odhran McCarthy**: Program Management Officer and Liaison Officer in New York at UNICRI, online moderator
– **Senator Shweba Fola Besalisu**: Nigerian Senator, Chair of the Nigerian Senate Committee on ICT and Cyber Security, Chairman of the West African Parliamentarians Network on Internet Governance
– **Juri Bokovoy**: Member of the Finnish Green Party
– **Belford Doherty**: Cyber Policy Analyst and Head of IT at the Financial Intelligence Agency in Sierra Leone (mentioned as expected speaker but did not participate in the recorded portion)
Full session report
# Report Launch: Access to Justice in the Digital Age – Empowering Victims of Cybercrime in Africa
## Executive Summary
This discussion centred on the launch of a research report titled “Access to Justice in the Digital Age: Empowering Victims of Cybercrime in Africa,” co-organised by UNICRI (United Nations Interregional Crime and Justice Research Institute) and ALT Advisory. The session brought together law enforcement officials, civil society representatives, legal experts, and policymakers to examine cybercrime trends and challenges across four African countries: South Africa, Namibia, Sierra Leone, and Uganda.
## Key Participants
**Ottavia Galuzzi**, Social Expert at UNICRI, served as moderator and explained UNICRI’s work stream on cybercrime and online harms, including terrorism, violent extremism, gender-based violence, child abuse and hate speech.
**Tina Power**, Director of ALT Advisory and Attorney of the High Court of South Africa, presented the report’s main findings as its principal author.
**Michael Ilishebo**, Digital Forensic Analyst and Cybercrime Investigator at the Zambia Police, provided law enforcement perspectives on operational challenges and technical capacity gaps.
**Sandra Aceng**, Executive Director at WOGNET (Women of Uganda Network), contributed civil society perspectives on victim experiences, particularly focusing on gender-based dimensions of cybercrime.
**Senator Shweba Fola Besalisu**, Nigerian Senator and Chair of the Nigerian Senate Committee on ICT and Cyber Security, as well as chairman of the West African Parliamentarians Network on Internet Governance, offered parliamentary perspectives on legislative challenges.
## Major Research Findings
### Rising Cybercrime with Under-Reporting
Tina Power highlighted that whilst cybercrime is increasing across African countries, significant under-reporting exists because victims often don’t recognise they’ve been affected by crimes, lack knowledge of reporting procedures, or face stigma when reporting personal harms such as non-consensual sharing of intimate images.
### Disparity Between Financial and Personal Cybercrimes
A key finding was the stark difference in how authorities respond to different types of cybercrimes. Financial crimes typically receive quicker responses from law enforcement, whilst personal crimes—particularly those affecting women—are often dismissed as intimate or domestic matters not warranting serious investigation.
### Legal Framework Gaps
The research identified inadequate legal frameworks across the studied countries, consisting of either patchwork legislation, outdated laws, or complete absence of relevant cybercrime statutes. Tina Power noted that “good laws matter as they provide clear definitions, guidance, and legal basis for victims to seek redress.”
## Law Enforcement Challenges
### Technical and Resource Constraints
Michael Ilishebo detailed significant obstacles facing African law enforcement agencies, including lack of technical skills for digital forensics, inadequate training programmes, insufficient resources for cybercrime investigations, and limited access to specialised equipment and software.
### Artificial Intelligence and Evolving Threats
Michael Ilishebo identified a concerning trend: “We have seen an increase in terms of the cyber crime that is happening due to the use of AI. We have seen people that may not even actually know how to code… They are using AI to enhance their criminal skills.”
### International Cooperation Challenges
The cross-border nature of cybercrime creates jurisdictional challenges. Michael Ilishebo noted that most African countries are not party to the Budapest Convention, limiting international cooperation opportunities. He mentioned the upcoming UN Convention against cybercrime to be signed in October in Vietnam as a potential solution.
### Africa Cryptocurrency Working Group
Michael Ilishebo described his involvement in the Africa Cryptocurrency Working Group under the US Department of Justice, which brings together African law enforcement agencies to address cryptocurrency-related crimes and share intelligence.
## Victim Experiences and Barriers
### Gender-Based Dimensions
Sandra Aceng provided insights into obstacles victims face when seeking justice, particularly women and gender minorities. She shared specific case examples, including a 2022 case from northern Uganda and a recent case reported to Kampala Central Police Station, illustrating how victims face social stigma and dismissive attitudes from law enforcement.
### Institutional Response Gaps
Sandra Aceng noted that law enforcement frequently lacks specific training for technology-facilitated gender-based violence, resulting in inadequate responses. She observed that police officers often don’t understand the serious impact of cybercrimes involving personal or intimate harms.
## Legal Successes and Practical Applications
Tina Power provided specific examples of successful legal interventions using South Africa’s Cybercrime Act and Protection from Harassment Act, demonstrating how well-crafted legislation can provide effective remedies for victims when properly implemented.
## Public-Private Partnership Challenges
### Platform Non-Cooperation
Michael Ilishebo identified frequent refusal of social media platforms to cooperate with African law enforcement agencies. Platforms often refuse to provide information or remove content, citing community guidelines over local laws.
### Continental Coordination Proposal
Senator Besalisu proposed a continental approach to address platform non-cooperation, suggesting that “Africa as a continent come together, does the same thing that happens in Europe. You violate the law in a part of Europe, the entire Europe will fine you.”
## Civil Society and Legislative Tensions
Senator Besalisu expressed frustration with civil society opposition to cybercrime legislation amendments, stating: “civil society organizations, every time there’s an attempt to amend the cyber crime law to protect citizens, the civil society organizations are always in arms, believing and thinking that it’s about to constrain the space.”
Tina Power acknowledged this “very fine line” and emphasised the importance of grounding legislation in human rights principles to address these concerns whilst protecting victims.
## Key Recommendations
### Victim-Centred Legal Frameworks
Tina Power emphasised that “laws must be victim-centered, grounded in human rights, and well-communicated to both public and justice system actors.”
### Capacity Building and Training
Speakers emphasised the need for comprehensive training programmes for law enforcement, judiciary, and other justice system actors, addressing both technical skills and gender-sensitive approaches.
### Specialised Response Units
Sandra Aceng recommended establishing specialised technology-facilitated gender-based violence response desks at police stations, staffed by trained female officers.
### Multi-Stakeholder Partnerships
Sandra Aceng emphasised the need for “multi-stakeholder partnerships including engagement of boys, men, and communities in prevention efforts.”
### Digital Literacy and Awareness
Speakers highlighted the importance of comprehensive digital literacy programmes targeting diverse audiences through multiple channels, including community workshops, radio programmes, and social media campaigns. Sandra Aceng mentioned successful initiatives including comic books and other educational materials.
## Conclusion
The discussion successfully launched UNICRI’s research report whilst generating substantive dialogue about cybercrime challenges in Africa. Key themes included the need for victim-centred approaches, enhanced institutional capacity, improved public-private partnerships, and better coordination between stakeholders. The session highlighted both the complexity of addressing cybercrime across diverse African contexts and the potential for collaborative solutions that balance victim protection with human rights considerations.
Tina Power’s framing of cybercrime as fundamentally “a justice issue” rather than merely a technical problem provided a central theme that influenced much of the subsequent discussion about legal frameworks, institutional responses, and victim support mechanisms.
Session transcript
Ottavia Galuzzi: Good morning everyone joining us in person and online and good afternoon or evening to disconnect from different time zones. For people present in the room you can wear the headset to hear us better, both us and people online. My name is Ottavia and I work as a social expert at UNICRE, the Interregional Crime and Justice Research Institute. We are very thankful for your time and interest in this session co-organized by UNICRE and ALT Advisory titled From Research to Action, Cybercrime and Justice in Africa. Very quickly during this session we will launch our research report just published today titled Access to Justice in the Digital Age, Empowering Victims of Cybercrime in Africa. We’re going to present its main findings, learn from our speakers about many topics from the common trends in cybercrime and online harms targeting African countries, the challenges national law enforcement agencies face in conducting investigations and the obstacles victims encounter in reporting cybercrime and online harms and in seeking fair redress. Then we will aim to discuss recommendations, what gaps we can fill in as a multi-psycholar community and what practical solutions we can improve or develop as next step. We will conclude the sessions with participant feedback and expertise including of course questions. for our speaker. This falls within UNIQRI’s work stream on cybercrime and online harms, which aims to explore the interplay of different cyber threats and harmful behaviours, seeking to develop inclusive and rights-based solutions to address the convergence of cyber crime and online harms, including terrorism, violent extremism online, gender-based violence, child abuse and exploitation, and hate speech. At UNIQRI, we address these threats through action-oriented research on niche thematic areas, but also capacity-building activities involving tech companies and technical assistance to member states and policy-making. So, of course, I would like to thank our great speakers for their time today and their direct contribution to the report we are launching, we just published. Let me quickly introduce them for you all. So, we have Tina Power, she’s Director of ALT Advisory and Attorney of the High Court of South Africa. Michael Ilishebo, Digital Forensic Analyst and Cybercrime Investigator at the Zambia Police. And then online, we have Sandra Aceng, thanks a lot for joining us online, Executive Director at WOGNET, Women Uganda Network. And we are waiting for another online speaker who might join us soon, Belford Doherty, who is Cyber Policy Analyst and Head of IT at the Financial Intelligence Agency in Sierra Leone. Last but not least, my colleague Odhran McCarthy, Program Management Officer and Liaison Officer in New York, is joining as online moderator. To all online participants, of course, leave your questions and comments and feedback in the chat, and we’re going to go through it during the Q&A. So, logistically speaking, we will, before opening the floor to participants, we will have two rounds of questions for our speakers, to firstly set the scene, and secondly, to discuss recommendations, opportunities. And I think we can just start. So my very first question is for Tina. So as the main author of this report we’re launching today, could you please share with us the main findings of this report? Thank you.
Tina Power: Thanks, Ottavia. Good morning, everyone. Maybe just a quick good morning to all of the early birds who joined us in person. We appreciate it’s a slightly chillier morning and it’s been a very busy week. So thank you for taking the time to be here. And thank you to everyone online as well. We know some colleagues are joining from very different time zones. So we appreciate that everyone has put in the effort. And a final thank you to the technical team at the back who’ve helped make all of this possible. We really do appreciate it. Thank you so much. So as a point of departure, it’s important to note that while many of the themes that we’ll be discussing today may be familiar, what sets this report apart is its focus on Africa, which is somewhat unique given that we’re launching it in Norway. But we don’t often speak about what’s going on in Africa from a cybercrime perspective. So that is important. It is also victim-centered in nature. We’re not talking about hacks of big banks. We’re talking about the lived and individual experiences of ordinary people who are victims of cybercrimes. And it is also informed by a diverse set of stakeholders, which is why this panel is also so unique. We’ve got members of the private sector. We’ve got members of civil society. We’ve got UN agencies and we’ve got government. So it was truly informed by everyone’s perspective and views. And lastly, it’s been geared towards finding meaningful and practical solutions to the present challenges. So quite simply, the report isn’t set to explain to us what cybercrimes are. It’s set to explain to us how people are being affected and why so many remain without justice. So there’s a lot to unpack in the report. But I’m going to just spend some time going through four of the overarching findings that we saw across four different countries. South Africa. Namibia, Sierra Leone, and Uganda. So the first finding is that we know cybercrimes are on the rise in Africa, but we don’t yet have the full picture. Secondly, there’s a significant disparity in how people are understanding various different forms of cybercrimes. Thirdly, certain types of cybercrimes are inherently gendered in nature. And finally, we are seeing deep structural barriers that are continuing to hinder access to justice. So I’ll briefly take us through these. So the first is we know that cybercrimes is on the rise. We’ve seen this through emerging data, anecdotal evidence, and engagement with stakeholders. But we don’t think we have the full picture yet. And this is largely due to significant under-reporting, which I know Sandra is going to touch on shortly. But we’ve seen that a lot of people who are victims of cybercrimes don’t know that they have been affected by a crime. Even if they do know it’s a crime, they don’t know where to report, how to report, or how to phrase what it is that has happened to them. There’s also, particularly in relation to more of the personal types of online harms, like the non-consensual sharing of intimate images, a lot of women in particular feel that there is significant stigma in reporting and that they will be blamed or re-victimized. So that hinders them from reporting. And in many instances, we’ve also just seen that frontline officers at police stations aren’t equipped to respond to these types of crimes. They don’t know what codes to use. They don’t know how to articulate what the crime is, which then hinders the whole justice process. So where data exists, it is very helpful and we can start seeing what the trends are and pick it up. But we do need to do a lot more to better capture what the actual reality is, because it means we’re tackling somewhat of an invisible problem. And this is not to say that the harms aren’t real, they’re just currently not captured on the system. Secondly, and this links to capturing, the report highlights quite an important distinction between financial harm and personal harm. Financial harms are the harms where you’ve been scammed, you’ve been hacked, someone has stolen your money. And then we have personal harms, which are far more intimate. This is the non-consensual sharing of intimate… and Mr. Michael Ilishebo. We’ve been talking about the two types of crimes. We’ve been talking about intimate images, being harassed online, being doxed, being threatened, instances of hate speech. And what is interesting about the distinction is how it is responded to. If you report a financial crime or a theft, there’s often a quicker response and a more immediate response, whereas reporting personal crimes are seen as more intimate or domestic in nature and not always taken as seriously. We’ve seen this in the case of cybercrime. We’ve seen this in the case of cybercrime. Cybercrime also came through incredibly strongly in the research and engagements with stakeholders on the ground. Where there are instances of harassment, doxing, intimate images, these are all very gendered in nature and it is clear that women are disproportionately affected as our members of the LGBTQI plus community. And finally, all of this culminates in us identifying the correct legal framework to enable this. It is either a patchwork of laws, non-existing laws, or laws that are outdated or outmoded. We’re also seeing that law enforcement and justice systems are often undercapacitated in Africa. Resources and budgeting is not directed towards these institutions, but it will be lovely to hear from Michael, as someone who is playing a critical role in these institutions, as to what the type of capacity is that we need. We’re also seeing that both victims and survivors of cyber-crimes are often under-reported. We’re also seeing that there is a lack of accountability for victims of cyber-crimes, and cyber-crimes legislation. We’re also seeing that people don’t recognize this as a crime, which then means that they are not reporting. Ultimately, the report has made it clear that cyber-crimes is not just a technical issue, it’s a justice issue. If we don’t have correct avenues for justice, victims and survivors will not be able to seek recourse, we will lack accountability, and we will not be able to move forward as a global country.
Ottavia Galuzzi: Thank you Ottavia, that was a fantastic their report we are publishing today, but to the conversation and discussion of today. So now we’d like to move to you, Michael, and get your perspective as law enforcement. So could you please share with us, what do you think are the main challenges law enforcement authority face with cybercrime investigations today?
Michael Ilishebo: Thank you. Good morning, Ottavia, and good morning, my fellow panelists and those in this room and those online. So basically as a law enforcement officer from Africa, and also one of the law enforcement officers in this world was trying at all costs to ensure that we have a safer cyberspace for all, because as you know, cyber has no jurisdictional borders. So basically from the African perspective, the challenges are many. Cybercrime comes in a mouth face way. What I’m trying to mean is, in almost every crime that is committed today has elements of digital evidence. Digital evidence is highly policed using the cyber laws. So we find a situation where almost all traditional forms of crime that used to happen in the past are more prominent now, because of the use of ICT. Just as the increase in the nominal traditional crimes is on the rise, we’ve seen cybercrime also coming on the rise. So among us, the many challenges that we are facing as African law enforcement agencies or African law enforcers is that the first part of it is the lack of capacity in terms of technical skills. Because when we talk of cybercrime investigations, the process starts with a person reporting. In this case, there’s also an increase in under-reporting of these cases. Most cases that happen, people fail to report them to the police, not because they don’t want to report them. and Mr. Michael Ilishebo. I would like to start with the first point. First, because they know that either by time, even if they report, they may not get sufficient help. Secondly, because as she put it, it could not be a crime that has been committed from the aspect of financial gain but personal gain. There are people that would want to keep their privacy as in really private, knowing that if I go to the police station, they may not be able to report these cases. So, if they report these cases, they may not be able to report the nature or anything that may compromise their standing in society. They may end up failing to report these cases. So, as the cases are reported at police station levels, what happens next is the suspects are supposed to be arrested and arraigned before the courts of law. Then, if the police are failing to handle some of these cases due to technical skills, what happens at the courts? Then they hug them, maybe try to run others and then eventually, they will get a fine or they will actually have to join another city court. They will have to judge the acts that they had committed against themselves. They will have to do a lot of things. They will have to do a lot of things. So, we have seen an increase in terms of the cyber crime that is happening due to the use of AI. We have seen people that may not even actually know how to code. People may not even know how to frame this and that. They are using AI to enhance their criminal skills. Also, the use of encryption in order to hide whatever information that may be found on their mobile devices, on their cell phones, on their computers. So, the use of encryption in order to delay the later justice is enforced. Thirdly, also, the use of cryptocurrency. As you have seen, we have seen an increase where criminals no longer have to rely on fiat cash. In order for them to commit a crime, they hide under the cryptocurrency barrier. But of course in Africa, we are trying with the help of the US government an organization was a grouping was formed which is called Africa Cybers Africa Cryptocurrency Working Group, which is under the US Department of Justice attached to the AU in Addis Ababa. I think we are there is about eight countries and more countries will be joining We are trying to tackle the issues of cryptocurrency so sadly It’s the issues of legal frameworks as you know To fight cybercrime you start with domestic laws then let alone you Graduates to international conventions. So most African countries are not part to the Budapest Convention But fortunately enough the UN has come up with a draft UN Convention against cybercrime Which by nature in an African country which is a member of the UN may in a way or one way or the other may find themselves signing to the Budapest to the To the UN Convention when it comes into when it is signed. I think it will be signed in In October in Vietnam, then of course the process will start overseeing and all those So that would be the easier way for member states in Africa to be part to the international conventions of course, we have the Malabo Convention on data protection and Cyber security, but it does not address the current increase in terms of crime There is also the Budapest Convention which has been around for the last 20 years If I checked the last time I checked the data’s the only about Less than 15 if not 12 African countries who are part to the Budapest Convention this in itself has played a critical role in terms of us getting information when it comes to Cases or criminals who are out of the jurisdiction of Africa So that brings out to the issues of jurisdictions because we’ve seen an increase in crimes Being committed in nations that may not even have cyber laws in place So what is happening right now is if a country has poor cyber Crime registration somebody from another jurisdiction may actually cause financial and emotional harm and yet In that country, it’s not a crime So, we’ve seen so many challenges such that if I was to list them here, they would resonate with almost all the challenges all enforcement agencies are facing. But what I’ve just said, these are part of the most critical challenges that we are facing.
Ottavia Galuzzi: Thanks a lot, Michael, for this overview and snapshot, very detailed snapshot into the different challenges law enforcement may face in regards to cybercrime investigation. Something interesting you mentioned regarding the rise of cybercrime with AI, something we have seen via different research is that AI is really, is not creating new criminals, is actually allowing criminals in other fields moving into the cybercrime environment, which is a bit concerning because, yeah, I think we can all agree that AI reduce the level of barrier into everything that is technical. So thanks for bringing that up as well. Moving forward, I would love to come to you, Sandra, I hope you hear me. Yeah, we see you, that’s fantastic. And to you, given your line of work and the important work you do, I would like to focus more on the victim side. So we would love to hear from you, what are the obstacles that victims encounter in reporting cybercrime-reliant harms and in seeking fair redress, particularly from a gender perspective?
Sandra Aceng: Thanks to you. Thank you so much for the question. Could you please confirm that you can hear me? Yeah, we can hear you. Okay. It would have been lovely to be in that room, but nonetheless, I’m joining virtually and good morning, good afternoon, and might be evening for everyone, for some other people who are joining as well. I’m very glad to be speaking about this subject. So based on the work that we have done at Women of Uganda Network. Looking at some of the obstacles that victims encounter, especially women and gender minority groups, they face a lot of numerous challenges, especially in being able to report, but also being able to seek redress, which sometimes, in most cases, is shaped by social stigma, sometimes institutional gaps, and also digital illiteracy, as also clearly highlighted by Michael. So I would go deep into it. So one of the obstacles has been stigma and the fear of also public shaming, whereby women who experience tech-facilitated gender-based violence, such as non-consensual intimate distribution of intimate images, that in the context of Uganda is usually referred to as revenge porn, and also cyber-stalking and doxxing, they often really fear being blamed or ridiculed. And then there’s also the cultural norms that discourage women from speaking out, especially when the abuse is sexual in nature and also intimate in nature. So there’s this case in 2022, when we documented during our study, a young woman in the northern part of Uganda had intimate images leaked by her ex-partner. So rather than her posting it, reporting it, she withdrew from social media and also isolated herself. So when she reported this case to Women of Uganda Network, she expressed fear of being judged by police and also her community. So when the case was handled confidentially through Whoop Net’s toll-free line, but she never really… pursued for more justice as well. And then, also talking about the aspect of limited literacy and also awareness of rights, whereby many victims, especially those in the rural, rural women and youth, do not recognize online harms or crimes, or are unaware of, actually, the available legal protections. So, a lack of awareness of, for example, a data protection law, or the Computer Misuse Act amendment in 2022, or the the Uganda Police Cybercrime Units also limits reporting. Still an example during a workshop that we conducted in a district that is based in northern Uganda. It also found out that about 80 percent of the participants, majority of which were women, had never heard of what Uganda Data Protection and Privacy Act is. So, many believe that online reporting abuse was maybe for those who have strong political views or financial connection. So, you can see, really. So, there’s also a bit of lack of awareness in regards to that. And then, also talking about the weak law and also the enforcement response and institutional aspect of it, you find that victims also face very dimissive attitudes from police, especially when the harm is online and not physical. A serious case that we actually recently was reported to our police through one of our partners, whereby her private and intimate images were unlawfully shared in multiple WhatsApp groups, which we also were able to identify the WhatsApp group without consent. So, she strongly believed that these images were taken by her former boyfriend, which was also named. but I’ll keep it private for this. So as she was, he was the only person with access to her at that state and she also asserted that he took these photos without her consent. So she reported this case to the central police stations of Kampala. All the case details are with offensive registration registered and as an unsolicited information under the Computer Misuse Act, CAPT 96. So she was deeply very concerned that more content would be shared and also as she and also the ex-partner also engaged in intimate activities in his residence which also had a CCTV camera to which he had sole access to but the police could not help her because she didn’t have enough evidence. So we were able to provide some support of having screenshots which she was also able to share and either also if she has direct access to the WhatsApp group which she didn’t have but a friend had access to the WhatsApp group and also belong to the WhatsApp group but she couldn’t report this case because the friend could not report it. It needed someone in the WhatsApp group to be able to report and I was actually looking through the rules in the WhatsApp group which also says if you’re quiet in the WhatsApp group you will be able to be excluded and all that. So you can see that the aspect of law enforcement that lacks specific trainings especially to undertake tech-facilitated gender-based violence is still lacking and investigations often are slow and also very inclusive. So this requires a lot of training to be done and also linked to that also a university student also reported that cyberbullying and defamation via Facebook accounts to her local police post in Lira district that is also based in northern Uganda. So the police asked her to present the suspect, who was anonymous, and then told her to ignore it and deactivate her account. So through UGNET’s legal network, we were able to link them to the Belford law for digital legal support, but formal justice was not pursued. So you find that there’s a lot of issues that continue to happen, also with the insufficient legal frameworks and gender-sensitive provisions that we have, this continues to happen. For the case of Uganda, we have no specific law in Uganda that addresses technology-facilitated gender-based violence. Although we have the African Commission of Human and People’s Rights Resolution 522, which talks about the protection of women against digital violence in Africa, it’s still something that re-organizes online gender-based violence as a human right. And this is a step, and it’s also calling for all the member states countries to be able to work together to ensure that we have specific laws that address digital violence as a whole. So I would like to stop here, and I need to go back to you, thank you.
Ottavia Galuzzi: Thanks a lot, Sandra, for sharing different, unfortunately different examples of cases related to cybercrime, online arms, and with a gender-based violence component as well. Your perspective is very much needed in this type of conversation. So now, you know, we set a bit the scene of what is happening, and what are the different and Mr. Michael Ilishebo, Mr. Odhran McCarthy. Thank you so much, Tina. I’m going to start with you, because I think you’ve already touched on a lot of different challenges that different sectors within this topic address. So perhaps we can we can move into recommendations, opportunities and next steps. And then, of course, opening to to the floor, both in on site and online floor. What can you share with us on the legislative aspect? So how can we use legislative instruments to to really advance the prevention and countering of cybercrime and online harms?
Tina Power: Thank you. Thanks, Ottavia. If there’s ever a question to get a lawyer excited, it’s what is the value of laws? So I’ll try to keep it brief, but I want to touch on why laws matter, how the right laws can lead to tangible, positive results and what we can do to make our laws better. So an important finding of the report is good laws matter for several reasons. They provide clear definitions and guidance as to what the crime is and what the options you have are. It’s also essential both for the public understanding of what the law means and what justice means, as well as for how the justice system will then unfold once it’s been reported. Laws also give victims a legal basis to seek redress and ensure that there is accountability and a well-drafted piece of legislation lays the foundation for action. It enables us to know what to do from the moment of reporting, which we’ve now heard multiple times is a clear challenge through to investigation, which is also there are complications as well as at the actual justice system level. Once we get to courts, what is going to happen? So the law gives us that foundation, which is much needed. So without laws and clear guidance that are grounded in human rights, victims will remain vulnerable and access to justice will remain limited. South Africa’s Cyber Crimes Act provides quite a useful tool. Thank you very much, everyone, for joining us. This is a very useful example of a law. I’m not going to say that it is good, but it does tick many of the boxes. It addresses both financial and personal harms. It provides institutional guidance. And requires the development of more sort of operational requirements, such as standing operational procedures. How do we gather the evidence? How do we store the evidence? How do we use the evidence in court? Having the right legal framework can tangibly support victims of various forms of cybercrimes. And I’m going to share a few examples in some of the work that we’ve done recently. So we’ve issued several cease and desist letters. In South Africa, we call them letters of demand. In cases involving threats of the disclosure of intimate images. So the image hasn’t yet been disclosed, but it’s a threatened disclosure. And in the letters, by simply referencing South Africa’s Cybercrime Act and explicitly stating what the criminal sanctions are, we have found that it’s an incredibly useful deterrent to stop the harm in its tracks. When people see that there is a real legal harm or a real legal risk at play, they generally have tended to stop, in our experience. And so this is quite an important reminder that even the threat of legal accountability can be a strong deterrent. A second successful example that we’ve had was using South Africa’s Protection from Harassment Act. And here we used it to obtain what we call a protection order. In other jurisdictions, it’s generally known as a restraining order. And we achieved this on behalf of a human rights defender who was being relentlessly attacked on X and threatened with real world violence. But fortunately for us, our act applies to both harassment and threats in the physical realm as well as the online realm. So we were able to go to court and explain to the judge both the online harms and the stresses and the consequences as well as how this could unfold in the offline world. And we were able to secure protection for our client in that way. and Mr. David B. Reid. So I want to start by saying that the UNRQ report can offer justice, but the law needs to be good. It needs to be well-crafted. Definitions need to be clear, the accountability mechanisms and the reporting mechanisms also need to be clear. So one of the useful things about the UNRQ report is that we don’t just stop there. We don’t say you need good laws. We actually want to figure out how we can make good laws. So that means that every agency and we are going to apply legislation to raise awareness of actually assessing things for, for concern at the country level. So each country assesses where they are at. Do we have overlapping laws, are we missing something fundamental? Do we simply not have a law like is the case in Namibia, they haven’t yet passed it and that means that victims are completely without redress. We then couple this with a cyber crimes model law that is victim-centered. We have seen other cyber crimes law that is not just a model law, it is a model law that is not just a model law, it is a model law that is actually affected. So we want to structure the legal approach in two forms. We set up the audit to assess where the challenges are, but then we also help you fill the gaps. This is what a model law could look like. Apply it to your jurisdiction and see what works. But we also know, and I’m sure other colleagues will discuss it as well, that law alone is not enough. This needs to be coupled with comprehensive training of just the sector actors so they know how to apply the law, but also to have a good piece of legislation. We need to have a good piece of legislation so that ordinary members of the public are able to know that they have been a victim of a cyber crime and they know how to report and what the reporting process looks like. So we want good laws, but we also want people to be able to use the laws. It is very unhelpful having a good piece of legislation, but no one knows how to use it or how to access it. So in summary and in closing, because I see our time is moving on, I will caveat all of this, as lawyers generally tend to do, to say that it is not the end line, it is not the finish line. For our laws to be effective, they need to be clear, they need to be grounded in human rights, and they need to be well communicated, both to members of the public and those within the justice system. This needs to be equally coupled with institutional capacity, and capacity has come up time and time again, both from Sandra and Michael. So we know there is a clear challenge there, and laws need to be able to support that and enable that. So when all is said and done, as a lawyer, I see huge potential and value in the law, and it is one of the solutions. It is an important one, and it does provide us with what we need to take us forward, but it needs to be coupled with everything else that we’ve discussed today as well. Thank you, Ottavia.
Ottavia Galuzzi: Thanks to you, Tina. That is very, I think, the perspective of having not just legislative framework, but the right legislative framework is a key message here. Michael, back to you again, from law enforcement perspective. Where do you think we should focus our efforts in enhancing law enforcement capabilities towards tackling cybercrime? And perhaps if we can get your views on one of the recommendations in the report is the use of clear coding systems for cybercrime investigations. If you could share anything on this, that would be very helpful. Thank you.
Michael Ilishebo: So basically, I’ll pick it up from the previous speaker. You see, when we embark on the Know Your Light campaign to focus on victim-specific laws, what we’ve discovered is that, like from my experience, from examinee experience, is that in my country, those laws are there that actually protect the victim. But unfortunately, the civil society are coming up and saying that some of these laws are meant to stifle freedom of expression. So as much as the government, as much as the laws and Mr. David Njie. We have a lot of people here who want to speak life into protection of the victims. Somehow others would feel their freedom of expression is being hampered. So that’s one of the challenges that we need to find a balance. So coming to the question, so some of the major recommendations or working solutions that we need in order for us to address, fight and mitigate, of course, you cannot avoid, is that the first part is, as I alluded on, is on capacity building. Not only the police, but also the justice system, the wheel of justice starts with the law enforcement, ends with the courts. So once we tell in some causes that are specific to the needs of these players in the justice system, we’ll at least be able to address and mitigate some of these issues. You need to feel in the Courts to be ready, to navigate, so therefore, we need first, we need such a solution to limit crime in those areas, and once you have a solution that we have mature on that should be protected. In terms of the first part of what I said, we need a third form of communication and we need the leniency of the system, not just pass things through but rely on that decision. Believing that we also have the right to have such an initiative and we should not let it happen. So I give an example. The social media platforms where some of these I would like to say even financial, non-financial cases or cases or cases happen, sometimes it requests information from these service providers, I won’t name them, you know them, they will tell you what to do with it. an offence, an act that is an offence in a certain jurisdiction. So, in that case, they may not actually be able to give you the information you are seeking for. An image may be posted of someone online. You request for it to be pulled down. They will still say this does not go against our community guidelines. So, we need a clear balance. We need cohesion. We need to reach consensus. A situation where as much as it may not be able to go against their guidelines, but as long as the victim feels not safe, imbalanced, or public shamed by virtue of that post or image, this social media platform needs to be bringing down this information, this content. Also, when the law enforcement agencies also request for information, there is need for them to act swiftly to give the law enforcers the information they are after, as long as we meet the basic criteria of either a subpoena, a court order, or anything that they demand. So, once we address the issue of private-public partnership, we will halfway have solved the problem. Also, we have the issue of enhanced legal framework. I will give you an example for Zambia. Just two months ago, we amended our Cybersecurity and Cybercrimes Act into making the Cybersecurity Act a single piece of registration and also enhancing the Cybercrimes Act. We have come a long way in terms of addressing cybercrime. But again, if you look at the current law, it is able to address what challenges we are facing now. But that does not mean that in two, three, four years’ time, the same piece of registration will be as effective as it is now. So, the process of enhanced legal frameworks has to always be continuous because a cyber law has a shelf life. It does not sit like a criminal procedure. The Penal Code, the mighty Penal Code, that probably was enacted 50 years ago, and the same applies today. So, enhanced legal frameworks which are supposed to mirror to each other, meaning that if we have a mirrored cybercrimes act in Africa, where an offence in Benin, an offence in Kenya, an offence in Zambia speak to the same facts, unlike where it’s an offence in Zambia, you go to Kenya, it’s not an offence. Probably the perpetrator might be in another jurisdiction, in this case Kenya, and the victim is in Zambia. The damage has been done in Zambia, but the victim is in Kenya. And yet when you compare the cyberregistration, you discover that indeed in Zambia it’s an offence, in Kenya it’s not. So there it becomes more of a bilateral kind of arrangement. But if the laws are as clear in terms of speaking life into each other, they are mirrored, that way I can assure you we are going to probably reduce cybercrime by a huge percentage, because anyone in any African country will know that what I’m doing is an offence where I am, and the victim I’m targeting is in South Africa or anywhere else. So amongst these recommendations are that we need to have a mirrored cyberregistration. Also I already alluded to the aspect of international conventions, as I said the letter to which African countries are becoming part of the Budapest Convention, not to talk of the UN Convention, which is not yet to be signed. I think there is a need for us to push for African countries to be part of most of these international conventions, because cyber knows no border, cyber knows no jurisdiction, cyber knows, it’s not a respect of anyone. An infrastructure here in Norway can be messed up by somebody who is in Colombia or Zambia. So how do we address these issues is for us to meet somewhere through an international treaty. Thank you.
Ottavia Galuzzi: Thanks to you, Michael, and thanks for the different oversight and for the recommendations in general. But I also really appreciate you introducing the public-private partnership debacle. It’s something that is very hard to tackle. I think probably everybody in the room and also online have faced this issue in one way or the other. Before moving into questions and feedback from participants, we’d like to come back to you, Sandra, for, again, we are looking into recommendations. So perhaps your view on recommendations on how we can effectively support victims of these types of crimes, again, with a particular attention to the needs of women and girls. Thank you.
Sandra Aceng: Thank you for the question. Can you hear me? Yes. Okay, so I will start from where Tina Power actually stopped. She said the law is not enough and I like to say that you cannot fix what is broken by tech using the law. However, we also need the law. So for my recommendation, I would say there’s need, as also Michael highlighted, that there’s need to strengthen the legal and policy framework, enact specific laws that speaks about online gender-based violence or technology facilitated gender-based violence. And these laws should be able to explicitly cover issues around non-consensual intimate images or videos, cyber-stalking, doxing, and also online harassment. And more importantly, also the AI-generated sexual content referred to as defects. Now we also have cheap facts and also the gender disinformation that is also very much linked to AI-generated content. And also speaking about the law, we can also see how to integrate gender-sensitive provisions, especially into ICT legislation, ensure that about women’s rights, and also building institutional capacity and accountability. How do we ensure that we train law enforcement, the prosecutors, the judicial officers on digital crimes with agenda lengths, include models, especially on trauma-informed survivor support, ensure that there’s digital violence, ensure that also we have an inclusion of online investigation techniques that is built or taught to these people, and also establishing delegated technology-facilitated gender-based violence response desk, especially at the police post with female officers and also the digital crime experts, monitor also enforcements to ensure that justice for survivors is being guaranteed and not further victimization, and also the need to have expand access to survivor-centered support services. How do we develop and also scale up support platforms that are available? For instance, we have the OGPV web portal, like the WOCNET has, and it’s able to offer support to digital security tips, also referrals, and also we have the toll-free line that’s held for reporting and emotional support. So how do we scale that? and also have mobile outreach clinics, especially for psychosocial services in the rural areas because these are some of the areas that are most times thought all that they do not face this thing and ensure that there’s also multi-language access, disability inclusive services so that no cyber is left out. And facilitate also legal aid, free legal aid, digital security support through partnership with civil society organization and legal tech firms. Again, for us to be able to reduce, I would not like to say end because we cannot end, reduce cyber crimes against women and girls, we need to have a multi-stakeholder partnership which is very key. And also promote digital literacy and awareness, ensure that women and girls have education on online safety and these trainings should be able to focus on managing the privacy settings, recognizing that online scam and online harassment is real and also the reporting channels and how to determine evidence is also being taught. And conducting some of the regular community-based workshops, the use of radios, the use of social media. Recently we explored the use of comic books that was really very impactful, going to the community and reading to them a fun way to for them to be able to educate themselves on some of the digital security tips but also being aware that online gender-based violence or cyber crime is real. And integrating also some of the cyber safety. and Digital Rights skills, especially into the school curriculum, especially for women and marginalized learners is key. And I would like to conclude and say that for us to also be able to reduce online gender-based violence, we have to start thinking of how do we engage boys, men, and also the communities in prevention. How do we challenge harmful gender norms and also online misogyny? Let’s have a facilitated dialogue, especially on respectful digital behavior and content and support male champions, peer educators in school and communities, groups to be able to be models that can promote positive engagement so that we can really be able to see how to mitigate the increasing rate of online gender-based violence. Thank you.
Ottavia Galuzzi: Thanks to you, Sandra, for sharing all these recommendations and fantastic ideas. The use of common books, for instance, I think it’s a very great example of how to involve communities as well. So I think we have a bit more than 10 minutes left, so it would be fantastic to come to you all. And thanks again for your interest and time in joining our session. I don’t know if there is any questions or comments from the… Yes, please. I think you would need to go to the mic. Thanks a lot.
Audience: Thank you and good morning. Can you hear me? I am Senator Shweba Fola Besalisu from Nigeria. I chair the Nigerian Senate Committee on ICT and Cyber Security. And I’m also the chairman of the West African Parliamentarians Network on Internet Governance. We are very delighted to be here and listening to the conversation in the last one hour or so. You could also be speaking about Nigerian situation, I mean listening to Zambia, to Uganda, and I’ve come to the conclusion that the issues are fairly the same, particularly on the African continent. My brother gentleman spoke about the civil society organization, and I think that’s why this conversation needs to also extend to the civil society organization. Nigeria first enacted the cyber crime law in 2015. Last year we amended it, and we’re also in the process of amending it again, particularly because of the UN Convention that was just adopted in December. But the challenge is that the civil society organizations, every time there’s an attempt to amend the cyber crime law to protect citizens, the civil society organizations are always in arms, believing and thinking that it’s about to constrain the space, forgetting that sometimes where your own rights stop is where the rights of somebody else start. So I think we need to bring the civil society organizations into the conversation. The second point for me, and I spoke about this two days ago at the parliamentary track, unless and until we are able to bring the big text into the table, and I suggest a situation where Africa as a continent come together, does the same thing that happens in Europe. You violate the law in a part of Europe, the entire Europe will fine you. Unless and until all of those big texts know that if you violate the law in South Africa, your sanction is not limited to South Africa. Same sanction will be applicable in Côte d’Ivoire, in Ghana, in Nigeria. That is the only time they will respect our national laws, our national values. As of now, we almost appear helpless. When we make the laws, devalue the laws, the same content that they will bring down in less than one day in Europe, you’ll be shuffling papers up and down for the next two weeks. Meanwhile, the victims continue to suffer. Sometimes people go into depression on account of that. I think we need a continental approach to come to the table and say, you violate the law, you don’t respect the norms here, that sanction will be applicable across the African continent. I would like to thank you again for this beautiful conversation. You speak to the African issue, you speak to the African continent, and I think now we need to move from talking about it to having a concerted African position.
Ottavia Galuzzi: Thanks a lot, Senator. That is very important, what you’re sharing, both involving civil society and bringing the big tag at the table. I don’t know if panelists would like, speakers would like to comment.
Michael Ilishebo: So basically, I will engage the Senator after this session. As this area put on, our challenges are not almost the same. What Nigeria is going through is what any other African country is going through. As I said earlier on, we’ll continue the conversations beyond the panel session.
Tina Power: I just have one brief comment. I see there’s more questions. But just to note, on the question about civil society responding to legislation, there’s a very fine line. And what the report recommends and proposes is when we’re drafting legislation, it must be grounded in human rights. So where there are tensions around freedom of expression, or there is the potential misuse of the act to potentially curb other people’s rights, we need to strike that balance. And that’s why the report very much suggests any law reform efforts must be grounded in human rights, must align with our international law. and Ms. Sophia Bokovoy, members of the Finnish Green Party. I appreciate the tension. It is a difficult one, and it is a fine line to navigate, but that is why we want our laws to be grounded in human rights, so that all human rights are taken into account when drafting the legislation. I see there are some more questions.
Ottavia Galuzzi: We can take one more question from the floor, and then we have a few questions online. Please go ahead.
Audience: Juri Bokovoy from the Finnish Green Party. It is good that it is highlighted that the big tech still washes their hands of any responsibility of monitoring their platforms for crime everywhere in the world. But my comment was mostly about the same point that was raised about civil societies. I originally come from Belarus, which is struggling from completely the same issues, but there the civil societies have been basically crushed by the government to do whatever they want. And I just want to remind people that civil societies can kind of be seen as a symptom of public distrust of the government’s capabilities and resources to handle these issues. And they should really be worked together with, even if they are annoying to compromise with on cyber law. But yeah, my question is mostly about what is the view of you, Michael, and as a prosecutor and law enforcer on the weight of public trust in these institutions in ways to handle these situations. I mean, you highlighted it about the trust affecting under-reporting quite heavily, but is there any specific way you can see that being improved significantly in this?
Michael Ilishebo: and various engagements to ensure that at least at any given situation at any police station they may start a case they may do something but when it goes beyond their technical capabilities they are able to reach us at the force headquarters which is the police headquarters so we are trying but I hope within the next coming year or two or so we’ll reach that but it’s not something that is easy.
Ottavia Galuzzi: I think we can we can have more questions I can respond to people afterwards. Sorry I’m just gonna take the two questions online if that’s okay with you very quickly and so we have one question on for you Sandra specifically on existing mechanism to support victims of crime and the second question for you Sandra but also for the other speakers is do we have example of countries where countries have enacted and robustly implemented laws against tech-facilitated gender-based violence and if yes how do they go about popularizing and implementing them? Thank you perhaps Sandra if you’d like to start thank you
Sandra Aceng: yeah so the existing mechanism, where is the person from, from which country?
Ottavia Galuzzi: I think we is Emi Okwir Oguele who asked the question online, but I’m not sure about the country but perhaps.
Sandra Aceng: Oh, so he was just saying he was happy to learn some of the existing mechanism highlighted by Sandra. So it wasn’t a question.
Ottavia Galuzzi: And then there is the question about if you know about countries that have enacted and robustly implemented laws against tech-facilitated gender-based violence.
Sandra Aceng: Okay, so that question is asked by Dr. Wakavi, who is also my friend and our partner from Uganda. So it’s a hard question, but there are some notable examples of countries that have enacted and actively also implemented laws that address technology-facilitated gender-based violence. Of course, there is really no legal framework that is perfect and universally that is comprehensive, but some countries have really made a meaningful stride, especially in the legislation. Maybe Tina Power could talk about the South African one that also have done the Cybercrime Act 2020 and also some of the key provisions, especially around it is it’s criminalizing malicious communication, including those intended to cause mental, psychological, and also emotional harm. And as far as I know, they have implemented a strategy of also doing partnership with women’s rights organization. and also to disseminate information, but also the creation of the cybercrime apps under the South African police services. But Tina Power would like maybe to explore more on that. Some of the countries that I also know is like Australia that also have the Online Safety Act of 2021 and that focus really on targeting sex-facilitated gender-based violence, including image-based abuse, cyber bullying, and also serious online harassment. From conversation, they have also the e-safety. Sorry, Sandra, we are out of time. If you can just quickly close it, that would be fantastic, sorry. Yes, so they also have the e-safety commissioner that also is very important. And also some of the examples that I know Dr. Waakabi knows about in Kenya, but explicitly we don’t have policies that really specifically might be talking about technology-facilitated gender-based violence, but things that relate to that. Thank you and over to you.
Ottavia Galuzzi: Thanks, thanks a lot. I’m sorry, we don’t have time for this, but you can come to us to share any comments or the questions you might have. Apologies for running out of time. But yeah, I would really like to thank all the speakers, Tina, Michael, and Sandra for your fantastic contribution. Thanks also to my colleague, Odhran McCarthy, who has been up late to follow us. And please feel free to come to us for any comments and questions. Thanks again for your time. Thank you. ,
Tina Power
Speech speed
212 words per minute
Speech length
2400 words
Speech time
678 seconds
Cybercrimes are rising in Africa but full picture unclear due to significant under-reporting
Explanation
While emerging data and anecdotal evidence shows cybercrime is increasing across Africa, the true scope remains hidden because many victims don’t report incidents. This under-reporting occurs because victims often don’t recognize they’ve been affected by a crime, don’t know where or how to report, or fear stigma and re-victimization.
Evidence
Victims don’t know that they have been affected by a crime, don’t know where to report or how to phrase what happened, fear stigma particularly for personal harms like non-consensual sharing of intimate images, and frontline police officers aren’t equipped to respond with proper codes or articulation of crimes
Major discussion point
Cybercrime Trends and Challenges in Africa
Topics
Cybercrime | Capacity development | Gender rights online
Agreed with
– Michael Ilishebo
– Sandra Aceng
Agreed on
Cybercrime is rising in Africa but full scope remains unclear due to significant under-reporting
Significant disparity exists between how financial crimes versus personal/intimate crimes are treated and responded to
Explanation
The report highlights a clear distinction between financial harms (scams, hacking, money theft) and personal harms (intimate images, harassment, doxing, threats). Financial crimes typically receive quicker and more immediate responses, while personal crimes are often seen as intimate or domestic matters and not taken as seriously by authorities.
Evidence
If you report a financial crime or theft, there’s often a quicker response, whereas reporting personal crimes like harassment, doxing, intimate images are seen as more intimate or domestic in nature and not always taken as seriously
Major discussion point
Victim Experiences and Barriers to Justice
Topics
Cybercrime | Gender rights online | Human rights principles
Certain types of cybercrimes are inherently gendered with women disproportionately affected, especially regarding harassment, doxing, and intimate images
Explanation
The research found that specific forms of cybercrime disproportionately target women and LGBTQI+ community members. These include harassment, doxing, and non-consensual sharing of intimate images, which are clearly gendered in nature and impact these communities more severely.
Evidence
Instances of harassment, doxing, intimate images are all very gendered in nature and it is clear that women are disproportionately affected as are members of the LGBTQI plus community
Major discussion point
Gender Dimensions of Cybercrime
Topics
Gender rights online | Cybercrime | Human rights principles
Agreed with
– Sandra Aceng
Agreed on
Gender-based cybercrimes disproportionately affect women and require specialized approaches
Deep structural barriers hinder access to justice including patchwork of laws, non-existing laws, or outdated legislation
Explanation
The legal framework across African countries is inadequate, consisting of either fragmented legislation, complete absence of relevant laws, or outdated regulations that don’t address current cybercrime realities. This is compounded by under-capacitated law enforcement and justice systems that lack proper resources and training.
Evidence
Legal framework is either a patchwork of laws, non-existing laws, or laws that are outdated or outmoded. Law enforcement and justice systems are often undercapacitated in Africa with insufficient resources and budgeting
Major discussion point
Legal Framework Gaps and Solutions
Topics
Legal and regulatory | Cybercrime | Capacity development
Agreed with
– Michael Ilishebo
– Sandra Aceng
Agreed on
Legal frameworks across Africa are inadequate, fragmented, or non-existent for addressing cybercrime
Good laws matter as they provide clear definitions, guidance, and legal basis for victims to seek redress
Explanation
Well-crafted legislation is essential because it provides clear definitions of crimes, guidance for both public understanding and justice system procedures, and establishes a legal foundation for victims to seek accountability. Laws enable the entire justice process from reporting through investigation to court proceedings.
Evidence
South Africa’s Cyber Crimes Act provides useful example addressing both financial and personal harms, provides institutional guidance, and requires development of operational procedures. Cease and desist letters referencing the Act have been effective deterrents, and Protection from Harassment Act was used successfully to obtain protection orders for online harassment cases
Major discussion point
Legal Framework Gaps and Solutions
Topics
Legal and regulatory | Cybercrime | Human rights principles
Laws must be victim-centered, grounded in human rights, and well-communicated to both public and justice system actors
Explanation
Effective cybercrime legislation must prioritize victims’ needs, align with human rights principles, and be clearly communicated to ensure both the public and justice system understand how to use them. The report recommends conducting legal audits to assess gaps and providing model laws to help countries develop appropriate legislation.
Evidence
Report includes recommendations for legal audits to assess where challenges are at country level, coupled with victim-centered cyber crimes model law to help fill gaps and structure legal approaches
Major discussion point
Recommendations for Improvement
Topics
Legal and regulatory | Human rights principles | Cybercrime
Disagreed with
– Michael Ilishebo
– Audience (Senator)
Disagreed on
Civil society role in cybercrime legislation
Michael Ilishebo
Speech speed
178 words per minute
Speech length
2161 words
Speech time
728 seconds
Law enforcement faces capacity challenges including lack of technical skills and inadequate training
Explanation
African law enforcement agencies struggle with insufficient technical skills to handle cybercrime investigations, which now involve digital evidence in almost every crime. This capacity gap extends throughout the justice system, from police officers who don’t know how to properly code and handle cybercrime reports to courts that lack understanding of these complex cases.
Evidence
Almost every crime committed today has elements of digital evidence. Cases are under-reported because people know that even if they report, they may not get sufficient help. Police failing to handle cases due to technical skills leads to cases being dismissed in courts
Major discussion point
Cybercrime Trends and Challenges in Africa
Topics
Cybercrime | Capacity development | Legal and regulatory
Agreed with
– Tina Power
– Sandra Aceng
Agreed on
Law enforcement and justice systems lack adequate capacity and training to handle cybercrime cases
AI is enabling criminals from other fields to move into cybercrime by lowering technical barriers
Explanation
The rise of artificial intelligence has made cybercrime more accessible to individuals who previously lacked technical coding skills. This has led to an increase in cybercrime as AI tools enable people without traditional technical expertise to enhance their criminal capabilities and commit sophisticated cybercrimes.
Evidence
We have seen people that may not even actually know how to code, may not even know how to frame this and that, using AI to enhance their criminal skills
Major discussion point
Cybercrime Trends and Challenges in Africa
Topics
Cybercrime | Digital business models
Use of encryption and cryptocurrency by criminals complicates investigations and delays justice
Explanation
Criminals increasingly use encryption to hide information on their devices and cryptocurrency to conduct transactions, making it much harder for law enforcement to gather evidence and trace criminal activity. While efforts like the Africa Cryptocurrency Working Group are being established, these technologies continue to present significant challenges for investigators.
Evidence
Use of encryption to hide information on mobile devices, computers delays justice enforcement. Use of cryptocurrency means criminals no longer rely on fiat cash. Africa Cryptocurrency Working Group formed under US Department of Justice with AU in Addis Ababa with about eight countries participating
Major discussion point
Cybercrime Trends and Challenges in Africa
Topics
Cybercrime | Cryptocurrencies | Encryption
Most African countries are not party to the Budapest Convention, limiting international cooperation
Explanation
The lack of participation in international cybercrime conventions severely hampers cross-border cooperation in investigations. With less than 15 African countries party to the Budapest Convention, law enforcement struggles to obtain information and assistance when criminals operate across jurisdictions, especially when crimes are committed from countries with poor or non-existent cybercrime laws.
Evidence
Less than 15 African countries are party to the Budapest Convention. This has played a critical role in terms of getting information when cases or criminals are out of Africa’s jurisdiction. If a country has poor cybercrime legislation, someone from another jurisdiction may cause harm and yet in that country, it’s not a crime
Major discussion point
Legal Framework Gaps and Solutions
Topics
Legal and regulatory | Jurisdiction | Cybercrime
Agreed with
– Tina Power
– Sandra Aceng
Agreed on
Legal frameworks across Africa are inadequate, fragmented, or non-existent for addressing cybercrime
Need for mirrored cybercrime legislation across African countries to address jurisdictional challenges
Explanation
African countries need harmonized cybercrime laws where the same acts constitute offenses across all jurisdictions. Currently, an act may be criminal in one country but legal in another, creating safe havens for criminals and complicating cross-border investigations and prosecutions.
Evidence
If we have mirrored cybercrimes act in Africa, where an offence in Benin, an offence in Kenya, an offence in Zambia speak to the same facts, unlike where it’s an offence in Zambia, you go to Kenya, it’s not an offence. Example given of perpetrator in Kenya targeting victim in Zambia where damage is done in Zambia but different legal frameworks create complications
Major discussion point
Legal Framework Gaps and Solutions
Topics
Legal and regulatory | Jurisdiction | Cybercrime
Enhanced legal frameworks must be continuously updated as cyber laws have limited shelf life unlike traditional criminal codes
Explanation
Unlike traditional criminal laws that can remain effective for decades, cybercrime legislation becomes outdated quickly due to rapidly evolving technology and criminal methods. Countries must continuously review and update their cyber laws to address new challenges and maintain effectiveness.
Evidence
Zambia amended Cybersecurity and Cybercrimes Act two months ago, enhancing the legal framework. Current law addresses present challenges but may not be effective in 2-4 years. Cyber law has a shelf life unlike Penal Code that was enacted 50 years ago and still applies today
Major discussion point
Recommendations for Improvement
Topics
Legal and regulatory | Cybercrime
Social media platforms often refuse to provide information or remove content, citing community guidelines over local laws
Explanation
Law enforcement faces significant challenges when requesting information from social media platforms or asking for content removal. These platforms often prioritize their own community guidelines over local jurisdiction laws, refusing to cooperate even when presented with subpoenas or court orders, which hampers investigations and victim protection.
Evidence
When requesting information from service providers, they will tell you that what you consider an offence in your jurisdiction does not go against their community guidelines. Images posted online that victims want removed are refused because they don’t violate platform guidelines, even when victims feel unsafe or publicly shamed
Major discussion point
Public-Private Partnership Challenges
Topics
Legal and regulatory | Liability of intermediaries | Content policy
Sandra Aceng
Speech speed
123 words per minute
Speech length
2078 words
Speech time
1009 seconds
Victims face stigma, fear of public shaming, and cultural norms that discourage reporting, especially for sexual/intimate abuse
Explanation
Women experiencing tech-facilitated gender-based violence often fear being blamed, ridiculed, or publicly shamed when reporting incidents. Cultural norms particularly discourage women from speaking out about sexual or intimate abuse, leading many to withdraw from social media and isolate themselves rather than seek justice.
Evidence
Case in 2022 of young woman in northern Uganda who had intimate images leaked by ex-partner. Rather than reporting, she withdrew from social media and isolated herself, expressing fear of being judged by police and community when she eventually contacted Women of Uganda Network
Major discussion point
Victim Experiences and Barriers to Justice
Topics
Gender rights online | Cybercrime | Human rights principles
Agreed with
– Tina Power
Agreed on
Gender-based cybercrimes disproportionately affect women and require specialized approaches
Many victims don’t recognize online harms as crimes or are unaware of available legal protections
Explanation
There is significant lack of awareness about cybercrime laws and available legal protections, particularly among rural women and youth. Many victims don’t understand that online harassment constitutes a crime or don’t know about relevant legislation like data protection laws or cybercrime units, believing that legal recourse is only available to those with political connections or financial resources.
Evidence
During workshop in northern Uganda district, about 80% of participants (majority women) had never heard of Uganda Data Protection and Privacy Act. Many believed online abuse reporting was only for those with strong political views or financial connections
Major discussion point
Victim Experiences and Barriers to Justice
Topics
Digital access | Gender rights online | Capacity development
Agreed with
– Tina Power
– Michael Ilishebo
Agreed on
Cybercrime is rising in Africa but full scope remains unclear due to significant under-reporting
Police often display dismissive attitudes toward online harms and lack specific training for tech-facilitated gender-based violence
Explanation
Law enforcement frequently shows dismissive attitudes toward victims of online crimes, especially when the harm is digital rather than physical. Police lack proper training to handle tech-facilitated gender-based violence cases, leading to slow and inconclusive investigations that often require victims to present anonymous suspects or result in advice to simply ignore the abuse.
Evidence
Case of woman whose intimate images were shared in WhatsApp groups – police couldn’t help because she didn’t have enough evidence despite screenshots. University student reporting cyberbullying was told by police to present the anonymous suspect and advised to ignore it and deactivate her account
Major discussion point
Victim Experiences and Barriers to Justice
Topics
Cybercrime | Gender rights online | Capacity development
Agreed with
– Tina Power
– Michael Ilishebo
Agreed on
Law enforcement and justice systems lack adequate capacity and training to handle cybercrime cases
Women and gender minorities face numerous obstacles shaped by social stigma, institutional gaps, and digital illiteracy
Explanation
The intersection of social stigma, inadequate institutional responses, and limited digital literacy creates multiple barriers for women and gender minorities seeking justice for cybercrimes. These interconnected challenges prevent effective reporting and redress, particularly affecting marginalized communities who face additional discrimination.
Major discussion point
Gender Dimensions of Cybercrime
Topics
Gender rights online | Digital access | Human rights principles
Need for specific laws addressing technology-facilitated gender-based violence including non-consensual intimate images and AI-generated sexual content
Explanation
Current legal frameworks lack specific provisions for technology-facilitated gender-based violence. New legislation should explicitly cover non-consensual intimate images, cyber-stalking, doxing, online harassment, and emerging threats like AI-generated sexual content (deepfakes) and gender disinformation.
Evidence
Uganda has no specific law addressing technology-facilitated gender-based violence. Reference to African Commission of Human and People’s Rights Resolution 522 on protection of women against digital violence, calling for member states to enact specific laws addressing digital violence
Major discussion point
Gender Dimensions of Cybercrime
Topics
Legal and regulatory | Gender rights online | Cybercrime
Agreed with
– Tina Power
– Michael Ilishebo
Agreed on
Legal frameworks across Africa are inadequate, fragmented, or non-existent for addressing cybercrime
Multi-stakeholder partnerships needed including engagement of boys, men, and communities in prevention efforts
Explanation
Reducing online gender-based violence requires comprehensive prevention strategies that engage all community members, particularly boys and men. This includes challenging harmful gender norms, promoting respectful digital behavior, and supporting male champions and peer educators to model positive engagement in schools and communities.
Major discussion point
Recommendations for Improvement
Topics
Gender rights online | Sociocultural | Human rights principles
Digital literacy and awareness programs essential, including community-based workshops and integration into school curricula
Explanation
Comprehensive digital literacy programs are crucial for prevention and response, including education on online safety, privacy settings, recognizing scams and harassment, and reporting channels. These programs should use diverse methods like community workshops, radio, social media, and innovative approaches like comic books, while being integrated into school curricula.
Evidence
Recently explored use of comic books which was very impactful, going to communities and reading to them in a fun way to educate on digital security tips and raise awareness that online gender-based violence is real
Major discussion point
Recommendations for Improvement
Topics
Online education | Digital access | Capacity development
Audience
Speech speed
142 words per minute
Speech length
668 words
Speech time
282 seconds
Need for continental approach where violations in one African country result in sanctions applicable across the continent
Explanation
African countries should adopt a unified approach similar to Europe, where violating laws in one country results in continent-wide sanctions. This would force big tech companies to respect African national laws and values, as currently they can ignore individual country requests while responding quickly to European demands.
Evidence
Same content that big tech will bring down in less than one day in Europe, you’ll be shuffling papers for two weeks in Africa while victims continue to suffer and sometimes go into depression. Suggests Africa needs concerted continental position similar to European approach
Major discussion point
Public-Private Partnership Challenges
Topics
Legal and regulatory | Liability of intermediaries | Jurisdiction
Disagreed with
– Michael Ilishebo
– Audience (Senator)
– Tina Power
Disagreed on
Civil society role in cybercrime legislation
Civil society organizations sometimes oppose cybercrime law amendments believing they constrain freedom of expression
Explanation
There is tension between protecting victims and preserving civil liberties, as civil society organizations often resist cybercrime legislation amendments out of concern that they will restrict freedom of expression. This creates challenges for governments trying to strengthen victim protection laws while balancing competing rights and interests.
Evidence
Nigeria enacted cybercrime law in 2015, amended it last year, and is amending again due to UN Convention. Civil society organizations are always up in arms when there’s attempt to amend cybercrime law to protect citizens, believing it will constrain civic space
Major discussion point
Public-Private Partnership Challenges
Topics
Freedom of expression | Legal and regulatory | Human rights principles
Disagreed with
– Michael Ilishebo
– Audience (Senator)
– Tina Power
Disagreed on
Civil society role in cybercrime legislation
Ottavia Galuzzi
Speech speed
145 words per minute
Speech length
1510 words
Speech time
624 seconds
AI is reducing technical barriers and enabling criminals from other fields to move into cybercrime
Explanation
Artificial intelligence is not creating entirely new criminals but is allowing existing criminals from other domains to transition into cybercrime by lowering the technical skill requirements. This trend is concerning because AI tools make sophisticated cybercrimes accessible to individuals who previously lacked the necessary technical expertise.
Evidence
Research shows that AI is not creating new criminals but allowing criminals in other fields to move into the cybercrime environment, which is concerning because AI reduces the level of barrier into everything that is technical
Major discussion point
Cybercrime Trends and Challenges in Africa
Topics
Cybercrime | Digital business models
UNICRI’s work addresses cyber threats through action-oriented research and capacity-building involving multiple stakeholders
Explanation
UNICRI’s cybercrime and online harms work stream explores the interplay of different cyber threats and harmful behaviors, seeking to develop inclusive and rights-based solutions. The organization addresses these threats through targeted research on niche areas, capacity-building with tech companies, and technical assistance to member states for policy-making.
Evidence
UNICRI addresses cyber threats including terrorism, violent extremism online, gender-based violence, child abuse and exploitation, and hate speech through action-oriented research, capacity-building activities involving tech companies, and technical assistance to member states and policy-making
Major discussion point
Organizational Approaches to Cybercrime
Topics
Cybercrime | Capacity development | Human rights principles
The report being launched is unique in its Africa-focused, victim-centered approach informed by diverse stakeholders
Explanation
The Access to Justice in the Digital Age report stands out because it specifically focuses on Africa’s cybercrime challenges, centers on individual victim experiences rather than institutional hacks, and incorporates perspectives from private sector, civil society, UN agencies, and government. The report aims to provide practical solutions rather than just explaining what cybercrimes are.
Evidence
Report is unique in its focus on Africa, victim-centered nature focusing on lived experiences of ordinary people rather than hacks of big banks, informed by diverse stakeholders including private sector, civil society, UN agencies and government, and geared towards finding meaningful and practical solutions
Major discussion point
Research Methodology and Approach
Topics
Cybercrime | Human rights principles | Capacity development
Agreements
Agreement points
Cybercrime is rising in Africa but full scope remains unclear due to significant under-reporting
Speakers
– Tina Power
– Michael Ilishebo
– Sandra Aceng
Arguments
Cybercrimes are rising in Africa but full picture unclear due to significant under-reporting
Law enforcement faces capacity challenges including lack of technical skills and inadequate training
Many victims don’t recognize online harms as crimes or are unaware of available legal protections
Summary
All speakers agree that cybercrime is increasing across Africa, but the true extent is hidden by massive under-reporting caused by victims not recognizing crimes, lack of awareness of reporting mechanisms, and inadequate law enforcement capacity to handle cases properly.
Topics
Cybercrime | Capacity development | Digital access
Law enforcement and justice systems lack adequate capacity and training to handle cybercrime cases
Speakers
– Tina Power
– Michael Ilishebo
– Sandra Aceng
Arguments
Deep structural barriers hinder access to justice including patchwork of laws, non-existing laws, or outdated legislation
Law enforcement faces capacity challenges including lack of technical skills and inadequate training
Police often display dismissive attitudes toward online harms and lack specific training for tech-facilitated gender-based violence
Summary
There is unanimous agreement that law enforcement agencies across Africa lack the technical skills, training, and resources needed to effectively investigate cybercrime cases, leading to dismissive attitudes and inadequate responses to victims.
Topics
Cybercrime | Capacity development | Legal and regulatory
Legal frameworks across Africa are inadequate, fragmented, or non-existent for addressing cybercrime
Speakers
– Tina Power
– Michael Ilishebo
– Sandra Aceng
Arguments
Deep structural barriers hinder access to justice including patchwork of laws, non-existing laws, or outdated legislation
Most African countries are not party to the Budapest Convention, limiting international cooperation
Need for specific laws addressing technology-facilitated gender-based violence including non-consensual intimate images and AI-generated sexual content
Summary
All speakers acknowledge that current legal frameworks are insufficient, consisting of either patchwork legislation, outdated laws, or complete absence of relevant cybercrime legislation, particularly for gender-based online violence.
Topics
Legal and regulatory | Cybercrime | Gender rights online
Gender-based cybercrimes disproportionately affect women and require specialized approaches
Speakers
– Tina Power
– Sandra Aceng
Arguments
Certain types of cybercrimes are inherently gendered with women disproportionately affected, especially regarding harassment, doxing, and intimate images
Victims face stigma, fear of public shaming, and cultural norms that discourage reporting, especially for sexual/intimate abuse
Summary
Both speakers agree that women face unique challenges in cybercrime victimization, experiencing disproportionate targeting for intimate and personal harms, combined with cultural barriers that prevent reporting and seeking justice.
Topics
Gender rights online | Cybercrime | Human rights principles
Similar viewpoints
Both speakers emphasize the critical importance of well-crafted, up-to-date legislation as the foundation for addressing cybercrime, with Tina focusing on victim-centered legal frameworks and Michael highlighting the need for continuous updates due to evolving technology.
Speakers
– Tina Power
– Michael Ilishebo
Arguments
Good laws matter as they provide clear definitions, guidance, and legal basis for victims to seek redress
Enhanced legal frameworks must be continuously updated as cyber laws have limited shelf life unlike traditional criminal codes
Topics
Legal and regulatory | Cybercrime | Human rights principles
Both speakers highlight the inadequate institutional responses to cybercrime, with Michael focusing on platform non-cooperation and Sandra on police dismissiveness, both contributing to victims’ inability to access justice.
Speakers
– Michael Ilishebo
– Sandra Aceng
Arguments
Social media platforms often refuse to provide information or remove content, citing community guidelines over local laws
Police often display dismissive attitudes toward online harms and lack specific training for tech-facilitated gender-based violence
Topics
Legal and regulatory | Liability of intermediaries | Cybercrime
Both speakers advocate for comprehensive, rights-based approaches that involve multiple stakeholders and prioritize victim needs, emphasizing the importance of community engagement and human rights principles in addressing cybercrime.
Speakers
– Tina Power
– Sandra Aceng
Arguments
Laws must be victim-centered, grounded in human rights, and well-communicated to both public and justice system actors
Multi-stakeholder partnerships needed including engagement of boys, men, and communities in prevention efforts
Topics
Human rights principles | Gender rights online | Capacity development
Unexpected consensus
AI is enabling criminals from other fields to transition into cybercrime by lowering technical barriers
Speakers
– Michael Ilishebo
– Ottavia Galuzzi
Arguments
AI is enabling criminals from other fields to move into cybercrime by lowering technical barriers
AI is reducing technical barriers and enabling criminals from other fields to move into cybercrime
Explanation
It’s unexpected that both a law enforcement practitioner and a research organization representative would specifically identify the same emerging trend about AI democratizing cybercrime capabilities, showing sophisticated understanding of how technology is reshaping the criminal landscape.
Topics
Cybercrime | Digital business models
Need for continental approach to cybercrime legislation and enforcement
Speakers
– Michael Ilishebo
– Audience (Senator)
Arguments
Need for mirrored cybercrime legislation across African countries to address jurisdictional challenges
Need for continental approach where violations in one African country result in sanctions applicable across the continent
Explanation
The convergence between a law enforcement officer’s technical perspective and a parliamentarian’s policy perspective on the need for harmonized African cybercrime laws demonstrates unexpected alignment between operational and legislative viewpoints.
Topics
Legal and regulatory | Jurisdiction | Cybercrime
Tension between civil society concerns about freedom of expression and victim protection needs
Speakers
– Michael Ilishebo
– Audience (Senator)
– Tina Power
Arguments
Law enforcement faces capacity challenges including lack of technical skills and inadequate training
Civil society organizations sometimes oppose cybercrime law amendments believing they constrain freedom of expression
Laws must be victim-centered, grounded in human rights, and well-communicated to both public and justice system actors
Explanation
Unexpected consensus emerged on the challenge of balancing victim protection with civil liberties, with speakers from different sectors acknowledging this tension and the need for human rights-grounded solutions.
Topics
Human rights principles | Freedom of expression | Legal and regulatory
Overall assessment
Summary
Strong consensus exists on core challenges: rising cybercrime with massive under-reporting, inadequate legal frameworks, insufficient law enforcement capacity, and disproportionate impact on women. Speakers also agree on fundamental solutions including better laws, enhanced capacity building, and multi-stakeholder approaches.
Consensus level
High level of consensus with remarkable alignment across different sectors (law enforcement, civil society, legal experts, policymakers) on both problem identification and solution directions. This strong agreement suggests a mature understanding of cybercrime challenges in Africa and creates a solid foundation for coordinated action. The consensus spans technical, legal, and social dimensions, indicating comprehensive shared understanding of the issue’s complexity.
Differences
Different viewpoints
Civil society role in cybercrime legislation
Speakers
– Michael Ilishebo
– Audience (Senator)
– Tina Power
Arguments
Civil society organizations sometimes oppose cybercrime law amendments believing they constrain freedom of expression
Need for continental approach where violations in one African country result in sanctions applicable across the continent
Laws must be victim-centered, grounded in human rights, and well-communicated to both public and justice system actors
Summary
There’s tension between law enforcement/government officials who see civil society as obstructing victim protection laws, and the legal perspective that emphasizes balancing human rights including freedom of expression. The Senator and Michael view civil society opposition as problematic, while Tina acknowledges the need to strike a balance and ground laws in human rights principles.
Topics
Freedom of expression | Legal and regulatory | Human rights principles
Unexpected differences
Approach to big tech accountability
Speakers
– Michael Ilishebo
– Audience (Senator)
Arguments
Social media platforms often refuse to provide information or remove content, citing community guidelines over local laws
Need for continental approach where violations in one African country result in sanctions applicable across the continent
Explanation
While both speakers identify big tech non-compliance as a problem, they propose different solutions. Michael focuses on improving public-private partnerships and meeting platforms’ criteria for information sharing, while the Senator advocates for a more confrontational continental sanctions approach similar to Europe’s model. This represents a tactical disagreement on engagement versus enforcement strategies.
Topics
Legal and regulatory | Liability of intermediaries | Jurisdiction
Overall assessment
Summary
The discussion shows remarkable consensus on problem identification but reveals nuanced disagreements on solutions and implementation strategies. Main areas of disagreement center on balancing human rights with victim protection, the role of civil society in legislation, and strategies for big tech accountability.
Disagreement level
Low to moderate disagreement level with high strategic alignment. The disagreements are primarily tactical rather than fundamental, focusing on implementation approaches rather than core objectives. This suggests strong potential for collaborative solutions that incorporate different perspectives, though careful navigation of civil society relations and human rights balance will be crucial for effective policy development.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers emphasize the critical importance of well-crafted, up-to-date legislation as the foundation for addressing cybercrime, with Tina focusing on victim-centered legal frameworks and Michael highlighting the need for continuous updates due to evolving technology.
Speakers
– Tina Power
– Michael Ilishebo
Arguments
Good laws matter as they provide clear definitions, guidance, and legal basis for victims to seek redress
Enhanced legal frameworks must be continuously updated as cyber laws have limited shelf life unlike traditional criminal codes
Topics
Legal and regulatory | Cybercrime | Human rights principles
Both speakers highlight the inadequate institutional responses to cybercrime, with Michael focusing on platform non-cooperation and Sandra on police dismissiveness, both contributing to victims’ inability to access justice.
Speakers
– Michael Ilishebo
– Sandra Aceng
Arguments
Social media platforms often refuse to provide information or remove content, citing community guidelines over local laws
Police often display dismissive attitudes toward online harms and lack specific training for tech-facilitated gender-based violence
Topics
Legal and regulatory | Liability of intermediaries | Cybercrime
Both speakers advocate for comprehensive, rights-based approaches that involve multiple stakeholders and prioritize victim needs, emphasizing the importance of community engagement and human rights principles in addressing cybercrime.
Speakers
– Tina Power
– Sandra Aceng
Arguments
Laws must be victim-centered, grounded in human rights, and well-communicated to both public and justice system actors
Multi-stakeholder partnerships needed including engagement of boys, men, and communities in prevention efforts
Topics
Human rights principles | Gender rights online | Capacity development
Takeaways
Key takeaways
Cybercrime is a growing justice issue in Africa, not just a technical problem, requiring comprehensive legal frameworks and institutional capacity building
There is significant under-reporting of cybercrimes due to victims not recognizing crimes, lack of awareness of reporting mechanisms, stigma (especially for gender-based violence), and inadequate police response
Gender-based cybercrimes disproportionately affect women and LGBTQI+ communities, with personal/intimate crimes receiving less serious treatment than financial crimes
Legal frameworks across Africa are inadequate – either non-existent, outdated, or fragmented – and most African countries are not party to international conventions like Budapest Convention
Law enforcement faces critical capacity gaps including lack of technical skills, training, and resources to handle cybercrime investigations effectively
AI is lowering barriers to cybercrime entry, enabling criminals from other fields to move into cyber offenses
Public-private partnerships with tech companies are problematic, with platforms often refusing to cooperate with African law enforcement or remove harmful content
Effective solutions require victim-centered, human rights-based approaches that balance crime prevention with protection of fundamental rights like freedom of expression
Resolutions and action items
UNICRI and ALT Advisory published and launched the research report ‘Access to Justice in the Digital Age: Empowering Victims of Cybercrime in Africa’
The report provides a cybercrime legislative audit tool for countries to assess their current legal frameworks and identify gaps
A victim-centered cybercrime model law is being developed to help countries fill legislative gaps
Recommendation to establish specialized technology-facilitated gender-based violence response desks at police stations with trained female officers
Proposal for African countries to adopt a continental approach to cybercrime sanctions, similar to Europe’s coordinated response
Call for enhanced international cooperation through increased African participation in cybercrime conventions, particularly the upcoming UN Convention
Development of clear coding systems for cybercrime investigations to improve data collection and case management
Unresolved issues
How to effectively balance cybercrime legislation with freedom of expression concerns raised by civil society organizations
Lack of cooperation from major tech platforms in providing information to African law enforcement and removing harmful content
Jurisdictional challenges when perpetrators and victims are in different countries with varying legal frameworks
Insufficient funding and resources for law enforcement capacity building and training programs
Limited public trust in institutions’ ability to handle cybercrime cases effectively
How to scale up victim support services, particularly in rural areas with limited digital literacy
The challenge of keeping cybercrime laws current as technology evolves rapidly, unlike traditional criminal codes
Addressing the root causes of gender-based online violence through community engagement and changing social norms
Suggested compromises
Grounding all cybercrime legislation in human rights principles to address civil society concerns about freedom of expression while protecting victims
Developing multi-stakeholder partnerships that include government, civil society, private sector, and international organizations in crafting solutions
Creating mirrored cybercrime legislation across African countries to address jurisdictional issues while respecting national sovereignty
Balancing the need for swift content removal with due process rights by establishing clear criteria for when platforms must act on law enforcement requests
Implementing graduated training approaches that build capacity at different levels of the justice system rather than trying to address all gaps simultaneously
Using multiple communication channels (community workshops, radio, social media, comic books) to reach diverse populations with cybercrime awareness
Thought provoking comments
Ultimately, the report has made it clear that cyber-crimes is not just a technical issue, it’s a justice issue. If we don’t have correct avenues for justice, victims and survivors will not be able to seek recourse, we will lack accountability, and we will not be able to move forward as a global country.
Speaker
Tina Power
Reason
This comment reframes the entire cybercrime discourse by shifting focus from technical solutions to justice mechanisms. It challenges the common assumption that cybercrime is primarily a technological problem requiring technological fixes, instead positioning it as fundamentally about access to justice and human rights.
Impact
This insight established the foundational framework for the entire discussion, steering subsequent speakers to focus on victim-centered approaches, legal frameworks, and institutional capacity rather than purely technical solutions. It influenced how other panelists framed their responses, with Michael discussing law enforcement capacity and Sandra emphasizing victim support mechanisms.
We have seen an increase in terms of the cyber crime that is happening due to the use of AI. We have seen people that may not even actually know how to code. People may not even know how to frame this and that. They are using AI to enhance their criminal skills.
Speaker
Michael Ilishebo
Reason
This observation introduces a critical new dimension to cybercrime – that AI is democratizing criminal capabilities by lowering technical barriers. It’s particularly insightful because it identifies AI not as creating new criminals, but as enabling existing criminals to migrate into cybercrime with minimal technical knowledge.
Impact
This comment prompted Ottavia to elaborate that ‘AI is not creating new criminals, is actually allowing criminals in other fields moving into the cybercrime environment,’ deepening the discussion about evolving threat landscapes. It shifted the conversation toward understanding how technological advancement is changing the nature of criminal activity and the implications for law enforcement preparedness.
You cannot fix what is broken by tech using the law. However, we also need the law.
Speaker
Sandra Aceng
Reason
This paradoxical statement captures the complex relationship between technology and legal frameworks in addressing cybercrime. It acknowledges both the limitations of legal solutions for technological problems while affirming their necessity, highlighting the need for multi-faceted approaches.
Impact
This comment reinforced Tina’s earlier point about laws not being sufficient alone, while adding nuance about the inherent limitations of legal remedies for technology-facilitated harms. It helped establish consensus among panelists about the need for comprehensive, multi-stakeholder approaches rather than single-solution thinking.
Unless and until we are able to bring the big text into the table, and I suggest a situation where Africa as a continent come together, does the same thing that happens in Europe. You violate the law in a part of Europe, the entire Europe will fine you. Unless and until all of those big texts know that if you violate the law in South Africa, your sanction is not limited to South Africa. Same sanction will be applicable in Côte d’Ivoire, in Ghana, in Nigeria. That is the only time they will respect our national laws, our national values.
Speaker
Senator Shweba Fola Besalisu
Reason
This comment introduces a geopolitical power analysis to the cybercrime discussion, highlighting how continental unity could create leverage against global tech platforms. It’s insightful because it moves beyond individual country approaches to propose collective action as a solution to the power imbalance between African nations and multinational tech companies.
Impact
This intervention shifted the discussion from national-level solutions to continental strategy, introducing themes of digital sovereignty and collective bargaining power. It prompted Michael to acknowledge that challenges are similar across African countries and suggested the need for continued conversations beyond the panel, indicating this perspective opened new avenues for thinking about regional cooperation.
Civil society organizations, every time there’s an attempt to amend the cyber crime law to protect citizens, the civil society organizations are always in arms, believing and thinking that it’s about to constrain the space, forgetting that sometimes where your own rights stop is where the rights of somebody else start.
Speaker
Senator Shweba Fola Besalisu
Reason
This comment exposes a fundamental tension in cybercrime legislation between protecting victims and preserving civil liberties. It’s thought-provoking because it challenges civil society’s role and suggests that advocacy for digital rights might sometimes conflict with victim protection, raising questions about balancing competing human rights.
Impact
This comment prompted Tina to acknowledge the ‘very fine line’ and emphasize the need for human rights-grounded legislation, while an audience member from Finland provided a counter-perspective about civil society as a ‘symptom of public distrust.’ This exchange deepened the discussion about the complex relationship between different stakeholders and the challenges of multi-stakeholder governance in cybercrime policy.
The report highlights quite an important distinction between financial harm and personal harm… what is interesting about the distinction is how it is responded to. If you report a financial crime or a theft, there’s often a quicker response and a more immediate response, whereas reporting personal crimes are seen as more intimate or domestic in nature and not always taken as seriously.
Speaker
Tina Power
Reason
This distinction reveals systemic bias in how different types of cybercrimes are prioritized and investigated. It’s insightful because it exposes how traditional gender biases in law enforcement extend into the digital realm, where crimes affecting women disproportionately receive less serious treatment.
Impact
This categorization provided a framework that other speakers built upon throughout the discussion. Sandra’s examples of intimate image sharing and the dismissive police responses directly illustrated this distinction, while Michael’s discussion of law enforcement challenges implicitly acknowledged these different response patterns. It helped structure the conversation around understanding why certain victims face greater barriers to justice.
Overall assessment
These key comments fundamentally shaped the discussion by establishing it as a victim-centered, justice-focused conversation rather than a purely technical cybersecurity discussion. Tina’s reframing of cybercrime as a justice issue set the tone for examining systemic barriers and human rights implications. The insights about AI democratizing criminal capabilities and the distinction between financial and personal harms added analytical depth by revealing how technological and social factors intersect to create new challenges. The Senator’s interventions introduced crucial geopolitical and governance dimensions, highlighting power imbalances and stakeholder tensions that often remain unaddressed in cybercrime discussions. Together, these comments elevated the conversation from a problem-identification exercise to a nuanced exploration of structural inequalities, continental cooperation strategies, and the complex balance between rights protection and victim support. The discussion evolved from individual country experiences to broader systemic analysis, ultimately emphasizing the need for comprehensive, multi-stakeholder, rights-based approaches to cybercrime in Africa.
Follow-up questions
How can we better capture the full picture of cybercrime in Africa given significant under-reporting?
Speaker
Tina Power
Explanation
There’s a gap between known cybercrime trends and actual reality due to victims not knowing they’ve been affected, not knowing where to report, or facing stigma in reporting
What specific technical capacity building is needed for law enforcement institutions in Africa?
Speaker
Michael Ilishebo
Explanation
Law enforcement faces challenges with technical skills for cybercrime investigations, and more detailed understanding of capacity needs would help address these gaps
How can we achieve better balance between protecting victims and preserving freedom of expression in cybercrime legislation?
Speaker
Michael Ilishebo
Explanation
There’s tension between civil society concerns about freedom of expression being stifled and the need for victim-specific laws
How can we improve cooperation between social media platforms and law enforcement for victim protection?
Speaker
Michael Ilishebo
Explanation
Current challenges exist where platforms refuse to remove content or provide information even when victims feel unsafe, citing community guidelines
How can African countries develop mirrored cybercrime legislation across jurisdictions?
Speaker
Michael Ilishebo
Explanation
Cross-border nature of cybercrime requires harmonized laws so that offenses are consistently defined across African countries
What are effective models for scaling up survivor-centered support services, particularly in rural areas?
Speaker
Sandra Aceng
Explanation
Need to expand access to support platforms, toll-free lines, and psychosocial services, especially for underserved communities
How can we effectively engage boys, men, and communities in preventing online gender-based violence?
Speaker
Sandra Aceng
Explanation
Prevention requires challenging harmful gender norms and promoting positive digital behavior through community engagement
How can civil society organizations be better integrated into cybercrime law development processes?
Speaker
Senator Shweba Fola Besalisu
Explanation
Need to address tensions between civil society concerns about rights restrictions and victim protection needs
How can Africa develop a continental approach to enforce cybercrime sanctions against big tech companies?
Speaker
Senator Shweba Fola Besalisu
Explanation
Current enforcement is ineffective because sanctions are limited to individual countries rather than coordinated continental response
How can public trust in law enforcement institutions be improved to increase cybercrime reporting?
Speaker
Juri Bokovoy
Explanation
Under-reporting is significantly affected by lack of trust in government capabilities to handle cybercrime cases
What are comprehensive examples of countries that have successfully implemented tech-facilitated gender-based violence laws and how do they popularize them?
Speaker
Dr. Wakavi (online participant)
Explanation
Need concrete models of successful implementation strategies for other countries to learn from
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
