Towards a Resilient Information Ecosystem: Balancing Platform Governance and Technology
8 Jul 2025 09:00h - 09:45h
Towards a Resilient Information Ecosystem: Balancing Platform Governance and Technology
Session at a glance
Summary
This UNESCO session focused on building resilient information ecosystems through balanced platform governance and technology, examining challenges and solutions in the digital media landscape. Assistant Director General Tawfik Jelassi opened by explaining UNESCO’s constitutional mission to build peace through education and culture, noting how digital platforms have become sources of misinformation, hate speech, and harmful content that undermine this goal. He highlighted UNESCO’s “For an Internet of Trust” initiative, which developed guidelines for digital platform governance through extensive global consultations involving over 10,000 inputs from 134 countries.
Professor Ingrid Volkmer emphasized that current crises require shifting from focusing solely on information and disinformation to understanding communication more broadly as an ecosystem. She pointed to examples from Ukraine and the Democratic Republic of Congo, where vital connectivity, cloud storage, and AI platforms create complex communication networks dominated by global corporate monopolies that remain largely unregulated during crisis periods. Frédéric Bokobza from France’s ARCAM discussed regulatory challenges, particularly the need to reduce information asymmetry with platforms and implement transparency requirements under the Digital Services Act, while emphasizing the importance of direct cooperation channels with platforms, especially during elections.
Google’s Nadja Blagojevic outlined the company’s approach to user empowerment through provenance technologies like SynthID watermarking, context tools such as “About This Image,” and media literacy programs like “Be Internet Awesome.” Maria Paz Canales from Global Partners Digital highlighted positive uses of AI by civil society, including protecting protesters’ identities, real-time crisis reporting, and preserving collective memory. The discussion concluded with calls for enhanced global cooperation among regulatory authorities and recognition of information as a public good requiring multi-stakeholder solutions.
Keypoints
## Major Discussion Points:
– **Expanding Crisis Communication Beyond Disinformation**: The discussion emphasized moving from a narrow focus on disinformation to a broader communication ecosystem perspective, particularly during crises. This includes vital connectivity infrastructure (like Starlink), cloud storage, AI platforms, and various communication technologies that create a holistic crisis communication environment.
– **Platform Governance and Regulatory Challenges**: Regulatory authorities face significant challenges in governing digital platforms, particularly around transparency, accountability, and reducing information asymmetry with platforms. The discussion highlighted the importance of multi-stakeholder approaches and international cooperation among regulatory networks.
– **User Empowerment and Media Literacy**: The conversation addressed the dual responsibility of platforms to provide transparency tools and users to develop critical thinking skills. This includes technical solutions like digital watermarking (SynthID), context tools, and comprehensive media literacy programs for both children and adults.
– **Civil Society’s Role in Leveraging Technology**: The discussion explored how civil society organizations use AI and digital platforms positively to amplify marginalized voices, support freedom of expression, organize during crises, preserve collective memory, and enhance advocacy efforts despite funding constraints.
– **Building Resilient Information Ecosystems Through Global Cooperation**: The conversation emphasized the need for international collaboration among regulatory authorities, the importance of treating information as a public good, and the challenges of supporting public interest media in a market-driven digital environment.
## Overall Purpose:
The discussion aimed to assess progress on media-related aspects of the World Summit on the Information Society (WSIS) and identify key challenges and actions needed to maintain a resilient information ecosystem. The session was designed to gather multi-stakeholder input for the WSIS revision process, focusing on balancing platform governance with technological advancement.
## Overall Tone:
The discussion maintained a professional, collaborative tone throughout, characterized by constructive problem-solving rather than confrontational debate. Speakers acknowledged both the challenges and opportunities presented by digital platforms and AI, with a balanced approach that recognized the complexity of the issues. The tone remained consistently forward-looking and solution-oriented, emphasizing the need for cooperation among different stakeholder groups rather than adversarial relationships.
Speakers
– **Ana Cristina Ruelas** – UNESCO’s Freedom of Expression, Safety of Journalist sections within the communication and information sector
– **Tawfik Jelassi** – Assistant Director General (ADG) at UNESCO
– **Ingrid Volkmer** – Professor of Media and Communication at the University of Melbourne, member of the i14 Global Knowledge Network
– **Frederic Bokobza** – Deputy Director General at the French Authority for Regulation of Audiovisual and Digital Communication (ARCAM), part of the Francophone Network of Regulatory Authorities (REFRAM)
– **Maria Paz Canales** – Head of Policy and Advocacy at Global Partners Digital
– **Nadja Blagojevic** – Global Go-to-Market Lead and Knowledge and Information Trust Manager at Google
– **Audience** – Various participants asking questions during the session
**Additional speakers:**
None identified beyond the provided speakers names list.
Full session report
# Building Resilient Information Ecosystems: UNESCO WSIS Review Session on Platform Governance
## Executive Summary
This UNESCO session, part of the World Summit on the Information Society (WSIS) review process, examined critical challenges facing global information ecosystems. Moderated by Ana Cristina Ruelas, the discussion brought together regulatory authorities, academia, civil society, and technology sector representatives to assess progress on media-related aspects of WSIS and identify actions needed to maintain resilient information ecosystems. The session explored how digital platforms have transformed the information landscape, creating both opportunities for democratic participation and threats to social cohesion.
## Opening Framework: Information as a Public Good
Assistant Director General Tawfik Jelassi opened by establishing UNESCO’s constitutional mission to build peace through education, science, and culture. He quoted 2021 Nobel Peace Prize laureate Maria Ressa, stating: “without facts, there is no truth. And without truth, there is no trust. And without trust, there is no shared reality upon which we can act.”
Jelassi highlighted UNESCO’s 33-year history with the World Press Freedom Conference and 27-year World Press Freedom Prize, emphasizing that digital platforms, while offering potential for knowledge sharing, have become sources of misinformation and hate speech. He cited an MIT study showing that “lies online travel ten times faster than the truth.”
The Assistant Director General outlined UNESCO’s “For an Internet of Trust” initiative, which developed guidelines for digital platform governance through global consultations involving inputs from 134 countries. He emphasized treating information as a public good rather than a commercial commodity, noting that misinformation and online violence “do not contribute to peace building” but rather “contribute to more hatred, more divisive communities, and sometimes to genocides and to conflicts and wars.”
## Paradigm Shift: From Information to Communication Ecosystems
Professor Ingrid Volkmer from the University of Melbourne argued for expanding focus beyond information and disinformation to understanding communication as an interconnected ecosystem. She stated: “it is not just information or disinformation that is central to crisis, but communication more broadly… we need to shift from this information paradigm into a broader communication paradigm, which is relevant for regulation of crisis communication.”
Volkmer provided concrete examples from Ukraine, where crisis communication involves complex networks including Starlink satellite connectivity, cloud storage platforms, open source software, and Magtag systems for trauma treatment. She highlighted concerns about data sovereignty, noting that crucial national information stored in commercial cloud systems raises questions about access and control during crises.
Her analysis revealed how global corporate monopolies dominate critical communication infrastructure during crisis periods, often remaining largely unregulated despite their essential role in maintaining social cohesion and democratic processes.
## Regulatory Approaches and International Cooperation
Frédéric Bokobza, Deputy Director General at France’s ARCAM and representative of the Francophone Network of Regulatory Authorities (REFRAM), discussed practical regulatory challenges. He emphasized the need to reduce information asymmetry between regulators and platforms while implementing effective transparency requirements.
Bokobza highlighted the Digital Services Act’s significance in establishing transparency obligations for platforms regarding content moderation and algorithmic decision-making. He stressed the importance of direct communication channels with platforms, particularly during critical periods like elections.
He announced that REFRAM and ACRON held a meeting in Abidjan with four major platforms (Meta, X, TikTok, and Google), securing commitments on moderation practices, transparency measures, and local language support. This represents a shift from confrontational approaches toward collaborative frameworks that maintain accountability while recognizing practical complexities of platform governance.
## Technology Solutions and User Empowerment
Nadja Blagojevic, Google’s Global Go-to-Market Lead and Knowledge and Information Trust Manager, outlined technology sector approaches to information ecosystem challenges through user empowerment and technical innovation.
Blagojevic detailed Google’s implementation of provenance technologies, including the SynthID watermarking system for identifying AI-generated content and the Coalition for Content Provenance and Authenticity (C2PA) standards. She described context tools like “About This Image” that provide users with information about image sources and verification status.
She emphasized Google’s media literacy programs, including “Be Internet Awesome,” which builds critical thinking skills across multiple countries. Blagojevic acknowledged the complexity of serving diverse global audiences, noting that “what works in one country or culture may not work in another” and recognizing “differing needs and sometimes divergent interests” in global platform governance.
Google announced plans to open-source the SynthID watermarking technology for text, demonstrating commitment to technical transparency and collaborative solutions.
## Civil Society Perspectives and Technology Applications
Maria Paz Canales from Global Partners Digital highlighted both positive AI applications and challenges facing civil society organizations. She outlined how civil society uses AI for protecting protesters’ identities, enabling crisis reporting, amplifying marginalized voices, and preserving collective memory through digital archiving.
Canales noted funding constraints that limit civil society’s ability to compete with well-resourced commercial and state actors, while demonstrating how technology enables more effective advocacy and broader audience reach.
She introduced important nuance regarding media support, stating: “I am very careful coming from Latin America about drawing the line between traditional media and public interest media because they are not the same, at least from the region that I come from.” This highlighted how media ownership and political contexts vary dramatically across regions.
## Key Challenges and Audience Concerns
Audience interventions raised critical concerns about the economic sustainability of public interest media, noting that 75% of advertising revenues now flow to digital platforms. This has created a funding crisis for traditional media organizations that requires public sector intervention to maintain diverse media ecosystems.
Questions were raised about supporting media organizations that serve the public interest while addressing the financial challenges created by platform competition for advertising revenue.
## Future Directions and Next Steps
UNESCO announced plans to move forward with pilot implementations of their platform governance guidelines. The Global Forum of Networks established a 2024-2026 roadmap focusing on developing network capacities, knowledge sharing, and action on generative AI and synthetic content.
REFRAM committed to continued technical forum meetings with major platforms to monitor commitments on moderation, transparency, and local language support. The emphasis was on maintaining collaborative approaches while ensuring accountability.
## Conclusion
The session demonstrated evolving approaches to digital governance that emphasize collaboration between stakeholders while addressing fundamental challenges in information ecosystem resilience. Key themes included treating information as a public good, the need for multi-stakeholder cooperation, and balancing platform accountability with recognition of cultural diversity and varying population needs.
Significant challenges remain, including the economic sustainability of public interest media, data sovereignty concerns during crises, and governance of global infrastructure monopolies. The discussion highlighted the need for continued dialogue, practical experimentation with governance mechanisms, and commitment to serving democratic objectives rather than purely commercial interests.
The path forward requires sustained multi-stakeholder cooperation, innovation in both technical solutions and regulatory approaches, and recognition that effective governance must balance global coordination with respect for cultural diversity and national sovereignty.
Session transcript
Ana Cristina Ruelas: Mrs. Maria Paz Canales, Ms. Ingrid Volkmer, Mr. Guilherme Canela de Souza Mrs. Maria Paz Canales, Ms. Ingrid Volkmer, Mr. Guilherme Canela de Souza Mrs. Maria Paz Canales, Ms. Ingrid Volkmer, Mr. Guilherme Canela de Souza Thank you, everyone, for being here. I am Ana Cristina Ruelas, and I am part of UNESCO’s Freedom of Expression, Safety of Journalist sections within the communication and information sector. And this session is to talk about how we’re doing in the media line on WSIS. Considering that we are in the process of revising this important actions. We have here our Assistant Director General to give us a first thought, a food for thought, as everyone says, to start discussing this issue. And then I will introduce you to all of our speakers. ATG, Tofiq Jalassi, please.
Tawfik Jelassi: Thank you very much, Ana Cristina. Good morning to all of you and thank you for being here on time on this Tuesday morning for this important session on Towards a Resilient Information Ecosystem, Balancing Platform Governance and Technology. As Ana Cristina said about her affiliation, UNESCO has been for more than three decades in charge of freedom of expression, media freedom, safety of journalists, access to information. We have been the designated agency within the United Nations system on these topics. And actually, we also have the UN Observatory for the Killings of Journalists. And for more than 33 years now, we have been organizing every year the World Press Freedom Conference, which is a major event of UNESCO. And for 27 years running, every year UNESCO awards its World Press Freedom Prize. In 2021, 2022 and 2023, the laureate of the UNESCO Prize received six months later the Nobel Peace Prize. And we do this conference always on May 3rd, which is, as you know, the World Press Freedom Day. So this is the context, because why is UNESCO working on digital platforms? Where does that come from? Is that part of the UNESCO mandate? We hear these questions all the time. People think that UNESCO is about culture, education and sciences. Not about digital platforms, not about press freedom, not about media freedom. And here, I said it in my opening remarks yesterday morning at this event, our focus on this topic for the last several years comes from our constitutional mission at UNESCO, which was defined 80 years ago, in 1945, when UNESCO was set up. And at the time, our founding fathers said the mission of UNESCO is to build peace in the minds of men and women. Some people say, well, this is maybe an overblown mission. How can you build peace, UNESCO? Well, we can, through education, through culture, through the sciences, in the minds of men and women. All starts with the mindset of the people from a very young age, like education, but also through culture, which is a very important lever for social cohesion, for solidarity, for intercultural dialogue. So starting with that mission from 1945, if you look at what has been happening through digital platforms over the last several years, we have been seeing an exponential increase of misinformation, hate speech, cyber bullying, online violence, and other forms of harmful online content. This does not contribute to peace building. This contributes to more hatred, more divisive communities, and sometimes to genocides and to conflicts and wars. And we have the evidence. I can show some examples of recent cases where disinformation through platforms have led to genocide and to communities fighting each other. Because once that disinformation is believed by some members of society, once they rush to using arms, that’s it. You cannot take it back. You cannot say, by the way, that was fake information. Let me give you the fact-checked information. Too late. We want information to be a common public good. We don’t want information to be a common hazard, risk, nor a common harm. That’s our starting point. And that’s why, three years ago, we launched a major global initiative called For an Internet of Trust. We want trustworthy information. We want fact-checked information. And a recent study from MIT shows that lies online travel ten times faster than the truth. People say, do you have evidence? Well, here is one study. You can look it up at MIT. So, is it about fact or fake? Is it about truth or lies? And let me quote the 2021 Nobel Peace Prize laureate, Maria Ressa, the journalist from the Philippines, who once said, without facts, there is no truth. And without truth, there is no trust. And without trust, there is no shared reality upon which we can act. And that’s why we called it For an Internet of Trust. And we developed, through open global consultations, three of them, which we run. And we used an inclusive, multi-stakeholder approach. We received over 10,000 inputs from our 194 member states, civil society organizations, academia, research, technical community, the media. 10,000 inputs coming from 134 countries. And we summarized them in this booklet, which we published in November 2023. And now we are about to start pilot implementation. implementations of the UNESCO guidelines for the governance of digital platforms. Nobody can stop digital platforms from becoming even more of a mainstream media, more of the number one source of information for hundreds of millions of people, especially the youth worldwide. All what we can hope for is a more effective governance of these platforms through a set of principles. Transparency, accountability, independent oversight bodies, user empowerment, content curation, content moderation. All these principles are detailed in this booklet and the outcome here resulted also from the involvement, the engagement of the tech companies and the platform operators. So again, balancing platform governance and technology, the topic of this session, I wanted to give you one example of a contribution, a humble contribution by UNESCO to try to move the needle and if all the key stakeholders from regulators to the platform companies, to social media influencers, to digital content creators, if they all adhere to these principles, hopefully we’ll have a safer, more trustworthy cyberspace. And I’m sure that my colleague Ana Cristina will tell you some of the steps we have done recently, in particular with these key stakeholders, the regulators, but also the social media influencers and digital content creators, to say we need all of you, those who supply information online, but also working with those who consume information online, supply and demand, and we have a major program called Media and Information Literacy in the digital era. because we need to educate the users. We need to develop a critical mindset, helping them hopefully to distinguish between falsehood and fact-checked information, or to systematically check the source of information before they like and they share, and they become themselves amplifiers of this information. So, this is meant, my remarks are meant to set the stage. I hope I did so, Ana Cristina, but keep in mind it’s nine o’clock and they didn’t have a good coffee yet. Thank you.
Ana Cristina Ruelas: Thank you very much, ADG, for this opening. So, one of the key elements that UNESCO wanted to highlight in this acknowledgement of the Media Line is that, yes, it is true that we have advanced a lot over the last 20 years when it comes to cold information as a public good, and I can say that one of the key achievements is to broaden the participation of different stakeholder groups in these discussions. Right now, we are sitting here between different stakeholder groups that represent the private sector, the social sector, the public sector, which is very, very important for this discussion because when it comes to the digital space, it doesn’t, it’s not anymore a thing that relates only to governments and the, let’s say, the regulated entities which are the platforms or the companies. It comes with a lot of other members of society that have something to say and that have something to do. So, in order to take stock of what we have gained, what we wanted here in this session is to try to acknowledge what is that that still remains as a challenge and what is that that we see now as a challenge given the specific context that we’re living in, but also what are the key elements, the key actions to maintain a resilient information ecosystem in order to be able to input this new revision in a way that can help us. identify what are the actions that we can do from the different perspectives and from the different roles in this multi-stakeholder manner. So for this we invited great, great speakers and I’m going to introduce you to Ingrid Volkmer. She’s Professor of Media and Communication at the University of Melbourne and she’s part and member of the i14 Global Knowledge Network which is a network that helps us at UNESCO to observe how the guidelines are being implemented. We also welcome Frédéric Bokobza who is the Deputy Director General at the French Authority for Regulation of Audiovisual and Digital Communication, ARCAM, and who’s also part of the Francophone Network of Regulatory Authorities. He’s joining us online. We have Maria Paz Canales who is the Head of Policy and Advocacy at Global Partners Digital and we have Nadja Blagojevic, I’m sorry, Global Go-to-Market Lead and Knowledge and Information Trust Manager at Google. So thank you very much and thank you to all of you that are here. I’m going to start with you, Ingrid, because I think that we need to acknowledge that we are in an ongoing global crisis in between many different conflicts. What new or exacerbated threats do information ecosystems face and what strategies are needed to safeguard and be resilient?
Ingrid Volkmer: Big questions. I will try to answer these big questions in three minutes and I have notes to discipline myself and really stay within the three minutes. Normally I speak freely but I feel I just need to go along my bullet points here. So I wanted to start with reminding ourselves that over the past few years, as I think we all agree that crisis scenarios are increasing, or we feel they are increasing, across world regions, from the pandemic to humanitarian crisis, environmental crisis, displacements to armed conflicts. Decades ago, as you might agree as well, the regulation of information during a national crisis had a focus on national linear media. Governments tried to create a national narrative, convincing national linear media to join and to promote that narrative. Today, as we just heard, social media platforms and specifically disinformation are seen as central for crisis information and are in focus of regulatory initiatives. Actually, across the world, we are doing a study and we clearly see that the governments have a focus on disinformation, propaganda, those kinds of things, fake news, and feel that’s central. So they shift this paradigm from information of linear media, then to disinformation on social media platforms. However, when you look at current crisis, for example, in the Ukraine, in DRC, the Democratic Republic of Congo, and other countries, it becomes clear that it is not just information or disinformation that is central to crisis, but communication more broadly. And that is really important to understand that we need to shift from this information paradigm into a broader communication paradigm, which is relevant, I feel, for regulation of crisis communication. To just use some examples from the Ukrainian case, as you well know, vital connectivity is provided by LEO satellite system, Starlink, which is a global monopoly, and has invented that new low earth orbit space for connectivity. Crucial national information is stored on clouds. Open source software used for military and civilian communications. Technology such as Magtag is developed to treat trauma. AI platforms are used by military personnel and civilians in addition good old social platforms are too embedded in this whole communication space and they provide spaces for crisis bloggers, influencers and people engaging with their own communities. If we are looking at that, I think it becomes clear that most services, of course, are provided by major global corporate monopolies with commercial interests, starting owned by SpaceX, to Amazon Web Services, to JetGBT OpenAI, and they are widely regulated in context of crisis communication. These different dimensions need to be seen, in my view, as a holistic perspective, as they are creating a communication ecosystem of crisis communication, which will also be significant, not just in Ukraine in the future, but in other world regions. And as I said, we do study in Democratic Republic of Congo, we see that they are signing up to Starlink as well, and this sort of broader holistic perspective is coming into place there as well. And we have to have a focus on this in terms of regulation, that is my view. So it’s across the global north and south and not just in the global north anymore, because these services are provided by globalized platform. This ecosystem is no longer national, but situated between transnational networks, actors, Starlink, Elon Musk, and national interests. As most digital regulation is drafted for periods of peace, it is important to plan for crisis, do risk assessments of crisis communication, to consider crisis period as an own regulation.
Ana Cristina Ruelas: Thank you very much, Ingrid. So you have broadened the scope, you know, going beyond only, let’s say, the last layer, the one that we deal with as users every single day, when it comes to this information disruptions, in order to try to think broader, and how the information is storage, how the information comes, you know, how can we access to information, and how things are shifting to a more concentrated space. Frédéric, thank you very much for being here. I don’t see you in the screen, but I hope that I can see you. So, Ingrid has highlighted the different risks that today’s information ecosystem faces in crisis-driven environments. From REFRAM, as part of this network of francophone regulatory authorities, but also as lead of the Arkham Regulatory Authority, what are the paramount challenges that regulatory authority faces to effectively govern this digital space, particularly digital platforms?
Frederic Bokobza: Yes. Good morning, Ana Cristina. Good morning, all. I hope you can hear me well, even if you can’t see me. Yes, we can. Oh, great. So, first of all, thanks a lot for the invitation. I’m really sorry I couldn’t be with you physically today. Very few quick words on Arkham’s role. Arkham is the French independent public authority regulating audiovisual and now digital communication. And regarding online platforms, what we do is basically we ensure that they fulfill their obligations. to moderate content on their services in a transparent accountable and balanced manner and this covers the measures that they deploy to combat dissemination of illegal and harmful content including disinformation of course and that’s under the Digital Services Act but also things like terrorist content or copyright infringing content. So to come to your question what are the major challenges in governing digital platforms? Well I would say I would start with the challenge to reduce the huge information asymmetry with the platforms internal information so that we can actually make them accountable and go beyond you know the usual official soothing talk saying that everything is going well. So on this one significant step is that they have started in Europe to publish from last year onwards their first transparency reports as well as for the very large online platforms and search engines their reports on systemic risks detection and mitigation measures they took and that’s in application of the Digital Services Act and that’s for us a really very important first step in establishing this new balance of power with the platforms. Obviously there’s still a lot to do lots of challenges ahead which also are lots of opportunities and to take two examples well some of the tools provided for by the GSA have not yet been fully implemented and I think especially about researchers access to data for very large online platforms which will be instrumental and also the involvement more broadly of a civil society such as trusted flaggers and that’s really is a whole environment that we need to succeed so in other words a multi-stakeholder such as approach such as the one promoted by UNESCO is particularly important here if we want to to succeed and maybe a last remark if I may because I wouldn’t like our approach to be viewed as a mere Confrontation with or even leeds against digital platforms we obviously need to have direct communications and cooperation channels with them. On this i ll take one example that of elections. Obviously election Periods is crucial. It happens that in France we had three major election, last year, 2024, what we did, we organised bilateral and multilateral meetings with main online platforms and search engines in France. Also with candidates and campaign teams. and other national authorities responsible for organizing elections and combating disinformation more broadly. So there the platforms could, well, present the resources they had deployed to comply with our recommendations, the difficulties they were facing, the doubts they may have had, and really having this direct communication channel with all these players helped raising awareness, raising trust, and was very useful for early detection and coordinated mitigation of any worrying trend in this context of election. So, well, there would be much more to be said, but I’ll stop here for the moment to respect the three minutes. Thanks a lot.
Ana Cristina Ruelas: Thank you, Frédéric. That gives me the leads directly to talk to Nadja, because I think that one of the key elements is just the responsibility for platforms to be able to respond and to be transparent and accountable. But on the other side, there’s a lot of responsibility on the users to have control and to be able to engage safely with their different services where they are participating. But it is also a lot of good practice when it comes with platforms, giving the opportunity and the tools to these users to be able to engage. And in alignment with the UNESCO MI election plan, Google has a lot of actions that aim to try to create user empowerment. Can you tell us a little bit about those?
Nadja Blagojevic: Yes, very happy to. And thank you so much for having Google here. We’re very happy to be speaking with you all today. When we think about the information ecosystem, I mean, this has been central to Google since our founding. 25 years ago, our mission has always been to help organize the world’s information and make it universally accessible and useful. And so from day one, we really thought about how to help people connect to the information that will matter to them. We have a lot of deep thinking around information quality. So for example, we have search quality radar guidelines. These are 100-page documents that outline how we think about expertise, experience, authoritativeness, trustworthiness, when we are deciding how to elevate and rank information on Google search. Because we believe that it is an important responsibility for platforms to be able to surface information, the right information to people when they need it. But we also, as Ana Cristina was alluding to, want to really make sure that we are providing people, our users, with tools and also skills to help them evaluate the information that they see. And when we think about what it means to have signals of trustworthiness in an AI era, we think about it in three different ways. We believe that first, there need to be strong signals of provenance. When content is created, it’s very important that there are technical mechanisms that allow people to understand whether that content was generated with AI or not. We have a number of different strategies that we use to do this at Google. And one of them is called SynthID, which is a digital watermarking technology, which basically embeds in the pixels of audio, imagery, video, and text, a watermark that identifies that content as being created by AI. This watermark is very difficult to remove, even if an image is cropped or has a filter applied. And this watermark can be a signal for people to understand whether or not that piece of content was created by AI or not. But whether or not something is created by AI doesn’t necessarily mean it’s trustworthy or not. You know, AI is often used for artistic expression, as you can see from the great exhibits lining the halls today. We also know, though, that AI can be used for actors who have abusive intents in mind. And that’s why we think this provenance technology is so important, and also why we have open-sourced SynthID for text, and also work with the coalition for content provenance at Google. and Authenticity, C2PA. I see a few nods in recognition around the room. This is an organization that is devoted to promoting technical standards and brings together a wide variety of stakeholders from technology companies to hardware manufacturers to camera companies like Nikon and Sony to have a unified approach to this issue. And while provenance is important, it’s by no means the sort of solution here. We also think it’s very important for people to understand the broader context of a piece of content that they have discovered or that they’re encountering online. Maybe it’s something that someone has sent them, maybe it’s something that they’ve seen on social media, maybe it’s something that they’re reading and they want to know a little bit more about it. So we have a number of tools to help people understand the context of what they see, whether it’s an ad, a webpage, or an image. And these tools, for example, called About This Image, allow people to understand for a given image, when was that image first crawled by our search engines? How often does it appear online? Here’s this, for example, a picture of a conflict from five years ago, and it’s in a news article purporting to be from last week. We really think that these kinds of context clues are very important for people to help them understand and sort of think critically about the information that they’re encountering online. And to help them build the skills to be able to do that. We’ve also invested deeply in media literacy programs and initiatives. We have a flagship program called Be Internet Awesome, which helps children understand how to think critically, how to determine fact from fake, and how to practice basic online safety skills. We run this program in more than 20 countries with more than 60 governmental and civil society partners to help build these skills in the next generation of information consumers. And we also run programs that are aimed at adults to help them build their critical thinking, media literacy, and sort of lateral reading skills. We think it’s very important for people to be able to help develop the capabilities and also the mindset to, you know, think about the content that they are encountering online in these ways. And all of these programs and initiatives are very much aimed at both leveraging the expertise and technical capabilities of Google while also working with third parties, civil society, governmental organizations, because this really will take a whole of society approach. Thank you.
Ana Cristina Ruelas: Thank you, Nadja. Maria Paz, Nadja already said that it is true that that AI normally is perceived as a threat, but it’s also true that civil society, the ones that we are dealing all the time and trying to create advocacy, has also used AI to strengthen their actions and to make sure there’s more engagement and to uphold freedom of expression. So, could you share some compelling stories of how civil society can use AI to be resilient and to make the information ecosystem more resilient?
Maria Paz Canales: Ana Cristina, thank you for the invitation to be here today. Yeah, I think that we have been hearing in the introduction and in some of the intervention more of the kind of downside of the evolution of the technology and the threats of the use of AI. I think that it’s a good reminder also that during the WSIS implementation in these 20 years also, technology has supported the exercise of human rights and has opened the possibility of exercising freedom of expression in a way that was not possible before the spreading of this type of technology. So, one very relevant use that civil society has done of different platforms online and technologies at large has been actually elevating the voices of those traditionally marginalized, those that were not featured by the traditional media, those that were not accessing to being able to challenge the current powers in many places around the world. So, we don’t forget the positive side and the things that technology has brought in terms of opening the space and being able to increase participation in that space, which in no way undermine the challenges that now we are seeing in terms of the use of the technology and that we believe as Global Partners Digital that need to be confronted with governance, with enabling environment, with a digital skill, all the things that also previous speakers have mentioned. But to come to the specific question that you were posing to me, Ana Cristina, we have seen use of artificial intelligence and other technologies, as I said, for supporting the exercise of freedom of assembly, freedom of expression, enhancing the privacy of people that is engaging in public manifestation in many cases. For example, through enabling the report of what is going on in real time in an effective manner but obscuring the identities of the participants of the protest in order to protect them from any harmful consequences later on. Other cases that we have seen of the use of technology, even like use of the platform in a very innovative way during a crisis moment, things that also were referred by previous speakers, is that we can use the platform as civil society for reporting real time what is going on and organizing in a more effective manner the action of the platform around the content moderation when these situations are happening and many information needs to be verified and there we see all the work that has been done consistently by Trusted Flagger but also by traditional human rights organizations that have come to a space and support in this collective action. Other type of more like creative and innovative uses that we have seen of the technology from the civil society side, for example, using the power of artificial intelligence to recreate memories and to have a specific account of stories that have been lost in the time because the people that was part of them didn’t have a chance to record at the moment, they were like too young for the moment, so part of the collective memory of the humanity also have been supported in its reconstruction and support, hopefully that we don’t commit the same errors in the future. More largely, for example, civil society also have leveraged the power of AI technology and other technologies for being more effective in their own action, in their own advocacy, in being able to reproduce content and disseminate content in a more effective manner with less resources because currently we are living in a world in which the support for civil society action is confronting. a funding crisis, so being more efficient and leveraging some of these tools in order to be more effective for condensing information, accessing information and enabling that we can reach the audience with the necessary information could be also some positive use that we are seeing in this technology from civil society. I’ll stop there.
Ana Cristina Ruelas: Thank you very much. For the sake of time, I’m going to reduce the second part of the, because I really want to hear the public and the questions that they have for you. But I want first to go to Frédéric, because he is the one that is not here in presence. So Frédéric, can you just tell us how can global cooperation, particularly bringing together the Global Forum of Network of Regulatory Authorities, can help to reduce the regulatory fragmentation and build a global trustworthy digital space?
Frederic Bokobza: Yes, thanks Ana Cristina. So now I remove my Arcom hat and put on my REFRAM hat. REFRAM is the regulatory, the network for regulatory authorities in the francophone world, and it gathers around 30 authorities on three continents, Europe, America, Africa, but in majority in Africa. So obviously legal frameworks in these countries differ pretty much, but we all face the same challenges, right? We all want to achieve a safe online space free from incitement to hatred and violence, free from malicious disinformation, interference, harm for minor, etc. And so in this context, we are convinced that global cooperation frameworks that enable regulators to share experience, expertise, share evidence, really do pave the way for a more coherent international approach to online safety regulation, and that it’s really crucial. So we are committed to helping build such a global trustworthy Digital space true cooperation and we do so, well first at the EU level as we cooperation very concretely on day to day basis with our counterparts in the other EU members states as well with EU commission. And we are also committed to doing so at the international level true different networks of regulators, and there well, the guidelines for the governments of digital platforms and global forum of networks UNESCO initiated we actually took part are a major step forward. We actually took part in the drafting of the guidelines and the launch of the forum a year ago in Dubrovnik as chair of REFRAN at that time. So the members of the forum are pretty diverse because we’re talking there about a number of regulators and networks of regulators, to quote a few of them, a network from Africa, ACRAN, two networks mostly based in Europe, the Media Board and EPRA, a global network, GOZERN, network centered on the Mediterranean area, as well as the Ibero-American Audiovisual Regulators, PRAE, and of course the francophone network REFRAN. So the objective of the forum is to provide a space for collaboration and discussion between the various international regulatory authorities and networks and enabling them to exchange good practices on the latest developments in the governance of digital platforms. And all this with a human-based rights approach, human rights-based approach, sorry, and I really want to strongly insist on this. We also have a roadmap for action spanning from 2024 to 2026, which was adopted in consultation with members, and UNESCO has announced three priorities for the network this year, developing network capacities, second, knowledge sharing between between members. And third, interestingly, action on generative AI, as was mentioned, and synthetic content. And maybe a last word quickly to try and illustrate achievements through cooperation. That’s an action that we took within REFRAM, together with ACRON, the African network. We convened in Abidjan last year with representatives of four major platforms, Meta, X, TikTok, and Google. And they voluntarily took concrete commitments on moderation, transparency, use of French language and other local languages, which is so important, as well as cooperation with civil society and a few other items. And we have set up an annual forum to implement and continue this dialogue with the big platforms, and of course monitor how they do improve on the commitments they took. And we will actually hold the first technical forum meeting on Thursday this week in Paris, and our high-level meeting before the end of the year. So again, that’s only an illustration of what cooperation can achieve, but there are many others. And to conclude, we really are convinced that cooperation among regulators and networks of regulators has a lot of value, and that UNESCO’s initiative, in particular with the Global Forum of Networks, is an excellent example of such valuable initiatives. Thanks.
Ana Cristina Ruelas: Thank you very much, Frédéric. So I’m going to give the floor to you, in case you have any questions.
Frederic Bokobza: I’d be happy to hear the other…
Ana Cristina Ruelas: Go ahead.
Audience: Thank you for the excellent interventions. I think this is such a critical topic, and I really wanted to lead off of Ingrid’s challenge to look at communication as a public good, because I think we need to see this as a comprehensive, market-driven ecosystem. And the issue of supporting trusted flaggers, and where information is a trusted source, and where information Is trusted source, with a public interest media, from a conversation yesterday, a session yesterday we heard about of the many challenges trusted media face to provide trusted information. One of those being the rising cost of technical platforms, cloud computing, package services. So how does this kind of creating and information as a public good and looking at regulation how do we manage the cost factors when the tech companies are just increasing their profit, and we have, yes this monopoly, how do we promote open source, and other creators entering into the field. What kind of international landscape can we provide?
Ingrid Volkmer: Another big question, a big question. I think these are the questions we have to ask, this same applies to Starlink, as well this same applies to this whole spectrum of different layers that engage in crisis communication and I highlight crisis communication because as I said in the beginning they will increase. Concerning clouds, what we see in Ukraine, everything has shifted to cloud. And the question is how about data privacy, how about access to that crucial national data from the cloud later on. And what sort of regulatory approaches are required in crisis times to ensure that once the content has been moved to the cloud, how can it be moved back. What are the circumstances that can be moved? These are crucial question.
Audience: Public interest media of information, let’s go to nobel prize winner Josef Stiglis, who said if the information is public good then the public sector has to step in. because we cannot wait forever for tech companies or private sector to support the financial viability of traditional media. The traditional media has been suffering, as you know 75% of advertising revenues go online with two companies mainly benefiting from that, Google and Meta. So again, how can traditional media thrive in this digital era? So the issue of financial viability of media is at the core of all of this. And you talk about public interest, I don’t see any other solution than the public sector to step in.
Ana Cristina Ruelas: Maria Paz?
Maria Paz Canales: I’m not sure I can add too much to what I already have been saying. I think that for me, there is a relevant point to make, also connected with my previous intervention, in the sense of understanding this as a challenge for public interest media. I am very careful coming from Latin America about drawing the line between traditional media and public interest media because they are not the same, at least from the region that I come from. So I think that it’s very important to foster a regulation that supports the public interest media, that supports models of dissemination of information that are not exclusively market-driven, as a platform like Wikimedia and others of the same type. I think that that’s very relevant and we need to be acknowledging the way in which we address regulation. I think that that was very much discussed during the drafting of the UNESCO guidelines on platform governance and it’s something that we continue to work with when we work with partners across different jurisdictions in implementing those guidelines in the regulatory discussions at the national level. I think that it’s very important to make those differences and to really focus in in the relevance of information as a public good and the public interest compared to exclusively driven market solution. Driven market solution can be aligned also with public interest in many cases. And that’s another thing that we have learned in working with partners at the local level, at the national level. We need to be smart in finding the convergence point that we can address together and use also the possibilities of creating something for the platform to go in the direction of supporting the information as a public good. I will stop there.
Ana Cristina Ruelas: Nadja, I don’t want to leave without asking you something because I think that everyone has that question. Sometimes we hear a lot, for instance, from regulatory authorities or from civil society actors that it’s difficult to engage with industry in different ways. But what for you are the most significant barriers preventing effective collaboration and meaningful inclusion between the different stakeholders? And what concrete strategies would you think to target these barriers?
Nadja Blagojevic: Yes, thank you very much for that question. I think from our perspective at Google, we truly believe that these types of issues are something that no one entity, no matter how big they are, can solve alone. So we very much agree with the idea that we really need to all be working in concert to be addressing these issues. I think as we have heard from a few different people on the panel, these are global issues. And what works in one country or culture may not work in another. We are very much invested in trying to work collaboratively to help understand and develop solutions and be part of coalitions that can truly help address some of these issues in the broadest possible ways. But we will always need to be mindful of the fact that what works for one sector of the population may not work for another, that there are differing needs and sometimes divergent interests, even within a set of stakeholders who may be trying to address the same problem.
Ana Cristina Ruelas: Well, the sessions are 45 minutes, and we’re done. Thank you very much to all of you for your insights. And I’m sorry that we didn’t go through all of the different questions, but thank you very much to you all for being here. Thank you and see you in the next four days. www.globalonenessproject.org
Ingrid Volkmer
Speech speed
162 words per minute
Speech length
721 words
Speech time
266 seconds
Crisis scenarios are increasing globally, requiring a shift from focusing on national linear media to broader communication ecosystems that include satellite systems, cloud storage, and AI platforms
Explanation
Volkmer argues that traditional crisis communication regulation focused on national linear media is outdated. Modern crises require understanding communication as a holistic ecosystem that includes satellite connectivity, cloud storage, AI platforms, and social media working together.
Evidence
Examples from Ukraine where vital connectivity is provided by Starlink (LEO satellite system), national information is stored on clouds, open source software is used for military and civilian communications, and AI platforms like ChatGPT are used by military personnel and civilians. Similar patterns observed in Democratic Republic of Congo signing up to Starlink.
Major discussion point
Information Ecosystem Threats and Resilience
Topics
Cybersecurity | Infrastructure | Legal and regulatory
Disagreed with
– Frederic Bokobza
Disagreed on
Scope of regulatory focus – information vs. communication paradigm
Tawfik Jelassi
Speech speed
129 words per minute
Speech length
1048 words
Speech time
487 seconds
Misinformation, hate speech, and harmful online content contribute to conflicts and genocides rather than peace-building, requiring trustworthy information as a public good
Explanation
Jelassi contends that the exponential increase of misinformation and hate speech on digital platforms undermines UNESCO’s constitutional mission to build peace. He emphasizes the need for trustworthy, fact-checked information as a common public good rather than a hazard.
Evidence
MIT study showing lies online travel ten times faster than truth; recent cases where disinformation through platforms led to genocide and communities fighting each other; quote from 2021 Nobel Peace Prize laureate Maria Ressa about the connection between facts, truth, trust, and shared reality.
Major discussion point
Information Ecosystem Threats and Resilience
Topics
Human rights | Sociocultural | Cybersecurity
Agreed with
– Maria Paz Canales
– Audience
Agreed on
Information should be treated as a public good rather than a commercial commodity
UNESCO developed guidelines for digital platform governance through multi-stakeholder consultations involving 10,000 inputs from 134 countries
Explanation
Jelassi describes UNESCO’s comprehensive approach to developing platform governance guidelines through inclusive consultations. The guidelines focus on principles like transparency, accountability, independent oversight, user empowerment, and content moderation.
Evidence
Three open global consultations receiving over 10,000 inputs from 194 member states, civil society organizations, academia, research, technical community, and media from 134 countries; guidelines published in November 2023 booklet; involvement of tech companies and platform operators in the process.
Major discussion point
Platform Governance and Regulation
Topics
Legal and regulatory | Human rights | Sociocultural
Agreed with
– Frederic Bokobza
– Nadja Blagojevic
– Ana Cristina Ruelas
Agreed on
Multi-stakeholder approach is essential for addressing digital platform governance challenges
Maria Paz Canales
Speech speed
152 words per minute
Speech length
957 words
Speech time
376 seconds
Civil society uses AI and technology to elevate marginalized voices, support freedom of expression, and enhance privacy protection during protests and crises
Explanation
Canales highlights the positive applications of AI and technology by civil society organizations. These tools enable traditionally marginalized groups to participate in public discourse and allow for safer participation in protests and demonstrations through identity protection.
Evidence
Examples include real-time reporting during protests while obscuring participant identities for protection; use of platforms for organizing content moderation during crises; AI used to recreate lost collective memories; civil society leveraging AI for more effective advocacy and content dissemination with fewer resources.
Major discussion point
Information Ecosystem Threats and Resilience
Topics
Human rights | Sociocultural | Development
Agreed with
– Nadja Blagojevic
Agreed on
Technology can be used positively to support human rights and democratic participation
Technology enables civil society to be more effective in advocacy and content dissemination while facing funding constraints
Explanation
Canales argues that AI and other technologies help civil society organizations overcome resource limitations by making their advocacy work more efficient. This is particularly important given the current funding crisis facing civil society organizations globally.
Evidence
Civil society organizations using AI for condensing information, accessing information, and reaching audiences more effectively with less resources during a time of funding crisis for civil society action.
Major discussion point
Technology Solutions and User Empowerment
Topics
Development | Economic | Sociocultural
Important distinction between traditional media and public interest media, especially in regions like Latin America, with need for regulation supporting non-market-driven information models
Explanation
Canales emphasizes that traditional media and public interest media are not synonymous, particularly in Latin America. She advocates for regulation that supports public interest media and non-market-driven information dissemination models like Wikimedia.
Evidence
Reference to platforms like Wikimedia as examples of non-market-driven information models; experience working with partners across different jurisdictions in implementing UNESCO guidelines at national level; finding convergence points between market solutions and public interest.
Major discussion point
Public Interest Media and Information as Public Good
Topics
Human rights | Legal and regulatory | Sociocultural
Agreed with
– Tawfik Jelassi
– Audience
Agreed on
Information should be treated as a public good rather than a commercial commodity
Disagreed with
– Audience
Disagreed on
Definition and support of media – traditional vs. public interest media
Frederic Bokobza
Speech speed
161 words per minute
Speech length
1200 words
Speech time
446 seconds
Major challenge is reducing information asymmetry with platforms through transparency reports and accountability measures under frameworks like the Digital Services Act
Explanation
Bokobza identifies the huge information gap between regulators and platforms as a primary challenge. He sees transparency reports and systemic risk assessments required by the Digital Services Act as important first steps in establishing a new balance of power.
Evidence
Digital Services Act requirements for transparency reports and systemic risk detection/mitigation reports from very large online platforms and search engines; mention of tools like researchers’ access to data and trusted flaggers that need full implementation.
Major discussion point
Platform Governance and Regulation
Topics
Legal and regulatory | Human rights | Cybersecurity
Disagreed with
– Ingrid Volkmer
Disagreed on
Scope of regulatory focus – information vs. communication paradigm
Effective regulation requires direct communication channels with platforms, especially during critical periods like elections
Explanation
Bokobza argues that successful platform governance goes beyond confrontation and requires cooperative communication channels. He emphasizes the importance of coordinated approaches during sensitive periods like elections.
Evidence
France’s experience with three major elections in 2024, organizing bilateral and multilateral meetings with platforms, search engines, candidates, campaign teams, and national authorities; platforms presenting resources deployed, difficulties faced, and doubts, leading to early detection and coordinated mitigation of concerning trends.
Major discussion point
Platform Governance and Regulation
Topics
Legal and regulatory | Human rights | Sociocultural
International regulatory networks like REFRAM facilitate experience sharing and coordinated approaches to online safety across different legal frameworks
Explanation
Bokobza describes how REFRAM, the francophone regulatory network, enables cooperation among 30 authorities across three continents despite different legal frameworks. This cooperation helps achieve common goals of online safety and combating harmful content.
Evidence
REFRAM gathering around 30 authorities on three continents (Europe, America, Africa), mostly in Africa; cooperation at EU level with counterparts and EU commission; participation in UNESCO’s guidelines drafting and Global Forum of Networks launch in Dubrovnik.
Major discussion point
Global Cooperation and Multi-stakeholder Approaches
Topics
Legal and regulatory | Human rights | Development
UNESCO’s Global Forum of Networks provides collaboration space for regulatory authorities with human rights-based approaches
Explanation
Bokobza highlights the Global Forum of Networks as a valuable initiative that brings together diverse regulatory networks to exchange good practices and collaborate on digital platform governance with a human rights foundation.
Evidence
Forum includes networks like ACRAN (Africa), Media Board and EPRA (Europe), GOZERN (global), Mediterranean-centered network, PRAE (Ibero-American), and REFRAM (francophone); roadmap for action 2024-2026 with priorities including network capacity development, knowledge sharing, and action on generative AI; concrete example of REFRAM-ACRON collaboration in Abidjan with Meta, X, TikTok, and Google resulting in voluntary commitments.
Major discussion point
Global Cooperation and Multi-stakeholder Approaches
Topics
Legal and regulatory | Human rights | Development
Nadja Blagojevic
Speech speed
133 words per minute
Speech length
1036 words
Speech time
466 seconds
Google implements provenance technologies like SynthID watermarking and context tools like “About This Image” to help users identify AI-generated content and understand information sources
Explanation
Blagojevic explains Google’s technical approach to information trustworthiness through provenance signals and contextual tools. SynthID embeds watermarks in AI-generated content, while context tools help users understand the history and authenticity of images and information.
Evidence
SynthID digital watermarking technology that embeds watermarks in pixels of audio, imagery, video, and text; watermarks difficult to remove even when content is modified; “About This Image” tool showing when images were first crawled, how often they appear online; open-sourcing SynthID for text; collaboration with Coalition for Content Provenance and Authenticity (C2PA).
Major discussion point
Technology Solutions and User Empowerment
Topics
Cybersecurity | Legal and regulatory | Sociocultural
Media literacy programs like “Be Internet Awesome” help build critical thinking skills in children and adults across multiple countries
Explanation
Blagojevic describes Google’s investment in media literacy education to help users develop skills to evaluate information critically. These programs focus on helping people distinguish between fact and fake information and practice online safety.
Evidence
“Be Internet Awesome” program running in more than 20 countries with more than 60 governmental and civil society partners; programs for both children and adults focusing on critical thinking, determining fact from fake, lateral reading skills, and basic online safety; search quality radar guidelines (100-page documents) outlining expertise, experience, authoritativeness, and trustworthiness criteria.
Major discussion point
Technology Solutions and User Empowerment
Topics
Sociocultural | Human rights | Development
Global issues require collaborative solutions as no single entity can address these challenges alone, while recognizing different needs across cultures and populations
Explanation
Blagojevic emphasizes that information ecosystem challenges are too complex for any single organization to solve independently. She acknowledges that solutions must account for cultural differences and varying needs across different populations and stakeholder groups.
Evidence
Google’s 25-year mission to organize world’s information and make it universally accessible; recognition that what works in one country or culture may not work in another; acknowledgment of differing needs and sometimes divergent interests within stakeholder groups trying to address the same problem.
Major discussion point
Global Cooperation and Multi-stakeholder Approaches
Topics
Sociocultural | Development | Human rights
Agreed with
– Tawfik Jelassi
– Frederic Bokobza
– Ana Cristina Ruelas
Agreed on
Multi-stakeholder approach is essential for addressing digital platform governance challenges
Ana Cristina Ruelas
Speech speed
130 words per minute
Speech length
1199 words
Speech time
549 seconds
Broadened participation of different stakeholder groups in digital governance discussions is a key achievement over the past 20 years
Explanation
Ruelas highlights that one of the major accomplishments in the WSIS process has been expanding participation beyond just governments and regulated entities to include private sector, civil society, and other members of society in digital governance discussions.
Evidence
Current session bringing together different stakeholder groups representing private sector, social sector, and public sector; recognition that digital space issues now involve many members of society beyond just governments and platforms.
Major discussion point
Global Cooperation and Multi-stakeholder Approaches
Topics
Legal and regulatory | Human rights | Development
Agreed with
– Tawfik Jelassi
– Frederic Bokobza
– Nadja Blagojevic
Agreed on
Multi-stakeholder approach is essential for addressing digital platform governance challenges
Audience
Speech speed
281 words per minute
Speech length
284 words
Speech time
60 seconds
Traditional media faces financial challenges with 75% of advertising revenues going to digital platforms, requiring public sector intervention to support public interest media
Explanation
An audience member argues that traditional media cannot survive financially as most advertising revenue flows to digital platforms, particularly Google and Meta. They reference Nobel Prize winner Joseph Stiglitz’s view that if information is a public good, public sector intervention is necessary.
Evidence
75% of advertising revenues going online with Google and Meta as main beneficiaries; reference to Nobel Prize winner Joseph Stiglitz stating that if information is a public good, the public sector must step in; rising costs of technical platforms, cloud computing, and package services for trusted media.
Major discussion point
Public Interest Media and Information as Public Good
Topics
Economic | Human rights | Legal and regulatory
Agreed with
– Tawfik Jelassi
– Maria Paz Canales
Agreed on
Information should be treated as a public good rather than a commercial commodity
Disagreed with
– Maria Paz Canales
Disagreed on
Definition and support of media – traditional vs. public interest media
Agreements
Agreement points
Multi-stakeholder approach is essential for addressing digital platform governance challenges
Speakers
– Tawfik Jelassi
– Frederic Bokobza
– Nadja Blagojevic
– Ana Cristina Ruelas
Arguments
UNESCO developed guidelines for digital platform governance through multi-stakeholder consultations involving 10,000 inputs from 134 countries
Multi-stakeholder such as approach such as the one promoted by UNESCO is particularly important here if we want to to succeed
Global issues require collaborative solutions as no single entity can address these challenges alone, while recognizing different needs across cultures and populations
Broadened participation of different stakeholder groups in digital governance discussions is a key achievement over the past 20 years
Summary
All speakers agree that effective digital platform governance requires inclusive participation from governments, civil society, private sector, academia, and other stakeholders. They emphasize that no single entity can solve these complex challenges alone.
Topics
Legal and regulatory | Human rights | Development
Information should be treated as a public good rather than a commercial commodity
Speakers
– Tawfik Jelassi
– Maria Paz Canales
– Audience
Arguments
Misinformation, hate speech, and harmful online content contribute to conflicts and genocides rather than peace-building, requiring trustworthy information as a public good
Important distinction between traditional media and public interest media, especially in regions like Latin America, with need for regulation supporting non-market-driven information models
Traditional media faces financial challenges with 75% of advertising revenues going to digital platforms, requiring public sector intervention to support public interest media
Summary
Speakers converge on the view that information serves the public interest and should not be solely driven by market forces. They advocate for regulatory frameworks that support public interest media and treat information as a common good.
Topics
Human rights | Legal and regulatory | Economic
Technology can be used positively to support human rights and democratic participation
Speakers
– Maria Paz Canales
– Nadja Blagojevic
Arguments
Civil society uses AI and technology to elevate marginalized voices, support freedom of expression, and enhance privacy protection during protests and crises
Google implements provenance technologies like SynthID watermarking and context tools like ‘About This Image’ to help users identify AI-generated content and understand information sources
Summary
Both speakers acknowledge that while technology poses challenges, it also offers significant opportunities to enhance human rights, democratic participation, and information transparency when properly implemented.
Topics
Human rights | Sociocultural | Cybersecurity
Similar viewpoints
Both speakers emphasize the importance of cooperative rather than confrontational approaches between regulators and platforms, advocating for direct communication channels and collaborative problem-solving.
Speakers
– Frederic Bokobza
– Nadja Blagojevic
Arguments
Effective regulation requires direct communication channels with platforms, especially during critical periods like elections
Global issues require collaborative solutions as no single entity can address these challenges alone, while recognizing different needs across cultures and populations
Topics
Legal and regulatory | Human rights | Sociocultural
Both speakers recognize that traditional approaches to information governance are inadequate for current challenges, requiring comprehensive understanding of how information systems impact peace and security.
Speakers
– Ingrid Volkmer
– Tawfik Jelassi
Arguments
Crisis scenarios are increasing globally, requiring a shift from focusing on national linear media to broader communication ecosystems that include satellite systems, cloud storage, and AI platforms
Misinformation, hate speech, and harmful online content contribute to conflicts and genocides rather than peace-building, requiring trustworthy information as a public good
Topics
Cybersecurity | Infrastructure | Human rights
Both speakers see technology as an empowerment tool that can help users and civil society organizations become more effective and informed participants in the digital information ecosystem.
Speakers
– Maria Paz Canales
– Nadja Blagojevic
Arguments
Technology enables civil society to be more effective in advocacy and content dissemination while facing funding constraints
Media literacy programs like ‘Be Internet Awesome’ help build critical thinking skills in children and adults across multiple countries
Topics
Development | Sociocultural | Human rights
Unexpected consensus
Need for transparency and accountability from platforms
Speakers
– Frederic Bokobza
– Nadja Blagojevic
Arguments
Major challenge is reducing information asymmetry with platforms through transparency reports and accountability measures under frameworks like the Digital Services Act
Google implements provenance technologies like SynthID watermarking and context tools like ‘About This Image’ to help users identify AI-generated content and understand information sources
Explanation
It’s notable that both the regulator and the platform representative agree on the importance of transparency measures. The regulator calls for transparency reports and accountability, while the Google representative describes specific technical implementations that provide transparency to users.
Topics
Legal and regulatory | Cybersecurity | Human rights
Crisis communication requires holistic regulatory approaches
Speakers
– Ingrid Volkmer
– Frederic Bokobza
Arguments
Crisis scenarios are increasing globally, requiring a shift from focusing on national linear media to broader communication ecosystems that include satellite systems, cloud storage, and AI platforms
Effective regulation requires direct communication channels with platforms, especially during critical periods like elections
Explanation
The academic researcher and the regulatory authority representative both recognize that crisis periods require special regulatory consideration and coordinated approaches, suggesting convergence between theoretical analysis and practical regulatory experience.
Topics
Legal and regulatory | Cybersecurity | Infrastructure
Overall assessment
Summary
The speakers demonstrate strong consensus on the need for multi-stakeholder approaches, treating information as a public good, and recognizing both the positive potential and risks of technology. There is agreement on the importance of transparency, accountability, and cooperative rather than confrontational approaches to platform governance.
Consensus level
High level of consensus with complementary perspectives rather than fundamental disagreements. The implications suggest a mature understanding of digital governance challenges that transcends traditional adversarial relationships between different stakeholder groups, pointing toward collaborative solutions that balance innovation with public interest protection.
Differences
Different viewpoints
Scope of regulatory focus – information vs. communication paradigm
Speakers
– Ingrid Volkmer
– Frederic Bokobza
Arguments
Crisis scenarios are increasing globally, requiring a shift from focusing on national linear media to broader communication ecosystems that include satellite systems, cloud storage, and AI platforms
Major challenge is reducing information asymmetry with platforms through transparency reports and accountability measures under frameworks like the Digital Services Act
Summary
Volkmer argues for a paradigm shift from information regulation to broader communication ecosystem regulation that includes infrastructure like satellites and cloud storage, while Bokobza focuses on traditional platform governance through transparency and accountability measures within existing regulatory frameworks.
Topics
Legal and regulatory | Infrastructure | Cybersecurity
Definition and support of media – traditional vs. public interest media
Speakers
– Maria Paz Canales
– Audience
Arguments
Important distinction between traditional media and public interest media, especially in regions like Latin America, with need for regulation supporting non-market-driven information models
Traditional media faces financial challenges with 75% of advertising revenues going to digital platforms, requiring public sector intervention to support public interest media
Summary
Canales emphasizes the distinction between traditional and public interest media, advocating for support of non-market-driven models, while the audience member focuses on supporting traditional media through public sector intervention due to financial challenges from platform competition.
Topics
Human rights | Economic | Legal and regulatory
Unexpected differences
Crisis communication regulation scope
Speakers
– Ingrid Volkmer
– Frederic Bokobza
Arguments
Crisis scenarios are increasing globally, requiring a shift from focusing on national linear media to broader communication ecosystems that include satellite systems, cloud storage, and AI platforms
International regulatory networks like REFRAM facilitate experience sharing and coordinated approaches to online safety across different legal frameworks
Explanation
This disagreement is unexpected because both speakers are addressing crisis communication and regulation, but Volkmer calls for a fundamental paradigm shift to regulate entire communication ecosystems including infrastructure, while Bokobza focuses on coordinating existing regulatory approaches across jurisdictions. The disagreement reveals a fundamental tension between revolutionary vs. evolutionary approaches to digital governance.
Topics
Legal and regulatory | Infrastructure | Cybersecurity
Overall assessment
Summary
The main areas of disagreement center on regulatory scope (information vs. communication paradigm), media support approaches (traditional vs. public interest media), and implementation methods for multi-stakeholder governance. While speakers generally agree on goals like trustworthy information ecosystems and collaborative approaches, they differ significantly on methods and scope.
Disagreement level
Moderate disagreement level with significant implications. The disagreements reflect fundamental tensions in digital governance between comprehensive vs. targeted regulation, revolutionary vs. evolutionary approaches, and different regional perspectives on media support. These disagreements could impact the effectiveness of global coordination efforts and the development of coherent regulatory frameworks, particularly in crisis situations where rapid, coordinated responses are needed.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers emphasize the importance of cooperative rather than confrontational approaches between regulators and platforms, advocating for direct communication channels and collaborative problem-solving.
Speakers
– Frederic Bokobza
– Nadja Blagojevic
Arguments
Effective regulation requires direct communication channels with platforms, especially during critical periods like elections
Global issues require collaborative solutions as no single entity can address these challenges alone, while recognizing different needs across cultures and populations
Topics
Legal and regulatory | Human rights | Sociocultural
Both speakers recognize that traditional approaches to information governance are inadequate for current challenges, requiring comprehensive understanding of how information systems impact peace and security.
Speakers
– Ingrid Volkmer
– Tawfik Jelassi
Arguments
Crisis scenarios are increasing globally, requiring a shift from focusing on national linear media to broader communication ecosystems that include satellite systems, cloud storage, and AI platforms
Misinformation, hate speech, and harmful online content contribute to conflicts and genocides rather than peace-building, requiring trustworthy information as a public good
Topics
Cybersecurity | Infrastructure | Human rights
Both speakers see technology as an empowerment tool that can help users and civil society organizations become more effective and informed participants in the digital information ecosystem.
Speakers
– Maria Paz Canales
– Nadja Blagojevic
Arguments
Technology enables civil society to be more effective in advocacy and content dissemination while facing funding constraints
Media literacy programs like ‘Be Internet Awesome’ help build critical thinking skills in children and adults across multiple countries
Topics
Development | Sociocultural | Human rights
Takeaways
Key takeaways
Information ecosystems face unprecedented threats from misinformation, hate speech, and harmful content that contribute to conflicts and genocides, requiring a shift toward viewing information as a public good rather than a hazard
Crisis communication now involves a holistic ecosystem including satellite systems (like Starlink), cloud storage, AI platforms, and social media – not just traditional linear media – requiring broader regulatory approaches
Multi-stakeholder collaboration has significantly expanded over 20 years, bringing together government, private sector, civil society, and technical communities in digital governance discussions
Technology and AI serve dual purposes – while posing threats through deepfakes and disinformation, they also empower civil society to elevate marginalized voices, enhance privacy protection, and improve advocacy effectiveness
Effective platform governance requires transparency, accountability, independent oversight, user empowerment, and direct communication channels between regulators and platforms, especially during critical periods like elections
Global cooperation through regulatory networks and frameworks like UNESCO’s guidelines helps address regulatory fragmentation while respecting different legal and cultural contexts
User empowerment through media literacy programs and technical tools for content verification is essential alongside regulatory approaches
Resolutions and action items
UNESCO is moving forward with pilot implementations of their guidelines for digital platform governance developed through global consultations
REFRAM will hold a technical forum meeting with major platforms (Meta, X, TikTok, Google) in Paris to monitor commitments on moderation, transparency, and local language use
UNESCO’s Global Forum of Networks has established a 2024-2026 roadmap focusing on developing network capacities, knowledge sharing, and action on generative AI and synthetic content
Google has open-sourced SynthID watermarking technology for text and continues expanding media literacy programs like ‘Be Internet Awesome’ across multiple countries
Unresolved issues
How to ensure financial viability of public interest media when 75% of advertising revenues go to digital platforms, with calls for public sector intervention remaining unaddressed
How to manage rising costs of technical platforms and cloud computing services for trusted media organizations
How to address data privacy and access concerns when crucial national information is stored on commercial clouds during crises
How to effectively distinguish between traditional media and public interest media in regulatory frameworks, particularly in regions like Latin America
How to balance different stakeholder needs and sometimes divergent interests when developing collaborative solutions
How to ensure effective governance of global monopolies providing critical infrastructure (like Starlink) during crisis situations
Suggested compromises
Finding convergence points where market-driven platform solutions can align with public interest goals rather than viewing them as mutually exclusive
Developing regulatory approaches that support non-market-driven information models while still engaging constructively with commercial platforms
Creating direct communication channels and cooperative frameworks with platforms rather than purely confrontational regulatory approaches
Implementing human rights-based approaches in international regulatory cooperation that can accommodate different legal frameworks while maintaining common principles
Balancing platform accountability measures with recognition that solutions must be adapted to different cultural contexts and population needs
Thought provoking comments
We want information to be a common public good. We don’t want information to be a common hazard, risk, nor a common harm… without facts, there is no truth. And without truth, there is no trust. And without trust, there is no shared reality upon which we can act.
Speaker
Tawfik Jelassi
Reason
This comment reframes the entire discussion by establishing information as a fundamental public good rather than a commodity, and creates a logical chain linking facts to shared reality. It’s philosophically profound because it connects epistemology (how we know truth) to social cohesion and collective action.
Impact
This foundational statement set the tone for the entire discussion, establishing the moral and practical framework that all subsequent speakers referenced. It shifted the conversation from technical platform governance to fundamental questions about truth, trust, and social cohesion in democratic societies.
It is not just information or disinformation that is central to crisis, but communication more broadly… we need to shift from this information paradigm into a broader communication paradigm, which is relevant for regulation of crisis communication.
Speaker
Ingrid Volkmer
Reason
This insight challenges the prevailing focus on content moderation and disinformation by arguing for a holistic view of communication infrastructure. It’s thought-provoking because it expands the scope beyond what people see on their feeds to include connectivity, data storage, and the entire technological stack.
Impact
This comment fundamentally broadened the discussion scope, moving participants away from narrow platform content issues to consider infrastructure monopolies, data sovereignty, and crisis communication as interconnected systems. It influenced subsequent speakers to think more systemically about governance challenges.
I am very careful coming from Latin America about drawing the line between traditional media and public interest media because they are not the same, at least from the region that I come from.
Speaker
Maria Paz Canales
Reason
This comment introduces crucial nuance by challenging the assumption that traditional media automatically serves the public interest. It brings a Global South perspective that recognizes how media ownership and political contexts vary dramatically across regions.
Impact
This intervention prevented the discussion from falling into Western-centric assumptions about media systems. It added complexity to the conversation about supporting ‘trusted’ media by highlighting that trust and public interest are contextual and that regulatory approaches must account for different media landscapes globally.
We have been seeing an exponential increase of misinformation, hate speech, cyber bullying, online violence… This does not contribute to peace building. This contributes to more hatred, more divisive communities, and sometimes to genocides and to conflicts and wars.
Speaker
Tawfik Jelassi
Reason
This comment is particularly powerful because it connects digital platform governance directly to UNESCO’s core mission of peace-building, and explicitly links online harms to real-world violence including genocide. It elevates the stakes of the discussion beyond technical concerns to matters of life and death.
Impact
This stark framing gave moral urgency to the entire discussion and justified UNESCO’s involvement in digital governance. It influenced other speakers to consider the real-world consequences of their work and helped establish why multi-stakeholder cooperation is essential rather than optional.
What works in one country or culture may not work in another. We are very much invested in trying to work collaboratively… but we will always need to be mindful of the fact that what works for one sector of the population may not work for another, that there are differing needs and sometimes divergent interests.
Speaker
Nadja Blagojevic
Reason
This comment acknowledges the fundamental tension in global platform governance – the need for both consistency and cultural sensitivity. It’s insightful because it honestly addresses the complexity of serving diverse global audiences while maintaining coherent policies.
Impact
This comment introduced important nuance to discussions about global cooperation and standardization. It helped ground the conversation in practical realities of implementation and influenced the discussion toward more flexible, adaptive approaches to governance rather than one-size-fits-all solutions.
Overall assessment
These key comments collectively transformed what could have been a technical discussion about platform regulation into a profound examination of information as a foundation for democratic society and peace. Jelassi’s opening established the moral framework, Volkmer expanded the analytical scope beyond content to infrastructure, Canales introduced crucial Global South perspectives that challenged Western assumptions, and Blagojevic acknowledged the practical complexities of implementation. Together, these interventions created a multi-layered discussion that moved from philosophical foundations through systemic analysis to practical implementation challenges, demonstrating why effective digital governance requires both moral clarity and cultural sensitivity.
Follow-up questions
How can crucial national data be moved back from cloud storage during crisis situations and under what circumstances?
Speaker
Ingrid Volkmer
Explanation
This addresses critical concerns about data sovereignty and access to national information stored in cloud systems during crises, particularly regarding regulatory approaches needed to ensure data can be retrieved when needed.
What regulatory approaches are required in crisis times to ensure data privacy and access to crucial national data from cloud storage?
Speaker
Ingrid Volkmer
Explanation
This explores the intersection of data privacy, national security, and crisis management in the context of cloud-based information storage systems.
How can traditional media achieve financial viability in the digital era when 75% of advertising revenues go to Google and Meta?
Speaker
Audience member
Explanation
This addresses the fundamental economic challenge facing traditional media and the need for sustainable funding models in the digital information ecosystem.
How can international cooperation manage cost factors and promote open source alternatives when tech companies are increasing profits and creating monopolies?
Speaker
Audience member
Explanation
This examines the economic barriers to creating a diverse and competitive information ecosystem and the role of international regulatory frameworks.
What role should the public sector play in supporting public interest media as a public good?
Speaker
Audience member
Explanation
This explores the potential for government intervention to support media that serves the public interest, drawing on economic theory about public goods.
How can regulation effectively distinguish between traditional media and public interest media, particularly in different regional contexts?
Speaker
Maria Paz Canales
Explanation
This addresses the need for nuanced regulatory approaches that recognize different types of media organizations and their varying contributions to public discourse.
What are the most significant barriers preventing effective collaboration between different stakeholders in platform governance?
Speaker
Ana Cristina Ruelas
Explanation
This seeks to identify and address obstacles to multi-stakeholder cooperation in governing digital platforms and information ecosystems.
How can crisis communication regulation be developed as a separate category from peacetime digital regulation?
Speaker
Ingrid Volkmer
Explanation
This suggests the need for specialized regulatory frameworks that account for the unique challenges and requirements of information governance during crisis situations.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.