WS #42 Combating misinformation with Election Coalitions
WS #42 Combating misinformation with Election Coalitions
Session at a Glance
Summary
This discussion focused on the role of election coalitions in combating misinformation during elections worldwide. Panelists from Google, fact-checking organizations, and journalism backgrounds shared insights on forming and operating these coalitions. They emphasized the importance of collaboration between diverse stakeholders, including media outlets, fact-checkers, and civil society groups, to address misinformation effectively.
The speakers highlighted successful coalition models from various countries, such as Comprova in Brazil and Facts First PH in the Philippines. They stressed the need for building trust among coalition members and maintaining neutrality in leadership. The discussion also touched on the challenges of sustaining momentum beyond election periods and adapting to different cultural and political contexts.
Participants explored the role of technology companies like Google in supporting these coalitions, while also addressing concerns about potential conflicts of interest and the impact of government pressure. The conversation included debates on terminology, with some questioning the use of “misinformation” and suggesting a focus on specific harms instead.
The panel addressed the emergence of AI-generated content and its implications for election integrity, noting both potential risks and opportunities for leveraging AI in fact-checking efforts. They also discussed strategies for engaging young people and non-voters in the fact-checking process.
Overall, the discussion underscored the complexity of combating election-related misinformation and the importance of multi-stakeholder approaches. Panelists agreed that while challenges remain, election coalitions represent a promising model for promoting information integrity and supporting democratic processes globally.
Keypoints
Major discussion points:
– The importance and effectiveness of election coalitions in combating misinformation
– Challenges in maintaining momentum and addressing critiques of election coalitions
– The role of technology companies like Google in supporting election coalitions
– Concerns about government pressure and content moderation in relation to misinformation
– The need for clear policies, transparency, and relationship-building in election coalitions
The overall purpose of the discussion was to explore the role and impact of election coalitions in combating misinformation, sharing best practices and lessons learned from various global examples. The speakers aimed to highlight the importance of collaboration between journalists, fact-checkers, and other stakeholders in promoting election integrity.
The tone of the discussion was generally informative and collaborative, with speakers sharing insights from their experiences. However, it became more pointed and critical when audience members raised concerns about content moderation, government pressure, and the role of large tech companies. The panelists responded professionally to these challenges, maintaining a constructive dialogue while acknowledging the complexity of the issues raised.
Speakers
Speakers from the provided list:
– Mevan Babakar – News and Information Credibility Lead for MENA for Google
– Daniel Bramatti – Investigative journalist from Brazil
– David Ajikobi – Nigeria editor for Africa Check
– Alex Walden – Global Head of Human Rights for Google
– Jim Prendergast – Moderator
– Lena Slachmuijlder – Search for Common Ground and the Council on Tech and Social Cohesion
– Milton Mueller – Internet Governance Project at Georgia Tech
– Claes de Vreese – University of Amsterdam and executive board of the European Digital Media Observatory
Full session report
Election Coalitions and Combating Misinformation: A Global Perspective
This discussion brought together experts from various fields to explore the role of election coalitions in combating misinformation during elections worldwide. Speakers from Google, fact-checking organizations, and journalism backgrounds shared insights on forming and operating these coalitions, emphasizing the importance of collaboration between diverse stakeholders to address misinformation effectively.
Introduction to Election Coalitions and the Elections Playbook
Mevan Babakar, News and Information Credibility Lead for MENA at Google, introduced the concept of election coalitions and presented the Elections Playbook, a comprehensive guide developed to help organizations form and maintain effective coalitions. The playbook outlines two main models for election coalitions: collaborative approaches, where multiple organizations work together, and independent approaches, where a single organization leads the effort.
Key Examples of Election Coalitions
Several successful election coalitions were highlighted during the discussion:
1. Electionland: A U.S.-based coalition that brings together multiple newsrooms to monitor and report on election integrity issues.
2. Comprova: A Brazilian coalition of media organizations that collaboratively fact-check election-related claims.
3. Facts First PH: A Philippine coalition that introduced the MESH concept, combining fact-checking with in-depth explanatory journalism.
These examples demonstrate the diverse approaches to coalition-building across different cultural and political contexts.
Strategies for Combating Misinformation
Speakers discussed various innovative approaches to address misinformation:
1. Pre-bunking: Babakar introduced this proactive strategy to inoculate against expected false narratives before they become viral. She noted successful pre-bunking efforts in Europe.
2. Context-based fact-checking: Daniel Bramatti, an investigative journalist from Brazil, emphasized that fact-checking should add context rather than censor speech.
3. Media literacy: David Ajikobi, Nigeria editor for Africa Check, highlighted the importance of media literacy efforts to engage youth.
4. AI-assisted fact-checking: Babakar discussed the potential of leveraging AI tools to scale fact-checking efforts, while also noting the challenges posed by AI-generated content.
Challenges and Considerations
Despite the overall agreement on the importance of coalitions, speakers acknowledged several challenges:
1. Building trust: Babakar noted that building relationships and trust among coalition members takes time but is critical for success.
2. Funding: Maintaining long-term financial support for coalitions was identified as a significant challenge.
3. Balancing diverse interests: Ajikobi highlighted the difficulties in managing diverse media organizations within coalitions.
4. Leadership: Bramatti stressed the importance of choosing neutral leadership to ensure coalition credibility.
5. Government pressure: Alex Walden, Global Head of Human Rights for Google, pointed out the need to navigate government pressure and legal challenges.
6. Local context: Speakers emphasized the importance of understanding and adapting to local contexts when forming coalitions, particularly in countries with limited civil society or media infrastructure.
The Role of Technology Platforms
The discussion touched on the role of technology platforms in election integrity:
1. Content moderation: Walden emphasized the need for platforms to balance content moderation with free speech concerns.
2. Transparency: Speakers called for increased transparency around content moderation policies and government removal requests.
3. Industry-wide collaboration: Claes de Vreese from the University of Amsterdam suggested that platforms should collaborate on industry-wide coalitions to address election integrity issues collectively.
Evaluating Impact and Future Directions
Speakers discussed various approaches to evaluating the impact of misinformation and coalition efforts:
1. Harm-based framework: Babakar proposed focusing on specific harmful narratives rather than all misinformation, using a harm-based framework to determine when intervention is warranted.
2. Measuring concrete harms: Ajikobi agreed on the importance of measuring tangible impacts, such as election interference percentages.
3. Online and offline impacts: Bramatti highlighted the need to consider both digital and traditional media impacts, particularly noting the importance of radio in African contexts.
Unresolved Issues and Future Considerations
Several unresolved issues emerged from the discussion, including:
1. Balancing content moderation with free speech concerns
2. Determining appropriate thresholds for platform intervention on misleading content
3. Addressing the challenges posed by AI-generated content
4. Adapting coalition models to diverse global contexts
Conclusion
The discussion underscored the complexity of combating election-related misinformation and the importance of multi-stakeholder approaches. While challenges remain, election coalitions represent a promising model for promoting information integrity and supporting democratic processes globally. The conversation highlighted the need for continued dialogue, collaboration, and innovation in addressing the evolving landscape of misinformation in elections, with a particular emphasis on building trust, adapting to local contexts, and leveraging technology responsibly.
Session Transcript
Jim Prendergast: Good morning, everyone. I think we’ll get started. Let me just get to the screen, the appropriate screen. So thanks, everybody, for coming. Good morning, good afternoon, good evening. Whether you’re joining us in person or virtually, welcome. My name is Jim Prendergast and I’m your moderator for this morning or today’s session, which is titled Combating Misinformation with Election Coalitions. If this isn’t the session you thought it would be, we’d like you to stay anyway. So 2024 was a watershed year for elections. The UN called it a super year for elections. Sixty-plus countries held elections this year. I believe that’s an all-time record. At a time when elections around the globe are increasingly vulnerable to the spread of misinformation, the stakes couldn’t have been higher. Disinformation campaigns not only undermine electoral integrity, but they also erode trust in institutions, diminish civic participation, and in some cases, polarize societies. But there’s good news, and that’s what we want to talk about. Today, we’re going to focus on the role of election coalitions, essentially partnerships between governments, civil society, private sector, fact-checkers, encountering the rise of the tide of misinformation. These coalitions have emerged as a promising approach to build trust, promote credible information, and strengthen election resilience. But their effectiveness depends on a lot of factors, including strong coordination, shared resources, and clear strategies. I’m excited to be joined by a great group of experts on this topic who bring a diverse set of perspectives and extensive experience to the table, both in person and virtually. First off, let me introduce Alex Waldron, who’s the Global Head of Human Rights for Google. She’s seated here at the table with me. Wave to everybody online. Meevan Babakar, who’s the News and Information Credibility Lead for MENA for Google. She is joining us from London. London. David Adjokobi, he’s the Nigeria editor for Africa Check. He is also remote. And then, finally, Daniel Bramante, who’s joining us from Brazil. He’s an investigative journalist. And he wins the prize for the earliest time zone as a presenter speaker. Before we begin, just a couple of things to point out. Going to be, there are our speakers. Our session is going to start off with a couple of brief presentations from our speakers. I’ll kick it off with a couple of questions. But we really want this to be interactive. We want this to be highly participatory. So for those of you online and those of you in person, we really encourage questions and conversation and discussion. With that, let’s get started. Alex, could you sort of kick us off and help set the scene with explaining why you think election coalitions are important?
Alex Walden: Sure, thanks. I think this global elections has been a banner year for global elections. And Google has taken it seriously in all of these dozens of elections that have happened around the world this year. And so it’s timely for us to be having this conversation reflecting around the successes of the approaches that industry and our partners have had, and also looking forward to what do we need to do to strengthen those. So I’m really glad we’re having this conversation today. I also think it’s appropriate that we would be having this conversation at the IGF, where we are all focused on the multi-stakeholder model and the importance of that. Everything about what we’re doing here at IGF is focused on the necessity of government and civil society and companies working together to ensure that we’re all sort of realizing the benefits of what technology can deliver, and that those relationships and that working together also should inform how we address problems that come before us. And so that’s true across many types of issues. And in particular, that’s true across elections. And so I think my colleagues across the panel today are the best experts to demonstrate and talk through the ways that we’ve seen these successes. But at Google, we have billions of people who come to our products every day. And in particular, in the election context, people are coming to find information and information about where to vote. And so we have an obligation and responsibility to make sure that we are doing the best to deliver information to those users. But also, it is incumbent upon us to engage with the rest of the ecosystem to make sure that the things that are not really, or they’re not necessarily in our power to change entirely, we need to be working with the rest of the ecosystem to ensure that there is integrity in the way that we’re delivering information to all these billions of users around the world in the election context. So again, I’ll stop there. I think Google’s really excited to be having this conversation and hear the input from everybody in the room and online about how we do this work going forward.
Jim Prendergast: Great. Thanks a lot, Alex. I’d now like to turn to. Mevan, who’s going to explain to us a project she worked on, something she developed called the Elections Playbook. And Mevan, I’ll be driving the slides for you, so just let me know when you want to advance.
Mevan Babakar: Perfect. Thank you very much. Can you all hear me? Excellent. OK, great. Let me just quickly, OK. Hi, everyone. I’m Mi-Van. I work at Google as well. I actually work in trust strategy now across knowledge and information. So that touches on search, that touches on ads and other products that we have. But previously, I used to work in the Google News Initiative. And previous to Google, I actually worked in fact-checking for a decade. So I used to work at Full Fact, the UK’s independent fact-checking charity. And at my time at Full Fact and also at Google, I saw the power of election coalitions. And one of the things that became very clear to me is that election coalitions are actually quite a magical way of scaling the work of journalists and campaigners around the world, especially during elections. So I’m going to talk to you today a little bit about the short history of election coalitions, election coalitions, a research project that we’ve done specifically to capture some of the learnings from around the world, how you can form and organize an election coalition, some of the lessons learned from all of those interviews that we did as well. We’ve got 10 minutes. It’ll be a bit of a whistle-stop tour. But if anyone has any questions, feel free to just jump in and ask them. So next slide, please. So 2024 was a very big election year, as Jim mentioned. More than two billion people voted in over 60 different countries. But as we all know, misinformation, unfortunately, a big part of elections and has been around for as long as elections and probably longer. There are lots of ways to combat mis and disinformation in lots of different ways, but there is no silver bullet. There are very subtle and sometimes not so subtle nuances between countries that have quite a big difference in how you would combat misinformation. Things like public broadcasting, community participation, press freedoms, all of these things actually necessitate a specific country-level intervention. Over the past decade, journalists and fact-checkers have come together to form these election coalitions. What they essentially are, just as a very top line, is when journalists, fact-checkers, community organizations, sometimes lawyers, sometimes researchers, join forces and share resources or share the impact of the work that they do during a specific event like an election coalition. So it might be actually sharing the resources of their media monitoring, their actual research that they do. It might be sharing the learnings of the actual fact-checks or the journalism. It might be sharing the impact or scaling the actual outcomes of the work itself. One of the earliest examples was Electionland in 2016. This was a U.S. coalition that was set up and it was 1,100 journalists working together. It was a nationwide effort to cover voting rights in election administration in 2016. So there was a narrative going around that basically the election was rigged and that narrative is one that still exists today, but key claims come up each time that the election was rigged. Historically, at least in 2016, the newsrooms were primarily focused on reporting the outcome of what happened on election day and the run-up to the political ins and outs. Voting issues were sort of relegated to secondary coverage. So a bunch of journalists and newsrooms came together and started election land to kind of combat that, especially because in the US, the election laws vary drastically from state to state, and even country by country. So no national newsroom was at the time in a position to cover election administration through a wide lens. So all these newsrooms came together. And actually, one of the things that they did was actually quite new at the time was using social media to actually alert the local newsrooms and journalists that were taking part on specific claims that were coming up around the election being rigged so that they could actually localize specific narratives and specific claims to certain regions. And on top of that, they had 1100 journalists immediately and authoritatively rebut some of the pieces of misinformation that were coming out. It kind of showed at the time that news organizations a could work together, and they can collaboratively serve as a watchdog for like, for this crucial democratic moment that was taking place. And as it stands in 2016, the election land project won an online journalism award for its work. And since 2016, there have been at least eight more coalitions, I think probably more like 12 at this point. And they have operated not just for elections, sometimes across multiple elections. Like Comprova, for example, which we’ll hear more about later, has run for multiple years now. And more recently in 2024, although their logo isn’t on this slide, we’ve had the Shakti coalition in India, which is about 40 organizations coming together. And in the EU, the election 24 check, which was 45 organizations across Europe, and working across 37 countries, who published 3000 plus fact checks around the EU elections. I think it really shows that when newsrooms come together or when fact-checkers and community groups come together, the impact can scale quite drastically. There’s something quite special in that model. Next slide, thanks. So Google has a long history of supporting these election coalitions and we wanted to understand how to effectively build them and what they should look like to serve the needs of voters in the countries in the run-up to elections. But more importantly, there had already been so much learning from the past decade and it felt a bit like every single time everyone was starting from scratch. So we wanted to run a six-month research project and talk to all of the election coalitions that had come before to understand exactly what the best ways of setting it up are, what are all the lessons learned over the past decade and how can we effectively build them going forward. We ended up talking to 15 global experts and the countries that we touched on was France, Brazil, Argentina, Mexico, Nigeria and the Philippines as well as the US. One of the key things that I think has come out of the learnings is that there’s really no one-size-fits-all approach to building a successful election coalition because of the fact that each country is very unique in how it’s set up. There are often different election laws, different voting systems, there are different news consumption habits like radio, TV, social media. If these things are turned up or down you’ll need to change how you do your monitoring. There are also different types of misinformation taking place. Sometimes there are one-off instances of claims that maybe are more honest in the fact that it’s misinterpreted something. Well sometimes there’s types of misinformation and disinformation that direct foreign interference and of course these things would need different approaches. But having said that, there are some things that are shared across all the successful election coalitions. And by asking the right questions, we can start to build something much quicker and much more viable. So the things that have come up as sort of stages and needs in election coalitions is to identify the need, actually understand what is it that you’re trying to do in the first instance, although they’re called election coalitions, and a lot of them are around missing disinformation around election coalitions. That model has also been extended to pandemics, for example, so the COVID-19 pandemic and others as well, or epidemics in local regions, kind of share the same model. So figure out what it is that need that you’re trying to meet specifically. It’s become clear that we need to identify the lead as well. So a specific organization often takes charge of the larger coalition, not necessarily as the spokesperson, but as the organizing lead for any kind of coalition to take place. And I think it’s really important that this is one of the things that came out of the interviews, it’s really important that that organization that takes the lead in that country context is seen as neutrally as possible, or seen as balanced as possible. Because a really key point of the election coalitions is that you want a broad spectrum of actors and journalists that meet the needs of voters. And depending on how polarized that ecosystem is, you might want to use it as a means of building trust in institutions or building trust in journalism or fact checking or whatever it is that’s happening in those countries. So having that as a key aim really, really helps and identifying a lead that is as neutral as possible helps build that bridge. membership, whether it’s formal or informal, whether actually you’re focusing on subject experts or technology partners, these are all very important steps, and things to formalise before the actual coalition comes together. And then I think the next two are very important, actually. Implementing capacity building programmes is especially important for an election coalition when there are multiple media organisations working together, because historically, those media organisations work against one another. They’re competitors. And I think that what they’re doing here is actually quite unique. They’re coming together, sharing resources, they’re sharing sometimes outputs. And they’re working in a much more collaborative way. So trust building is an incredibly important part of these election coalitions. And trust isn’t something that is earned overnight. It is earned through example, it’s earned through case studies, it’s earned through the experience of working with one another. And the more times that you can bring people in the election coalition together in person, the better it will be for that. And then on top of that, making sure that people kind of have the same skills and resources available to them. Developing clear coalition policies is key. There are actually two models for election coalitions that I have seen so far, we call them the collaborative approach and the independent approach. In the collaborative approach, the organisations actually share resources to do the media monitoring together, they check together, they edit together, it actually becomes one mega newsroom. And actually, they publish the final outputs of the pieces across the multiple media organisations as well. That’s the collaborative approach. In the independent approach, and we see this more sometimes when people don’t have the trust necessarily to jump in together yet. In the independent approach, there’s no commitment to share the output, so often people will maybe share the media monitoring side of things, but then do the check or the article through their own independent editorial processes. And then that’s kind of shared across newsrooms or across a platform, and organizations can choose whether to share it or not, so there’s no commitment to share it. But still, there’s a lot of value there in understanding what are the shared narratives happening in that country, and actually what are the gaps that still need to be filled that haven’t been filled across the ecosystem. And then other things like figuring out the branding of the coalition, etc., the code of ethics and standards and correction policies is incredibly important when many newsrooms come together. Next slide, please. So some of the key things that came out were about preparation, starting early, and planning for scale is incredibly important. With an election coalition, you can’t start too early. I think that there’s a lot of prep to do for them, and the sooner you build trust, the better. Diversity and collaboration is a really key part. We’ve already mentioned that that scale and that width of partners is very important. But often you have a layer of journalists, and then that intersects with the community as well. And so in some places, you actually get media and civil society organizations taking part as well. And that’s an opportunity to go even broader and more diverse and trying to get the stories out there. And finally, context. So actually understanding how the context of your own country might be changing. In some cases, for example, there might be a growth in AI misinformation and understanding do you have tools across your AI, across your election? coalition to actually be able to come back up. Next, please. I’m going to just quickly touch on two case studies. And one of those was crosscheck in France. In 2017, it brought together over 30 organizations. And it was led by a Jones France Press FP, who took on the editorial leadership of it. They had 37 partners, and across the videos that they all shared, they had 1.2 million video views in total. And they published hundreds of articles between them. Gregoire Lemarchand, the chief editor, editor of the digital verification program at AFP, said this is, for us, one of the biggest wins in AFP history, crosscheck will always be special, personally. Sometimes I meet colleagues who took part in this project, and they say, Do you remember crosscheck? That was so great. And I think that’s a really key part is that that trust that it builds across journalists is really important. And it lives beyond the election coalition to. Next slide, please. Then we have facts first pH, which is one of my favorites. Sorry, everyone else. But they had 131 partners working together. And they published 1400 fact checks. And Jim Mendoza, the head of research and strategy at Rappler said the thing with these is that this is these are experiments. I wouldn’t say facts first pH was perfect. At the time we were experimenting. And the reason why you wanted to experiment was because there was a huge challenge. And it’s true, there is a huge challenge. And even when we look at these numbers 131 partners and 1400 fact checks, it might not feel like it’s big enough to meet the scale. But I think one of the important things we need to remember is that with misinformation, there are often just a handful of narratives. that are the most well-known and well-seen narratives and that cause the most harm. And actually, if we focus efforts on those narratives and those pieces that are being seen the most or the most harmful, you can actually go quite a long way to kind of interrupting the flow of misinformation in each country. Next slide, please. I think one of the things that the Rappler team did very, very well in the Philippines with FactsVerse.ph is that they introduced something called the MESH. And they had all of these authoritative information sources in blue. So these were journalists, expert institutions, fact-checkers. These were actually producing the research. And then in red, the MESH, they actually had over 100 orgs that were separate to the information providers. And these were influencers, NGOs, communities, trusted people in their communities. And they would then go out and share the outputs of the election coalition more broadly. And I thought that was a really amazing model of the kind of impact that that had in the Philippines in terms of building trust and showing people that there was an answer to misinformation was actually very, very powerful. And then just more broadly on top of that, there was also research. So taking all the learnings from that, and then finally, accountability and change. There are some actors around the world that take the outputs of the authoritative information that’s being found or being introduced into the world to combat harms, and then actually using it as evidence to hold people accountable, for example, in the International Criminal Court sometimes or in legal cases. And I think that that’s also a really important part of the misinformation challenge. It’s not just about combating misinformation. It’s also about looking for that systemic change that might improve the system overall. I’m going to leave it there. I appreciate I’ve been talking for a long time. But I’m really… really pleased to share this with you all today. If you want to learn more about all of the case studies and go into depth in any of this stuff, and there’s this election coalition’s playbook that we’ve published alongside anchor change. And actually, there’s a podcast as well with Claire Wardle, and Daniel, who’s here today, that took place as well, where you can get a summary of everything. So please download it, enjoy it, use it. And if you ever make an election coalition, get in touch. Thank you very much.
Jim Prendergast: Great, thank you very much, Mevan. I think I know the answer to this question before I ask it, but I’ve noticed at least in the room, some people are taking camera shots of the slides. Are you willing to share them with anybody who would like them? Yeah, perfect. That’s what I figured. Okay, great. So we’ve already had a couple of examples, case studies of country election coalitions. I’m going to now ask David to share his experience with election coalitions in Africa. Good morning, David. Are you there?
David Ajikobi: Hi, everyone. Can you hear me? Can you see me?
Jim Prendergast: We can and we can see we can hear and see you looking great.
David Ajikobi: Greetings from Lagos, Nigeria. So I think Meevan has already sort of set the tone to the conversation. I think I just wanted to add a few things. I think largely for us, at Africa Check, we are the continent’s leading first independent practical organisation. And what we’ve sort of done with election coalition work is to also help other countries around the elections to set up, you know, the coalitions. It can be very, it sounds very, very pretty when Meevan was saying it, but it can be very problematic, you know, particularly in a continent like Africa, where, you know, historically, media ownerships are often, the media is often owned by either politically exposed people or politicians or by government. So what we’ve been able to do essentially is to say, look, we would bring everybody on the table and we’ll have a common interest to say, we don’t want, we want elections to, we want our elections to hold, we want integrity in our elections, we do not want disinformation to be the third candidate or the fourth candidate in Africa elections. And so far so good, we’ve actually established some successes. And I’ll give you an example. So I just go back from Ghana, where, you know, we’re able to sort of foster an election coalition, comprised of Dubai, Ghana, FACTS, FACTSpace and other partners. And what we essentially saw was this. Traditionally, people did things in their own different, you know, corner, right? But together, it’s like coming together, we were able to sort of form a formidable front. And we saw how that panned out in the last elections in Ghana that brought about the election of John Mahama as the new president of the country. And so much so in the sense that the collaboration also helped with, you know, it was inclusive, the inclusive nature of the collaboration, because, for example, in Ghana, there were situation rooms in Accra, in the south, there were disinformation monitoring rooms in Tamale, which was in the north. So what that did was that we were able to also map out not just the actors, but also the patterns we’re seeing from region to region. And I can say the same thing about the Senegal elections that brought about Boussaou Faye, where we had an election coalition, we had the same thing even in South Africa. And if you follow what happened in the African elections, where for the first time, we had a government of national unity, because, you know, and we saw how this information played. out in all of that. I’ll give you my own context in Nigeria, where we had elections in 2023, which is dubbed one of the biggest elections in Africa, where we had more than 50 million voters in that election. Having a situation room, having an election coalition was a very great, not just about fact-checking alone or debunking, but also helped with debunking, because we already had experience with 2019 elections where we did coalitions to say, we would know that this kind of election disinformation would spread. On election day, we would have things like, oh, one candidate has stepped up for another candidate, so we can actually debunk that. But also, just like what Milind said, we were also able to use the coalitions to introduce AI tools, new tools, capacity building. So for example, the tool developed by Fufa in collaboration with Africa Check and CheckKado was provided for free to the coalition members in Ghana, Nigeria, and practically all the countries that had election coalitions in Africa. And for us, that is a very, very big step, because naturally, those individual organizations might not be able to access that. But with the coalitions, we were able to sort of give them for free, onboard them for free, in collaboration with Fufa. So for us, it wasn’t just about the election. It was the opportunity to collaborate at the largest scale. And I’ll give an example. For my context in Africa, radio plays a very important role. So you cannot talk about election coalitions without talking about the impact of radio. So at the level that, if you look at the structure that Milind presented, where you have the collaborative nature, what we did was that, apart from the fact-checkers, the CSOs, and the media agencies we were doing, we also partnered with radio stations across the continent when we were doing election work. What I did was that our content was able to reach a lot of more people, and also people who were in news deserts or on the sub-communities. or people who, what we would call media inclusion. So for us, that was very, very important. And we think that moving on with elections where they are coming around in 2025, it’s an opportunity to actually connect to do more. Thank you very much.
Jim Prendergast: David, thank you very much. Turn your camera back on for a second, because I want everybody to note that David gets bonus points for color coordinating his outfit with the theme color of the IGF. I also want to thank you for, you know, calling out the importance of radio. I think so many people are focused on the next technology and the future of technology and where these problems are happening that sometimes, including myself, we forget about what’s already there or how different environments consume news. So your comments about radio are certainly hitting home with me and I’m sure with others. So Daniel, you’ve got some experience with this in Brazil and I believe we got you up in the middle of night to share those with us. So why don’t you tell us how it went for you and some lessons learned there.
Daniel Bramatti: Well, thanks for having me. I am the editor of Estadão Verifica, the fact-checking unit of O Estado de São Paulo newspaper, and also a member of the advisory board of IFCN, the International Fact-Checking Network. I’m going to talk about the largest and most successful and most durable collaborative project involving journalism in the history of the Brazilian press, which of course is the Comprova Project. The origin, at the beginning of 2018, the Brazilian Association, the Brazilian Investigative Journalism Association, Abragi, was invited to organize and coordinate a coalition of 24 media outlets to combat misinformation and disinformation in that year’s presidential elections. I was then president of Abraji. And the invitation came from the researcher Claire Wardle, then head of first draft and author of the famous report Information Disorder from 2017. Google was one of the sponsors of the project. I have to say that at first, not all media outlets showed enthusiasm for the project. Of course, the news market is very competitive in Brazil. And there was no culture of collaboration between different companies here. But gradually, resistance was broken down, mainly because there was great concern about the impact of disinformation campaigns during the presidential race. Everyone knew that the challenge of containing this information was too great to be faced in isolation. And all decisions related to the project were made through consensus building without imposing directions or rules. Even the name of the project was chosen by the participants themselves. In Portuguese, comprova means to verify or to check, and also sounds like the words comprova, which means with proof. So there is a wordplay here. An important decision we made was to limit our verification work to content generated by social media users. We didn’t check the candidates’ speeches or statements. As one of the candidates lied a lot more than the others, it was probable that he would be the most contradicted. And so many media outlets were concerned about the possibility of conveying the idea that they were against this candidate or that they wanted to benefit their opponents. The vast majority of the media. media outlets invited to take part in Comprova did not have fact-checking units in their newsrooms. So dozens of journalists had to be trained using the methodology provided by First Draft. These professionals were from TV stations, radio stations, newspapers, magazines, and digital native media. Organizations of different sizes that reached different audiences in different parts of Brazil. In essence, Comprova put journalists from different companies to work together to debunk misleading content. And the final result was only published after a cross-checking process, meaning that at least three media outlets not involved in the original fact-checking had to give their approval to the work done by other colleagues. In addition to working together, another important aspect was the amplifying power of the media outlets involved. The fact-checks were almost always published by all 24 participants in the project. So after our first face-to-face meeting in May 2018, Comprova was officially launched in June and during the Congress of Abraji. And in August, we started publishing our first fact-checks. The election campaign, which ended in October, confirmed our worst fears. There was a huge circulation of misleading content and this content generated enormous engagement with a public that wasn’t prepared to deal with the problem. We had a lot of work, but also a lot of enthusiasm. All the work was done remotely, I’m talking about two years before the pandemic here, and to coordinate our activities. activities, we used a WhatsApp group. The amount of messages exchanged in this group was immense. In six months, around 50 journalists exchanged more than 18,000 messages in the group. And I did a word count on these messages and found that more than 315,000 words were written. For comparison, that’s more text than any book of the Harry Potter saga. So we learned some lessons. Number one, a shared purpose motivates journalists much more than competition. Number two, horizontal collaboration works best if there is central coordination. Also, number three, the role of the central coordinator, the project editor is not to give orders like a boss, but to act as a diplomat who seeks to build consensus and break down resistance when needed. And we learned fundamentally that fact checking is hard, very hard. Sometimes it took us days to get the information we needed to disprove a piece of content that clearly had been created in minutes. We managed to publish around 12 fact checks per week, or 147 in total. Organizers and participants were very satisfied with this experience. And as a result, Comprova did not end in 2018. As originally planned, the consortium remains active to this day, with the mission of fact checking rumors related to public policies, health, climate change, and other topics. We also worked together during the pandemics, fact checking false rumors about vaccines and the virus, and the electoral campaigns of 2020, 22 and 24. The number of participants grew to more than Our work in 2022 was especially important because in that year, there was a wave of attacks on the integrity of the Brazilian electoral process. There was a lot of content citing false vulnerabilities in the electronic voting machines and suggesting that there was fraud to benefit one of the candidates. We didn’t know at the time, but many of these rumors were created and spread by state actors, by intelligence agents from the Brazilian government, with the aim of destabilizing our democracy. Recent investigations by the Brazilian Federal Police have revealed that we almost suffered a coup d’etat that year and that the disinformation campaigns were part of the plan. We still have democracy in Brazil, and I don’t want to exaggerate our role, but I think I can say without fear of being wrong that journalism contributes to this result. Thank you very much.
Jim Prendergast: Great, Daniel, thank you. So, unlike many of the sessions you may have been at the IGF to date, we have this room, this room both physically and zoom room till 11am so that’s 45 minutes that we have set aside for comments, questions and discussion as I told you at the outset we wanted this to be as interactive and engaging as possible so I actually see we have a question already online but while I get my act together on that I’m going to throw one out to the group to sort of let folks in the room. Think about it but, you know, one of the things that I was struck by was, and Danny you talked about this is pivoting from an election to other things like the pandemic where you’re doing fact-checking. I guess for everybody, you know, how do you keep the momentum going? You know, I’m biased. I just came out of a national and congressional election in the U.S. where we were bombarded nonstop with election ads and all sorts of stuff. And frankly, we’re tired. I can’t imagine how journalists feel coming off of a cycle like that. How do you keep the momentum going both from elections and from other issues, you know, either state or local or other events like a pandemic? So whoever wants to take that first go ahead.
Daniel Bramatti: I can go first. In the Comprova case, the media outlets that participate in the project are the same, but not the journalists. We rotate the team so that more people can get together and learn from the others. So basically, we have a fresh team working together every year. So this is not a problem to us.
Jim Prendergast: Meevan or David, any comments on that one? Go ahead, Meevan.
Mevan Babakar: Sure. I think it’s really important in my experiences of being in election coalitions myself before Google and being a journalist for a long time is it’s really important to look after yourself in those situations and to look after the team. And it’s important to also step away when it becomes too much. I think that actually the emotional burden that a lot of people take on in these situations is quite high, especially, you know, And when we talk about elections, that’s one experience, but a lot of people are fact-checking during conflict in war zones, or like doing work that actually, where you end up seeing things that are quite harmful yourself. And I think the wellbeing of the team and the people is actually the thing that must be preserved and looked after beyond anything else. So I wanted to recommend a handbook that was written called, it’s called, it’s about vicarious trauma and how to look after people in a newsroom specifically. And I’m gonna put a link to it in the chat, but it has really great recommendations for how to look after journalists, newsrooms, and campaigners so that actually they don’t experience vicarious trauma through the work that they do. And I think it’s a really great resource for answering that question.
Jim Prendergast: Great, thank you. David, I saw you flash your camera on. You wanna weigh in?
David Ajikobi: Yeah, so for us, we had a very interesting case in 2019. So in 2019, we had an election coalition and it was sort of midwives by a foreign organization. And because the language or the reason, the thinking behind it was not so clear to some media partners. What happened was that, you know, you expect them to be paid. And I could tell you that when they got paid and when the money dried up, I mean, people left the conversation. Only Africa Check, Dubawa, and Fact Check Hub, three IFCN members stuck to that goal. But what we did with 2023 elections in Nigeria was that we said, look, we wanted people who understood the role of media in a democracy like Nigeria. For example, my country has had decades of military rule. So having elections. done properly and the outcome of the election done, you know, in a country like this is very important and, you know, the media has a role to play in that. And I also give you the example of the fact that we know, by doing so, we had partners who were committed to that. And in the election coalition in Nigeria in 2023, there was no single external funding. In fact, there was no single funding. What we did was to collapse our individual election work into the coalition work. So what I did was that we were all equal stakeholders, right? And also, just to speak to that point, Nigeria has off-season elections. So we’ve had state elections in about four or five states in 2023. And we’ve come together again to set up the coalition. Africa just set up the one in Lagos. Yubawa or Paktekum set up the one in Abuja, you know, for example. But we’re seeing across the continent that there’s a lot of funding and sponsorship and support coming from Google Initiative and other funders for the coalitions, you know. But we think that if we have journalists and fact-checkers and media partners who have a common sense of understanding the role of the media in democracies, it wouldn’t be a problem. It’s not been a problem so far in Africa and some of the election coalitions that we have coordinated. So I think that’s one of the key successes we’ve had.
Jim Prendergast: Thanks a lot, David. So we do have an online question, which I’ll read out. It is from Hazel Bitanya. I hope I got that right. Do you have any experience or thoughts in involving children or young people who are non-voters but would like to contribute to the discourse, either as fact-checkers or part of disinformation campaigns or as the target audience of these campaigns? Who would like to take that?
Mevan Babakar: I can jump in really quickly. I haven’t seen any young people being included in an election coalition specifically, but maybe David and Daniel will know more than I do on this, but I do know that there have been media literacy efforts that include young people, for sure. And one that comes to mind is the Teen Fact-Checking Network, that media-wise, that’s run out of pointer in the US, but they would actually go and work with teenagers and actually teach them what does it look like to even fact-check, what is a fact-check, how can you go out and check something that you see on social media. And a while ago now, Chequillado in Argentina used to run a really big schools network as well of fact-checking and fact-checkers, and they had a series of videos that actually had nothing to do with politics, it was a lot about, you have seen a post on the internet and it’s about your friend, and someone is pretending that your friend has done something that they haven’t, and actually they set it up in a way that it was almost like a series of, like you were a detective trying to figure out what had happened, and you could use a reverse image search and you could run a couple of searches that would actually help you get contextual information, and it really helped the young people who took part not just learn those skills, but also to ask the right questions. And I think that’s a really important part of it, it’s not necessarily to just learn fact-checking through the lens of politics, it’s actually just being critical in your day-to-day when you see something, and I think that question part of it is the most important. I’ll link to those two projects as well in the chat so you can see them.
Jim Prendergast: Thanks. David, did you want to add something?
David Ajikobi: Yes, I want to add quickly that specifically for election coalition work, we involved the Nigerian Union of Campus Journalists, these are actually students who are based, who are campus journalists, who are based on campuses, basically, pretty much like press clubs, you know. We invited them to the institutional rooms when we were doing the elections, which would typically last for like a week or two, to see how we were operating every day and how the fact-checking process works in newsrooms and also in that context. Then two, was the fact that we also had student volunteers, you know, students who would come out to say, oh, can we join you guys, you know, so that was very important. Beyond the coalition work that we did, we, for example, in Africa Tech, we also did, we were trying to do what we call the finish model, where we were trying to get across to school owners who were below a voting age. So we had a project sponsored by the UN agency, where we went to schools to actually teach them basic, you know, how to fact-checking, like, you know, with very simple exercises. And we actually incorporated games, because we think that young people, it’s easier to catch their attention with games and things like that. And we’re seeing that, you know, the feedback has been very fantastic, particularly when it comes to, because these school learners will turn 18 and will be the next batch of voters in 2027. And we thought that by raising the critical thinking skills now would help them access or, you know, navigate the more keywords of the election information when the next wave of elections are coming up in Africa. Thank you.
Jim Prendergast: Great, thank you. We’ve got a couple of questions in the room. So I’m going to pass this microphone to the woman to… Hopefully, it’s still working.
Lena Slachmuijlder: Yeah. Hi, my name… My name is Lina Slachmolder. I’m with Search for Common Ground and the Council on Tech and Social Cohesion. I just want to congratulate all of you because this is exemplary work. It also aligns with what Google has signed on to, which is the Voluntary Election Integrity Guidelines for tech companies that IFS worked with you and many other industry partners and where there’s hopefully a momentum to try and put these kinds of things in practice. So just want to really acknowledge that. It’s hard work. It’s good work. But I have four questions that I want to raise and I’m very curious to hear what you think. The first is that there is a lot of evidence about Google’s ad policy monetizing mis- and disinformation. And so while the fact-checking work is critical, you actually have an upstream driver of misinformation that doesn’t seem to be discussed in these kinds of conversations. Secondly, we see how generative AI is very quickly taking over search. And that includes your own AI summaries and plus all of the competition between all of the AI companies, which could seriously disrupt the things that Google has done so well in terms of upranking higher quality information over years. It’s been a big point of credibility. But this risks to disintegrate. Number three is that these countries and the examples of this in Indonesia and other places is excellent. But we had, what, 80 elections or 60 to 80 elections? And, you know, we work in places that are struggling, that are conflict-affected. And these kinds of coalitions don’t happen in places like Chad. You know, they don’t happen in other places where the, between civil society and government is so incredibly deep that it’s difficult. So the question is, you know, how does Google act when in fact there isn’t a coalition? Do you try and take the initiative? And the last question is similar in the sense that I believe Google was part of the pre-bunking effort in Europe to try to tackle misinformation through pre-bunking. And if I’m not mistaken, it was an initiative that you took. But you haven’t taken that initiative in all the other places, notably in the global south, where we have similar issues. And sometimes the consequences of misinformation in these conflict-affected societies is deadly.
Jim Prendergast: Thank you very much. There’s a lot to digest there. Mevan, do you want to take the first shot at some of it, what you can? I guess one of the things I would ask is, and it was one of the questions I actually had for David because he used the term first, pre-bunking. Up until this week, I’d never heard of what that, I’d never heard that word before. So explain what that part of your answer.
Mevan Babakar: Sure. Let me first say, I think those are all very important questions. And I’m grateful that they have a forum to be asked. Pre-bunking is when there is a narrative that is trending in a country, or there’s like a series of claims that add up to a narrative that might be seen at the sharp end of a news outlet, et cetera. And instead of dealing with it after it’s been published and after it’s actually trending and viral, pre-bunking deals with it beforehand. So for example, in the UK, I know that every single election, based on my years fact-checking, there’s going to be a claim that comes up around the election, around like day one of voting that, or the day before voting, that will say, if you use a pencil to actually mark your X on your piece of paper. paper, then your vote is invalid. And that’s a claim that comes up every single year. It’s not true. But it comes up and it’s used to disenfranchise people sometimes. Another claim that comes up is something that will say, if you’re voting for this party, you vote on this day, if you’re voting for this other party, you vote on this day. And it might feel innocuous, but these are things that we know are going to come up. And sometimes they can cause harm. And they might disenfranchise specific populations. So instead of dealing with that fact, that piece of misinformation after it’s actually going viral, a pre bunk will actually warn people that this is going to happen maybe weeks and months in advance. You know, it will say one of the kind of tactics that we’ve seen is these kinds of claims being used to disenfranchise people or will teach people about straw man arguments or the kind of tactics and manipulations that take place so that people are sort of inoculated or vaccinated against the misinformation when they see it. So that’s a pre bunk. I think a really important part of a pre bunk is that it’s not Google trying to push this out there. It’s actually the community organizations that have the relationships. And I think this kind of goes to some of the questions. In a lot of these cases, I think it’s really important. And this is why we do the work on election coalitions, that it’s not just one organization pushing out a narrative. It’s actually communities identifying misinformation that affects them. And then those same communities being empowered to combat that misinformation themselves. Because it’s one thing for somebody in that community to fix it. It’s another thing for an external party to come in and say this is how it should be. And we both know which one’s going to engender more. trust. And I think that that’s a really important part of this puzzle. In the case of pre bunking, it’s still a relatively new effort. And it’s one that’s led by jigsaw. And actually just last week, or the beginning of December, they graduated pre bunking into the real world, and actually handed it over to a series of community organizations. And the idea is that those community organizations will be the ones that kind of further it and grow it. And that includes people from across the globe. So it’s it shouldn’t be just a EU centric effort. It should be something that exists around the world. But I think it, it is resource intensive, and it requires infrastructure. And I think that part of the selection coalition’s work is building that infrastructure for things like pre bunking to actually jump off of. And because having that layer of community organizations and journalists working together, is the scaffolding that we need for things like pre bunking to actually take effect. Your other questions were about how does Google act when there isn’t an election coalition? Or whether isn’t that kind of infrastructure already in place? Like you mentioned, Chad, for example? I think that’s a really important question. And I also think that there’s an element of that that’s supporting understanding what are the prerequisites for an election coalition. And in some cases, yes, it does require community organizations to already exist, it requires services, it requires a certain amount of media organizations to be present. And I think in the cases when those things aren’t present, we have societal conversations, we have societal challenges that we need to tackle. And I think that’s not something that Google should do in isolation. It’s something that we all need to talk about together. And actually, Google plays a part in it. But so just figuring out actually how out how to create those structures in a completely different environment. And then finally, on Gen AI and ads. On Gen AI, I think it’s a really important question. And actually, a lot of my work at Google these days is about building tools to support fact-checkers working with Gen AI. I think it’s important to say that a lot of the fact-checkers are really excited about using Gen AI and AI tools. And I think that that’s sometimes missed as a part of this conversation. The scale of the misinformation that already exists is quite high. And I think we’re all aware that manual efforts alone are not enough to fix it. So there’s an opportunity there to use AI to actually help the battle. And that’s not to replace anybody, but actually to just support the efforts. And David already mentioned the full-fact AI tools that are used by over 50 organizations around the world now. And that helps fact-checkers actually spot repeat instances of misinformation to actually do some primary checking. And actually, it doesn’t take the fact-checker or journalist out of the equation, but it supercharges them to do more and more at scale. And I think that’s really important. And then I think, finally, on Gen AI, the EU election coalition that I mentioned, the election 24 check, we actually funded them to do some research with the 3,000 fact-checks that they did this time around to actually tag the ones that came out that were Gen AI. And actually, it was surprisingly low, the number of instances of Gen AI that caused harm in an election cycle. I’m not saying that it won’t cause harm. Obviously, it has the potential to. That’s an obvious thing. But I also think that it’s interesting to consider that at this moment in time, it’s not doing that. So how can we actually? find instances of where pre-monking might actually help with Gen-AI. So, in Taiwan, for example, one of the ministers would put out fake, deep fakes of themselves ahead of an election coalition to inoculate the population against it. And I think that’s a really interesting key study. But having said that, the harms of Gen-AI is still quite high. And there’s a lot of efforts at the moment across Google to combat potential harm from AI election information especially. So, there’s something called SynthID. This is watermarking. So, where we actually add like a little signature into any image that’s generated by Google. And we would be able to flag if it’s a generated AI image or not in any of our tools. We’re also part of something called the C2PA coalition. It actually is an industry standard for assessing where information has come from and the provenance of information. And those are being added into our tools right now. So, if actually if you see an image, you’d be able to say this is where it came from. And beyond provenance, we’re actually also working on a series of tools that are about giving people more context. So, when you actually see a piece of misinformation or when you see that it’s AI generated, you can also go to things like about this image or about this page on search which tells you how old is this image? Where has it come from? Who first published it? Are there any fact checks about it? And so, encouraging people to do lateral reading around it is a really important part of this. So, that’s the kind of user intervention side of it. And then finally, there are a whole host of policies that remove hundreds and thousands of ads, like thousands and thousands of ads every single day. And whether or not those thresholds are… like, are exactly in the right places, that’s a conversation that’s constantly being had and changes in each country and changes with different laws and regulations. But I think it’s an important challenge. But I think the thing that I’d like to leave you all here with, is this isn’t a thing where it’s just one answer. It’s different in every single country. It’s different in every single threshold, the context keeps changing. The tools keep changing. And actually, it’s something where the see-saw of it is hopefully going in the right direction. But we do keep see-sawing, if that makes sense. I’ve spoken a lot, but I’ll leave it there. Thanks.
Jim Prendergast: Yeah, you’re entitled to another sip of tea there. So great exchange. Lena knows the routine because she’s asked questions before. So when you do ask a question, which we have a couple in the room, just please identify yourself and your affiliation. So I’ll turn it over to Milton.
Milton Mueller: Thanks, Jim. I’m Milton Mueller. I’m at the Internet Governance Project at Georgia Tech. I want to begin by challenging the term, misinformation. I’m in a sort of a computer science, algorithmically-driven university. And the term tends to encourage the idea that misinformation is something that has a signature that you can just recognize and somehow kick out of the bit stream. And I think the Google speaker was very perceptive in pointing out that it’s really, it’s narratives, it’s interpretations. And I don’t know why we don’t just say false or misleading information, because that makes it clear that when you interfere in these discourses, you are essentially setting yourself as an arbiter of truth. I love this idea of coalitions of journalists coalescing to do fact-checking, because that is fully in line with the liberal democratic idea of the role of the press in a free society. You are… are you are not forcing anybody to do anything. You are just simply responding to bad speech with correct speech or good speech. But there’s an elephant in the room here that I hope to see addressed and that I want to ask you about. And I’m sure the Google people are very aware of this. There’s a high degree of concentration of communication and discourse around platforms. And as a result of that, contestation over what those platforms suppress and what they promote is the stakes are raised very high. And in particular, when governments get involved in trying to influence those decisions, you get problems. You also get problems with perceptions of bias from the platforms, which are well-known as being situated in liberal California and Silicon Valley as not being exactly in red state territory. And perhaps the Hunter Biden laptop story is a perfect example of where you think you’re suppressing misinformation, but you’re actually responding to maybe political pressure from people who think that a certain amount of information that might actually be true is going to harm the chances of their favored candidate in the election. So I’m concerned about how you set the threshold for where you actually intervene in these false narratives or misleading narratives. I don’t want to use the word misinformation. And I’m particularly concerned about how you handle the role of government. We have a series of court cases. Again, Murphy versus Missouri. We have the Supreme Court case. That went all the way to the Supreme Court. We have state legislation in Florida, which is trying to regulate the way you make these decisions and impose common carrier obligations. So this issue is really a lot more, I mean, it’s great to have these journalistic coalitions, but legally and economically, this issue is a lot more. rot than you’ve made it out to be here and I’d like to know how you handle those situations, particularly again when you know the government is an interested actor in the outcome of an election obviously right so what happens when you get pressure from governments to suppress information that may be damaging to them or that may be an extension of their policy?
Alex Walden: I just don’t know if you wanted to pass the mic to somebody else and take more than one at a time.
Jim Prendergast: Sorry yeah we’ll take a couple of questions from the room and then we can sort of bounce them around.
Claes de Vreese: I think there’s sound now that’s that’s a good start. So good morning everybody my name is Claes de Vreese I’m from the University of Amsterdam and also on the executive board of the European Digital Media Observatory. I have three quick questions I really appreciate the eye for sort of the local context in which these coalitions are built but I wonder if you could speak a little bit more as to how you choose partners who are in or outside of these coalitions since there are so many new and relatively unknown actors when building these coalitions in different election contexts. So the sort of how to build and which partners are in outside of the scope. The second question would be what in your google playbook are the best advices that you give the coalitions in dealing with critiques that they might get that they’re trying either to stifle free speech or to intervene in the elections which is a common critique that is coming from different vantage point in different kinds of elections. And then the third question would be how is google trying to be proactive in actually building a coalition that would also have multiple big tech platforms at the table so that you would see a coalition that is rather driven maybe by an industry interest the more so maybe than one or two individual companies. So those would be my three questions. Thank you.
Jim Prendergast: Okay great so move on I hope you your you’ve refreshed your palate because I think a lot of them are directed you but not all of them so do you want to start us off and then we’ll maybe work Daniel and David into the conversation.
Mevan Babakar: Sure I’m actually out of tea unfortunately I feel like I need a top up but I will say I just want to make it really clear based on that last question that these election coalitions are not Google-run election coalitions these are communities of journalists and fact-checkers and organ like social organizations that have come together created the coalition and then gone for funding and Google just happens to be one of the people that have funded it and I so I think that’s a really important part of this these are like interested important people in their own communities in their own countries coming together to build something that actually serves the voters of those countries and the only thing really that Google has done is supported them either with a resource or with funding and done this research project to kind of collect some of the learnings from them so that if another group of organizations comes to us with an election coalition we can say hey here are the lessons of the other coalitions that have come before you and you can learn from them and so I really want to just make that very clear so we don’t choose who are the partners that are in the election coalition it’s not us picking and choosing it’s actually the organizations themselves coming up with their own ecosystem their own collaboration their own policies their own membership models their own capacity building programs and I think that’s a really important part that you had a question about how do we how did they really like stop the challenges of being said to censor and and free speech and I’m going to take off my google hat for one second and put on my fact checker hat and when I used to work at full fact and we used to get full facts the UK’s fact checking charity we used to get a lot of you know fact checks are censoring speech kind of conversations and our response at the time and still is used to be fact checking is the free speech response to misinformation we’re not actually taking anything down as fact checkers we’re adding more context we’re giving you more information so that you can make up your own mind and I think that’s that’s how we used to deal with it as fact checkers but Daniel and David can probably give you a much better answer for where things are these days
Jim Prendergast: Milton I’ll ask uh we’ll we’ll get in response from Daniel David first then we’ll cycle back to you
Daniel Bramatti: I think that uh even even responses is perfect uh I I really uh contest and this idea that fact checking is is censorship you you you you have to have uh content moderation in in platforms you have content moderation regarding uh violence regarding pornography regarding other other things and and and also you have to have uh content moderation regarding uh the flow of uh bad information information that uh contributes to to to polluting our uh media ecosystem our information ecosystem so um the other thing uh uh there were so many questions uh I I just want to to to mention briefly that um how we decided uh uh who enters the coalition and in in the beginning in the beginning of the pandemic uh we wanted you know for from our side to uh introduce the communication towards uh people who are probably international innovators and so we in the very very early stage of of the the adoption of the of the framework which we we have put together against the and after. So in Comprova, at the beginning, our goal was to reach a large percentage of the Brazilian public. So we invited to the table all the big players in the media here. And also, we tried to balance different media organizations according to their editorial orientation, more to the left, more to the right, and also contemplating local players. So it was a very diverse group, in my opinion. And since then, since we decided to keep Comprova going, all the new participants are, they ask to enter the coalition. And we decide collaboratively, and everybody has a veto power if we give the OK or not to that applicant. And to this day, to my knowledge, we never close the door to anybody.
Jim Prendergast: So coming back to you, Mivan, to Milton’s question about thresholds and how do you determine when you take action and when you don’t, did I incorporate one of them?
Alex Walden: I can jump in just quickly on one about the challenges of how we engage with governments and the government pressure and how that does or does not impact us. And then I’ll kick it over to you, Mivan, to talk more about the definition of misinformation and the challenges around that. Although I will say on that piece, having worked on this from the beginning when fake news was the term, and then we all decided fake news was not the right term, and there are still many conversations happening all over where mis-dismal information. I think for us at a company, we sort of have to just land on something and figure out how to operationalize it while we’re managing these harms. And we also will continue to be part of conversations around what’s the appropriate lexicon for how we’re describing what are just sort of abuse or exploitation of the services that we’re providing. But when it comes to sort of the challenges of government, on the one hand, obviously, we are deeply committed to partnering with governments across a lot of the work that we do. And increasingly so, that’s the case. But also, I’m agreeing with you that also it’s the case that governments and parties are interested parties in the outcomes of elections. And so we have to be mindful of the role that we have in engaging neutrally. And so really, that gets. back to the importance of us having clear and robust policies in place to make sure that we are consistently addressing any of these issues as they come to us. So on the one hand, that’s about having clear policies that are the product policies. How do we define election misinformation or misrepresentation or the variety of other things that might come up? How do we ensure that that’s clear? And then we really do have to enforce that consistently across all of our policies in every country. And then also being clear about when it’s, for legal reasons, we might need to remove something under a national law. And so it’s perfectly legitimate for any government to say this content violates our local law and here’s an order and you need to remove it. In that case, we would evaluate it under our standards. And then if we, under our analysis, it’s consistent with the local law and we’ve received the appropriate process from the proper authority, we may remove that. And then that would be something that we put in our transparency report and make clear to everyone that we have complied with a law under the national, which is a national requirement. So I think those things enable us to, that’s what we rely on to make sure that we can be consistent everywhere we’re operating, even though it’s true that we are getting, we will get pressure from government, but we have to kind of have that to fall back on.
Jim Prendergast: Thank you Alex, Mevan.
Mevan Babakar: I think that was a great summary. And the only thing I would add on terms is, like Alex, I’ve seen like a lot of work done on the different terms in the space. And I think that sometimes different terms are helpful for different types of things. But I think the way that I like to think about it personally that makes it very real and, makes it very real and reminds me and others of the importance of this work, is thinking about it through the lens of harms. And I think that there is some really excellent work being done at the moment by a professor called Peter Cunliffe-Jones at Westminster University to develop a harms framework specifically for mis and disinformation. And the European Fact-Checking Standards Network is, well, a couple of organizations in Europe are looking to actually start using that in a meaningful way to actually highlight the harms of misinformation in like a completely different way. Because I think it’s one thing to say there’s misinformation, and it’s another thing to say there has been 5% election interference in this country, right, or vaccine misinformation that has led to this harm. And I think that that is the level of granularity we need to get to. And at that point, we kind of bypass some of the issues of the words, and we get more to those claims and those narratives and those harms. And it’s only at that kind of detail we can start to understand what interventions are meaningful and who should take them. Should that intervention come from a government? Should that intervention come from a platform? Should that intervention come from the community or the people affected? Because I think, actually, we need interventions at all of those levels. Thanks.
Jim Prendergast: Alex, did you have anything to add? No? Okay. I’ve just been shown the, we have five minutes to wrap up sign from our helpful tech support in the room. What I’m going to do is ask each of our panelists, I guess, if you had one piece of advice that you could offer people who might be interested in either participating in or starting their own elections coalition, what would that piece of advice be? And then a call to action coming out of today, what would you like to see happen? We’ll start with David, please.
David Ajikobi: I’ll say the first thing, if you’re, how do you feel about the election, or the listening fields, and do you think there are elections releasing, how would someone answer this? You know, it’s a good point to be if you’re not, if you’re not seeing things in the same perspective. I also got to give you advice, you know, but also I explain to you, I have to tell you that Google does not have all the charts on any election campaign that we have in Africa. You know, like Google Search has some, you know, some supporters, and I can tell you that, you know, I’ll tell you where that’s on this, this page. The top last thing you have to do is, you know, and it’s something that they will be probably a little bit challenging. And what we see that helps us in Africa is that we’re able to develop, develop expertise very early in the days. And many times the coalitions are sort of, that’s regulated, sort of, you know, coordinated by AFP members, whether it’s transparency of funding, transparency of election policies, the things we search. I want you to find out, I’m trying to say you can’t check everything. So we need, we have a sort of methodology of what we’re looking at, because we want to actually, like all the bonds to be very transparent around the elections. That’s, I think that’s, that’s where we should be coming from, and that’s really the challenge that we have. Thank you.
Jim Prendergast: Great. Thank you, David. Daniel, turn to you.
Daniel Bramatti: Yes. Sorry, my camera was off. My advice is choose wisely the organization that is going to coordinate your coalition. As Miwan said, it has to be some organization that is, if not neutral, the more neutral possible, equidistant in terms of political stance, independent from parties and from government and from private sector pressures. I know that sometimes it’s difficult to find an organization with that characteristics, but it is essential. to gain trust and to lead the work.
Jim Prendergast: Great, thank you, Daniel. Meevan.
Mevan Babakar: I’d say the relationships are everything in election coalitions. So similar kind of to Daniel’s points, but it’s really important to not underestimate the amount of work it will take to actually build those relationships across those media organizations and across those people. I think that when you get a group of people like that coming together who trust each other, that’s when something special can happen, but that takes time. And I’d also add, because relationships take time to build, maybe it’s not the first election that’s the best one, maybe it’s the second one or the third one. And I think that Comprova is a very good example of that. And I think that as those relationships build, so does the opportunity and the scale of those coalitions. Yeah, thank you.
Jim Prendergast: Great, thank you. And then finally here in the room, Alex. No, okay. So I wanna thank everybody. It was a great presentation, some very good questions, some very pointed questions, frankly, and a good discussion. For those in the room who wanna copy the slides, just come see me. I’ve already sent them to somebody online who wanted them. And I do wanna thank our panelists who all got up at varying degrees of the middle of the night to join us. And for Alex for joining us here in person. Thanks for everybody for showing up in person and online. More to come. And then for those in the room, Meevan’s been dropping links to various information into the chat. So when the recording is posted on the IGF website, be sure to go back and that information is for you waiting. So thank you very much everyone and enjoy the rest of your day.
Daniel Bramatti: Thank you so much. Bye.
Mevan Babakar
Speech speed
155 words per minute
Speech length
5698 words
Speech time
2200 seconds
Coalitions allow journalists to collaborate and scale impact
Explanation
Election coalitions enable journalists and fact-checkers to work together and share resources. This collaboration allows them to have a greater impact in combating misinformation during elections.
Evidence
Examples of successful coalitions like Electionland in 2016 and Comprova in Brazil were provided.
Major Discussion Point
The importance and effectiveness of election coalitions
Agreed with
Daniel Bramatti
David Ajikobi
Alex Walden
Agreed on
Importance of election coalitions
Pre-bunking can inoculate against expected false narratives
Explanation
Pre-bunking involves warning people about potential misinformation before it spreads. This strategy can help inoculate the public against false narratives that are likely to emerge during elections.
Evidence
Example of pre-bunking claims about invalid votes marked with pencils in UK elections.
Major Discussion Point
Strategies for combating misinformation
Focus on specific harmful narratives rather than all misinformation
Explanation
Instead of trying to address all misinformation, coalitions should focus on the most harmful and widespread narratives. This targeted approach can be more effective in mitigating the impact of disinformation.
Major Discussion Point
Evaluating the impact of misinformation and coalitions
Agreed with
David Ajikobi
Agreed on
Focus on specific harmful narratives
Differed with
Milton Mueller
Differed on
Approach to addressing misinformation
Daniel Bramatti
Speech speed
110 words per minute
Speech length
1439 words
Speech time
778 seconds
Coalitions build trust across media organizations
Explanation
Election coalitions help build trust and collaboration between competing media organizations. This trust is crucial for effective fact-checking and information sharing during elections.
Evidence
Experience with the Comprova project in Brazil, which brought together 24 media outlets.
Major Discussion Point
The importance and effectiveness of election coalitions
Agreed with
Mevan Babakar
David Ajikobi
Alex Walden
Agreed on
Importance of election coalitions
Choosing neutral leadership is essential for coalition credibility
Explanation
The organization coordinating an election coalition should be as neutral as possible. This neutrality is crucial for maintaining credibility and trust among participants and the public.
Major Discussion Point
Challenges and considerations in forming election coalitions
Fact-checking adds context rather than censoring speech
Explanation
Fact-checking is not censorship but rather a way to provide additional context and information. This approach allows people to make informed decisions without restricting free speech.
Major Discussion Point
Strategies for combating misinformation
David Ajikobi
Speech speed
162 words per minute
Speech length
1770 words
Speech time
652 seconds
Coalitions help combat disinformation in African elections
Explanation
Election coalitions have been effective in combating disinformation during elections in various African countries. These coalitions bring together fact-checkers, media organizations, and civil society groups to address misinformation.
Evidence
Examples of successful coalitions in Ghana, Senegal, and Nigeria were provided.
Major Discussion Point
The importance and effectiveness of election coalitions
Agreed with
Mevan Babakar
Daniel Bramatti
Alex Walden
Agreed on
Importance of election coalitions
Media literacy efforts can engage youth
Explanation
Engaging young people through media literacy programs can help prepare future voters to navigate misinformation. These efforts can include teaching basic fact-checking skills and critical thinking.
Evidence
Example of a project sponsored by the UN agency to teach fact-checking in schools.
Major Discussion Point
Strategies for combating misinformation
Measure concrete harms like election interference percentages
Explanation
To evaluate the impact of misinformation, it’s important to measure concrete harms such as the percentage of election interference. This approach provides a more tangible understanding of the effects of disinformation.
Major Discussion Point
Evaluating the impact of misinformation and coalitions
Agreed with
Mevan Babakar
Agreed on
Focus on specific harmful narratives
Alex Walden
Speech speed
176 words per minute
Speech length
891 words
Speech time
302 seconds
Coalitions are crucial for delivering credible information to users
Explanation
Election coalitions play a vital role in ensuring that credible information reaches users during elections. These partnerships help address the challenges of misinformation that individual organizations may struggle to tackle alone.
Major Discussion Point
The importance and effectiveness of election coalitions
Agreed with
Mevan Babakar
Daniel Bramatti
David Ajikobi
Agreed on
Importance of election coalitions
Coalitions must navigate government pressure and legal challenges
Explanation
Election coalitions face challenges in dealing with government pressure and legal issues. It’s important to have clear policies and consistent enforcement to maintain neutrality and credibility.
Major Discussion Point
Challenges and considerations in forming election coalitions
AI tools can help scale fact-checking efforts
Explanation
Artificial intelligence tools can assist in scaling up fact-checking efforts. These tools can help identify repeat instances of misinformation and support primary checking, allowing fact-checkers to work more efficiently.
Major Discussion Point
Strategies for combating misinformation
Platforms must balance content moderation and free speech
Explanation
Tech platforms face the challenge of balancing content moderation with preserving free speech. Clear policies and consistent enforcement are crucial in addressing this challenge.
Major Discussion Point
The role of tech platforms in election integrity
Differed with
Milton Mueller
Differed on
Role of tech platforms in content moderation
Transparency in platform policies and government requests is key
Explanation
Transparency in platform policies and government content removal requests is essential. This transparency helps maintain trust and accountability in content moderation processes.
Evidence
Mention of transparency reports that disclose compliance with national laws.
Major Discussion Point
The role of tech platforms in election integrity
Milton Mueller
Speech speed
136 words per minute
Speech length
524 words
Speech time
229 seconds
Concentration of discourse on platforms raises stakes of moderation
Explanation
The high concentration of communication on a few major platforms increases the importance of content moderation decisions. This concentration raises concerns about the power of platforms to influence public discourse.
Major Discussion Point
The role of tech platforms in election integrity
Differed with
Alex Walden
Differed on
Role of tech platforms in content moderation
Balance addressing misinformation with preserving free speech
Explanation
There is a need to balance efforts to combat misinformation with protecting free speech. Interventions in public discourse raise concerns about platforms becoming arbiters of truth.
Evidence
Example of the Hunter Biden laptop story and potential political pressure in content moderation decisions.
Major Discussion Point
Evaluating the impact of misinformation and coalitions
Differed with
Mevan Babakar
Differed on
Approach to addressing misinformation
Claes de Vreese
Speech speed
163 words per minute
Speech length
242 words
Speech time
88 seconds
Platforms should collaborate on industry-wide coalitions
Explanation
Tech platforms should work together to form industry-wide coalitions to address election integrity issues. This collaboration could lead to more consistent and effective approaches to combating misinformation.
Major Discussion Point
The role of tech platforms in election integrity
Agreements
Agreement Points
Importance of election coalitions
Mevan Babakar
Daniel Bramatti
David Ajikobi
Alex Walden
Coalitions allow journalists to collaborate and scale impact
Coalitions build trust across media organizations
Coalitions help combat disinformation in African elections
Coalitions are crucial for delivering credible information to users
All speakers emphasized the significance of election coalitions in combating misinformation and enhancing election integrity through collaboration and resource sharing.
Focus on specific harmful narratives
Mevan Babakar
David Ajikobi
Focus on specific harmful narratives rather than all misinformation
Measure concrete harms like election interference percentages
Both speakers advocated for targeting specific harmful narratives and measuring concrete impacts rather than addressing all misinformation.
Similar Viewpoints
Both speakers emphasized the importance of balancing fact-checking and content moderation with preserving free speech, viewing fact-checking as a way to add context rather than censor.
Daniel Bramatti
Alex Walden
Fact-checking adds context rather than censoring speech
Platforms must balance content moderation and free speech
Both speakers discussed innovative strategies to combat misinformation, including pre-bunking and AI tools, highlighting the need for proactive and scalable approaches.
Mevan Babakar
Alex Walden
Pre-bunking can inoculate against expected false narratives
AI tools can help scale fact-checking efforts
Unexpected Consensus
Importance of neutrality in coalition leadership
Daniel Bramatti
David Ajikobi
Choosing neutral leadership is essential for coalition credibility
Coalitions help combat disinformation in African elections
Despite coming from different regions, both speakers emphasized the importance of neutral leadership in election coalitions, suggesting a shared understanding of coalition dynamics across diverse contexts.
Overall Assessment
Summary
The main areas of agreement included the importance of election coalitions, the need to focus on specific harmful narratives, the balance between fact-checking and free speech, and the potential of innovative strategies like pre-bunking and AI tools.
Consensus level
There was a high level of consensus among the speakers on the core principles and strategies for combating misinformation in elections. This consensus suggests a growing understanding of effective practices in election integrity efforts across different global contexts. However, there were nuanced differences in approaches and emphases, reflecting the diverse challenges faced in different regions and the need for context-specific solutions.
Differences
Different Viewpoints
Role of tech platforms in content moderation
Milton Mueller
Alex Walden
Concentration of discourse on platforms raises stakes of moderation
Platforms must balance content moderation and free speech
Milton Mueller expressed concerns about the concentration of communication on platforms and their power to influence public discourse, while Alex Walden emphasized the need for clear policies and consistent enforcement to balance content moderation with free speech.
Approach to addressing misinformation
Mevan Babakar
Milton Mueller
Focus on specific harmful narratives rather than all misinformation
Balance addressing misinformation with preserving free speech
Mevan Babakar advocated for focusing on the most harmful and widespread narratives, while Milton Mueller emphasized the need to balance efforts to combat misinformation with protecting free speech.
Unexpected Differences
Terminology and framing of misinformation
Milton Mueller
Mevan Babakar
Balance addressing misinformation with preserving free speech
Focus on specific harmful narratives rather than all misinformation
Milton Mueller unexpectedly challenged the term ‘misinformation,’ suggesting it encourages algorithmic solutions, while Mevan Babakar proposed focusing on specific harmful narratives. This difference in framing the issue highlights the complexity of addressing misinformation.
Overall Assessment
summary
The main areas of disagreement centered around the role of tech platforms in content moderation, approaches to addressing misinformation, and the balance between combating false information and preserving free speech.
difference_level
The level of disagreement was moderate. While speakers generally agreed on the importance of election coalitions and fact-checking, they differed on specific strategies and the role of various actors. These differences reflect the complex nature of addressing misinformation in elections and highlight the need for continued dialogue and collaboration among stakeholders.
Partial Agreements
Partial Agreements
All speakers agreed on the importance of election coalitions, but they emphasized different aspects: Mevan focused on scaling impact, Daniel on building trust, and David on combating disinformation in specific contexts.
Mevan Babakar
Daniel Bramatti
David Ajikobi
Coalitions allow journalists to collaborate and scale impact
Coalitions build trust across media organizations
Coalitions help combat disinformation in African elections
Both speakers agreed on the need to scale fact-checking efforts, but they proposed different approaches: Alex emphasized AI tools, while Mevan suggested focusing on specific harmful narratives.
Alex Walden
Mevan Babakar
AI tools can help scale fact-checking efforts
Focus on specific harmful narratives rather than all misinformation
Similar Viewpoints
Both speakers emphasized the importance of balancing fact-checking and content moderation with preserving free speech, viewing fact-checking as a way to add context rather than censor.
Daniel Bramatti
Alex Walden
Fact-checking adds context rather than censoring speech
Platforms must balance content moderation and free speech
Both speakers discussed innovative strategies to combat misinformation, including pre-bunking and AI tools, highlighting the need for proactive and scalable approaches.
Mevan Babakar
Alex Walden
Pre-bunking can inoculate against expected false narratives
AI tools can help scale fact-checking efforts
Takeaways
Key Takeaways
Election coalitions are an effective way for journalists and fact-checkers to collaborate and scale their impact in combating misinformation
Building trust and relationships between coalition members is crucial but takes time
Pre-bunking and media literacy efforts can help inoculate against expected false narratives
Tech platforms play an important but complex role in election integrity, balancing content moderation with free speech concerns
Measuring concrete harms from misinformation is more useful than focusing on all potential misinformation
Resolutions and Action Items
Google to continue supporting election coalitions through funding and resources
Fact-checkers to focus on measuring and highlighting specific harms from misinformation
Coalition members encouraged to build long-term relationships beyond single election cycles
Tech platforms to increase transparency around content moderation policies and government removal requests
Unresolved Issues
How to form effective coalitions in countries with limited civil society or media infrastructure
Balancing the need for content moderation with concerns about censorship and free speech
How to handle government pressure on platforms to remove content during elections
Determining appropriate thresholds for platform intervention on misleading content
Suggested Compromises
Using a harm-based framework to determine when intervention on misinformation is warranted, rather than blanket policies
Platforms collaborating on industry-wide coalitions to address election integrity, rather than individual company efforts
Balancing fact-checking with adding context, rather than removing content outright
Thought Provoking Comments
Pre-bunking is when there is a narrative that is trending in a country, or there’s like a series of claims that add up to a narrative that might be seen at the sharp end of a news outlet, et cetera. And instead of dealing with it after it’s been published and after it’s actually trending and viral, pre-bunking deals with it beforehand.
speaker
Mevan Babakar
reason
This introduces the concept of pre-bunking, which is a proactive approach to combating misinformation that many participants were unfamiliar with.
impact
It sparked further discussion about proactive strategies for addressing misinformation and led to examples being shared, like the Taiwan minister creating fake deep fakes to inoculate the population.
I want to begin by challenging the term, misinformation. I’m in a sort of a computer science, algorithmically-driven university. And the term tends to encourage the idea that misinformation is something that has a signature that you can just recognize and somehow kick out of the bit stream.
speaker
Milton Mueller
reason
This comment challenges a fundamental assumption underlying much of the discussion and pushes for more precise language.
impact
It shifted the conversation to consider the complexities of defining and identifying misinformation, leading to discussions about narratives and interpretations rather than just false information.
There’s a high degree of concentration of communication and discourse around platforms. And as a result of that, contestation over what those platforms suppress and what they promote is the stakes are raised very high. And in particular, when governments get involved in trying to influence those decisions, you get problems.
speaker
Milton Mueller
reason
This comment highlights the broader context and potential risks of platform-based approaches to combating misinformation.
impact
It led to a discussion about the role of governments, potential biases, and the challenges of setting thresholds for intervention.
I think that sometimes different terms are helpful for different types of things. But I think the way that I like to think about it personally that makes it very real and reminds me and others of the importance of this work, is thinking about it through the lens of harms.
speaker
Mevan Babakar
reason
This reframes the discussion of misinformation in terms of concrete harms rather than abstract definitions.
impact
It shifted the focus towards more tangible impacts and metrics, suggesting new ways to approach and measure the effects of misinformation.
Overall Assessment
These key comments shaped the discussion by challenging assumptions, introducing new concepts, and reframing the issue of misinformation. They moved the conversation from a focus on technical solutions and coalitions to a more nuanced consideration of the complexities involved in defining, identifying, and addressing misinformation. The discussion evolved to consider broader societal impacts, the role of various stakeholders including governments and platforms, and the importance of focusing on concrete harms rather than abstract definitions.
Follow-up Questions
How can election coalitions be formed in conflict-affected countries where cooperation between civil society and government is difficult?
speaker
Lena Slachmuijlder
explanation
This is important to understand how to implement election integrity efforts in challenging political environments.
How does Google act when there isn’t an election coalition in a country?
speaker
Lena Slachmuijlder
explanation
This explores Google’s role and responsibilities in countries lacking established election integrity infrastructure.
Why hasn’t Google taken the initiative to implement pre-bunking efforts in the Global South, similar to what was done in Europe?
speaker
Lena Slachmuijlder
explanation
This addresses potential disparities in misinformation prevention efforts between different regions.
How do platforms set the threshold for when to intervene in false or misleading narratives?
speaker
Milton Mueller
explanation
This is crucial for understanding how platforms balance free speech concerns with misinformation prevention.
How do platforms handle pressure from governments to suppress information that may be damaging to them or that may be an extension of their policy?
speaker
Milton Mueller
explanation
This explores the complex relationship between platforms, governments, and information control during elections.
How do election coalitions choose which partners to include or exclude?
speaker
Claes de Vreese
explanation
This is important for understanding how coalitions maintain credibility and effectiveness.
What are the best practices for election coalitions to deal with critiques that they are trying to stifle free speech or intervene in elections?
speaker
Claes de Vreese
explanation
This addresses a common challenge faced by fact-checking and anti-misinformation efforts.
How is Google trying to be proactive in building a coalition that would include multiple big tech platforms?
speaker
Claes de Vreese
explanation
This explores the potential for broader industry cooperation in addressing election misinformation.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
Internet Governance Forum 2024
15 Dec 2024 06:30h - 19 Dec 2024 13:30h
Riyadh, Saudi Arabia and online