Main Session 3: Internet Governance and elections: maximising potential for trust and addressing risks
Main Session 3: Internet Governance and elections: maximising potential for trust and addressing risks
Session at a Glance
Summary
This discussion focused on Internet governance and elections, particularly addressing the challenges of maintaining information integrity and trust in the democratic process in the digital age. Panelists from various sectors and regions shared insights on the experiences of the 2024 “super election year” and discussed strategies to protect election integrity.
Key issues highlighted included the spread of misinformation and disinformation, the impact of artificial intelligence and deep fakes, and the need for better regulation of digital platforms. Panelists emphasized the importance of media literacy, fact-checking, and collaboration between stakeholders to combat these challenges. The discussion also touched on the specific difficulties faced by the Global South, including digital inequality and limited access to information.
Several initiatives were discussed, such as partnerships between tech companies and fact-checkers, training programs for journalists, and the development of AI detection tools. The role of civil society and NGOs in promoting digital literacy and resilience was stressed. Panelists agreed on the need for a multi-stakeholder approach to address these complex issues.
The discussion explored governance principles and mechanisms to protect electoral processes while upholding human rights. Suggestions included improving transparency in political advertising, strengthening data protection laws, and developing global standards for content moderation. The importance of balancing innovation with integrity was emphasized.
Participants highlighted the potential of the Internet Governance Forum (IGF) to facilitate global dialogue and cooperation on these issues. They called for a more coordinated approach between regional and global IGFs to maximize impact. The discussion concluded with a recognition of the ongoing nature of these challenges and the need for sustained efforts beyond election periods to safeguard democratic processes in the digital age.
Keypoints
Major discussion points:
– The challenges of misinformation, disinformation and foreign interference in elections in the digital age
– The need for multi-stakeholder collaboration and governance frameworks to protect election integrity
– The importance of media literacy, journalist safety, and access to reliable information
– The role of social media platforms and technology companies in addressing online harms
– The potential of the Internet Governance Forum to facilitate global cooperation on these issues
The overall purpose of the discussion was to examine the challenges to election integrity in the digital age and explore potential governance principles, tools and mechanisms to protect democratic processes while upholding human rights.
The tone of the discussion was largely serious and concerned about the threats to democracy, but also constructive in proposing solutions. There was a sense of urgency about addressing these issues, balanced with cautious optimism about the potential for multi-stakeholder cooperation. The tone became more action-oriented towards the end as participants offered final recommendations.
Speakers
– Pearse O’Donohue: Moderator
– Tawfik Jelassi: Director from UNESCO
– Lina Viltrakiene: Representative from the Lithuanian government
– William Bird: From Media Monitoring Africa
– Rosemary Sinclair: Chief Executive Officer of the Australian DA
– Daniel Molokele: Member of Parliament from Zimbabwe
– Sezen Yesil: Director of Public Policy at Meta
– Elizabeth Orembo: Researcher at the International Stakeholder Relations of ICT Africa
Additional speakers:
– Giacomo Mazzone: Member of EDMO (European Digital Media Observatory)
– Bruna Martins dos Santos: Organizer of the session
– Maha Abdel Nasser: From the Egyptian parliament
– Alexander Savnin: From Primorsky University in Russia
Full session report
Expanded Summary of Discussion on Internet Governance and Elections
Introduction:
This discussion, moderated by Pearse O’Donohue, brought together in-person and online panelists from diverse sectors and regions to explore the critical intersection of internet governance and election integrity in the digital age. The panel examined challenges, successful initiatives, and potential governance mechanisms to protect democratic processes while upholding human rights.
Key Challenges to Election Integrity:
1. Misinformation and Disinformation:
Multiple speakers, including Tawfik Jelassi, William Bird, Sezen Yesil, and Lina Viltrakiene, identified the spread of misinformation and disinformation as a significant threat to election integrity. This includes coordinated inauthentic behaviour on social platforms and the use of AI and deepfakes to create misleading content.
2. Attacks on Electoral Bodies and Journalists:
William Bird and Tawfik Jelassi highlighted the serious issue of attacks and intimidation against journalists and electoral management bodies, recognising it as a significant threat to press freedom and election integrity. Jelassi specifically noted the increased violence against women journalists.
3. Digital Inequality:
Elizabeth Orembo raised concerns about digital inequality limiting access to reliable information, particularly in the Global South. She also highlighted challenges related to data sharing and the need for proactive information from election management bodies.
4. Emerging Technologies:
Lina Viltrakiene and Sezen Yesil emphasised the threat posed by AI and deepfakes in creating misleading content. Yesil acknowledged these risks and discussed measures taken by platforms to address them.
5. Untrained Influencers:
Daniel Molokele pointed out the rise of influential but untrained social media personalities and podcasters affecting election integrity in Africa, highlighting the lack of regulation for these new media actors.
Successful Initiatives and Best Practices:
1. Multi-stakeholder Collaboration:
Several speakers emphasised the importance of collaboration between various stakeholders, including tech platforms, fact-checkers, authorities, and civil society.
2. Media Literacy and Digital Skills Education:
Tawfik Jelassi highlighted the importance of media literacy and digital skills education programmes in combating misinformation, mentioning UNESCO’s role in training journalists on election coverage and AI’s impact on elections.
3. Technical Measures:
Sezen Yesil discussed technical measures implemented by Meta, including detecting manipulated media, removing inauthentic accounts, and providing transparency in political advertising.
4. Public Reporting Platforms:
William Bird mentioned the development of public reporting platforms for online harms and suggested more nuanced labels to understand different types of misinformation.
5. Consolidated Monitoring Systems:
Lina Viltrakiene described Lithuania’s initiatives, including a consolidated monitoring system and collaboration between business and academia to address digital threats to elections.
6. European Digital Media Observatory (EDMO):
Giacomo Mazzone highlighted EDMO’s role in monitoring European elections and coordinating fact-checking efforts across the continent.
Governance Principles and Mechanisms:
1. Balancing Innovation and Integrity:
Rosemary Sinclair stressed the need to balance innovation with integrity and human rights protections in the digital sphere, emphasising the technical community’s role in maintaining DNS availability during elections.
2. Global Cooperation:
Lina Viltrakiene called for increased global cooperation and information sharing between democracies to address digital threats to elections.
3. Standardisation of Information Quality:
Daniel Molokele suggested the standardisation of quality information and news across regions, particularly in Africa.
4. Platform Accountability:
Lina Viltrakiene advocated for establishing clear legal responsibilities and potential penalties for digital platforms, while Sezen Yesil emphasised voluntary collaboration between platforms and authorities.
5. Information as a Public Good:
Tawfik Jelassi proposed treating information as a public good rather than a public hazard.
6. Ongoing Efforts:
William Bird stressed the importance of continuous efforts to combat misinformation outside of election periods.
Role of the Internet Governance Forum (IGF):
Rosemary Sinclair emphasised the potential of the IGF to facilitate global dialogue and cooperation on election integrity issues. She called for clarifying and strengthening the IGF’s role in addressing information integrity issues globally, developing more coordinated efforts between national, regional, and global IGFs, and potentially contributing to a global governance architecture.
Unresolved Issues and Future Directions:
1. Regulation of Influential Social Media Personalities:
The discussion highlighted the need for effective regulation of influential social media personalities and content creators, particularly in regions like Africa.
2. Addressing the Digital Divide:
Participants recognised the ongoing challenge of addressing the digital divide that limits access to reliable information in some regions.
3. Balancing Free Speech and Combating Misinformation:
The discussion touched on the complex issue of balancing free speech protections with the need to combat harmful misinformation.
4. Global Platform Accountability:
Questions remained about how to hold global platforms accountable across different national jurisdictions.
5. Standardised Definitions:
The need for developing common definitions and standards for identifying misinformation/disinformation was identified as an area for future work.
6. Internet Voting Systems:
An audience member raised concerns about the use of internet voting systems in some countries and the potential risks associated with them.
Conclusion:
The discussion underscored the complex and evolving nature of protecting election integrity in the digital age. While there was broad consensus on the challenges faced, the panelists emphasised the need for continued multi-stakeholder collaboration, enhanced digital literacy efforts, and the development of nuanced governance frameworks to address these critical issues. The role of the IGF in facilitating ongoing global dialogue and cooperation on these matters was highlighted as a key avenue for future progress. The moderator’s final remarks emphasised the importance of the multi-stakeholder process in addressing these challenges effectively.
Session Transcript
Pearse O’Donohue: Good afternoon. Welcome to this open session, the main session on Internet Governance and Elections. Welcome to this open session on Internet Governance and Elections. We want to focus on the issues around elections and maximising potentials. We have a very important session on Internet Governance and Elections. We want to address the issues of the democratic process. We must address, already on Sunday morning, in day zero of this Internet Governance Forum here in Saudi Arabia, we had a session on misinformation. And in that session, we also had a session on the role of stakeholders in protecting election integrity and the right to information. We want to have a discussion on the role of stakeholders for increasing trust and addressing any risks that exist. This session will therefore have a discussion on the role of stakeholders in actually protecting information and election integrity and what are the rights, what are the rights to information and election integrity. And also, we will have a discussion on the role of stakeholders in protecting citizen participation while mitigating the risks to electoral integrity. So for that, I would like to introduce our great panel of speakers, whom we have, and I would like to start by saying hello to Ms. Sezen Jezil, who is Director of Public Policy at Meta. We also have Mr. William Bird, who is from Media Monitoring Africa. And Mr. Tawfik Jelassi from UNESCO. You’re welcome, Tawfik . And then we have online Ms. Rosemary Sinclair, who is the outgoing Chief Executive of AUDA. You are both welcome online and it’s great to see you. We can see you on stage here. So I will move to the seat. I beg your pardon. I am so sorry, Your Excellency. This is the problem of not having paper in front of me. I’m still not adapted. So we have a representative from Zimbabwe, a member of the Parliament of Mozambique, the Honourable Mr. Daniel Molokele So the way we’re going to proceed with this with this panel is that I’m going to allow each of the panel members to make a brief opening statement in relation to a question which I will now ask. They’ll have three minutes to respond and in the good tradition of the IGF we will then immediately allow for input from you, the audience, both here and online to those questions before then I go back to the moderators with some more detailed questions for which we have chosen specific subjects. That’s how we’d like to proceed so as I say get ready we would really like to encourage your participation so that the output of this session will actually be something which we can have some well-informed actionable measures which can be taken and I will say in the context of the IGF where we know that there’s so much that the multi-stakeholder platform that this represents can do in such an important issue. So to get us going I’m going to ask the following question to all of our panel members. With more than 65 countries going to the polls in 2024 this was marked by the biggest number of elections at the same time in history so some have called this the year of democracy but looking now in retrospect at the end of the year how do you think it has gone? How has the year gone by? What worked and what didn’t work? So perhaps I can turn to you please first of all.
Sezen Yesil: Thank you so much. Hello everyone thanks a lot for hosting Metta on this panel. Internally at Meta we call it year of election too so we knew it was coming and we prepared well. Before each election we make a risk assessment specific to that election and this assessment informs our election integrity work at Metta. In 2024 we ran a number of election operations center to monitor continuously the issues on our platforms and to take actions as needed swiftly. I can share a few observations from this year’s elections. So in first of all in our actions we try to strike a balance between protecting voice and keeping people safe and I must admit that it is one of the hardest jobs in the world and we have many policies or rules on what is allowed and what is not allowed on meta platforms and we remove content which is violating our rules or policies. Throughout this year we decided to update some of our policies. For example we updated our penalty system per feedback of the oversight board to treat people more fairly and to give them more free expression and secondly we updated our policy on violence. People of course have every right to speculate on election related corruption but when such content is also combined with a signal of violence we remove it and I can say that those updates work very well during the elections in this year. Second observation is about prevention of foreign interference. In this year only we removed about 20 CIB network coordinated in authentic behavior network. Those networks consist of hundreds of Facebook and Instagram accounts and pages and they work to mislead people, they work to spread disinformation unfortunately. We observed that some of those networks we disrupted moved to other platforms with fewer safeguards than ours. The last observation is about the impact of artificial AI. So in the beginning of this year many people were very concerned about the potential negative impact of gen AI generated content on elections such as deepfakes or AI generated disinformation campaigns. However and sorry to address these risks we took a lot of technical measures plus we signed an AI election accord with other major tech companies to cooperate to combat threats coming from the use of AI on elections and we observed that the risks did not materialize in a significant way and such impact was modest and very limited in scope. example, only less than 1% of the fact-checked misinformation was AI-generated.
Pearse O’Donohue: Time now. Thank you. That’s it. Okay. Thank you. Sorry, Sezen, but you’re the first to suffer from the fact that we will hopefully have a good discussion, so I’ll keep the speaking time short. Now I’ll go to the other end of our list of speakers here, just Tawfik Jelassi, Director from UNESCO. We’ll be very happy to hear your views on that question as to, really, what do you think, how did the year go, and what worked and what didn’t work?
Sezen Yesil: Thank you very much, Mr. Chair. So you reminded us that this is the super election year, with 75 elections being held, that is involving half of the population of the world, and obviously this is a major test for democratic systems around the globe. What has worked well, to answer your question, I think there were some global efforts to protect election integrity from a process point of view, however, the second maybe thing that worked well is the involvement of the youth and first-time voters in elections around the world, especially in countries where half of the population, sometimes even 60% of the population, is under the age of 25. I think we saw this major engagement, that’s good. What has not worked well is the exponential spread of disinformation and hate speech derailing the integrity of electoral processes, and maybe casting some doubt or trust in election outcomes and democratic institutions. Another thing that did not work well, which is a major challenge, is the safety of journalists covering elections. Many attacks happened against them, and we know about the extreme The second thing that did not work well is that there is a huge digital inequality in the world, and that’s why there is a relatively high impunity rate for violence or crimes committed against journalists. The third thing that did not work well is still a huge digital inequality that exists, especially marginalised groups, including women and persons with disabilities, who face major barriers to participate in public spaces, and that’s why we need to change the way or the path forward. I think we need some stronger regulatory frameworks to address harmful online content while protecting freedom of speech, so when I say regulation, I’m not referring to censorship, that’s why I’m saying while safeguarding free speech online. Second, we need maybe to expand media and information literacy in the digital age, especially among the youngsters and citizens, and, finally, I would say that UNESCO is contributing to this global effort on media and information literacy in the digital age, but also through the published UNESCO guidelines for the governance of digital platforms, which happened a year ago.
Pearse O’Donohue: Thank you very much, and some of those subjects that you’ve raised we will come back to in our detailed questions, but it’s a very clear view as to the main points that we must address, including, of course, intimidation and violence against journalists, and the digital gaps which have themselves an impact on the derailment of these elections. So, thank you. If I could now ask the same question to the first of our online participants, our online panellists, so Ms Liz Orenbo, who’s a researcher at the International Stakeholder Relations of ICT Africa. I’d like to hear your views on that same question about how things went and what worked and what didn’t work. Please, Liz.
Elizabeth Orembo: Thank you for the floor, and thank you for inviting me to this very important discussion. In my reflection, I would say that there are things that went well, there are things that didn’t go well, just as far as I’m concerned. There are things that didn’t go well. There are things that didn’t go well, just as far as I’m concerned. you might hear some chicken sound behind me. So one thing that did go well is that stakeholders, even locally, even in Africa, because I work in the context of Africa, they knew that this was coming. And with the rapid changes of technologies, they were aware that they needed to come together and tackle some of these risks. So some of those risks were tackled, but also the challenges of the free flow of information itself. And with that, I also talk about data. That remained a problem. And when free flow of information is not there, with challenges of policy, with challenges of infrastructure, with challenges also of media, then people don’t access information the same way. And it breeds a very fertile ground for misinformation and inequality. There’s also not that culture of data sharing, and especially in the context of election. And this brings that unevenness of access to information itself and also misinformation. But that problem continued. It also meant that trust for election management bodies also kind of went down, because people are yearning for information, truthful information. And at the same time, they’re getting mixed information. But also at the same time, media is not equipped. It’s also a struggling industry to get important information to people. So that also breeds another fertile ground for misinformation. So data and information flow, I would say was a major problem to me. But also, as much as the stakeholders came together to tackle misinformation, also there was a bit of challenge in bringing all stakeholders to come to place. Because with data becoming more available, we also need more capacity to crunch data to get it to people. And those capacities were different as well, and sometimes challenging. So there was sometimes data availability, but challenges in making use of that data. Another one persistent challenge is, and especially us in the global south, reaching the tech companies. And with that, we also experienced regulatory challenges when it comes to crisis during election that can sometimes lead to internet shutdown. I will stop there for fear of being time-limited.
Pearse O’Donohue: Well, thank you. And a very interesting perspective, including that last point, but not. least that last point with regard to the particular issues of the Global South. Hopefully we can come back to some of those questions as well. But now if I could turn to the next of our speakers here, William Byrd from Media Monitoring Africa. Please, William. Thank you.
William Bird: It’s been a big year but I want to just ask if people genuinely feel better about democracy having had 65, 70, 75 elections. Because the sense that I get from speaking to people is that despite it being, it should be a year of celebrating democracy, we don’t feel good about democracy and I think that speaks to some fundamental changes. The first is the rise of fascism and this is a very real problem for us in terms of the fact that I think it’s deepening polarization. It’s framing people that believe and support human rights as left-wing extremists, just because you are talking about fundamental equality and dignity for all. And there’s something that’s happened I think that we also need to accept as a point of departure about power structures. We’re no longer in a place where you can have power determined and messaging and narratives framed by one or a few central entities. There’s now this wonderful possibility that almost anyone can have a view and then as much as that’s a good thing, we mustn’t throw away, throw the baby out with the bathwater as the expression goes, right? Because we do need to make sure that there’s certain things that are common that we can at least agree on. So I think in terms of things that worked well, I was thinking about it last night and I came up with MECA, which stands for Media, it seems appropriate, Media, Electoral Management Bodies, Civil Society, Collaboration and Adaptability. Some colleagues have touched on that sense of adaptability of organizations of entities adapting to the emerging challenges. I think for media we saw them facing huge problems across the continent, particularly in Southern Africa, but we also developed some mechanisms to start to assess how they perform and how they contribute. Electoral Management Bodies in countries where there were big shifts, like in South Africa and in Botswana for example, of political power, we saw that where you’ve got a stronger, more credible Electoral Management Body, they’re able to still contribute and function despite being subjected to significant attacks. Civil Society I think worked really well certainly in our experience in South Africa. They came up with research projects, they worked with universities. There was a reporting mechanism, Real 411, that’s a public complaints platform, and they worked together, which is the next point, collaboration, can I finish, collaboration, which is that we worked with the social media platforms, Google, Meta, and TikTok, and the electoral management body, and that did something really positive.
Pearse O’Donohue: Okay, thank you, William, for a new acronym, but at least a way of analysing the different issues. We will come back to that also. And now, I’m certainly not going to forget him this time, our next speaker is the Honourable Daniel Molokele, who is a Member of Parliament from Zimbabwe, please.
William Bird: Thank you so much. I will speak more from the African point of view. It was also a very huge election year for Africa 2024. I would say as we end the year as a continent, we are generally happy with the election processes across Africa, we had largely peaceful and successful elections in countries such as South Africa, Madagascar, Botswana, and very recently in Ghana. And we managed also to benefit from innovation around media and technologies, especially harnessing the youth population into elections. Generally, young people in Africa are very averse to elections, there is apathy, but I think this year we saw a higher participation of young people as voters. We still need to see more young people as candidates or as elected representatives. We also saw the use of social media in a much more progressive way to mobilize people to voter registration and more importantly to turn out as voters, including media platforms such as TikTok, WhatsApp, Facebook, and X, so Africa is harnessing the media technologies to also improve access to elections by average citizens. We also end the year on a very difficult note in countries such as Mozambique where there is no peace at the moment. The post-electoral violence continues to escalate with no solution in sight. Last time I checked, over 100 civilians have died, mostly at the hands of security officials like police and army in Mozambique. The election remains disputed and we need a solution to that. Interestingly enough, there has been a huge use of media technology or innovative approaches to use of media. The opposition leader is actually not in Mozambique at the moment, but he is able to provide leadership every day in Mozambique and people are using access to media technologies to respond. It can be a bad thing, it can also be a good thing, but that’s the situation at the moment in Mozambique. Thank you.
Pearse O’Donohue: Thank you very much. So between what Daniel Molokeli has said and William before him, we are faced with a number of issues where we need to consider the role of the international online data and communications on issues such as, William mentioned, the rise of extremists as a result of the elections, or in the case of Daniel, the actual fact of violence as part of the elections leading even terribly to the death of citizens and individuals, and to what extent is whatever about misinformation, whatever to what extent is digital or online information or platforms contributing to those serious issues. So the next speaker is here with us, so it’s Ms. Lina Vitrakainė from the Lithuanian government. Please.
Lina Viltrakiene: Thank you very much and good afternoon everybody. And indeed, I would like to say that Lithuanians significantly contributed to this year of democracy by having, participating in three elections this year. We had presidential elections, we had elections to the European Parliament, and we also had the national parliamentary elections. So from the government perspective, it was a challenge and a lot of governmental institutions, including Lithuania’s Central Electoral Commission and a number of other institutions, indeed worked hard and consolidated all their efforts in order to make these elections go smoothly and make them reliable. Particular attention was paid to ensure that only legal sources of fundings are used for electoral campaigns, transparency is maintained with regard to real expenditure of political parties and individuals for the media, effective communications channels with media are maintained, appropriate channels to detect disinformation and a comprehensive system to mitigate risks is established, to mention just a few. Some of these requirements and other important requirements are covered by Lithuanian and European legal acts, like the election code, the criminal code, the political party law, the long provision of information to the public. to mention some of them, and thus solid legal environment is the first thing I would like to mention in the list of what worked. Another action which I prefer to include in the same list is established collaboration of responsible state institutions with media, including with social platforms, which no doubt enlarged public space and reinvigorated public debate during the election campaign. But on the other hand, all around the world, we faced the unprecedented scales of lies and disinformation, deep fake statements of top politicians appearing especially on social platforms. And this increased the threat of influencing the choices of people, seeding distrust in society and eroding trust in democratic institutions. You may know that in the EU, the Romanian and Bulgarian elections experienced significant interference by foreign actors via social media platforms, especially TikTok and Telegraph. Thus, this shows us that we need to work further on continuous collaboration of platforms with state institutions. And while regulatory frameworks perhaps should be improved, and as a model, I would like to refer to the EU’s Digital Services Act, which could really encourage the thinking. Thank you.
Pearse O’Donohue: Thank you very much, Lina. And if I could just add, working in the European Union for the European Commission, also we had put in place a number of measures for monitoring the health of the European Parliament elections. We’re still doing that assessment, but it is clear that some problems were avoided. But you did mention, including a number of other problems, for the first time, the appearance of deep fakes, which can be very influential and turn people against an individual or a tendency or a party, and be very damaging, even if they are very quickly identified as being fake, because sometimes the initial damage is done. Thank you. So now our last speaker who is in line, and thank you very much for your patience, is Ms. Rosemary Sinclair, who is the Chief Executive Officer of the Australian DA. Rosemary, the floor is yours.
Rosemary Sinclair: Thank you, Pierce, and many thanks for the opportunity to bring a technical community perspective to the panel. And I’d like to start with just a technical reminder, really, about the internet. It is, of course, a network of networks, 70,000 in total. It operates on open standards and common protocols to enable global interoperability. It’s made useful by the unique identifiers, the names and numbers, which are coordinated by ICANN, which in itself is an independent technical community that uses a multi-stakeholder approach. So I’m part of that technical community, and I’m responsible for OUDA, which is the small company that administers .AU, the country code for Australia. We focus on technical operations and performance and our domain name licensing rules. And we’re very strong supporters of the multi-stakeholder model of internet governance. When I think about 2024 and what worked and what didn’t work in that year of so many elections, the first point I want to make is that technically the internet worked. In Australia, we delivered 100% availability to users during the year. Every time a user wanted to access the domain name system, they could. Why was that important? It’s because the internet worked to share information, to provide communication and commerce, of course, to grow economies and standards of living. But there are a number of harms, and many of those have been mentioned just now. Misinformation and disinformation and fraud and others, and they are key challenges, particularly in such an election year. So the harms, of course, need policy work, and that’s what we’re here to talk about. And the tensions, as we see it, are between open information, secure identity and privacy for individuals. And the question really is how to balance those things. So practically speaking, during elections, we sometimes see at powder increased requests from people to take down the websites of their political opponents. And those requests are often made with claims of misinformation or disinformation. Those claims must be assessed by others who are authorised by law and skilled to make those judgements. Our response can only be based on our .au licensing rules and not on the political nature of the content or the requester. We’ve not yet seen the impact of AI on elections in Australia, but we’re expecting to have a national election next year, and we think that AI will be something that we need to watch during that process. So the policy work that we all have to contribute to is really a work in progress, and we see the Internet Governance Forum as the place for those discussions to take place across all the different perspectives, including our own technical perspective. Thank you.
Pearse O’Donohue: Thank you, Rosemary. And indeed, thank you for giving us the views of the technical community, and in particular referring to ICANN, but also the importance of the DNS in relation to the issues that we’re talking about, and again, of course, the need for independent verification and moderation with regard to any attempt to take down websites. It’s a two-edged sword. So thank you. So now, thank you to all of the panellists for that first round, and as I said, we are now going to see if anybody from the audience here in the conference room or, for that matter, online, would wish to make any inputs. I will ask that they are short, and to do so in time-honoured fashion, if that is the case, you need to come up to the front and use one of the microphones. So if anyone wants to do so, could you please identify yourself and the organisation you represent and please keep your input very short, two minutes as a very maximum. Thank you.
Giacomo Mazzone: Thank you very much. Giacomo Mazzone. I am a member of EDMO, the European Digital Media Observatory that you know very well. I’m here reporting what we discussed in the workshop on day zero that was organised by EDMO about the task force that worked on vigilant the integrity of the European election last year, compared with what happened in the US election and the South Africa election. The contribution that we can give you is that the assessment of what happened during the European election was very good because there was a successful example of cooperation with the platforms, and made in a multi-stakeholder way, in the sense that in a unique place, that is EDMO, you have academia, you have fact-checkers, you have institutions working together. Through the code of practice that the European Commission signed with a certain number of platforms, this information will bring to the attention of the platform and the platform will immediately react and behave. So we have been successful in removing things without having enforcement, but made on goodwill and cooperation. Unfortunately, what was reported by… U.S. friends was not exactly the same. They said that the level of cooperation in the U.S. was not the same, and also that they lived a very worrying experience, this is important for our UNESCO people here, of pressure and intimidation on fact-checkers, trying to to silence them and having not them in the public discourse. And in South Africa…
Pearse O’Donohue: Sorry, I’m going to have to ask you just to wrap up, please.
Giacomo Mazzone: Yes, the last point, to be complete, is about South Africa’s experience. They reported that any intervention by legislation is seen as censorship, so it shows that it’s different. You need to find a different way to act in different cultural contexts according to the different situation. Thank you very much.
Pearse O’Donohue: Thank you, and thank you for those insights, and indeed as well the very useful workshop that took place on Sunday. We have another speaker, please. Again, your name and organization. Thank you.
Audience: Hello, Alexander Savnin, Primorsky University from Russia. I would like to point out that among this misinformation and data spread, the Internet already may be used by some governments for votings. Like in Russia, this year there were two sets of elections, one of which was actually a presidential election for Mr. Putin, and systems implementing Internet voting was used in these elections. And without possibility to multi-stakeholder discussions on implementation of this system, without possibility to check trust, this system actually undermines any results of elections as all. Unfortunately, implementation of these systems and results of elections are not very well observed or seen by global community, but it brings another dimension to the undermining trust and improving risk of fair elections. Thank you very much.
Pearse O’Donohue: Thank you, indeed. I’m just looking to see, do we have any online inputs? Anybody who’d like to take the floor or make a comment? And this is the way of giving the spotlight to Bruna, who has done all the organization for this session.
Bruna Santos: I would just echo a comment from Mokabedi. So just reading it out loud. Hi everyone. I’m Mokabedi from Iranian academic community, some Krausbräuder digital platforms, refuse to cooperate with the competent authorities of independent countries in the field of immediately dealing with this information that meaningfully affects the election results and harms public trust during the elections due to reasons and excuses, including political reasons and sanctions. They even refuse to establish legal representation. My question to the panel is what can be the legal and political solutions to solve this challenge and the double standards of digital platforms? Should maintaining the health and safety of online elections in different countries have a different degree of importance? That’s the one we have here. Thanks.
Pearse O’Donohue: Thank you, Bruna. And I will ask the panelists if there’s anything from what we’ve heard so far, particularly that last question, if you want to incorporate that in the responses when we come back to you for a discussion. Now we have a final participant from the floor, please. Thank you.
Audience: Thank you very much. My name is Maha Abdel Nasser. I’m from the Egyptian parliament. Actually, the problem is not just during elections, but it gets worse during the elections. We find those, what they call it, the electronic flies or so, they attack anything we put with the, they put a lot of disinformation and they try to get us down by all means, even those people who were, I don’t know, by the regime or by opponents or by anything. And even when we report, it takes a very long time to take any action if the action is taken. So my question is, if there is a possibility to have a platform or anything between all those people to report such attacks or such harassments, especially for politicians, women politicians of course, so the action can be taken in a rapid way and we can get rid of these things or not? Thank you.
Pearse O’Donohue: Thank you. Again, I hope that that question can be addressed. I will allow myself just to very briefly give a partial answer, but it is not the full answer, but that in the European Union, particularly now with the introduction of the Digital Services Act, we do have a requirement for individual, very large operators of platforms to have the facility for the reporting of such activities, but also centralized databases monitoring these issues. And by the way, verbal and online violence against women and particularly female politicians is something that we are particularly concerned about, as it is insidious and has long-term effects, as well of course as the effects on the individual. So these are issues which we must address in the case of the European Union. We do see this as a necessity, the ability to report such incidents and hopefully to see quick action. But I’m sure that there are other experiences from around the world and we’re always willing to learn. So for that, thank you for your participation. We will have another slightly longer section at the end and I hope that we have more participation here in the room and online, but we’re going to move on now to the second set of questions and here we’ve broken them down between our expert panellists and I’m going to start with William Byrd and Liz Orembo and you’ve got the hardest job because I’m going to ask both of you two questions and give you five minutes each to answer both of them. So we’ve put them on screen and I hope that you can see them, but certainly what evidence has come to light of information integrity being weakened through human rights or tech harms? How should the weakening of election integrity through these and other risks be identified? And that’s for William. And then Liz, when we come to you, the question I’d like to ask you is, what are the implications or consequences of such risks to information integrity in elections? But we’ll come back to you, Liz, in a moment. First of all, I’d like to hear William on the first question and you have five minutes, please. Thank you.
William Bird: So I love the point from one of the other people that a lot of these things occur outside of elections. What we see is these things occurring at a heightened level in an election period, but that they, you know, attacks against women, for example, online don’t stop just because it’s not an election period. So I think there are three things where we saw information integrity being weakened in South Africa specifically. Firstly, attacks against the electoral management body. These were multi-pronged and straight out of a disinformation playbook that targeted the entity, its decisions, they spread rumors, missing disinformation, then they target individuals in there, and then they lace these various campaigns with kind of pseudo-legal challenges. And then they rely on a willing platform partner to scale the dirty work. And in that instance, most of these things we saw in South Africa on the platform that was X, which was not part of our collaboration, and unsurprisingly. The second issue is attacks against journalists and human rights defenders and those bodies. So as an example, on X, over a two-week period, we saw over a thousand attacks against journalists, and most of those actually against one journalist in particular. So clearly organized network behavior, including issues linked to incitement.
Tawfik Jelassi: And then thirdly, the bigger impact of the decimation of media, as we’ve seen them being systematically undermined as trusted systems. That feeds into that idea of media and polarization, that sense of people not knowing what’s actually going on, and then being unable to actually operate. So how should they be identified? You spoke about what’s happening in the EU. In South Africa, we’ve got a platform, Mars, which people can report attacks against journalists so that there’s a public archive of them, and we’ve also got the same thing for other online harms, mis- and disinformation, hate speech, and threats, and hate speech. And that’s also, again, a public platform that operates independently of the state so that the public begin to have faith in it. And critically, it applies the same standard, because what we found was problematic is that what’s okay on one platform isn’t okay on another. And so that leaves the public thinking, well, what do I do here? If I want to report on X, nothing happens. If I report on meta, it’s this process. If I report on this platform, it’s another whole process. So we’ve got a system that allows the people to report any platform, and then you can take action.
Pearse O’Donohue: Thank you very much. And of course, consistency in the application and confidence of the individual that whatever the platform is, that they will have the ability to have redress or at least to have the issue examined is very important. Thank you. Now, turning to you, Liz, just to repeat the question is, what are the implications or consequences of these risks to information integrity in elections, including the risks to civil and political rights, or the interference by foreign actors, and so on? Please.
Elizabeth Orembo: Thank you. Well, I’d begin by first looking at the media environment. There are certain most information that comes from the online environment coming to media and vice versa. And when there’s no information integrity on online platforms, it means that the media has to respond to a lot for public interest, get what information that is misleading there and putting it out to the public, demystifying some of this misinformation. Also, competing narratives, a lot of the information also coming online means that the media has to go through all that information and spotlight what kind of information that the public should focus on, because they can also get overwhelmed with a lot of information coming from different media. But then again, we see the capacity issue of the media also because the shift with advertisement and revenue to online spaces. So the media is also challenged there. What does it mean on human rights implications and civic rights is that people don’t vote from an informed point, because they miss a lot of information that can really be detrimental or be useful for voting, the right choice of candidate. It also means that this will also impact things to do with development issues, development which is a right that would enable them to enjoy also first generation of rights like freedom of expression, and that’s a problem there. The other one is incitement. Of course, when there’s no information integrity, there’s a lot of polarization happening online, offline, that have effects to also marginalization. People who are further marginalized, you mentioned women and girls. women who have been active change-makers at the grassroots level, when they try getting into the spaces of governance, they face a lot of violence online and offline, and this really discourages them from pursuing government office or electoral office. That means that we are widening the inequalities there. I would also like to point out that the African continent faces very different challenges and also very different context. We are different levels of development, different levels of democratic progress, and that means that policies that are by the big platforms cannot just be applied blanketly because some will not apply in some countries, because of special tech development context and also democratic context as differing to others. Sometimes we see that there’s no much investment or tech companies get overwhelmed to give special attention to special context. This year, what we’ve seen, and especially with Mozambique as the situation continues, is that not really that tech platforms are not engaging there, but also there’s that overstructure in engagement to respond faster to situation on the ground. Those are the challenges that we are getting in most African countries, that even when there’s attention there, there’s no that specialized attention on the ground because most of these tech companies are not domiciled. The other thing is when we talk about information integrity and trust in electoral management bodies, sometimes you have a focus on electoral management bodies maintaining their reputation. But at the same time, for them to get trust from the public, it means that there’s also need to be an environment where there’s proactive information coming also from election management bodies and especially in the context of how they manage election. Now, because of different media access, the situation in Africa, either connectivity is uneven or even access to media, even traditional media is uneven. That means even when they try to communicate with whatever platform, it doesn’t really reach people. That unevenness in information access also brings about the fertile ground for misinformation. Like I said, it also touches on what William Bird had also mentioned. On this, I’d also like to touch on what we try to do at RIA.
Pearse O’Donohue: Just as quick as you can, please. Thank you.
Elizabeth Orembo: Yes. We are working on Mozambique, Ghana, and Tanzania, which is having elections next year. Our focus is on media coalitions and access to data also. research, another thing that we are seeing right now are the dilemmas around data sharing, data sovereignty, and whether to host data, elections data, in the country and outside the country. I think I will stop there.
Pearse O’Donohue: Okay, I’m sorry that I had to interrupt you, but that was a very interesting analysis and quite a number of issues that you have identified as being things that need to be addressed the consequences in some detail and obviously some lived experiences as to what happens. With that in mind, now we’re going to move on to the next set of panellists. This time the format is slightly different. We have one question and I’m going to ask that question to three panellists and hopefully you can feed off one another. So I will start with Daniel Molokele. And the question is, what initiatives have successfully responded to challenges posed to information integrity in elections? And how is this success measured? And are such initiatives specific to a given time or place? Or could they be used more widely around the world? Mr. Molokele, please.
Daniel Molokele: Thank you so much. Yeah. There are several initiatives, most of them are just starting. But I wanted to highlight a very continental one which occurred in September. We met in Senegal as Africans on the Freedom of Internet Forum. And one of the key pillars of this conference with hundreds of delegates from across the continent, one of the key pillars was access to information from a perspective of elections, especially knowing that in some instances in Africa, we have seen governments using strategies such as internet shutdowns, where they create a complete blackout during the campaign period to force an advantage against the opposition. We’ve also seen instances where social media platforms like WhatsApp are also restricted in terms of operation to make it difficult for people to access information. Also the over-reliance on state media at the expense of media that is independent and shutting down of alternative media platforms, especially media houses that are seen to be sympathetic to the opposition. So we have started an annual meeting in which we will be able to get presentations and research and assessments in terms of electoral processes and access to information. And also related to that, there is a parallel process around challenging policy frameworks and legislative frameworks that make it harder for people to access information, especially civil society, especially political parties that are not the ruling party, especially journalists who are covering elections. Access to information laws in Africa are there, but some of them are designed in such a way that they create a more bureaucratic process. Ostensibly, they are supposed to increase access to information, but at the same time they make it harder for someone to access information. We also have such laws in Zimbabwe, where I come from, called the Official Secrets Act. Official Secrets Act also can be used to make it difficult to access specific information if it doesn’t create advantage to the ruling party. So there is a lot that is happening, and we are seeing not just civil society coming into space, but we are also seeing research coming from universities, from schools that teach journalism and media studies, and that also helps us to have a more robust view around access to information and electoral integrity. Some of the ideas that are coming out, they are mostly unique to Africa, because Africa also is a situation where there is a great digital divide with the rest of the world. The majority of people in Africa have no easy access to the internet, they have no easy access to mainstream media, so at the end of the day, they are subjected to misinformation and disinformation, and a lot of state-funded propaganda. And at the end of the day, it’s such a huge disadvantage, it makes it difficult for election systems to be free and fair, because without being properly informed, you cannot make informed choices as a voter, and in most instances it favours the ruling elite in the continent. Thank you so much.
Pearse O’Donohue: Thank you. So, in suggesting some of the solutions, you’ve also identified one or two further problems that need to be addressed, some arising. from your experience. So now I’d like to ask the same question to Lina. I will read it out very briefly or abridge what initiatives have been have successfully responded to the challenges and how are such initiatives specific to a given time or place or could they be used more widely. Lina, please.
Lina Viltrakiene: Well, thank you very much and indeed measuring the impact of the counter disinformation initiative is really very challenging task but I am willingly like to share with you several good practices which we developed in Lithuania and that could be really replicated worldwide. So I will refer to three of them. First, in Lithuania we created really a consolidated system of monitoring and neutralizing disinformation. We take a comprehensive whole-of-society approach to monitoring, analyzing and countering disinformation involving not only state institutions but also the whole vibrant ecosystem of non-governmental organizations, media, business, which really helps to create resilience of the society and also trust. In this context I would like to particularly stress the importance of NGOs in analyzing and countering disinformation but also particularly in promoting digital and media literacy, including journalists working or writing to audiences of national minorities, developing learning programs and different devices to vulnerable groups. We have NGO Civil Resilient Initiative which worked a lot on that. We have an important non-governmental organization debunk.org. This institution also researches disinformation and runs educational media literacy campaign. So indeed developing… the management itself, main part of our research objects going ambitious and developing critical thinking is key to resilience against a foreign information manipulation and interference. Another important element I would like to mentions, it’s the collaboration between business and academia to develop technical solutions. So, in Lithuania, we have a lot of collaboration between business and academia, and we have technologies, such as AI-driven tools, for example, that could detect manipulated media bots, and also coordinated inauthentic behaviour, and here, the collaboration between science, between academia, and business is really, really important. So, we have a lot of collaboration between business and academia, and we have a lot of collaboration between business and academia, and we have a lot of collaborations about reporting platforms, and so on, so, in Lithuania, we have, really, a lot of people, a lot of society members participating in encountering this disinformation. We have a very nice initiative, the Lithuanian Elves Initiative, where we have a lot of people from all over the world participating in encountering this disinformation, and this really works very well. Second practices which I wanted to share with you, and which is very much related to the first one, is a cross-sectorial approach to find disinformation, and really closely cooperating at the national level. For this reason, I brought with me on my side a team of experts under the framework, under the national crisis management centre. indeed, helping to quick detection and rapid response to disinformation or to information incidents, which could have a big influence. This National Crisis Management Center coordinates the strategic communications and also provides guidelines for a possible response to different information incidents. And our experts from this center are really willing to share and sharing their experiences also with other countries of this effective functioning of cross-institutional framework. And finally, that brings me to my third point, that sharing experiences among democratic states is really, really important. And one of such initiatives we have in Lithuania is the Information Integrity Hub, which is operated by Lithuania and the OECD, which provides the training for officials worldwide. So this is a training program offering opportunity for OECD and non-OECD public officials to peer learn and strengthen their capacities to detect, suppress and prevent and view foreign influence and find disinformation. And indeed, that is very effective when experts are gathering together, when the sharing the cases of disinformation they face, and perhaps also that could form a kind of inventory of practices, of bad practices, which would be then easier to recognize when experts are working together. are discussing and sharing that. Thank you.
Pearse O’Donohue: Thank you very much, Lina. So, now, the same question to Sezen Yezil. You’ve been waiting a long time since you last spoke, so again, what initiatives have successfully responded and can they be used elsewhere?
Sezen Yesil: Please. Thank you so much. Oopsie. I hope that my answer will also address the questions of the audience, the one from the online participant and the one from my sister from Egypt. And I know that women politicians are especially vulnerable, unfortunately, and we have special protections in place, and after this session, if she kindly stays and meet me, I would like to explain more in detail. But I can say that we, as META, we have a very well-established playbook on election integrity and we keep improving it according to the lessons learned after major elections. Our measures are globally applicable, but we make risk assessment for each election specific to that country and adjust our measures if needed. So that participant from online medium said that we don’t have a local representation, et cetera, so that doesn’t matter because all our measures are globally applicable. And we have about 40,000 employees working on safety and security, and we have invested more than $20 billion in this area since 2016. There are five pillars in our election integrity work. First one is that we do not allow fake accounts. Our automatic detection tools block billions of accounts often within a few minutes after creation. Second, we disrupt bad actors. We took down more than 200 coordinated inauthentic behavior networks since 2017. And, as you know, those networks are used to mislead people, especially during election times. And we work in collaboration with law enforcement and security agencies and with academia, researchers, etc. to identify those actors. Third, we fight against misinformation. It is a really tough issue because nobody agrees on the definition of misinformation. For example, let’s say a politician says that they have the best economy in the world. What if the indicators do not agree with him? Are we going to remove that content and label it as misinformation? That won’t be appropriate. So, we have a three-part strategy. Remove, reduce, and inform. Under remove, we do not allow mispresentation of voting date, voting location, and times. We do not allow mispresentation of who can vote, who can participate in elections, and what documents are required, etc. Under reduce, we work with more than 90 third-party fact-checkers around the world. And they cover 60 languages to identify and rate viral misinformation. When rated content is not recommended in our systems, its distribution is reduced. And under inform, we put labels like false information on rated content by the third-party fact-checkers. And we provide more context to the users if they want to have more information on why it was misinformation, etc. And under the fourth pillar, we increase transparency. Especially for political ads, we have an obligatory authorization process. Advertisers, political parties, for example, have to prove who they are and where they are located. They can only target audience in the country where they are based in. And we put a paid-for-buy disclaimer to the ad so that people can understand who is funding that political advertisement to give more transparency. Also, political ads are kept in our ad library for seven years. So, for example, researchers use it a lot. It is publicly available, free. And you can see all the information like the amount spent on ads, who is funding, etc. is created with AI apps, the advertisers have to disclose it to us. They have to say it. And we put a label on the content, like digitally create, so that people understand it is a photorealistic video or photo or something. And under fifth and last pillar, it’s about partnerships. We work with local trusted partners to receive timely insights on the ground. So okay, final comments. And also user education is very important. We do campaigns with third-party checkers and academia to raise awareness on how to fight disinformation and misinformation. Thanks so much.
Pearse O’Donohue: Thank you very much for that. So we heard, particularly in the answers from Daniel and from Lina, already references to the civil society, to NGOs, to the stakeholders, the multi-stakeholder process, as having an important role with regard to, you know, what could be successful responses and how we learn to share initiatives across countries and regions. And now we might, that’ll be one element of the next question that I’m going to pose to our final two panellists. Again, thank you for your patience. And that question, again, it’s on screen, is what are the governance principles, tools and mechanisms that could be applied in order to help protect the integrity of electoral processes and information in the digital age, while upholding human rights and democratic principles? And then, are there specific roles for particular stakeholders that need to be highlighted? So I’m going to put that question, first of all, to Tawfiq Jalassi, please.
Tawfik Jelassi: Thank you very much, Mr. Moderator. I think we all agree that ensuring that information is trustworthy and accurate is a very critical challenge today, maybe more than ever before, especially during elections. And here I would like just to quote Maria Ressa, the 2021 Nobel Peace Prize winner, who said, without facts, there is no truth. Without truth, there is no trust. And without trust, there is no shared reality. I think this is a very powerful quote that reminds us that fact-checked information is the basis for not only democracy, but for society and for communities to live together. So it’s a major challenge. But then, second and final quote from journalists. Here is Karl Bernstein, a political scientist. We have a journalist who said, what we do as real journalists is to give our readers the best obtainable version of the truth. It’s a simple concept, but it’s very difficult to achieve and especially elusive in the age of social media. We know the power of digital influences, who have today 50 plus million followers per digital influencer. Our recent study shows that more than half of the content they post online is not fact-checked, is not verified. This is a new challenge that we need to deal with. So, the dilemma is there, and the pursuit of truth is especially challenging in this digital age, where information spreads rapidly and far faster than objective information. A recent MIT study shows that false information travels 10 times faster than fact-checked information. So, it’s a real challenge, and as I said, this is at the heart of preserving democratic processes. So, the question is, what can we do about this? And here, let me say that at UNESCO, we are deeply committed to advance our mission of protecting the integrity of information. And here, I must say that we are honored at UNESCO, being asked last month by the G20 Summit to become the secretariat for a global initiative on information integrity and to administer the global fund allocated to that by the G20, the 20 most important economies of the world. So, I think information integrity is at the heart of what we are discussing, especially also when it comes to climate change. How can we combat climate disinformation when we try to resolve the environmental crisis? So, this is part of our… mission. Now the next question is how do we go about it and our approach has been all along anchored in the international human rights standards. We developed the guidelines I mentioned a few minutes ago, the guidelines for the governance of digital platforms, again based on human rights, but also to promote transparency. you Transparency, accountability, and inclusivity. One third of them have to quit because of online harassment and as I said sometimes physical violence as well. So this is what we have been doing to protect women journalists. So you didn’t ask about this, you asked about women politicians and our panelists has addressed that. So again there is one final note maybe to mention is we believe that true empowerment starts with education. Education is at the heart of the matter and some of the panelists mentioned media and information literacy in the digital age. Literacy again in reference to education. Our program on that is a cornerstone of our strategy. We want not only to have guidelines for digital platforms and for regulatory authorities, that’s on the supply side of information, but we have to work on the demand side. of information and the usage. And our aim through our educational program is to make the users of digital platforms become media and information literate by developing a critical mindset so they can distinguish, hopefully, between fact-checked information, objective information, and obviously falsehood. This is something that we believe is very important. We want them to raise a few questions. Who created this information I come across online? Why was it shared? And what evidence supports it? Because otherwise, the users of information online become themselves amplifiers of misinformation. They like and they share that information. And finally, to say it’s a collective effort. I mentioned what UNESCO is trying to do, but of course, it’s a collective effort. We need governments to create policies to protect human rights, safeguard freedom of expression, and having the right regulation, maybe, for digital platforms. We want tech companies to adhere to full transparency and accountability and the proper content moderation and curation, but educators and civil society to empower citizens, the way I mentioned, to discern facts from fiction. Let me just conclude, because I think my time is up, to say not only that we remain at UNESCO steadfast in our commitment to this cause, but we believe that together we can build a digital age that does not divide, but unites, that does not harm, but heals, and that does not undermine democracy, but strengthen it.
Pearse O’Donohue: Thank you very much. A lot to think about there. So finally, waiting patiently, we would like to hear from Rosemary Sinclair on her views on this same question. Please, Rosemary, the floor is yours.
Rosemary Sinclair: Thanks, Piers, and it’s a very big question, as I know you know, so just a few thoughts from me. We’ve been focusing in this panel session on elections, misinformation, and disinformation, but I think we’re really talking more broadly about information, and that means we’re really talking about trust and confidence in an online world. And we’re having this discussion right at the point where we have the possibility to secure amazing innovation, which can benefit individual people, their communities, and their economies. So this is a conversation really worth having. For a long time, we’ve been focused on practical connectivity, and there’s a way to go, I know, particularly in the Global South. More recently, we’ve started to think about cultural connectivity, so efforts focused on digital inclusion through language. But I really want to stress that our focus must be on building, or in some cases, rebuilding confidence online. In Australia, we at .au do annual research into the digital lives of Australians. And for the first time this year, that research told us that Australians are starting to think about doing less online because of the harms that they are experiencing. Right at a time where for productivity, efficiency, and innovation reasons, our policy makers and others are wanting them to do more online. So I think we’ve got to get back to a point where technology is seen as a tool and not as something that is somehow beyond policy. And when we’re thinking about policy, we’ve got to balance innovation and integrity. And I think we need some very big thinking, and we’ve done some of that at .au. We forced ourselves to do it using a scenario process. And if the scenarios are of interest to anybody, they’re available for free use on our website. But there are two scenarios in there that are pertinent to this discussion. And I’m going to summarize them in about six words. One of them says, government is in charge of information. And the other of them says, private sector is in charge of information. And when we dug into those scenarios, what we found were really some shared issues about the rights of individuals to privacy and to choice, the importance of integrity and impartiality around information. There’s a whole set of issues around the importance of the security of people’s identity. We explored the role of the internet, open, free, secure, and globally interoperable. And we really thought about integrity and the assurance processes that would need to be put in place to assure people of integrity. So in answer to the question, we need governance principles, tools, and mechanisms in all of those areas. Getting back to our topic today, which is elections, I wanted to make the point that really democracy now is a team sport. And more than that, it’s actually a global team sport. And who we need on the playing field with the voters and the politicians, we need civil society, we need the technical community, we need the private sector, media, technology companies, the platforms, we need government, public service officials, we need the combination of judiciary and regulators to actually implement and enforce policies, laws, regulations, the people who are accountable for election oversight and the like. In addition, I want to be bold enough to suggest that we might need some philosophers on the playing field as well, to think about the limits of markets as Michael Sandel has done, to think about big questions around values and ethics and culture. My final point, in fact, I’ve got two final points, but the first one is I’m finding it very interesting that organizations that have usually been concentrating on economic policies and competition and the like are becoming very interested in these issues too. And if I just give you one little quote from the OECD’s report, Facts Not Fakes, Tackling Disinformation, Strengthening Information Integrity, that report says, informed individuals are the foundation of democratic debates and society. And the report also goes on to make the comment a multi-stakeholder approach is required to address the complex global challenges of information integrity. More locally in Australia, our ACCC, which is our competition authority, has been conducting an inquiry into digital platforms. And in its final report, it says, this inquiry has highlighted the intersection of privacy, competition, and consumer protection considerations. Privacy. data protection laws can build trust in online markets. So, the fact that these bodies are thinking about these issues for the purpose of economic and societal outcomes, I think, is very interesting. Sorry, Rosemary, I’m going to have to ask you to wrap up now, please. And my final point, please, is just we need to have a global governance architecture. I think the Internet Governance Forum has a role to play, and I’m really hoping that through the processes next year, the role of the IGF is made clear and permanent so that it has the certainty to help do this work.
Pearse O’Donohue: Thank you. Thank you very much, Rosemary, and thank you for that very clear enumeration and explanation of the principles that we need to revisit in the work that we’re doing with regard to the Internet as a whole, and then specifically with regard to election integrity. And, of course, to Taufik for his analysis, and again, the worrying facts of violence against journalists, particularly female journalists, there is a direct and very thick line between that and election integrity. If the journalists, if the free press, is intimidated into silence, then we are already losing the electoral integrity process. So, something that we must think of, and also the effects of the digital elements to that. So now, as I’ve said, we want to again open up the floor to questions, but particularly statements, because on this occasion we’re going to make you work a little bit harder. So this is to participants here in the room, but also, of course, to online participants. We actually have a couple of questions for you, so if anyone would like to answer those questions, address those questions, or address points made by our panellists in their very rich responses to those set of questions that we put to them. So, it’s simply this, how do you think the broader Internet governance debate intersects with electoral information integrity discussions, and how can the IGF discussions, the multi-stakeholder approach, how could it contribute to improving and strengthening the information integrity in elections, and in the election space? So, do we have anyone who’d like to take the floor on this, or make comments on what has been heard from the floor? If so, please come to the microphones, one or other, at the head of the room. And I’m also looking at Bruna, if there is anybody online. Okay, well, we’ll keep going, because we have been very disciplined, I have to say. I’ve been nudging one or two of you, but I would like to thank all the panellists for being so disciplined in time, while giving us such rich responses. But now we have the opportunity, perhaps, to open the debate to you, to everything that your co-panellists have said in the questions they had asked, in relation to what we said, what are the problems, you know, what evidence has come to light, what initiatives have worked, what hasn’t worked, and then what are the principles that we need to apply. So now I’m giving the floor to you, but I would also like to put to you that question that I just posed now, and you can tackle any or all of them as you see fit, is, you know, how can the broader Internet governance discussion and debate intersect with this issue of electoral integrity, and, you know, how can the multi-stakeholder approach contribute to improving and strengthening the situation. So now, the floor is open. Who’d like to take the floor? Please, Tawfiq.
Tawfik Jelassi: Thank you. You remind us that the focus is elections, of course, and reporting on elections in a fact-checked, objective way requires proper training of journalists covering elections. UNESCO has been doing this in many countries recently, to provide the training needed by journalists, because, of course, the information they bring to the fore is so important, especially in this era of misinformation. Second, the impact of emerging technologies on elections, such as the impact of AI on elections. This is another training that we developed. It’s an online course on the impact of AI and generative artificial intelligence on election processes. So this is part of awareness creation, awareness raising, advocacy, because we need to have in place an enabling environment for elections to take place in a fair, free, and democratic way.
Pearse O’Donohue: Very good. Please, William, you were next.
William Bird: So what struck me is, despite us all coming from radically different perspectives, just how similar the issues we’re facing are, and in fact how similar the kind of approaches to dealing with them are, which says that often these things aren’t, as I said at the beginning, a bigger question of how do we deal with this new information chaos environment, where power dynamics have shifted so dramatically. And that seems to be a common kind of question that all of us are grappling with to varying degrees. The second thing is the critical importance of digital literacy. This is mentioned at every single instance of these things that I go to. The thing that is consequently and still missing in massive amounts are effective and properly resourced plans to actually implement these things. So we can come here and say all these good things, but there’s no real meaningful action. And then how do we deal with the outliers, right? Elon Musk being one of those outliers. X’s power is diminishing, but just because it’s diminishing, the harm that it’s causing in very real terms is still significant. And it seems we don’t really have an answer to that, you know. We’ve just seen one of the major world superpowers buddy up to this man who openly used his platform to spread misinformation, and in the case of South Africa, happily allowed it to spread attacks against journalists inside violence and hate speech. And we need an answer to that. IGF and all of us, we need to be able to say how we’re going to deal with this. Thanks.
Pearse O’Donohue: Thank you. Daniel Molokele, please.
Daniel Molokele: Thank you so much. I wanted to speak on something that I feel we have not addressed that affects electoral integrity from an information point of view. It’s the issue around the need for standardization and professionalism. You will see that we are seeing a rise of new social media platforms or media technologies that are highly influential in influencing political opinions, especially for the electorate. Some people have got blogs, some people have got podcasts, some of them are live. Some people have got ex-pages or Twitter pages, they’ve got Facebook pages, they can actually go live at any time and millions of potential voters tune in. In that live broadcast, there are some untested facts around the elections that are said, or even allegations, for example, around rigging or cheating in elections. Because the audience trusts the person behind the podcast or the show, it then affects everything in terms of integrity of the entire election process. Yet most of these people who conduct these live sessions and so on are actually not trained journalists. They do not practice any form of ethics and they have no form of qualification or certification. And at the end of the day, there is no emphasis on professional research and standardization of content. And also driving them is the fact that at the end of each month, they get a paycheck and it’s based on the amount of interaction or interactive use of that blog or pod. So the more people are emotionally tuned, the more viewership, the more interactions, and the more currency at the end of the month. So the net effect of that is a single person or two people… people can actually shape the narrative, depending on the side which they are. And it then allows people with money, maybe business people, for example, who have got interest, maybe in public tender systems, to actually fund these people also, unofficially, and influence the electoral process. Because at the end of the day, they would want a government that would be in power after the elections to be aligned to their business interest. So it’s a major concern. And the main media houses or professional institutions that practice journalism to standards are normally overrun by this kind of live transmissions. Then I also wanted to zero in on artificial intelligence. For us as Africans, we are coming from a position of being left behind. I think the average person, especially in Zimbabwe, where I come from, is still very difficult to distinguish a story that is AI-generated and one that is real. Because if you look at the videos, if you look at the pictures, they look so real. And if you can come up with content that is misleading or misinforming or disinforming, an average voter will be able to take it seriously. By the time clarifications are done, follow-ups are done, it’s too late, and it then affects the credibility of the election system. Thank you so much.
Pearse O’Donohue: Thank you very much, indeed, I fully agree, I see the same. Now, I just wanted to check, I don’t know if we have a hands-up function, but I want to make sure that if either Rosemary or Liz wanted to come in on what we’ve heard in the panel discussion, and also, of course, this question about the broader internet governance debate and the IGF, would either of you like to come in on this?
Elizabeth Orembo: I can come in and make a few short remarks on how elections, the discussion, can integrate with the internet governance, which, from my view, we are looking at the governance of infrastructure and the governance of content, as far as the discussion of internet governance is concerned but then again, when it comes to elections and information integrity, I don’t think any society has really quite agreed what is disinformation or what is misinformation, what is good information that should be encouraged online and what should be discouraged. But also, what we are also seeing is that people are saying that there should be plurality of information, and when does that plurality of information, some of them oppose each other, and in shaping our narratives, we try to label the other information as misinformation. Some misinformation are outright misinformation, and they are put out there to influence some unfair narrative, which is also harmful to the society. But now we are talking about a plurality of information that sometimes causes tensions in the society, and it’s not really intentional on harming any citizenry, but still, it harms our democratic process. I think as a society, we need to reflect on such dangerous misinformation coming out, but if regulatory concerns are taken out, then it means that a certain group will feel offended with it. With internet governance discussion, and this is my last point, is that. But I think we need to go broader to accommodate, to appreciate what’s really happening within the elections environment, because it also touches on the wider issues of development. When the right people in governance, not the right people in governance are put in place, then it really affects how a society will develop democratically. In some elections, violence also erupts. It also means that economic consequences also follow, because in cases where countries face elections aftermath, or even three years after election, contesting elections, going to court, and even being passive in applying policies that are put in place by the illegitimate government, it means there’s a slow economic growth in those countries, and people are not able to prosper there. So I think we need to be wider in how we think about internet governance and democracy, going beyond just content moderation and infrastructure governance, but also to the underlying issues, like other panelists had said, that they also play out in between elections, but they actually rise when it comes to the elections themselves. Thank you.
Pearse O’Donohue: Thank you. And Rosemary, I saw you wanted to come in.
Rosemary Sinclair: Sorry, Piers, I’m having trouble with my mute button. Yes, I was wondering if we could pursue the idea that I think Lina put on the table earlier. I was wondering if we could pursue the idea that I think Lina put on the table earlier. And I just wondered if we could hear a little bit more about that work in Lithuania.
Pearse O’Donohue: Okay, well, Lina, I think that’s an invitation to you. Please.
Lina Viltrakiene: Well, of course, as I already presented, we have a quite comprehensive system. I think it’s very important for us to understand that the IGF is not a system established in Lithuania on countering disinformation. But perhaps as we are now moving to the end of the discussion, I wanted to react very briefly to your question about how IGF, indeed, discussions could lead us to something more specific. I think it’s very important for us to understand that the IGF is not a system established in Lithuania on countering disinformation. But perhaps as we are now moving to the end of the discussion, I wanted to react very briefly to your question about how IGF, indeed, discussions could lead us to something more specific. But perhaps as we are now moving to the end of the discussion, I wanted to react very briefly to your question about how IGF, indeed, discussions could lead us to something more specific. So, we are now in the process of defining the responsibility of social media platforms, and perhaps finding some legal tools to enforce that. So, we really think that we could establish clear responsibilities, legal obligations, and sometimes even penalties for platforms that fail to prevent the spread of organized disinformation campaigns. But perhaps, inspiring thee mechanisms e architectures s such as digital market and information distribution, often these are completely explicit approaches which we could develop in discussing in this multi-stakeholder format, we are all views hurt, and all of us are onboard. When we are hyper great, that is time to have these kind of discussions.
Pearse O’Donohue: » I am going to come to you now I see on finished business, but also great opportunity to take this forward. Because I’m going to ask all of the panelists now to give their views, and I will come to you first, Sez. » I would like to start by saying thank you to all of you for being here. I know that you are all very committed and you know that I’m tough, but you are very disciplined, and the purpose really here is that I’d really love if you could come up with a recommendation, a request, a best practice that we could take as takeaways from this discussion, and as I said, I think there can be another one from Nina or from Rosemary, but I would like to start with you, Sez, and then we can move on to the next panel. » Thank you very much. I think that the way in which a person, a user, can address themselves, and hope to have action taken is one of those issues, and how we can adapt that, I’m sure there are different models. but if it were to be throughout the system, it would be great. But anyway, I’ll ask you to do that now. Two minutes maximum, and I will shut you off so that everyone gets their last word. And I will start with you, Sezen, please.
Sezen Yesil: Okay, thanks so much. So, the problems we have discussed today, like disinformation, misinformation, are probably as old as the history of democracy. But of course, the use of internet brings those to another level. We all accept that. Also, the problems are not specific to one country or to one platform only. Bad actors, for example, can work from country X to target people country Y, and also they use all the platforms they can use. So, I believe that, like all other global problems, election integrity-related problems can be tackled best in collaboration with all the stakeholders, like private, public, academia, and civil society. The beauty of IGF is that it’s bringing us all, and we are hearing each other. That’s great. So, I really appreciate all the esteemed panelists’ views and comments on the matter. I’m taking this as my homework. I will feed those as input to our election integrity work back at META. So, at META, we understand our responsibility, and we try to improve ourselves. And we are already leading and participating in many collaborative efforts, and we will be more than happy to expand our collaborative efforts to the other stakeholders, including governments, UNESCO, etc. Thanks so much for this opportunity. Thank you very much.
Pearse O’Donohue: And now I’ll turn to you, William. Your two minutes. Yes. Don’t worry. I’m being random for a purpose.
William Bird: Okay. So, my points would be, I think, to call for resources outside of elections periods, because what we see is that elections approach, and suddenly everyone’s excited, and then elections go, and we all say, yes, these are bad, and then suddenly there’s no work to… We need to see these as ongoing societal challenges. The second point is the intersection of online harms needs to be dealt with comprehensively, and then we need to see some action. It’s not enough for us to just say, oh, yes, the attacks against women online is very bad, and we really must do something. Let’s do something. Let’s hold some of these people accountable. We can’t take Elon Musk to court in South Africa because they don’t have anything there, but why aren’t gender-based violence groups… him to court in the United States where he’s domiciled. We need to be holding people accountable that continue these things. We can’t leave it as is any longer. And then the third thing I think is that we need to, mis and disinformation are thrown around and you’re right, everyone said there’s no common definition. For us we reference public harm as one of the elements of mis and disinformation and I think one of the things that we could and should be looking at are more nuanced labels around understanding mis and disinformation that it isn’t all the same thing because we already accept some things as problematic. Thank you.
Pearse O’Donohue: Thank you very much. I’ll now turn to Tawfiq please for your two minutes worth. Thank you.
Tawfik Jelassi: Thank you very much. Not to repeat myself but the title here is how to maximise potential for trust and addressing the risks. For me, I repeat myself, the number one risk is dis and mis information. It’s not by chance that Davos World Economic Forum this year put fighting disinformation, disinformation as the number one global risk for 2024 and 2025. This is the super election year that will continue in 2025. Disinformation for me is at the heart of the battle and if we can address it, if we can minimise it, maybe we cannot totally reduce it, I think we will be able to maximise trust and address the risk that it represents.
Pearse O’Donohue: Thank you very much. Liz, are you still with us? We see a very interesting photo of you. If not perhaps then I will turn for the moment to Rosemary please. What would be your last comments? Now we see you Liz but we’ll give you a moment in a moment. Thank you. Rosemary, go ahead.
Rosemary Sinclair: Yes, thank you. I’d like to make two comments really. One is to re-emphasise what I said very briefly that I would like to see the role of the IGF clarified and made permanent so that we have a forum for large multi-stakeholder discussions about matters of importance. importance. Second thing is practically I would like to see the tapestry of internet governance forums knitted together much more closely. So in Australia we have the local IGF, AUIGF, and then we have the Asia-Pacific region IGF, and then we come to the global IGF. If we could imagine a world where all of that effort was focused on a topic area, perhaps centralised clearinghouse or reporting, perhaps how to deal with platforms. If we could somehow maximise that effort and bring that work to the global IGF for consideration and discussion by the multi-stakeholder community, then I can see a possibility of progress. Thank you.
Pearse O’Donohue: Thank you very much. Now, Liz, we’d like to hear from you. No, we don’t hear you. Keep trying to unmute. Yes, now I succeeded in unmuting, thank you.
Elizabeth Orembo: I think my only one point is that positively we’ve seen a lot of efforts on strengthening information integrity with the election. I think this year it came as a golden opportunity because of the many elections happening, and in Africa we’ve seen a lot of partnerships, different kinds of partnerships, stakeholders coming together to fight disinformation, to map risks on disinformation and fight it proactively. But then this has happened not really in silos, but different parts of the partnership, but also leaving important stakeholders. Like one panelist said, it’s a big team thing and we should expand more. But also the other thing is that we should not leave the work here. We should make these different works connect to each other to get a full picture of what really happened this year and what to anticipate in the next elections next year and even the years after. So that connection, connecting the dots from the partnership in the civil society with the tech people to even the data enthusiasts. There are people who are working with data. What really happened there and how we can connect the dots there. Thank you.
Pearse O’Donohue: Thank you so much. Now I’d like to turn to Daniel, please.
Daniel Molokele: Thank you so much. I think in 2024, we saw democracy continue to grow and take root in Africa. The elections that we had all across Africa, they significantly gave us an opportunity as Africa to showcase ourselves and rebuild our reputation as a continent. To that end, access to information from a perspective of electoral integrity is very important to Africans. And as a parliamentarian, I think one of the issues we need to focus on is making sure that there is standardization in terms of quality of information and news across Africa, especially during election campaigns and election announcements, results announcements. And we must make sure that the policy framework, the legislative framework all across Africa is modelled and standardised so that it benefits democracy in Africa. Because information is the potential to build our democracy, is the potential to make our electoral system be accepted and conventional. But at the same time, it is the potential to undermine our electoral integrity. So it’s important that we create model laws from a continental point of view that will enhance access to quality information and promote electoral integrity. Thank you.
Pearse O’Donohue: Thank you. And now, Lina, please.
Lina Viltrakiene: Well, from my side, I would like to leave you with a message that elections is the test of democracy. And indeed, democracy is not something that we have for granted. So if we want to live in democratic society, upholding liberty, human rights, rule of law, other democratic values, all together, we need to work for strengthening the democracy. And in this task, I firmly believe that all our societies need to be on board. and that is the only way to build trust, to make comprehensive action on strengthening resilience and critical thinking. And again, I believe that critical thinking and resilience is key, indeed, in all our efforts to ensure the smooth, reliable, free from malign interference elections, democratic elections. And perhaps, in addition, just one more thing I wanted to mention is that how important it is to coordinate among ourselves, among democracies, and that is crucial, indeed, to ensure that we appropriately, effectively respond to foreign information manipulation and interference, and that we prevent hostile actors from manipulating and hijacking the information space. Thank you.
Pearse O’Donohue: Thank you, Tawfik. You were very, very short in your-
Tawfik Jelassi: 30 seconds.
Pearse O’Donohue: So you have 30 seconds that you didn’t use. Go ahead, please.
Tawfik Jelassi: Why I ask the floor again, because your question had two parts. What can the IGF do about it, which I did not answer in my first intervention. 20 years ago, IGF did not foresee the rise of digital platforms, nor the harmful online content that we suffer from today. I believe that, going forward, IGF has to ensure that information is a public good, not a public hazard, not a public harm.
Pearse O’Donohue: Thank you. So, I want to draw a close to the meeting. I’m not going to draw formal conclusions. That would not be appropriate, and it would be very subjective and impressionistic, and I’ll tell you about what we’re going to do in a moment. But what I would like, first of all, is I think we should show our appreciation for the fantastic insights and analysis by our panel, physically and online. Please, a round of applause. And they have made my job very easy, one, by really focusing on the questions, but also by allowing the discussion to continue by limiting their time. I know it’s very frustrating. The only censorship that is allowed during the multi-stakeholder process is your speaking time. Everything else is just not allowed. So what I will do is I would like to say that we have a clear set of well-informed views that show, yes, the experience tells us that the threats are real, that the challenges have been experienced across a number of countries and regions, and we would expect that they will get worse unless action is taken. Whereas there were no, well sorry, I will take that back, where largely disasters were avoided in 2024. There have been some very stark examples given to us of where serious problems have arisen. But almost all domains, countries have seen a level of disinformation, certainly misinformation, going all the way to the use of deepfakes, as well as the suppression of opposing views. So we are united in diversity and it might not always be the case that I am happy with the result of the election. My side didn’t win. That’s not the point. That’s democracy. It is, well, did the side that win do so on the basis of the democratic process, which we all welcome, or did they do so because they used digital technologies to misinform, to disinform, or to actively prevent another voice from being heard. And that is the line that we must follow with regard to information and with regard to election integrity. I think we’ve had some great insights. As I said at the start, we do hope in what will come next is to, in listening to the stakeholders, to actually share our experiences and actually find inspiration, make suggestions, be able to give actionable insights to guide stakeholders and the actions that they can take, including and particularly in the IGF. We have the IGF coming to us next June. I think this work can continue. And for that, we have a rapporteur from this session who will help us, and I’d like to thank him, Jordan Carter, for his contribution in organising the session as well. But I must single out, as well as thanking our panellists, in particular I must single out Bruna Martins dos Santos, who has been the driving force in organising this event. I saw Bruna in action in NetMundial, and we’re very grateful for all the work that she’s doing on the MAG. By the way, Jordan is also on the MAG, and we really think that this is an issue that we will continue to need to focus on, and where the IGF and the multi-stakeholder process that it represents is the only forum in which we can find consensual responses to the challenges of digital while, and I think this was also what Taufik wanted to say as well, embracing all the good that digital technologies can bring to societies across the world. Thank you for your presence, thank you for those online, thank you again to the speakers and I wish you a great continuation of the IGF.
Tawfik Jelassi
Speech speed
137 words per minute
Speech length
1363 words
Speech time
593 seconds
Spread of misinformation and disinformation online
Explanation
Disinformation and misinformation are major challenges to election integrity in the digital age. They spread rapidly online and can significantly impact public trust and democratic processes.
Evidence
MIT study shows that false information travels 10 times faster than fact-checked information.
Major Discussion Point
Challenges to election integrity in the digital age
Agreed with
William Bird
Sezen Yesil
Lina Viltrakiene
Agreed on
Misinformation and disinformation as major threats
Violence and intimidation against journalists, especially women
Explanation
Journalists, particularly female journalists, face violence and intimidation when covering elections. This poses a serious threat to press freedom and election integrity.
Evidence
One third of women journalists have quit due to online harassment and physical violence.
Major Discussion Point
Challenges to election integrity in the digital age
Media literacy and digital skills education programs
Explanation
UNESCO is focusing on education programs to improve media and information literacy in the digital age. These programs aim to help users develop critical thinking skills to distinguish between fact-checked information and falsehoods.
Evidence
UNESCO’s program on media and information literacy is a cornerstone of their strategy.
Major Discussion Point
Successful initiatives and best practices
Training journalists on election coverage and emerging technologies
Explanation
UNESCO provides training for journalists on covering elections and the impact of emerging technologies like AI. This helps ensure more accurate and responsible reporting during election periods.
Evidence
UNESCO has developed an online course on the impact of AI and generative artificial intelligence on election processes.
Major Discussion Point
Successful initiatives and best practices
Treating information as a public good, not a public hazard
Explanation
The IGF should focus on ensuring that information is treated as a public good rather than a public hazard. This approach is crucial for addressing the challenges of harmful online content and protecting democratic processes.
Major Discussion Point
Governance principles and mechanisms needed
William Bird
Speech speed
155 words per minute
Speech length
1485 words
Speech time
572 seconds
Attacks on electoral management bodies and journalists
Explanation
There have been multi-pronged attacks on electoral management bodies and journalists, following a disinformation playbook. These attacks target the entities, their decisions, and individuals within them, often using pseudo-legal challenges.
Evidence
Over a two-week period, there were over a thousand attacks against journalists on X, with most targeting one journalist in particular.
Major Discussion Point
Challenges to election integrity in the digital age
Agreed with
Tawfik Jelassi
Sezen Yesil
Lina Viltrakiene
Agreed on
Misinformation and disinformation as major threats
Public reporting platforms for online harms
Explanation
South Africa has implemented public platforms for reporting attacks against journalists and other online harms. These platforms operate independently of the state and apply consistent standards across different social media platforms.
Evidence
South Africa has platforms called Mars and Real 411 for reporting attacks against journalists and other online harms like misinformation and hate speech.
Major Discussion Point
Successful initiatives and best practices
Daniel Molokele
Speech speed
138 words per minute
Speech length
1200 words
Speech time
519 seconds
Lack of regulation for influential social media personalities
Explanation
There is a rise of influential social media personalities who can shape political narratives without proper journalistic training or ethics. This lack of regulation and standardization can significantly impact election integrity.
Evidence
Examples of podcasts, blogs, and live broadcasts that can reach millions of potential voters with untested facts or allegations about elections.
Major Discussion Point
Challenges to election integrity in the digital age
Standardization of quality information and news across regions
Explanation
There is a need for standardization in the quality of information and news across Africa, especially during elections. This includes developing model laws and policy frameworks to enhance access to quality information and promote electoral integrity.
Major Discussion Point
Governance principles and mechanisms needed
Elizabeth Orembo
Speech speed
124 words per minute
Speech length
1719 words
Speech time
831 seconds
Digital inequality limiting access to reliable information
Explanation
Digital inequality in Africa leads to uneven access to information, creating fertile ground for misinformation. This inequality affects people’s ability to make informed choices during elections.
Evidence
Challenges in policy, infrastructure, and media access in African countries.
Major Discussion Point
Challenges to election integrity in the digital age
Sezen Yesil
Speech speed
138 words per minute
Speech length
1647 words
Speech time
711 seconds
Coordinated inauthentic behavior on social platforms
Explanation
Meta has identified and removed numerous networks engaged in coordinated inauthentic behavior. These networks spread disinformation and mislead people, particularly during election periods.
Evidence
Meta removed about 20 coordinated inauthentic behavior networks in 2024 alone.
Major Discussion Point
Challenges to election integrity in the digital age
Agreed with
Tawfik Jelassi
William Bird
Lina Viltrakiene
Agreed on
Misinformation and disinformation as major threats
Collaboration between platforms, fact-checkers and authorities
Explanation
Meta collaborates with third-party fact-checkers, local trusted partners, and authorities to combat misinformation. This multi-stakeholder approach helps in receiving timely insights and taking appropriate actions.
Evidence
Meta works with more than 90 third-party fact-checkers around the world, covering 60 languages.
Major Discussion Point
Successful initiatives and best practices
Agreed with
Lina Viltrakiene
Rosemary Sinclair
Agreed on
Need for multi-stakeholder collaboration
Differed with
Lina Viltrakiene
Differed on
Approach to regulating digital platforms
Technical measures to detect manipulated media and inauthentic accounts
Explanation
Meta employs various technical measures to detect and remove fake accounts and manipulated media. These measures help maintain the integrity of the platform during elections.
Evidence
Meta’s automatic detection tools block billions of fake accounts, often within minutes of creation.
Major Discussion Point
Successful initiatives and best practices
Lina Viltrakiene
Speech speed
121 words per minute
Speech length
1451 words
Speech time
719 seconds
Use of AI and deepfakes to create misleading content
Explanation
The use of AI and deepfakes to create misleading content, such as fake statements from top politicians, poses a significant threat to election integrity. This technology can influence people’s choices and erode trust in democratic institutions.
Evidence
Experiences from Romanian and Bulgarian elections where significant interference by foreign actors via social media platforms was observed.
Major Discussion Point
Challenges to election integrity in the digital age
Agreed with
Tawfik Jelassi
William Bird
Sezen Yesil
Agreed on
Misinformation and disinformation as major threats
Multi-stakeholder approach to monitoring and countering disinformation
Explanation
Lithuania has implemented a consolidated system for monitoring and neutralizing disinformation. This system involves various stakeholders including state institutions, NGOs, media, and businesses to create societal resilience against disinformation.
Evidence
Lithuania’s Civil Resilient Initiative and debunk.org work on analyzing and countering disinformation, as well as promoting digital and media literacy.
Major Discussion Point
Successful initiatives and best practices
Agreed with
Sezen Yesil
Rosemary Sinclair
Agreed on
Need for multi-stakeholder collaboration
Clear responsibilities and accountability for digital platforms
Explanation
There is a need to establish clear responsibilities, legal obligations, and potential penalties for digital platforms that fail to prevent the spread of organized disinformation campaigns. This approach aims to improve the governance of digital platforms during elections.
Major Discussion Point
Governance principles and mechanisms needed
Differed with
Sezen Yesil
Differed on
Approach to regulating digital platforms
Global cooperation and information sharing between democracies
Explanation
Coordination among democracies is crucial to effectively respond to foreign information manipulation and interference. This cooperation can help prevent hostile actors from manipulating the information space during elections.
Major Discussion Point
Governance principles and mechanisms needed
Rosemary Sinclair
Speech speed
119 words per minute
Speech length
1509 words
Speech time
756 seconds
Balancing innovation with integrity and human rights protections
Explanation
There is a need to balance innovation in the digital space with integrity and human rights protections. This involves developing governance principles that address issues of privacy, security, and trust in the online world.
Evidence
Research in Australia shows that people are starting to do less online due to the harms they are experiencing.
Major Discussion Point
Governance principles and mechanisms needed
Strengthening the role of IGF in addressing information integrity
Explanation
The role of the Internet Governance Forum (IGF) should be clarified and made permanent to provide a forum for multi-stakeholder discussions on important issues like information integrity. This could help in developing more effective global governance mechanisms.
Major Discussion Point
Governance principles and mechanisms needed
Agreed with
Sezen Yesil
Lina Viltrakiene
Agreed on
Need for multi-stakeholder collaboration
Agreements
Agreement Points
Misinformation and disinformation as major threats
Tawfik Jelassi
William Bird
Sezen Yesil
Lina Viltrakiene
Spread of misinformation and disinformation online
Attacks on electoral management bodies and journalists
Coordinated inauthentic behavior on social platforms
Use of AI and deepfakes to create misleading content
Multiple speakers identified the spread of misinformation and disinformation as a significant threat to election integrity, highlighting various forms and channels through which this occurs.
Need for multi-stakeholder collaboration
Sezen Yesil
Lina Viltrakiene
Rosemary Sinclair
Collaboration between platforms, fact-checkers and authorities
Multi-stakeholder approach to monitoring and countering disinformation
Strengthening the role of IGF in addressing information integrity
Several speakers emphasized the importance of collaboration between various stakeholders, including tech platforms, fact-checkers, authorities, and civil society, to effectively address election integrity issues.
Similar Viewpoints
Both speakers highlighted the serious issue of attacks and intimidation against journalists, recognizing it as a significant threat to press freedom and election integrity.
Tawfik Jelassi
William Bird
Violence and intimidation against journalists, especially women
Attacks on electoral management bodies and journalists
Both speakers addressed issues related to information quality and access in Africa, emphasizing the need for better regulation and infrastructure to ensure reliable information during elections.
Daniel Molokele
Elizabeth Orembo
Lack of regulation for influential social media personalities
Digital inequality limiting access to reliable information
Unexpected Consensus
Importance of digital literacy and education
Tawfik Jelassi
William Bird
Lina Viltrakiene
Media literacy and digital skills education programs
Public reporting platforms for online harms
Multi-stakeholder approach to monitoring and countering disinformation
Despite coming from different backgrounds (UNESCO, civil society, and government), these speakers all emphasized the importance of digital literacy and education in combating misinformation and protecting election integrity.
Overall Assessment
Summary
The main areas of agreement included recognizing misinformation and disinformation as major threats to election integrity, the need for multi-stakeholder collaboration, the importance of protecting journalists, and the value of digital literacy and education programs.
Consensus level
There was a moderate to high level of consensus among the speakers on the key challenges facing election integrity in the digital age. This consensus suggests a shared understanding of the problems, which could facilitate more coordinated and effective responses to these challenges. However, there were some differences in the specific solutions or approaches proposed, indicating that while there is agreement on the problems, there may be diverse views on how best to address them.
Differences
Different Viewpoints
Approach to regulating digital platforms
Lina Viltrakiene
Sezen Yesil
Clear responsibilities and accountability for digital platforms
Collaboration between platforms, fact-checkers and authorities
Lina Viltrakiene advocates for establishing clear legal responsibilities and potential penalties for digital platforms, while Sezen Yesil emphasizes voluntary collaboration between platforms, fact-checkers, and authorities.
Unexpected Differences
Focus on AI and deepfakes
Lina Viltrakiene
Sezen Yesil
Use of AI and deepfakes to create misleading content
Technical measures to detect manipulated media and inauthentic accounts
While Lina Viltrakiene emphasizes the threat of AI and deepfakes in creating misleading content, Sezen Yesil surprisingly downplays this concern, stating that the risks did not materialize significantly in recent elections. This unexpected difference highlights varying perceptions of the immediate threat posed by AI in election integrity.
Overall Assessment
summary
The main areas of disagreement revolve around the approach to regulating digital platforms, the focus on AI and deepfakes as immediate threats, and the most effective methods for combating misinformation and improving information quality.
difference_level
The level of disagreement among speakers is moderate. While there is a general consensus on the importance of addressing misinformation and protecting election integrity, speakers differ on the specific strategies and priorities. These differences reflect the complex nature of the issue and the need for a multi-faceted approach, potentially complicating efforts to develop unified global strategies for protecting election integrity in the digital age.
Partial Agreements
Partial Agreements
All speakers agree on the need to improve information quality and combat misinformation, but propose different approaches: Tawfik Jelassi focuses on education programs, William Bird on public reporting platforms, and Daniel Molokele on standardization of news quality.
Tawfik Jelassi
William Bird
Daniel Molokele
Media literacy and digital skills education programs
Public reporting platforms for online harms
Standardization of quality information and news across regions
Similar Viewpoints
Both speakers highlighted the serious issue of attacks and intimidation against journalists, recognizing it as a significant threat to press freedom and election integrity.
Tawfik Jelassi
William Bird
Violence and intimidation against journalists, especially women
Attacks on electoral management bodies and journalists
Both speakers addressed issues related to information quality and access in Africa, emphasizing the need for better regulation and infrastructure to ensure reliable information during elections.
Daniel Molokele
Elizabeth Orembo
Lack of regulation for influential social media personalities
Digital inequality limiting access to reliable information
Takeaways
Key Takeaways
The integrity of elections is facing significant challenges in the digital age, including misinformation, disinformation, and attacks on electoral bodies and journalists
Successful initiatives to protect election integrity include multi-stakeholder collaboration, media literacy programs, and technical measures by platforms
Governance principles needed include balancing innovation with integrity, global cooperation between democracies, and treating information as a public good
The Internet Governance Forum (IGF) has an important role to play in addressing information integrity issues globally
Resolutions and Action Items
Continue discussions on election integrity at future IGF meetings
Clarify and strengthen the role of the IGF in addressing information integrity issues
Develop more coordinated efforts between national, regional and global IGFs on key topics like election integrity
Expand collaborative efforts between platforms, governments, civil society and other stakeholders
Unresolved Issues
How to effectively regulate influential social media personalities and content creators
Addressing the digital divide that limits access to reliable information in some regions
Balancing free speech protections with the need to combat harmful misinformation
How to hold global platforms accountable across different national jurisdictions
Developing common definitions and standards for identifying misinformation/disinformation
Suggested Compromises
Balancing innovation in digital technologies with the need for integrity and human rights protections
Finding a middle ground between government regulation of platforms and industry self-regulation
Developing nuanced labels and categories for different types of problematic content, rather than broad definitions
Thought Provoking Comments
We must address, already on Sunday morning, in day zero of this Internet Governance Forum here in Saudi Arabia, we had a session on misinformation. And in that session, we also had a session on the role of stakeholders in protecting election integrity and the right to information.
speaker
Pearse O’Donohue
reason
This comment set the stage for the entire discussion by framing it within the broader context of the IGF and highlighting the key themes of misinformation and stakeholder roles in protecting election integrity.
impact
It focused the discussion on the intersection of internet governance and election integrity, prompting panelists to address these specific issues throughout their remarks.
Throughout this year we decided to update some of our policies. For example we updated our penalty system per feedback of the oversight board to treat people more fairly and to give them more free expression and secondly we updated our policy on violence.
speaker
Sezen Yesil
reason
This comment provided concrete examples of how a major tech platform is adapting its policies to balance free expression with preventing harmful content, particularly in the context of elections.
impact
It sparked discussion about the role of tech platforms in moderating content and the challenges of balancing different rights and interests.
What has not worked well is the exponential spread of disinformation and hate speech derailing the integrity of electoral processes, and maybe casting some doubt or trust in election outcomes and democratic institutions.
speaker
Tawfik Jelassi
reason
This comment highlighted a major challenge facing election integrity in the digital age, pointing to the broader implications for democratic institutions.
impact
It shifted the conversation to focus more on the negative impacts of disinformation and hate speech, prompting other panelists to address these issues in their remarks.
Because with data becoming more available, we also need more capacity to crunch data to get it to people. And those capacities were different as well, and sometimes challenging.
speaker
Elizabeth Orembo
reason
This comment introduced the important issue of data literacy and capacity, particularly in the context of the Global South.
impact
It broadened the discussion to include considerations of digital inequality and the need for capacity building in data analysis and interpretation.
It’s been a big year but I want to just ask if people genuinely feel better about democracy having had 65, 70, 75 elections. Because the sense that I get from speaking to people is that despite it being, it should be a year of celebrating democracy, we don’t feel good about democracy
speaker
William Bird
reason
This comment challenged the assumption that more elections necessarily lead to stronger democracy, introducing a more nuanced perspective on the state of global democracy.
impact
It prompted a deeper reflection on the quality of democracy beyond just the quantity of elections, influencing subsequent comments on the challenges facing democratic processes.
We still need to see more young people as candidates or as elected representatives. We also saw the use of social media in a much more progressive way to mobilize people to voter registration and more importantly to turn out as voters, including media platforms such as TikTok, WhatsApp, Facebook, and X
speaker
Daniel Molokele
reason
This comment highlighted the positive potential of social media in engaging young voters, while also pointing out the need for greater youth representation in politics.
impact
It shifted the discussion to consider the role of social media in political engagement and the importance of youth participation in democratic processes.
Thus, this shows us that we need to work further on continuous collaboration of platforms with state institutions. And while regulatory frameworks perhaps should be improved, and as a model, I would like to refer to the EU’s Digital Services Act, which could really encourage the thinking.
speaker
Lina Viltrakiene
reason
This comment introduced the idea of regulatory frameworks as a potential solution to challenges in digital election integrity, specifically referencing the EU’s Digital Services Act.
impact
It prompted discussion about the role of regulation in addressing digital challenges to election integrity and the potential for international cooperation in this area.
So practically speaking, during elections, we sometimes see at powder increased requests from people to take down the websites of their political opponents. And those requests are often made with claims of misinformation or disinformation. Those claims must be assessed by others who are authorised by law and skilled to make those judgements.
speaker
Rosemary Sinclair
reason
This comment provided a concrete example of the challenges faced by technical operators during elections, highlighting the complexity of content moderation decisions.
impact
It grounded the discussion in practical realities and emphasized the need for clear guidelines and authorized bodies to make content moderation decisions during elections.
Overall Assessment
These key comments shaped the discussion by highlighting the multifaceted challenges facing election integrity in the digital age, from disinformation and hate speech to digital inequality and youth engagement. They prompted a nuanced exploration of the roles and responsibilities of various stakeholders, including tech platforms, governments, civil society, and international bodies. The discussion evolved from identifying problems to considering potential solutions, including policy updates, capacity building, regulatory frameworks, and multi-stakeholder collaboration. Throughout, there was a tension between the potential of digital technologies to enhance democratic participation and the risks they pose to election integrity, reflecting the complex nature of internet governance in relation to democratic processes.
Follow-up Questions
How can we develop more nuanced labels and definitions for misinformation and disinformation?
speaker
William Bird
explanation
Current definitions are too broad and don’t account for different types and levels of harm. More precise categorization could help in addressing these issues more effectively.
How can we create a centralized platform for reporting online harassment and attacks, especially against politicians and journalists?
speaker
Maha Abdel Nasser
explanation
A unified reporting system could help address online harassment more quickly and effectively, particularly during election periods.
What legal and political solutions can address the challenge of digital platforms refusing to cooperate with authorities in independent countries?
speaker
Mokabedi (online participant)
explanation
This is important to ensure consistent enforcement of policies across different countries and platforms.
How can we standardize the quality of information and news across Africa, especially during elections?
speaker
Daniel Molokele
explanation
Standardization could help improve the integrity of electoral information and strengthen democracy across the continent.
How can we better connect and synthesize the work of different partnerships and stakeholders working on election integrity?
speaker
Elizabeth Orembo
explanation
Connecting these efforts could provide a more comprehensive understanding of election integrity issues and more effective solutions.
How can we ensure consistent implementation of content moderation policies across different social media platforms?
speaker
William Bird
explanation
Consistency across platforms is crucial for effective management of online harms and misinformation.
How can we better address the challenges posed by non-professional content creators (e.g., podcasters, bloggers) in spreading election-related misinformation?
speaker
Daniel Molokele
explanation
These new media sources have significant influence but often lack professional standards or oversight, potentially impacting election integrity.
How can we improve digital literacy efforts, particularly in the Global South, to help users distinguish between AI-generated and real content?
speaker
Daniel Molokele
explanation
As AI-generated content becomes more prevalent, the ability to identify it is crucial for maintaining election integrity.
How can the role of the Internet Governance Forum be clarified and made permanent to address ongoing issues of online information integrity?
speaker
Rosemary Sinclair
explanation
A clearer, permanent role for the IGF could provide a consistent forum for addressing these evolving challenges.
How can we better integrate local, regional, and global Internet Governance Forums to address issues like election integrity more effectively?
speaker
Rosemary Sinclair
explanation
Better integration could lead to more coordinated and effective responses to global challenges in online information integrity.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
Internet Governance Forum 2024
15 Dec 2024 06:30h - 19 Dec 2024 13:30h
Riyadh, Saudi Arabia and online