Viewing Disinformation from a Global Governance Perspective | IGF 2023 WS #209
Table of contents
Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.
Knowledge Graph of Debate
Session report
Full session report
Corway Wu
In the realm of disinformation, politicians are also actively involved in creating and spreading false information, not just news media and social media. This amplifies the scope of actors responsible for misleading the public. The negative sentiment towards politicians indicates a lack of trust in their intentions.
Timing is a crucial aspect in the dissemination of disinformation. The example of Brexit voting behavior is used to demonstrate this. Voters may be influenced by inaccurate information without realizing it until it is too late. This implies that the impact of disinformation can have lasting effects, shaping important decisions.
However, an opposing viewpoint is presented, disagreeing with Jeanette’s argument about the significance of timing when discussing disinformation. It is argued that Jeanette’s failure to consider timing weakens their argument. The negative sentiment expressed towards this disagreement suggests a potential blind spot in understanding the issue.
Overall, this analysis illustrates the multifaceted nature of disinformation and its wide-ranging consequences. Politicians, news media, and social media platforms are all complicit in perpetuating false information. The timing of disinformation is highlighted as a crucial factor, as it can significantly influence its impact on individuals and societies. The disagreement regarding the importance of timing further emphasizes the complexity of this subject.
Audience
The analysis delved into the multifaceted nature of misinformation and disinformation. One of the speakers put forth the argument that these actions have the potential to incite individuals to act against democratic institutions. To support this claim, they highlighted the example of the January 6th Capitol riots, which they believed were inspired by misinformation and disinformation. The speaker’s sentiment was negative, suggesting concern about the impact of these actions on democracy.
However, another speaker expressed a more neutral stance, highlighting the challenge of settling on a clear definition of disinformation. They pointed out that assessing the longitudinal impact of disinformation is challenging. This sentiment indicates a level of uncertainty regarding the extent to which misinformation and disinformation can influence actions and outcomes.
A disagreement emerged regarding the possibility of completely eliminating disinformation. One speaker argued that efforts should be directed towards reducing its spread and minimizing the damage caused, rather than striving for complete elimination. This sentiment aligned with a more positive outlook on the issue.
In the specific context of Switzerland, it was suggested that disinformation does not possess enough influence to significantly sway elections. The speaker based this claim on the observation that Switzerland has a stable multi-party system with relatively consistent voting patterns over the past 30 years. This sentiment reflects a more neutral perspective on the impact of disinformation in the Swiss political landscape.
The analysis also examined the potential effects of disinformation on internet infrastructure and connectivity. There was evidence suggesting that disinformation governance can impact internet infrastructure, with an example cited of Europe implementing IP blocking of Russian websites spreading disinformation. This negative sentiment implies the belief that the weaponization of disinformation through online platforms has had widespread consequences.
The audience raised concerns regarding the potential threat disinformation poses to the fundamental right of freedom of expression. Historical examples, such as the information war of the Cold War and the use of radio for propaganda during Nazi Germany, were provided to illustrate this point. This sentiment highlights the importance of protecting freedom of expression in the face of disinformation.
Notably, the analysis explored the effectiveness of the Christchurch Call initiative in response to live-streamed terrorist attacks in New Zealand. The sentiment here was negative, as it was argued that rushed solutions to govern and regulate disinformation can cause unintended harm. The speaker stressed the need for a nuanced approach to address disinformation, referencing the impact of tackling disinformation in G7 declarations.
The audience member supporting the Christchurch Call initiative expressed a positive sentiment, believing in its effectiveness. They emphasized the significance of trust-building and multi-stakeholder involvement in addressing terrorism facilitated by online platforms. This aligns with the overall positive sentiment of utilizing a multi-stakeholder model and engaging governments, tech firms, and civil society in combating disinformation.
In conclusion, the analysis highlighted the complex nature of misinformation and disinformation. The arguments presented ranged from the potential dangers of these actions in undermining democratic institutions to the challenges in defining and assessing their impacts. The disagreement regarding the elimination of disinformation reflected a difference in perspectives, with one side advocating for reducing its spread. The analysis also shed light on the specific impacts of disinformation on internet infrastructure, the threat it poses to freedom of expression, and the potential effectiveness of initiatives such as the Christchurch Call in preventing terrorism. Overall, the analysis underscores the need for nuanced approaches and multi-stakeholder involvement to address disinformation and its various repercussions.
Anriette Esterhuysen
The analysis provides a comprehensive overview of perspectives on disinformation in South Africa, covering various viewpoints and arguments. One viewpoint suggests that disinformation is not a major problem in the country, with more concern placed on trusting the government. The public tends to rely on the media, which is regarded as well self-regulated and proficient in dealing with disinformation. Fact-checking is also a common practice in South Africa, swiftly and efficiently debunking false information.
Another argument highlights the successful media regulation in South Africa, which ensures accuracy across different ideological spectrums. It is noted that a commitment to accuracy exists among right-wing, center, and left-wing media outlets in the country. Fact-checking is a prevalent practice, further enhancing the reliability and trustworthiness of the media. This observation supports the notion that media regulation in South Africa effectively maintains accuracy and minimises the spread of disinformation.
The analysis also emphasises the need for careful consideration in the international regulation of disinformation. It is crucial to explore the implications of such regulation on access to information and freedom of expression. While national initiatives regarding disinformation regulation are controversial, existing international instruments may serve as a baseline for effectively governing disinformation.
Furthermore, the analysis highlights the distinct dynamics of weaponising disinformation in online platforms compared to traditional broadcasting platforms. Unlike traditional platforms, online platforms allow for the widespread distribution of disinformation without requiring significant political or economic power. This observation emphasises the need for tailored approaches in combating disinformation across different digital platforms.
A noteworthy observation from the analysis is the advocacy for considering bottom-up approaches and self-regulation measures alongside governmental regulations. Anriette Esterhuysen argues that jumping to governmental regulations without exploring more bottom-up ways may be premature. While a regulatory response might be necessary, Esterhuysen highlights the importance of not dismissing self-regulatory and bottom-up approaches to tackle disinformation. This perspective demonstrates a concern that solely relying on governmental regulations might overlook effective alternatives.
Overall, the analysis offers valuable insights into the various dimensions of disinformation in South Africa. The perspectives presented shed light on the strengths of the country’s media regulation, the challenges faced in international regulation, the dynamics of online platforms, and the importance of considering diverse approaches to combat disinformation.
Remy Milan
Misinformation poses a significant threat to the stability of state institutions, as it undermines citizens’ confidence in these establishments. This erosion of trust has detrimental effects on democracy and should not be underestimated. Remy Milan also shares this view, considering misinformation to be a high-level danger to state institutions. The spread of false or misleading information can have far-reaching consequences in a democracy. It confuses and disenchants citizens, weakening the democratic fabric by eroding trust between the governing and the governed. This issue is especially relevant to SDG 16: Peace, Justice and Strong Institutions, which aims to ensure inclusive governance and access to justice for all. Misinformation disrupts this goal by sowing doubt and creating divisions within society, hindering efforts to achieve peace and justice. It is worth noting that advances in technology, particularly social media platforms, have facilitated the spread of false information, making it easier for malicious actors to manipulate public opinion. Addressing this issue requires a multi-faceted approach, including education, media literacy, regulation, and responsible platform governance. Overall, the danger of misinformation to state institutions is significant, impacting citizens’ confidence and threatening democracy itself. Remy Milan emphasizes the importance of addressing this issue for achieving SDG 16 and ensuring peace, justice, and strong institutions. Efforts must be made to promote media literacy, regulate false information, and foster trust and critical thinking to uphold the integrity of state institutions and democratic values.
Jeanette Hofmann
This discussion explores the impact of disinformation on people’s minds and voting behaviour. One participant criticises the limited knowledge surrounding this issue due to the lack of empirical evidence. They argue that it is essential to conduct research to better understand how disinformation affects individuals and their decision-making processes.
Another participant highlights the strategic intent of disinformation, stating that it is often used as a tool to manipulate people’s behaviour and influence their worldviews. Disinformation is seen as a deliberate tactic that focuses on achieving specific objectives.
The conversation also emphasises the need to expand research on disinformation beyond platforms and consider the wider media landscape. It is noted that context plays a crucial role, and solely examining platforms and algorithms is insufficient. The impact of disinformation should be studied within the broader media environment to gain a comprehensive understanding of its effects.
Furthermore, it is observed that individuals sharing disinformation may not necessarily believe the information themselves. Instead, they may be using it as a means to signal their belonging or loyalty to a certain group or ideology. This highlights the complex motivations behind the sharing of disinformation and the need to consider social and psychological factors in analysing its influence.
The conversation also touches upon the rising disregard for truth and the detrimental impact it has on public discourse and democracy. This trend of increasing tribal attitudes and a lack of concern for distinguishing truth from falsity has severe consequences for the functioning of society and democratic processes.
Regarding the governance of the internet, there is a recognition that infrastructure standards need global agreement to ensure a cohesive global network. However, content regulation should not be undertaken at a global level, as it may impinge upon freedom of speech and local autonomy.
The Digital Service Act, proposed by the European Commission, is viewed as an interesting development. It extends the scope of human rights to not only govern the relationship between people and governments but also guide platform behaviours. This recognition that the private sector’s influence on the exercise of human rights should be guided by human rights principles is seen as positive.
The Act’s provision for data access related to systemic risks caused by platforms is supported. This data access allows for a better, evidence-based understanding of the impact of disinformation. However, the concept of needing to mobilise systemic risk to gain access to data is criticised, highlighting the need for more efficient mechanisms.
The discussion concludes with the suggestion that the Internet Governance Forum (IGF) could serve as a platform to discuss and implement best practices derived from Article 40 of the Digital Service Act. This highlights the potential for international collaboration and knowledge-sharing in addressing disinformation and its consequences.
Overall, this discussion emphasises the urgent need for comprehensive research, consideration of wider media environments, and the recognition of the complex motivations behind the sharing of disinformation. It also addresses the importance of upholding human rights principles and the challenges of content regulation in a global and interconnected digital landscape.
Bili Drake
The impact of disinformation on democracy is a complex issue that is influenced by various factors and is context-dependent. Different perspectives exist on the extent to which disinformation can affect democratic processes. Some argue that disinformation can have a significant negative impact on democracy, while others caution against oversimplifying the issue and relying on false dichotomies.
It has been observed that a considerable amount of disinformation originates from broadcast media and is then amplified through social media platforms. This highlights the interconnectedness between different forms of media in the spread of disinformation. Several studies have indicated this behavior, emphasizing the importance of understanding the role played by different media channels in the dissemination of disinformation.
One key aspect that complicates the issue of disinformation is the lack of a standardised definition. Leading organisations like the European Union and the UN Special Rapporteur have differing definitions of disinformation, which can give rise to confusion and inconsistencies in tackling this problem. It becomes crucial to establish a common understanding to effectively address disinformation.
Tribal loyalty is identified as a significant factor that can lead individuals to believe in disinformation. In cases like the United States, where tribal affiliations and identity politics play a prominent role, people may align with certain narratives or disinformation due to their loyalty to a particular group. This highlights how social and political factors can impact an individual’s susceptibility to disinformation.
Identity politics further compounds the issue, distorting the perception of truth. Some individuals develop their identities around opposing certain groups or ideologies, leading them to embrace disinformation that aligns with their pre-existing biases. This phenomenon highlights the role of emotions and personal beliefs in shaping the acceptance of disinformation.
Efforts to regulate disinformation on a global level have been proposed, but doubts remain about their effectiveness. Discussions in the United Nations have seen various proposals related to disinformation, such as China’s suggestion to criminalise its spread and UNESCO’s guidelines for digital platforms. However, the complexities and geopolitical divisions inherent in regulating disinformation make it challenging to achieve meaningful global regulation. As a result, long-term engagement is advocated, focusing on building infrastructure to challenge disinformation effectively.
The proposal for a code of conduct on information integrity for digital platforms is seen as an attempt at global internet governance. This proposal aims to govern the information that flows through digital networks, aligning with the definition of internet governance. It raises questions about the extent to which such regulations should be implemented and their potential impact on freedom of expression and privacy.
The primary responsibility to counter disinformation lies with states, according to the UN General Assembly’s resolution. While platforms such as social media play a role, governments bear the primary responsibility to address the issue effectively. Simply pressuring platforms to act does not address the root causes of disinformation.
It is important to recognise that disinformation can originate from various sources, including the dark web. This highlights the need for a comprehensive approach that looks beyond platforms alone. Strategies should encompass multiple sources and channels through which disinformation can be generated and disseminated.
Civil society participation is crucial in the discussion on countering disinformation. While there have been limited discussions on proposals like the UN Secretary General’s Global Digital Compact, greater involvement of civil society in such initiatives can ensure diverse perspectives and balanced decision-making.
In conclusion, addressing the issue of disinformation requires a multifaceted approach that involves governments, platforms, and civil society. The complex nature of disinformation and its impact on democracy necessitate a nuanced understanding, taking into account various factors such as media channels, definitions, tribal loyalty, and identity politics. Efforts to regulate disinformation at a global level should be complemented with long-term engagement and infrastructure-building, recognising the challenges and limitations faced in achieving effective global regulation.
Nighat Dad
Disinformation, which can impact democratic processes, is a topic of concern. However, solid evidence is needed to support this claim. Caution must be exercised in interpreting the complex and contextual definitions of misinformation and disinformation. Disinformation has the potential to harm marginalized groups, and a UN report highlights its negative effects on gender equality. Global governance instruments exist, but their application needs improvement as regulations and laws often suppress freedom of expression. State actors and companies have a shared obligation to provide accurate information and prevent the spread of misinformation. Synergy between existing systems is crucial, and the performance of oversight boards and governance mechanisms must be reviewed. Concerns are raised about governments misusing guidelines and the lack of accountability. Regulatory mechanisms are needed to hold state actors accountable. User rights should not be forgotten in regions with restrictions. The local context is vital, and more global oversight boards are necessary to hold companies accountable. Transparency reports play a key role in holding platforms accountable.
Clara Iglesias Keller
Disinformation has the potential to undermine democracy, although its impact varies depending on the context. While there is currently no solid empirical evidence to suggest that disinformation directly changes voters’ behaviors or affects election results, there is a consensus that further research is necessary to fully understand its implications.
The existing research on the impact of disinformation is primarily focused on the United States and Europe, highlighting a need for expanding studies to include other regions such as Latin America, Africa, and Asia. It is important to understand how disinformation strategies can influence political transformations in different contexts.
Disinformation is considered a communication practice and an online harm, along with misinformation, propaganda, and fake news. Its intent holds significant legal relevance, further emphasizing the need to address the issue.
In some instances, disinformation serves as a form of political intervention. For example, in Brazil, it has been used to express dissatisfaction or directly attack democratic institutions, including the electoral system and high courts. This highlights the destructive potential of disinformation as a tool in political disputes.
However, the concept of disinformation poses a challenge within statutory regulation, as there is no clear space for its definition and regulation.
Global governance solutions, although important, may not be sufficient to address the impact of misinformation and disinformation on political disputes. It is necessary to confront the ultimate convertibility of economic power into political power, particularly within the media landscape. This is evident in countries like Brazil, where traditionally concentrated and unregulated media landscapes contribute to the spread of disinformation.
Additionally, global solutions often rely on consensus-based governance structures, which may lack the power needed to modify digital business models and data usage effectively.
More empirical evidence is needed, especially outside of the global north. In countries like Brazil, internet usage is strongly associated with platforms such as WhatsApp, Facebook, and Instagram, facilitated by zero rating policies. Understanding the impact of disinformation in these regions is crucial for developing effective countermeasures.
In conclusion, addressing the challenges posed by disinformation requires not only further research but also more institutional innovation. This innovation should create an apparatus that allows diverse stakeholders and civil society to engage in the disputation of truth and content. By confronting the convertibility of economic power into political power and exploring alternative governance structures, we can work towards mitigating the harmful effects of disinformation and safeguarding democratic institutions.
David Kaye
Disinformation is a complex issue that involves the dissemination of false or misleading information. It can have various impacts and is spread through different platforms, including legacy media and social platforms. Understanding the nuances of disinformation is crucial, as there is no one-size-fits-all solution to address it.
David Kaye, an advocate for human rights, highlights the importance of clear definitions and understanding in addressing disinformation through legal regulation and governance. However, he expresses concern about the lack of shared definitions, which may impede the process of regulation. Kaye also raises concerns about emerging regulations in Europe and the UK that rely on platforms to define disinformation, as this may affect transparency and risk assessment.
While global regulation of disinformation may seem desirable, Kaye argues that it is not achievable. Instead, he suggests the development of a common set of guiding principles based on human rights. These principles should be the foundation for addressing disinformation, providing a framework that ensures legality, necessity, proportionality, and legitimacy of objectives.
In shaping policies and strategies to combat disinformation, Kaye believes that civil society should play an active role. They should be included in the drafting and adoption process to ensure a more inclusive approach. Additionally, Kaye argues that governments should be held responsible for their behavior and should support public service media, as excluding them would undermine the effectiveness of addressing disinformation.
Over-reliance on platforms for handling disinformation is a matter of concern. Relying solely on platforms may create challenges in terms of transparency, accountability, and bias. Therefore, it is necessary to explore alternative approaches and strategies to combat disinformation effectively.
The leadership of New Zealand in promoting multi-stakeholder approaches and prioritising human rights in times of trauma, such as after the Christchurch attack, is commended by Kaye. He recognises the importance of keeping human rights at the core of global governance. In this regard, Kaye highlights the Internet Governance Forum (IGF) as a platform where human rights and access to information should be given priority.
However, Kaye also warns against adopting ideas that disregard human rights in response to traumatic events, such as the Israel-Palestine conflict. While people may have natural responses to such events, it is crucial to ensure that any responses or measures taken are rooted in human rights principles.
In conclusion, addressing disinformation requires nuanced understanding and a combination of approaches. Clear definitions, shared principles based on human rights, civil society participation, government responsibility, and cautious reliance on platforms are all essential factors in effectively combating disinformation. New Zealand’s leadership and the IGF’s emphasis on human rights in global governance are notable examples of positive progress. However, it is crucial to avoid compromising human rights in times of trauma or conflict.
John Mahob
A recent discussion highlighted the detrimental effect of disinformation on democracy in the Philippines. The concern was voiced by the current president, Marcos, who is the son of the former dictator. One of the key arguments made was that disinformation played a significant role in influencing the outcomes of the recent elections.
Disinformation in the political landscape is seen as a serious threat to the country’s democratic processes. It is suggested that the spread of false information and manipulation of facts can lead to citizens making ill-informed decisions, thus undermining the democratic values of transparency and accountability.
Supporting this viewpoint, John Mahob, a representative from the Foundation for Media Alternatives in the Philippines, also expressed concern over the impact of disinformation on the country’s democracy. He stressed the need to address and counter disinformation, as it has the potential to distort public opinion and undermine the credibility of democratic institutions.
The speakers argued that the negative consequences of disinformation are far-reaching. By spreading false narratives and distorting facts, disinformation can erode trust in institutions and create divisions among citizens. It is seen as a tool that can be used by those in power to manipulate public sentiment and secure their own interests.
The evidence presented by both speakers raises important questions about the state of democracy in the Philippines. The influence of disinformation on the recent elections serves as a warning sign that steps need to be taken to protect the integrity of democratic processes. Efforts to combat disinformation and promote media literacy are crucial in order to safeguard the principles of democracy, uphold freedom of expression, and ensure that citizens are adequately informed to make informed decisions.
In conclusion, the discussion reveals a shared concern about the negative impact of disinformation on democracy in the Philippines. The speakers, including the current president and John Mahob, emphasize the urgent need to address this issue and prevent disinformation from undermining democratic values. It is hoped that by raising awareness and taking appropriate measures, the Philippines can work towards creating a more informed and resilient democratic society.
Aaron Maniam
The analysis explores several key aspects of global governance, regulations, misinformation, and digital regulation. One of the main challenges in defining global governance arises from the presence of different models and guidelines, which leads to variations in the level of guidelines and enforcement, resulting in a lack of consensus on the precise meaning of global governance.
Concerning global governance regulations, it is crucial to distinguish between basic standards and additional issues. Examples such as the Digital Services Act (DSA) in the European Union and online safety regulations in Singapore and the UK emphasize the significance of addressing both fundamental standards and more complex issues in regulating global governance. These regulations play a significant role in achieving Goal 16 of the Sustainable Development Goals (SDGs): Peace, Justice, and Strong Institutions.
Governments can have contradictory impacts on global governance efforts. On one hand, they can be a source of misinformation, hindering progress towards effective global governance. However, governments also possess the authority and skills necessary to continuously update legislation to keep pace with rapidly evolving technology. This ability is essential for achieving Goal 16 of the SDGs.
Interoperability, the ability for different systems to communicate seamlessly, is a vital aspect of digital regulation. Aaron Maniam highlights the importance of interoperability among different countries, as it enables coherent communication and collaboration. This is linked to Goal 9 of the SDGs: Industry, Innovation, and Infrastructure.
A polycentric approach is crucial in combating disinformation. Governments should move away from solely having an authority-based role and embrace a convening function. By engaging in consultation and deliberation, governments can prioritize and address issues related to disinformation in a bottom-up fashion. Additionally, community building, space building, and urban planning should be part of the government’s role in fighting disinformation.
Education and literacy play a pivotal role in tackling disinformation. In Singapore, various organizations, including the National Library Board, the Information and Media Development Authority, and the Cybersecurity Agency, collaborate to operationalize strategies. Education that starts at home and in schools and libraries is highlighted as a key factor in enhancing literacy among citizens. This aligns with Goal 4 of the SDGs: Quality Education.
In summary, the analysis underscores the complexities and challenges of global governance and the importance of clear regulations. It also highlights the dual role of governments as potential sources of misinformation and as crucial actors in updating legislation. Interoperability is crucial for effective digital regulation, and a polycentric approach is essential in combating disinformation. Lastly, education and literacy are vital components in mitigating the impact of disinformation.
Greta
Greta strongly believes that disinformation is significantly weakening democratic systems. This issue is related to the topics of disinformation and democracy and is associated with a negative sentiment. However, no specific supporting facts or arguments were provided to support the claim that disinformation undermines democracy.
Disinformation, the deliberate spread of false or misleading information, poses a serious threat to the democratic process. It can manipulate public opinion, deceive voters, and erode trust in democratic institutions. Greta’s agreement with this viewpoint suggests that she recognizes the detrimental effects that disinformation can have on the functioning of democracies.
Although no specific supporting facts or arguments were presented, it is worth considering the widespread impact of disinformation in recent years. The rise of social media platforms has enabled the rapid spread of false information, often disguised as legitimate news or opinions. This has the potential to sway public opinion and distort democratic discourse.
Furthermore, disinformation campaigns have been known to target elections by spreading false information about candidates or manipulating public sentiment. Such tactics can undermine the integrity of electoral processes and compromise the ability of citizens to make informed choices.
The conclusion drawn from Greta’s strong agreement is that urgent actions are needed to address the problem of disinformation. Safeguarding the democratic process involves countering disinformation through fact-checking, promoting media literacy, and strengthening regulations on social media platforms. It is essential to restore trust and ensure that accurate and reliable information prevails in democratic societies.
In summary, Greta strongly believes that disinformation is undermining democracy. While specific arguments and supporting facts were not provided, the existence of disinformation poses a clear threat to democratic systems. Addressing this issue requires collective efforts to counter disinformation, promote media literacy, and protect the integrity of democratic processes.
Session transcript
Nighat Dad:
missed your question.
Anriette Esterhuysen:
I just wanted to do an audio check to see if we can hear you clearly. Now you can hear me. And we can. Thanks a lot. OK, we’ll start in a few minutes. We’ll start at 1 o’clock on the dot, Japan time. 1 o’clock. 1 o’clock. 1 o’clock. 1 o’clock. 1 o’clock. Thank you. Good afternoon, everyone. Welcome to the Disinformation from an Internet Governance Perspective workshop. My name is Anriette Esterhuisen, and I’ll be moderating. I’m a senior advisor of internet governance at the Association for Progressive Communications. I’ll introduce you to our very diverse and very interesting panel as we do the workshop. But I actually, at this point, because we’re a small group of people, I want to ask you all to stand up. Because we’re going to start with an interactive exercise, professor. That includes you. So just stand up and line up. I’m going to make a statement. And then I’m going to ask you to position yourself. I mean, along this corridor and that corridor, more or less. On that side of the room, if you strongly agree with the statement. Towards this side, if you strongly disagree. And then somewhere in the middle, according to how strongly you agree or disagree. So the statement is, absolutely you. Otherwise, you’re not going to get a chance to speak. Disinformation is undermining democracy. If you agree, go towards that end. Disagree, somewhere around here. And somewhere in the middle. And the idea is, as other people speak, I’m going to ask people why they stand where they stand. Then think about it. Reflect and decide whether you want to move along this imaginary spectrum. Bill Drake, you have to. Then you have to stand in the middle. Disinformation undermines democracy. So I’m going to start over there with somebody who’s right on the other side. Anybody willing to, why? Why do you agree so strongly with this? And just introduce yourself and then say why.
Greta:
Hi, I’m Greta. I think it’s hard to explain, but the feeling that there is information outside that can hurt how our, yeah, how democratic institutions work and function, or, yeah, really.
Anriette Esterhuysen:
So you feel that misinformation can actually undermine the institutions of democracy. Anyone else here who feels strongly?
Remy Milan:
Hello, my name is Remy Milan. The reason I would say undermining is that mis- or disinformation undermines citizens’ confidence in the institutions of the state. And that’s probably what I view as the sort of highest level danger.
Anriette Esterhuysen:
Similar, so remember, you’re supposed to move if he’s moved you. So let me move to that side of the room because I haven’t seen anyone move yet. Jeanette, why don’t you tell us, you’re one of our speakers, so introduce yourself. Why are you standing at this side? Why do you disagree?
Jeanette Hofmann:
But then I sort of, I anticipate what I’m going to say. All I wanted to say, one of the main things I want to say is that while we have a lot of research on generation and circulation of disinformation, we know little, if not nothing, about how it actually affects people’s minds and people’s voting behavior. A lot of what we see here discussed is sort of based on assumptions, but not on empirical evidence. Just introduce yourself. Oh, I’m one of the speaker, Jeanette Hofmann. I’m a political scientist from Germany.
Anriette Esterhuysen:
Anyone, you want to react to that? Good, you’re allowed to. You disagree with Jeanette.
Audience:
I would say, since I live in the United States, I would say January 6th was actually a good example of events inspired by mis- and disinformation designed specifically to undermine the democratic transfer of power between two governments. That, I think, I watched on TV. I actually believe that I saw something. So I think there’s empirical evidence that people can be driven to act contrary to democratic institutions by mis- and disinformation.
Anriette Esterhuysen:
Any of our online participants, including the speakers, who strongly agrees that disinformation undermines democracy? Nigat, Clara, David, Erin? Nigat, go ahead and just introduce yourself, and then tell me whether you agree or disagree.
Nighat Dad:
Yeah, no, thank you so much. My name is Nigat Dhad, and I run Digital Rights Foundation. I’m speaking from Lahore, Pakistan right now. And I’m like in between agree and in the center. So I was very much agreeing it. And when Jeanette said, I’m like, okay, I want to little bit move on to Jeanette’s side because we have been saying this over and over again that yes, disinformation, misinformation impacts our democratic processes, but what kind of evidence we have to support actually that and a very solid evidence in terms of how it basically changes behavior of people during electoral processes. And however, we see a lot of impact and effect of disinformation on several institutions and also on voters. But I think we still need to have a solid sort of evidence in terms of supporting that yes, it is actually deteriorating our democracy in our countries. I feel like there’s several other aspects that sort of impact this sort of destruction of our democracy processes.
Anriette Esterhuysen:
Thanks, Nigat. Before Professor Kulwe, before you, you strongly disagree as well. Are you willing to share why?
Audience:
I wouldn’t say I strongly disagree. Like I’m not over there. That’s true. I’m sort of slightly to the left of the middle, or right of the middle, I guess. I think it’s, I’m from a think tank, we do some work in this area. I think my background is also legal and I frequently find it quite difficult to settle on a definition of disinformation. And then I think there’s a whole separate question of how we actually apply that definition. And I think a lot of people are thinking of different things when they say they’re thinking about disinformation. I also think the sort of longitudinal impact is something that’s very hard to assess. And I think when we think about the causative impact of disinformation, it’s very hard to extract substantive grievance from it. So is it a manifestation of people’s dissatisfaction with the way society is that we can now observe and measure in ways that we couldn’t in the past? Or is it the disinformation that is sort of causing that dissatisfaction? I think that’s a hard thing to unpick.
Anriette Esterhuysen:
Okay, so the unknown. So those of you that came late, I made a statement. I said disinformation is undermining democracy. And the idea is that people who disagree are kind of over here and people who agree are over there. Bill Drake, you’re one of our speakers. Please introduce yourself. And what is your position? Why are you in the middle? That’s not actually very common for you. I’ve known you for many years and you very, very rarely sit on any fence.
Bili Drake:
Bill Drake, Columbia University. Hi, because as I’ve said before, when we had a session about this at the Taiwan IGF last week, beware of false binaries. I think when you put these things into simple either or choices, it becomes almost meaningless. The extent to which it impacts democracy and the way it might impact is highly dependent on context, et cetera, et cetera. So, and I would add to Jeanette that social science ability to do effective polling in here may not be the only possible measure. We have tens of millions of Americans who have demonstrated over and over that they believe in disinformation, so.
Anriette Esterhuysen:
You see, Bill has made me move a little bit more to the center. Professor Corway, you wanted to say something. Introduce yourself.
Corway Wu:
Oh, this is Corway Wu. He’s the chair of the TWI from Taiwan. And just like Bill said, I have three points, very simple. First of all, disinformation. Politicians also create and spread the disinformation, not only the news media or the social media, so don’t forget that. Politicians did it, too. The second of all is I don’t fully agree about what Jeanette is saying. The reason is because she didn’t put a time in that coordination, coordinate, because if you put a time, because when the disinformation spread around, don’t forget it, the voting is just a second. So in that second, you might get moved by the disinformation, change your voting behavior. It’s like Brexit. And then maybe one minutes after, you regret, but too late, it’s done. So that’s the reason I’m standing in the middle.
Anriette Esterhuysen:
Like the Brexit referendum, so many people regret it. Okay, before we sit down like normal IGF participants with an interesting, we hope, panel on that side and the rest of us on this side, so before we assume the normal divisions of power, anyone else in this side of the room who agrees strongly that disinformation is undermining democracy?
John Mahob:
My name is John Mahob from the Foundation for Media Alternatives from the Philippines. And if you are aware who was our previous president, Duterte, and who is our current president, Marcos, the son of the former dictator, I think it would be easier to appreciate why civil society, at least, feels that disinformation plays a huge part as to why our democratic institutions right now, and democracy in general, is very much in peril, if not already gone, so to speak. So I can understand, though, the search for empirical data. I’m also a sociologist and a lawyer, but we live behind the studies. We live through the realities every day, and we’ve gone through our elections last year, and we saw how much that disinformation actually impacted people across, not just those supposedly in the marginalized sectors, even those who are actually, you would expect to, are learned individuals. They, too, were very much caught into that web, yeah.
Anriette Esterhuysen:
It’s true, but you also did have decades of Marcos dictatorship prior to online disinformation. So, okay, the last word will be from you.
Audience:
Okay, thank you for giving me some floor. Tim, I’m from Russia, and I work for a think tank which is responsible for fighting and countering disinformation and fakes in Russia, so I have some practical experience on that. So, like, some bad news. Disinformation is, like, inevitable at all because the ability to make a mistake is a natural part of humankind and human brain, actually. So, as far as it goes, disinformation and fake news are extremely effective weapon, and nowadays, it’s widely used as a weapon, and you can get that, especially in context of Russian-Ukrainian war. We have an informational war with the Korean along, and we have lots of usage of this weapon, like, against us. And last but not least, actually, when you fight disinformation, it’s never possible to, like, ground zero all the disinformation and fakes and myth and anything else like this possible. But what you actually do is you actually fight the consequences and damage of disinformation and spread of disinformation, but you cannot fight the disinformation itself. Thank you.
Anriette Esterhuysen:
Thanks very much. Thanks very much, everyone. Let’s take a seat. Come and sit in the front.
David Kaye:
Henriette, while people are taking a seat, can I answer to you?
Anriette Esterhuysen:
Yes, David, please go ahead.
David Kaye:
I wish I was there. It sounds fun, and it’s six in the morning where I am, so I could use the stretch. So, I’m gonna be a little bit of a diplomat of my-
Anriette Esterhuysen:
David, just introduce yourself to the room.
David Kaye:
So, I’m David Kaye. I teach law at the University of California at Irvine. I was the UN Special Rapporteur on Freedom of Expression from 2014 to 2020, and I’m the independent chair of the board of the Global Network Initiative. And the diplomatic thing that I was gonna say is that I agree with what everybody said, but the thing, I think there are at least three different issues here that, from an IGF global governance perspective, make it very difficult to talk about disinformation. One is, it really matters, and this came through in the comments already, I think it really matters who’s doing the disinformation. Is it the state? Is it the president? Is it, like, who’s doing it? And also, where is it happening? So, as far as our study and understanding of the impact, I think those things matter. A second thing is, what impact do we actually care about? I mean, do we care about whether disinformation has an impact, as Jeanette was suggesting, on voting behavior? Do we care about the impact that disinformation has on people’s action in the street, as in January 6th? You know, what is the thing that we’re actually studying? And then third, based on all of those different contextual features, I think it’s very, very difficult to say that there’s one particular solution that works at a global governance level that would address disinformation in every instance or every context, whether it’s in legacy, broadcast, media, print, or on social, or search. There’s just such a variety. So I think that it’s just a vast topic that requires nuance and may not be really amenable to a kind of one-size-fits-all response.
Anriette Esterhuysen:
Thanks very much, David. Clara, do you want to reflect on this opening statement, and then also please introduce yourself? Tell us where you are.
Clara Iglesias Keller:
Thank you, Antiet. I’m Clara. I’m a legal scholar from Brazil, but I am right now in Berlin, also 6 a.m. here, so keeping strong. Thank you so much for having me. So I’m going to jump right in and say that I think I would be, alongside David, kind of in the middle, a little bit more on the diplomatic side, and I’ll tell you why. I think this information can undermine democracy, depending on context. I prepared myself to talk a little bit about that, right? So depending on sources, depending on political, social context, what disinformation strategies are being pursued, I think they can undermine democracy, but not on the ways that we would instinctively think of. So I agree with Jeanette that we don’t have solid empirical evidence that disinformation changed voters’ behaviors, for instance. We don’t have solid empirical evidence that it changes the results of elections, but I think we do have enough reason to worry about how exactly it weighs in recent political transformations that we have seen in different contexts. And I think this shows that we need more research on that, on these blind spots that were not exactly unfolding yet. But I will say one thing about the empirical evidence, just to close my first statement, which is this empirical evidence we have nowadays, it’s very much focused on the U.S. and Europe, so I would be very happy to see more empirical evidence of how disinformation unfolds in other political contexts, what we call the global South, but I’m talking Latin America and Africa and Asia. So that would be it for my first input.
Anriette Esterhuysen:
Thank you very much, Clara. We do not yet have our final speaker, Aaron, who was with the Singaporean government recently. He’s now moved to academia. I don’t think he’s with us. I do want to introduce our remote moderator. She’s in Nairobi. Risper, can you just reveal yourself on screen? Put your camera on and say something. I’ll see if I can unmute you. Can you unmute Risper Arose, please? She was there, and she might have just dropped off, but she’s trying to get hold of Aaron. But let me turn to Jeanette and Bill here, and David, you can add on this as well, and Nigat. Our opening question to you was not that dissimilar from the question that I asked the room, and that is, do you think you can define it? What is disinformation? Is it serious? And if so, why?
Jeanette Hofmann:
I think there is less argument about what disinformation actually is. Usually we say it’s a decision we make between misinformation and disinformation, and disinformation focuses on strategic intent. And that usually is to manipulate people in their behavior and in their worldviews. So that is what disinformation is. But now I’d like to come back to what I said earlier, that we really lack empirical evidence and I’d like to elaborate a bit on that. We have right now a strong focus in academia on platforms. And while that makes sense, because we get at least some data about the circulation of disinformation, there is also a little problem with that, because it cuts off long-standing research traditions that focus already on the question of propaganda, of manipulation and its effects on people’s minds. At that time it was less about platforms and it was more about legacy mass media, but also political propaganda. And I have to say that there was never agreement on the question of whether it had a long-time effect on people. There is a strong opinion in academia saying the effect is amplifying what people already believe. If you have a tendency to believe in conspiracy theories, you might be open, acceptable to disinformation. But if you are immune to that, it might not have much of an effect on you. So there is the hypothesis called preaching to the quiet. Disinformation reaches people who are vulnerable. That’s point one. And second one I would like to make is that a lot of research focuses on the individual. But what really matters when it comes to disinformation is the media environment. Countries with a strong tradition of public media, of high-quality media, they are able to counter disinformation much better than countries where the media landscape is a mess. So if we want to learn more about whether or not disinformation works, we need to look beyond platforms and take into account the wider media landscapes. Context matters and there is no point in only looking at platforms and their algorithms. It’s also interesting, I mean, it just kind of struck me the remark about the U.S.
Anriette Esterhuysen:
and then the comment from the Philippines. And maybe there are cultures, media cultures, and populist, you know, political moments which might also be similar. So there could also be other contextual issues. And Bill, do you want to elaborate on your earlier remarks? Okay. Well,
Bili Drake:
first of all, on the U.S. case, I think that there have been a lot of studies that have indicated that actually, further to Jeanette’s point, a lot of the disinformation is not originally coming from social media. It’s coming from broadcast media and then gets picked up by and amplified by social media. So when we talk about the strength of the traditional media environment being a buttress against the spread of disinformation, you have to recognize in a case like the United States, which has a fairly strong and vibrant media system and has for a long time had multiple voices, well-funded, et cetera, et cetera, that we have three networks that get tens of millions of viewers that are completely all in on disinformation. I mean, who are major proponents of disinformation and who attack anybody who questions disinformation as being somehow trying to suppress freedom of speech and so on. So we have a certain problem in the U.S., but I mean, I guess the point I would make is, you know, between traditional media as the savior and platforms as the source of all problems, as again, what I said at the outset, it all depends on the context of what we’re talking about. On the definitional issue, you know, Jeanette said that we all kind of agree on this. It’s actually interesting. I was looking at some of the different definitions that have been put forward by different leading organizations, and it’s amazing how variable they are in their details. Maybe this is because I spent a lot of time in the Internet governance space working on the definition of Internet governance 20 years ago in the UN Working Group and so on, but I mean, I tend to look at these things and think, why are they doing it this way? For example, the European Union describes disinformation as the creation, presentation, and dissemination of verifiably false or misleading information for the purpose of economic gain or intentionally deceiving the public. Why economic gain or intentionally deceiving the public? There’s different ways of formulating the strategic objective there, right? The UN Special Rapporteur in 2022 says false information that is disseminated intentionally to cause serious social harm. So are you saying that it’s only disinformation if it is intended to cause serious social harm, and what constitutes serious social harm? So, you know, you can play with the definitions. In fact, they could be quite messy. Obviously, we want a definition and an understanding that captures the notion of intentionality and of verifiable falsity, and we want to capture that there’s a strategic dimension in the production and the original creation dissemination. But then again, you have people who recirculate disinformation all the time not knowing it’s disinformation, so then when they do it, it’s misinformation, I suppose you would say, right? Because they’re not seeking necessarily to cause serious social harm. Their weird uncle sent them this picture saying that, you know, Hillary Clinton is the devil and eats babies, and they think maybe that’s true, and they send it to their friend. So, I mean, you know, I mean, this is the kind of crazy stuff we have. But I will, back to Jeanette’s point, and I’ll stop on the U.S. thing, though. It is true that social scientists, just to amplify what I said before, social scientists always have trouble demonstrating impact. We have 60-something years of media studies before the internet where people tried to impact, look at the impacts of media, media effects, and how strong or weak they were. Did they cause violence, sexual predation, whatever, etc. It’s hard to get at that through polling data and so on, but when tens of millions of people vote saying that they do so informed by strong disinformation, this would seem to be relevant information to me.
Anriette Esterhuysen:
Thanks, Bill. Nigat, do you want to add anything to your earlier opening remark on the concept and the issue?
Nighat Dad:
Yeah. So, based on my work wearing different hats, you know, someone working in Pakistan, looking at the context, but then someone sitting at Matters Oversight Board, I feel that content that is related to mis- and disinformation, it’s very complex to define it as a mis- and disinfo, and even more complicated to identify it. And I think definitions are very, very contextual. Some of the definitions Bill mentioned here, and I’m like, okay, but some, you know, like actors, civil society actors, states, companies, sort of define all these things according to, like keeping their interest in mind as well, or the work that they are doing, or the context that they belong to. But for instance, like disinformation for us, how we kind of see it as a, you know, false content, which is being shared with the intent of causing harm. But then we cannot assume all untrue information is harmful, and we should be very cautious of, you know, defining it in a way where, especially not the civil society, but the powerful actors, when they define it, that means that they are going somewhere to regulate it, right? And that’s where I see a problem is. So, I don’t know what else to add here, but I feel that it’s very contextual. I don’t know how many of you have seen UNSR’s report on gender disinformation, which is just released a couple of days ago, and it has a context of all the regions. And it actually, it was mind-boggling to see that how disinformation causes harm to women and marginalized groups in South Asia, and how it does it in Latin America. So, I feel like it’s such a good document that people should read. It’s very recent, but at the same time, I feel it’s contextual. But we should be very, very cautious when we are defining it, and not giving, not leaving a space where, you know, like, we should basically give space to people to interpret it the way they want to, according to their context, and how they see their political situation is.
Anriette Esterhuysen:
Thanks, Nikat. Just a little plug for my organization, Use the IGF, and this shows how the IGF can be really valuable, to have a consultation last year in Addis Ababa with the special rapporteur on gender disinformation, and she used the input from that consultation in her report. David, any additional points from you?
David Kaye:
I think the only thing that I’d add is that, is to sort of put a sort of a legal sort of gloss on all this, which is just to emphasize why definition is so important. If we’re talking about, ultimately, legal regulation, which I think, ultimately, when we’re talking about governance, that’s what we’re talking about, then we have to be clear about what the issue is. And that’s not just some abstract issue, it’s also a fundamental component of legality. We need precise definition, and then, you know, precision in what it is that we’re actually restricting. And one of my big fears is that even though I share the view that there is a problem called disinformation, and there is a wide variety of impacts of disinformation, we don’t really have clear definition. And we see that, I think, even in the emerging regulations from places in Europe and the UK and elsewhere, because what we see there is this move for transparency and risk assessment that assumes that, in those cases, platforms will sort of define the issue as they’re doing that work. Maybe that’s okay, and maybe that will be great evidence for social scientists and for legal scholars, but I’m afraid we’re not at a point where we have shared definitions that are clear enough for legal regulation.
Anriette Esterhuysen:
Thanks, David. Janette, you want to react?
Jeanette Hofmann:
Yeah, I wanted to address the question of motives, because many people, of course, for good reasons, refer to distinct episodes like the attack after the last election in the US on the White House. Empirical research shows that many people who act on disinformation and share disinformation do not necessarily believe that this is the truth. One reason why people share disinformation is to signal belonging, to express their loyalty to a certain way of thinking and acting. So many people say Republicans who now share information about the last election being stolen might not necessarily believe this. What they are expressing is their loyalty, their belief in Trump and this sort of crowd of Republicans. And we even have evidence that when people are asked, do you believe what you are sharing, that they might not tell you the truth. And this brings me to an aspect that I find actually alarming. It’s less the amount of disinformation, but there is a growing amount of people in various countries who do not care any longer about the distinction between truth and falsity. For them, political belonging, let’s call it tribal attitudes, are becoming so strong that they are more important than whether or not there is a truth to strive for. And that is what I think undermines public discourse and therefore democracy.
Anriette Esterhuysen:
Thanks, Janette. Clara, I hear, I can see you also want to react. Go ahead.
Clara Iglesias Keller:
Yes, thank you so much, Henriette. Just a sec, my kid’s waking up.
Anriette Esterhuysen:
Tell them to make you coffee.
Clara Iglesias Keller:
Yeah, unfortunately not possible yet. But yeah, I agree there is this conceptual inconsistency. So we have a lot of definitions more on the media and communications disciplines as one of the many communicational phenomenons that comes alongside misinformation, propaganda, fake news. I think I actually like distinguishing things by intention. It makes all the difference for the law. We often run into social science scholars or communication scholars say, oh, but you can’t tell much about intent when in fact intent holds up a big chunk of legal relevance, right? But what I do want to say is that beyond being a communication practice or being a sort of online harm, which is also another way in which we refer to disinformation a lot, I think we need more definitions or more efforts to conceptualize this information within political theory, within political practice, right? So I think it functions a lot as a form of political intervention that takes shapes in different contexts, as Nigat was saying as well. So I come from Brazil, for instance, and this type of intervention clearly serves there as a means to show dissatisfaction. as somebody pointed out in the beginning of our panel, or to directly attack democratic institutions, in particular the electoral system and high courts. So I think we do have this varied, but not complete enough conceptual framework. And I’m just gonna say one very quick last thing. I don’t think there’s space in statutory regulation for a concept of disinformation.
Anriette Esterhuysen:
I’m about to move on. I’m gonna ask you about regulation next. So Bill, you want to react. Is it about the opening segment, or do you want to actually talk about regulation?
Bili Drake:
I just want to surprise Jeanette by agreeing with her strongly, because we’ve been arguing about this for a while. No, I mean, I just want to emphasize again, and this is why I said at the outset that I was kind of in the middle, and it depends on context and so on. There are indeed lots of people who will say they believe in disinformation because of tribal loyalty. There’s no question about that in the American case. And indeed, one of the things that’s really happened, and this goes also to your not believing in truth versus fiction, they’re not believing there’s a boundary. There’s a lot of people who have just built their identity around giving the finger to the other side. It’s like if you hear a lot of people, when they interview people at Trump rallies, and they say they believe this stuff, and somebody asks them, but you saw this. And they go, yeah, well, whatever, screw that. The libs don’t want us to think this. So it’s all about owning the libs, owning the other side, giving the finger to the people you don’t identify with, and so there’s like a pretense in a way. So it does mean that it’s harder to disentangle. It doesn’t mean that it’s not a vibrant, important part of the mix. It just means is it directly causal, is a little bit more complex.
Anriette Esterhuysen:
Thanks. I want to ask everyone in the room, is there anyone here who lives in a context where disinformation is not particularly prominent or influential? Anyone? Just come and tell us a bit about it. Just come to the mic and tell us where you’re from and why you think that’s the case.
Audience:
Well. Hello, everyone. Well, I’m from Switzerland. I wouldn’t say that it’s not a problem whatsoever, but I also wouldn’t say that it has the ability to sway entire elections. We have a multi-party system, for example. People tend to vote for the same parties for the last 30 years. So not much has happened in that respect. Of course, we had the same challenges with COVID as everyone else. But I think I will be exaggerating massively to say that this is the topic for us to focus on.
Anriette Esterhuysen:
Well, it’s really good to get a global north perspective. I’m from South Africa. And I can say that disinformation is not a major problem in South Africa. Believing the government is a major problem. But what, so there is often engagement in the media about where the information is false or accurate. But we’ve got very well self-regulated media that deals with disinformation. And the public tends to participate in that. So fact-checking is something that happens on a daily basis very quickly. So a politician will make a statement on television one night. The next day, the media will fact-check it. And the public does tend to believe the media. And what we find quite unique or interesting is that whether it’s right-wing center or left-wing media, there is a common commitment across that spectrum to accuracy, which we think is because our media, the self-regulation actually works. So for us, it’s also not a major concern. I want to move on to the next question, in a way at the heart of why we convened this workshop. By the way, the three of us argued a lot in the course of this workshop planning and design. So it’s for us also not a clear issue. But really the heart of this is it’s at the IGF. And it’s about governance. Do you think that we have, do you think we can regulate disinformation effectively internationally? We know there’s a lot of national initiatives that are being put in place and that are quite controversial. And what do you think, do we have the baseline? Do we have a strong and clear baseline that existing international instruments can provide for the governance of disinformation? And if we move in this direction of international governance of disinformation, what are the implications for access to information and freedom of expression? David, can I ask you to start us on that? I know this is something you have applied your thought and your expertise and experience to.
David Kaye:
Yeah, thanks, Henriette. So my view is that global regulation of disinformation, if we think of that as a concrete set of rules that will guide decision makers in every context and will have kind of a global oversight is a chimera. That’s not, we’re not going to achieve that. And it’s not worth, in my view, even trying to achieve it. What I do think is that governments and platforms and media outlets need a common set of principles to guide how they think about this and how they react to it. And to my mind, I mean, this would be no surprise to people, at least who know my work, I think that those principles are rooted in human rights law. And there’s very, very good reason because we’re talking about information, the sharing of information, the possibility of making it harder for individuals to find accurate information. We need to have standards that are based in Article 19 on principles of legality, necessity and proportionality and legitimacy of the objective. And I think that, and just to end here, I think that there is a way in which resourcing human rights mechanisms in particular, and by that, I don’t mean the Human Rights Council, but rather the Human Rights Committee, regional human rights courts and others, resourcing them, ensuring that they have the tools to answer questions when individuals feel that their government is interfering with their right of access to information by either disinforming themselves or permitting disinformation in one way or another, that those kinds of tools can be a mechanism for global governance, but not in the maybe IGF sense of what internet governance looks like.
Anriette Esterhuysen:
Thanks, David. Clara.
Clara Iglesias Keller:
Yeah, so I really enjoy thinking about this question. It was really provoking to me. I am not sure about the extent to which global governance solutions can help us, and I’ll try to summarize this very briefly in two points. First, because I feel this urge to look at this information, disinformation’s role in political disputes and it becomes clear to me that countermeasuring it or mitigating it’s not only about a communication practice in itself, if it’s being used by certain political and economic interests that have been shaping our societies for so long, then mitigating disinformation is also about confronting ourselves with these broader issues. So to stick with the Brazilian case, I think about the ultimate convertibility of economic power in political power, and that includes a traditionally concentrated and unregulated media landscape, especially when compared with other Western democracies. So I think that certainly needs to be a part of the conversation. But even in the interest of getting more granular, it’s okay, it’s fair to say we need some action that targets disinformation specifically as well. And here, I’m afraid I’m also skeptical because global solutions mostly presume consensus-based governance structures, or as David put very well, at least global governance in the IGF sense of things, does not include an authority enforcement, right? And I think even though it offers us very interesting guidelines, human rights standards, I’m afraid that confirmation of current digital business models and data usages will need more than that to mention a few things that should be on the regulatory agenda.
Anriette Esterhuysen:
Thanks, and Nigat, do you feel we’ve got the instruments needed? Or how do you feel about the idea of global governance of disinformation?
Nighat Dad:
Yeah, I mean, everything that has been said by David and Clara, I agree with that. I think we already have global governance instruments with us in terms of several principles or conversations that have taken place, resolutions and all of that. But I think we also need to see what actors have learned out of those instruments and tools that we have, which have been developed globally. I don’t think powerful actors have learned that. If you look at national policies, regulations, and laws that are being drafted and introduced not only in global majority, but also in global north, those policies have this appetite of suppressing freedom of expression, right? I don’t think that they are able to identify how certain content can actually cause real world harm. And what David basically said, that especially state actors have this, not only companies, but state actors also have this obligation under human rights standards to provide accurate information and prevent misinformation. I mean, we have Rabat plan of action. We have so many instruments out there. I think we really need to see how different components that are already out there can complement each other and do not work in silos. There are several, and we’ll be talking about those governance mechanisms. You have already mentioned there’s like certain, you know, like for oversight board or other things that are already out there, how those are performing. Are we really, you know, looking into the performance, what they are adding into our existing ecosystem? So I think we already have so much. I think we just need to know how to use that.
Anriette Esterhuysen:
Thanks. Thanks for that, Nika. Jeanette.
Jeanette Hofmann:
Yeah, thank you, Henriette. The whole question reminds me of the early days of internet governance when it was always clear that protocol standards for the infrastructure, we had to agree upon to have a global network, but the upper layers, particularly content, that should not be done on a global level. Having said that, there is one aspect I’d like mention that I find quite interesting in this context, and this is the Digital Service Act that the European Commission just agreed upon, and that will take effect early next year. And that DSA has one aspect that, at least from a German perspective, is really interesting. It concerns the scope of human rights. At least in Germany, traditionally, human rights regulate the relationship between citizens, or say people, and the government. But the DSA mentions at several points that human rights should also guide the behavior of platforms, meaning the scope of human rights begins to integrate also private sector action because it affects to such an extent our conditions and possibilities to exercise human rights. So this is an interesting development, and we can think about extending that to other jurisdictions or world regions. And actually, I would like to know what our other panelists think about that.
Anriette Esterhuysen:
Thanks, well, Bill, do you want to react to that before we go to, Aaron has now joined us, so we’ll hear from him next.
Bili Drake:
No, I don’t want to react to that. I want to say something else. Go ahead, you can. So the question, I mean, the way the question’s posed, can it be effective? Probably not. I think we have to engage it on the long term anyway. We have to try to build up the operational and normative infrastructure to continually challenge disinformation. But of course, obviously, to believe that it’s going to effectively regulate it at the global level is a little bit far-fetched. But we have to try. That said, I think it’s worth just highlighting, because this is the Internet Governance Forum, we’re trying to talk about disinformation from an internet governance standpoint, some of the things that are happening internationally. I just want to flag a couple of quick things. One is the UN discussions around cybercrime and cybersecurity. In those contexts, you’ve seen a lot of proposals, a lot of language that pertains to disinformation, and it demonstrates the difficulty of trying to do this at a multilateral level in a geopolitically divided environment. So for example, in the cybercrime treaty negotiations going on now, China proposed language saying that all governments should adopt laws calling spread of disinformation a criminal offense. And they described it as anything that makes available false information that could result in serious social disorder. Well, again, what could result in serious social disorder is obviously in the eyes of the beholder. Then we have, secondly, the UNESCO guidelines for the regulation of digital platforms, which UNESCO hopes to finalize this year. That has some language in it that is germane to disinformation as well. And the model of adopting guidelines or suggesting guidelines to countries, you know, there’s the possibility that some countries will implement those guidelines in ways that are restrictive of freedom of expression and will claim international legitimacy in doing so. So the question of guidelines versus treaties is an issue. The last thing I would just mention, the UN Secretary General is now proposing a code of conduct for information integrity on digital platforms. This is part of the global digital compact discussions, and he wants to have this discussed at the summit on the futures. And if you’ve seen the document, he proposes a global set of guidelines drawing on the UNESCO experience that is based on nine principles, many of which are pretty much focused on platforms and how platforms behave and how stakeholders behave. This is, you know, it’s easy, I think, to say, well, platforms have to adopt rules about transparency, disgorging information, allowing scholars to access the data, and so on. It’s a lot harder to say states should not disseminate this kind of information in the first place, or all stakeholders must abide by good taste and common sense. Those things are a little bit hard to achieve, particularly through guidelines, but that’s what the Secretary General is doing. And he’s actually calling also for the establishment of a, quote, dedicated capacity in the UN Secretariat to monitor and advance all of this, which is an interesting thing. So, you know, there’s been a lot of discussion about whether new organizational structures will be built through the global digital compact, particularly in New York. And this is one where I think they might get some political traction in saying there ought to be a centralized mechanism for at least tracking progress in addressing these issues, tracking progress in adopting complementary kinds of guidelines, and so on. So we’ll see. But so there’s a lot going on at the international level, and I think it’s worth talking about that.
Anriette Esterhuysen:
Thanks, thanks. Aaron, Aaron Maniam, you are with us now. Please introduce yourself and tell us what you think about this global governance response to disinformation.
Aaron Maniam:
Thanks so much, Henrietta. Apologies for joining you a little bit late. I had some technical challenges, which I suppose are unironic given that this is, you know, a panel at the IGF. And I’m calling in from Oxford, that panel, you know, that capital of the world in internet connectivity. So apologies to everyone. I’m glad I got here now. I love this question on global governance, because I think it gets to the nub of many of the core issues.
Anriette Esterhuysen:
Aaron, just tell us a little bit more. I know you’re an academic now, but we were particularly interested on your views based on your perspective as being within government.
Aaron Maniam:
Of course, sure. Until recently, I was a policymaker in Singapore, working on digital issues, you know, covering both how governments can enable digital in the economy, in society, but also what sorts of international partnerships governments need to embark on and the kinds of regulation that we need to do, both in the economic sphere, as well as on content. So I think this is a real, you know, commingling of many of the challenges that any government faces. And on the global front, I wanted to make four points, which I think are all germane to this discussion. The first is, we have to figure out even what we mean by global governance, right? There are so many models. There are de minimis models, where it’s kind of a very basic level of guidelines that get set out and very little else in terms of enforcement or monitoring capacity. But then we have maximal models as well. Things like the ICAO has managed to achieve because we know that there are clear risks and safety issues that are involved. And at the moment, you know, we see emerging examples like those Bill mentioned, the open-ended working group on cyber at the UN is trying to achieve a greater set of consensus on some of these issues. And it’s really not clear where we’re gonna land, I think, in terms of the de minimis or more maximal models of global governance. As a result, just second point that I think is important here is, I think it’s really important to differentiate between the basic standards and the kind of more additional sets of issues that we want to cover in a set of global. governance regulations. Jeanette referred to this, and I think examples like the DSA are really important. In Singapore, we have a set of online safety regulations. The UK, we know, has put that out recently as well. And I think it’s useful to ask if, beyond the basic, you know, we were not able to get to a level of consensus yet, what is useful is that the guidelines that Bill mentioned or any further kinds of regulatory principles that are put in place must at least be interoperable amongst different countries. They don’t need to be identical, but interoperability allows for different systems to at least talk to each other in a much more coherent way. Two last points that I think are also important that we haven’t quite named in this discussion. The first is, I think one major challenge in this work is going to be the fact that, in some cases, governments are the source of mis- or disinformation, rather than the entity that is going to regulate it. And that makes it much more difficult for those sorts of systems to be working with others where governments act in more rational, misinformation-minimising sort of ways. And we need to be able to differentiate the two and not let that first group of governments actually end up holding us back on any kinds of coordination that we need to achieve. This will be hard, of course, because the tech itself is dynamic. In a sense, we’re trying to play whack-a-mole here. I don’t know if they call the game the same thing in every country, but it’s like a new problem comes up every few weeks, and you have to find a new way to keep hitting it down. And the regulation has to keep evolving. But that also means that we need the skills within governments to keep that adaptation going, and we need the ability to continually update our legislation if we want to do this work well. That’s not impossible. It can be done. But it means that parliamentary capacity is going to be stretched, not just executive capacity, because we’re going to constantly be going back and having to update and refine and making our laws more agile and adaptive. Not easy, but I think those are the kinds of challenges we’ll have to deal with to realise the sort of global governance that we would want.
Anriette Esterhuysen:
Thanks, thanks, Aaron. And maybe that interoperability, that takes us back to David, to whether human rights frameworks can provide that interoperability. Before I open to the audience, do any of the speakers want to react or comment on one another’s inputs? So I want to open it, sorry, who is that that was that? Nigat, go for it, Nigat.
Nighat Dad:
And yet I just wanted to, before we move ahead, I think one part of this conversation that is missing is we are talking about global governance, but for some reason I always raise this in several panels and it kind of, you know, like it kind of gets lost in the conversation. And that basically is that how to hold those governments accountable who actually use these guidelines for their own interest and benefit and make laws and regulations where they can control content on in their own sovereign jurisdictions, but then they are not accountable that what are the wrongdoings that they are doing. So I feel like that just regulatory mechanisms are good, but how to hold those state actors accountable. And I have had these conversations where many global north actors are like, but those governments will do this anyway. Then where should we all go? Like, should we leave them behind? Should we leave those users behind? Like how we will take them along with us? I think this is the question that I always raise it and it frustrates
Anriette Esterhuysen:
me a lot. Thanks, Nigat. I share that. Bill, you want to react quickly to this?
Bili Drake:
No, actually I want to ask the other panelists for their perspective. I’d like to know if anybody has views. I’m looking at some of the people we have online. Has any opinion on the Secretary General’s proposals? Since this is an important thing, the Secretary General of the United Nations is proposing a code of conduct on information integrity for digital platforms. That would seem to be an instance of an attempt at global internet governance. It’s governing information that goes over the network that fits within the definition of internet governance. So I’d like to know how people view this initiative, what they think of it in terms of its potential impact, how well it’s crafted, etc.
Anriette Esterhuysen:
Do you want to comment on that, any one of you? Just jump in if you do.
David Kaye:
I’ll say something just very briefly about it because I do think it’s a very important topic. I’ll say two things. One, I think the process has been interesting, but I am not sure that civil society has played as active a role in helping shape the document as should be the case. So there’s a process issue at the outset. As we look forward to the actual negotiation of an adopted or to be adopted text, if civil society is not in the room, if civil society is not in the places where there’s actual drafting and adoption taking place, I think the legitimacy of the outcome will be suspect. So that’s a process response. A substantive response is that I’m concerned that it focuses not exclusively, but perhaps in an over-reliant way, on platforms. There is absolutely a major role of platforms in the problem of disinformation, but any process that excludes, as we heard from the situation in the Philippines or the United States, that excludes a call for governments themselves to be better behaved, to be supporting and resourcing public service media, public broadcasting, any avoidance of that will, I think, make the process sort of at its core not useless, but will really detract from its value.
Anriette Esterhuysen:
Thanks very much, David. I’m actually going to move to the audience now, and I’m going to open with the question that actually I’m going to ask the panel to respond to as well, which is that if we are going to develop governance initiatives to respond to disinformation, how do we do it? David just pointed out the risks of a fairly top-down process, such as the one that’s coming from the Secretary General’s office. So, how do you consult? How do you make decisions about governance responses to disinformation in a way that will be effective? So, I’m opening this question to people in the room, and also, if you have any questions for the panelists or contributions, please stand up, take the mic, and
Audience:
introduce yourself first. Farzaneh, you can go ahead. Oh, thank you. My name is Farzaneh Badi from Digital Medusa. First of all, congratulations on a nuanced and evidence-based discussion on disinformation. This has been lacking from IGF this year, and I think you remedied that with panel. Disinformation governance can become, indeed, an internet governance issue if we rush towards solutions that could affect internet infrastructure. And by talking without evidence, by talking about disinformation and how harmful it is, and rushing to govern and regulate it, we are going to see that it affects connectivity. And it has actually happened when Russia invaded Ukraine. Europe decided to do IP blocking of websites of Russia that were spreading disinformation. So, it is, indeed, an internet governance issue, and at IGF, we need to have an absolutely more nuanced approach to the discussion and not rush to any conclusion. And as Nigat said, we also have been active in coming up with solutions. Anyway, but so, I think that that’s one point that it can be an internet governance issue, and we need to monitor that. And the other is that disinformation is also something that in declarations like G7, they mention it. They commit to open global internet, but they also talk about tackling disinformation, which, like, but our solutions to fight with disinformation should not affect connectivity and internet infrastructure. Thank you. Thanks, Farsi. Wolfgang. Thank you very much. My name is Wolfgang Kleinwächter. I’m a retired professor from the University of Aarhus, and I can only support what Farsaneh has just said. We always risk in this debate to undermine the fundamental right to freedom of expression, as laid down in Article 19 of the Human Rights Declaration. And, you know, as a member of the old generation, I’m asking myself a little bit, what’s really new here? So, I went through the Cold War, and the Cold War was an information war. So, if it goes back 500 years when Gutenberg invented the printing press, so the Catholic Church were excited, then somebody used it to write anti-Catholic pamphlets, and because the Pope said there’s a misuse of technology. And then the struggle started, you know, who is right, who is wrong. So, the critics of the Catholic Church or the Pope were right, and they had the index of censorship. So, it’s very close if we continue the debate that you end up with the censorship regime. So, and the tragedy is also, look at Germany in the 1930s. So, Mr. Goebbels, who was the Minister for Propaganda in the Third Reich, so he did see radio as a weapon. He said it’s like a machine gun, and people loved him. So, millions of Germans, you know, believed in what this crowd of criminals said to the public, and the tragedy is, we know all this, that complex, simple answers to complex questions are given by either idiots or liars. So, but the problem is that a bunch of people love idiots, and they love liars. So, that means what you can do with this. So, that means you have to invest more in education, creating awareness, you know, to enable people that they understand the context. So, I think context is one way forward. So, you have, if you have bad information, the best thing is to have more good information, so that you can rebalance this, and not to cancel it, or to censor it, or something like that. So, I think Jeanette mentioned the Digital Service Act. That’s one step in the right direction. We have the Facebook Oversight Board, you know, it’s an effort, but I’m also very critical. So, I think we have a problem, but I do not have a solution, and I think in 20 years, we will discuss, you know, what could be the solution for this problem. So, we have to go with very small steps to identify where we can minimize the negative side effects of this disinformation. You will not be able to remove it. So, and, you know, one idea I have is, since years, so in the ICANN context, we have the dispute resolution mechanism for domain names. So, the UDRP system allows a broad system, on a case-by-case basis, to go through cases, taking into account a regional context, and individual constellation, and 80-90 percent of content-related conflicts in the internet are relevant for the region, and the cultural context, and they involve parties. So, that means, why not to think about a distributed system, where you, on a case-by-case basis, go through certain issues, and this could create a body of understanding what is bad and what is good. So, that means, these are the, then the voluntary guidelines could outcompete it, so best practice or something like that. So, I fully agree with David, that he said, you know, a top-down regulation will never work. We have Article 19, we have Article 19.3, we know how the national security, public order, public health, and moral as possible restrictions are misused by governments, but this is not a governance issue for the IGF, this is a national issue for national policymaking, and insofar, we should debate this in the IGF, but we should not expect that we will find a solution. Thank you.
Anriette Esterhuysen:
Thanks, Wolfgang. I just want to make one point, though. I do think we have to recognize that the weaponizing of disinformation is very different now from what it was when you had to do it via radio, to do it via broadcasting platforms. You needed to have some kind of political power or economic power, and in online platforms, it’s much more distributed, that ability to weaponize disinformation, and I think that is a challenge we cannot ignore. So, yes, there are similarities, but there are also differences, and I see, Korwe, you want to have the floor, but I wanted to know if there was anyone from the Christchurch call in the room who might be willing to share how they approached consultation, because I think that is an important part of what we’re trying to address with this workshop, to find effective ways of getting towards governance ignitions. So, please go ahead, Korwe.
Audience:
I have two comments. The first comment, actually, is I really question about, you know, the UN Director General, his saying is really can get consensus or any respect from the state, you know, the different states, because well, how he can act that, and eventually the state agree upon, that is difficult. And the second one, actually, is a question, because we are talking about disinformation, because we are living in a luxury democratic system, or some of us. We have a democratic system, we allow you, you know, even the disinformation, you can debate. Let me ask a simple question. If this disinformation, most of that, or maybe 80 percent of that, is a dispute by government. In that kind of environment, can you tell me how the people can react that? Because the disinformation is not from people, it’s from government or from politicians. Then what are we going to do?
Anriette Esterhuysen:
Thanks, Korwe. And anyone else from the audience before we go back to the Christchurch? Did you want to, Paul, are you going to, do you want to say something about it? I know it’s a slightly different topic, but I think that what you did was, in terms of the consultation process, maybe if you could just say a little bit about how you approached finding solutions through consultation. Apologies for putting you on the spot like this, but.
Audience:
It’s okay, I’m often put on the spot. I’m just used to taller microphones than this. Thank you. Yeah, I’m just trying to think through the various different layers. The Christchurch call has been through a bit of a journey. For those who don’t know much about it, it was started in 2019. And just introduce yourself, please. Sorry, I’ll go back. My name is Paul Ash. I’m the Prime Minister’s Special Representative on Cyber and Digital from New Zealand. I also lead the team in the New Zealand end of the New Zealand-French Secretariat leading the Christchurch call. For those who aren’t familiar with the call, it was stood up after the terrorist attacks in two mosques in Christchurch in March 2019, when the murder of 51 worshippers was live streamed around the world, amplified algorithmically, and found its way, I’m sure, into many of your social media feeds and inboxes. And rather than let that stand, we took a deliberate approach to working with companies, seeing them, governments, and civil society as stakeholders in the process of trying to build solutions to prevent that happening again. And as we did that, we negotiated a set of 25 commitments. That was done in rather a hurry, through eight weeks, but actually those commitments have proved very durable as we’ve looked to the implementation of them. As we went through that eight-week process, we met with civil society groups. At that point, we were not in a space of a full multi-stakeholder construct, in part because we wanted to capture the moment and make sure that we fed that input in, put a placeholder in the Christchurch call text in order to enable full participation thereafter, and get those commitments landed at a meeting in Paris in May 2019. Thereafter, we’ve worked on the process of implementing specific commitments around things like live streaming. around the ability to detect and deal with terrorist and violent extremist content, and increasingly recently, focusing in on issues like algorithmic amplification, like user journeys and the pathway to radicalization to violence. And it’s probably in that area that there are some significant connections to the topic of dis- and misinformation and its distribution. And certainly there is a good body of evidence showing that the amplification of disinformation messages can lead to radicalization to violence. The approach we’ve taken since then around consultation has been over the period of 2019 to build up and establish a civil society advisory network that works with the Christchurch Call partners, and over time, to broaden that out, to develop what we call the Christchurch Call community, in which all of those involved, whether they’re in the advisory network, whether a partner, whether they’re a government or an online service provider, a supporter, can contribute to a discussion on all of the different pieces of policy and problem solving that the Christchurch Call is working on. Over the course of a year, we work on those through specific working groups and work streams that the Christchurch Call leaders ask us to focus in on. And each year, we hold a Christchurch Call summit, where heads of government, heads of tech firms, heads of civil society organizations consider the outputs from that and give direction around the work that we will take forward in the subsequent year. It’s a different subject matter set from dis- and misinformation. We’ve been very careful to distinguish the subject areas. What I would say is keeping scope really tight has been one of the things that has enabled the Christchurch Call to make progress. It’s not a call about child sexual abuse material or body image or a range of other things. It’s focused in on one specific subject area. And I think having a secretariat that, at times, is able to build trust across participants to work quietly on issues that can be really contentious is also a very, very important lesson that we’ve taken from this. But the most important, I think, analog that could be drawn and brought over into the area of dis- and misinformation is the strength of a fully multi-stakeholder model. We’ve looked at many, many different multi-stakeholder configurations that might be governments and civil society, tech and civil society, tech and governments. Having that mix all in a room together is actually incredibly difficult. You get aspects of the three-body problem working at times, and that can be really complicated to deal with. But it does mean that there’s a process of building trust and becoming, as we put it, comfortable having uncomfortable discussions. And I think that’s one of the most important things to learn. Over time, we’ve had to systematize that a bit more. And as it’s grown, one of the hardest things to do is actually maintain that trust across the community. So there’s a useful lesson there for any other multi-stakeholder governance construct. I’ll stop there, because that’s probably more information than you needed, but thank you for the opportunity to speak.
Anriette Esterhuysen:
Thanks, thanks, thanks for that, Paul. But I think it’s very useful because I think it’s the, it’s about the depth of the process, the fact that it takes time, that it takes time to establish a process that’s going to help you respond effectively to a problem, and then I think the focus. Maybe we talk about disinformation, but maybe disinformation about sexual and reproductive health is very different from disinformation about elections. But so now to go back to the panel, and Bill, I’m gonna start with you, this question about how to approach the consultation, the decision-making process to respond to this from a governance perspective.
Bili Drake:
You keep asking me questions that I don’t wanna answer. I wanna do something else.
Anriette Esterhuysen:
Okay, then I’ll go to someone else. No, you can answer the question that you want to answer.
Bili Drake:
I wanna reply to David and Galway on points they made real quick. Galway was talking about the governments being the source of things. This goes back to what I was saying about the Secretary General’s initiative. You know, December 21, the General Assembly had a resolution on countering disinformation which recalled, reaffirmed, recognized, highlighted, expressed concern for, et cetera, et cetera, and asked for the Secretary General to do a report. He did a report the next year, August 2022, and said, one of the big conclusions, states bear the primary responsibility to counter disinformation by respecting, protecting, and fulfilling the rights to freedom of opinion and expression, to privacy and public participation, and called for a multifaceted approach anchored in the protection and respect for human rights. And now, what’s he doing in response to that? Guidelines for platforms. Because the reality is, it’s pretty hard for the United Nations to have any kind of really constructive process around governments being the source of much of the disinformation that matters because governments reserve the rights under the UN Charter to do whatever the hell they want, and they’re not gonna be constrained. So instead, we focus on the platforms. I agree that platforms have some issues. I agree that, you know, it’d be useful to try to encourage greater transparency, et cetera, et cetera, in the platforms. But that’s not, you know, telling Facebook that they have to do something because Russia used Facebook to disseminate information does not address the fact that Russia is disseminating information. And it’s not just, you know, Facebook and Twitter, and, you know, what am I trying to say? You know, YouTube. There’s so many different sources of disinformation, on the dark web and so on. It’s an incredibly robust environment. It’s a commercial marketplace. People create this stuff. You can go online on the dark web and hire people to generate disinformation using bots, and so on and so forth. So, I mean, it’s not just, you know, you can adopt rules for a few of those platforms. It’s not gonna solve the thing. The other point I just wanted to make real quick was that David said we need greater civil society participation in the Secretary General process. This is why I brought it up. Here we are at the IGF. We’re having almost no discussion around the various things that are being proposed through the Secretary General under the Global Digital Compact. I’ve been to many sessions. We’re not doing it. We’re not taking advantage of the opportunity to say, we, the stakeholders, not just civil society, but all stakeholders, demand right to be heard and weigh in on these processes. Nobody knows where this is being done and how it’s being done. It’s just outreach to a small set of players. That’s, something has to change there.
Anriette Esterhuysen:
Now, there is a sense, I haven’t heard anyone use the concept yet of disinformation panic, but I think there is a bit of a sense that multistakeholder and self-regulatory processes aren’t dealing with it. Therefore, the parents have to come and step in and we need governmental regulation. And I think, well, certainly in my view, I’m not saying that there isn’t a need for regulatory response, but it does feel as if we are leaping to that response before we’ve actually explored more bottom-up ways. Jeanette, what is your view?
Jeanette Hofmann:
When it comes to what we should do, I’d like to go back to the Digital Service Act. It’s paragraph 40, has a provision that gives regulators, but also vital for me, researchers access to data from platforms in the area of what is called systemic risks caused by platforms. I’m not so happy about this restriction that we have to sort of mobilize systemic risk as a concept to get access, but the fact that we now will get access to data that help us understanding in a better, more evidence-based way of what disinformation does, also how many people it actually reaches, we don’t know much of that, right? We have only vague data from the US and Europe saying it’s sort of between 0.4 and 5% of the people, of the users of platforms actually get to see disinformation. There are better times lying ahead of us with getting access, and I hope that the IGF could be a mechanism where we can sort of provide best practices of how to actually implement this article 40, and I hope that also small jurisdictions will pick this up and say, here, you are giving people access in Europe, why don’t you give us access in other countries? IGF could be a way to sort of discuss this.
Anriette Esterhuysen:
Yes, it’s very important to note that this access to data is not available to researchers in the Philippines or in Brazil or in other instances where there is the need to look at the impact and how it operates, how disinformation operates. Aaron, what is your view on this, on how we develop these responses? And particularly, I would like to respond on the role of government, because I think we are in a moment where governments are taking this very seriously, I think sometimes with good intentions, not always with bad intentions.
Aaron Maniam:
No, thanks. Thanks so much for that. I broadly agree with the thrust of the discussion so far that this work needs to be polycentric. It’s not just about having multiple people involved, but there needs to be leadership and decision capacity in multiple centers of power and influence. And that means that governments move from just having an authority-based role to having a convening function. And it means governments bring together the right players, it means governments have to acquire skills in facilitation, in interest aggregation, potentially even in conflict mediation, in order to allow for emerging consensus to actually distill itself. In Singapore, we’ve been doing a lot of this consultative, deliberative work in terms of the kinds of bottom-up democracy that we’ve been calling forward Singapore in its most latest incarnation, is how are we looking to build a future identity for the country? During COVID, there was a process of thinking about recovering and emerging stronger together as well. And it’s something that we’ve been experimenting with for a good part of the last 20 years, how we build this bottom-up set of priorities and issues. But as a kind of side note to that, but a very important side note, I think even where government itself is thinking about its role in relation to the private sector as well as civil society, it’s really important to harness the range of roles, not just the regulatory functions and the lawmaking functions that we have, but also the possibility of governments in community building, in space building, in urban planning, because a lot of the solutions might actually lie in there. One thing I was very struck by in my last job, for instance, is that the three agencies that worked together with the ministry in operationalizing strategy, right, were the Information and Media Development Authority, the Cybersecurity Agency, and the National Library Board, right? So we have a board that looks after all of our national libraries. And it’s really interesting, because when you think about it, we’ve talked a lot about regulation. We’ve talked a lot about the dark aspects of security and how we want to respond to that. But we haven’t talked about the literacy that our citizens need with as much detail. And if you ask me, I think schools and libraries are as important players in this whole process as the more regulatory counterparts that they might have in government. This isn’t always obvious to us, because this is where the patience comes in, right, and it takes time to build up literacies. But I think that’s where we need to start, right? I look at my one-year-old nephew who knows how to swipe on an iPad, even before he can talk. His literacies are gonna be very different from the literacies that all of us have had to develop over time. But I think we need to bring our libraries, bring our schools, and bring our family discussions into this a lot more, because that’s where some of those core literacies and moral sensibilities will start to get laid, yeah.
Anriette Esterhuysen:
Thanks. Clara, we have three minutes left, so let’s move quickly. And then I’m still gonna come back to the panel for the takeaways. Clara, keep it brief, please.
Clara Iglesias Keller:
Okay, so I’m going to second Jeanette on a call for more empirical evidence, especially outside of the global north. I’ll stick to Brazil to tell you, for instance, that we all know that Latin American countries have another sort of relationship with messaging apps, for instance. In Brazil, this is aggravated by zero rating policies, where for a huge majority of the population, the internet equals WhatsApp, Facebook, Instagram. So this just shows you how we need more data in order to understand how all this political, social contingencies that we can see actually play into the way this information behaves. Very briefly to our promise, and yet I think we need more institutional innovation in the sense of, among all of the things that we need to find this information, there’s a lot that is up for government, but there’s this one thing that is disputing the truth, disputing content, and that should definitely be with civil society. It should be with different agents, and I think we need the institutional apparatus to allow for that to happen.
Anriette Esterhuysen:
Thanks, so Clara’s actually already done what she wants to see and what she doesn’t want to see. Nighat, what about you?
Nighat Dad:
Yeah, Andrea, I actually, to go back to your point where you said that we sometimes quickly want to get into other solutions, and I feel that the ones that we have, we really need to look into the nuance that those solutions have established. Of course, like it or not, some solutions, all solutions have pros and cons, but to your point about the consultation, one thing that I have learned sitting at the board is that we are talking about context a lot, but we really don’t know how to extract that context when we are deciding about something, and at the board, what we have done is speaking to civil society from that particular region. Where we have selected that case from, and I think that’s the kind of process that we need. We need more global oversight board. We don’t need only one because if regulation is doing something from the state perspective, we need these boards to also hold these companies accountable and do their work. Our transparency reports are there. Our decisions are there. I think those also give really good data point, data sets to researcher to hold these platforms more accountable.
Anriette Esterhuysen:
Thanks, Nighat. David.
David Kaye:
Yeah, I really appreciate that Paul Ash sort of made an intervention because I think that his leadership and New Zealand’s leadership and the role of multi-stakeholder approaches, even in the wake, and let’s remember the Christchurch call, is, you know, could have gone the other direction. I mean, it was a moment of real trauma and people have a kind of natural response, as we’re seeing right now in Israel and Palestine, a natural response to adopt ideas that are not rooted in human rights. And the Christchurch call and Paul’s, I think, stewardship of that had that at its core. And I think we need to find ways to ensure that that remains at the core of the global governance. IGF has not always been a place where human rights and access to information is front and center, but it can be. And I think there are a lot of people in this room or on this Zoom who believe in that. And there are models that we can draw from as we move it forward.
Anriette Esterhuysen:
Thanks, David. So, Berlindje Net, you’re gonna have the last word. And I want you to say the one thing that you’d like to see taking this forward and the one thing you don’t want to see. And be brief.
Jeanette Hofmann:
One thing, let’s start with the negative. The one thing I don’t want to see is that governments use this sort of global concern on disinformation as an excuse for regulating their people, speaking up in public during a time where for the first time they can actually speak up. It’s so important to support people and giving their opinion and exchanging their views and not sort of primarily thinking of it in terms of regulating. That’s the negative. The positive, I’ve always been an internet researcher with a focus on the internet. And over the last months, I begin to grasp the importance of high quality journalism. I want everybody to have the right to claim that the earth is flat as long as there is good journalism that depicts the globe, right, that shows and talks about it so that everybody can disseminate crap because it doesn’t matter, it doesn’t do harm to society as such. Good journalism and business models that are also robust for the future that comes where the young generation doesn’t subscribe to newspapers anymore but use social networks. We need to prepare for that time that the young generation doesn’t pay anymore but depends on high quality journalism.
Anriette Esterhuysen:
Thanks. Nick, I just thanking you for saying that. Bill, you have the last word before me.
Bili Drake:
I would like to see the European Union find Elon Musk. Substantially, they’ve got the capacity under the Digital Services Act and their guidelines. He has told them, screw you, I don’t care what you think and dropped out. I mean, I think governments are the main focus but we have to do some stuff on the platforms and making it hurt financially, especially vis-a-vis the advertisers is a good way to start.
Anriette Esterhuysen:
Thanks very much and I’m sorry we went over time. I’m sorry we didn’t have time to come back to our participants again but thank you very much to all our panelists and our online participants and to the tech team and to CRISPR and enjoy the rest of your IGF and let’s continue using the IGF to get to the nuts of these problems and to unpack the misinformation about misinformation. Thanks everyone. Thank you.
Speakers
Aaron Maniam
Speech speed
214 words per minute
Speech length
1443 words
Speech time
404 secs
Arguments
We have yet to define what we mean by global governance due to various models that exists
Supporting facts:
- There are de minimis models and maximal models, with varying levels of guidelines and enforcement.
Topics: global governance, policies
It’s crucial to distinguish between basic standards and additional issues in global governance regulations
Supporting facts:
- Examples include the DSA and online safety regulations in Singapore and the UK.
Topics: global governance, regulations
Governments can be a source of misinformation, which could hinder global governance efforts
Topics: governments, misinformation
Governments need the ability and the skills to continually update legislation to keep up with dynamic technology
Topics: governments, legislation, technology
The work on combating disinformation needs to be polycentric, and governments should move from having exclusively an authority-based role towards having a convening function.
Supporting facts:
- In Singapore, there has been consultation and deliberation in how to set priorities and address issues in a ‘bottom-up’ fashion
- Role of government should include community building, space building, and urban planning
Topics: disinformation, government involvement, polycentric approach
Education and literacy are crucial for tackling disinformation.
Supporting facts:
- In Singapore, the National Library Board, the Information and Media Development Authority, and the Cybersecurity Agency work together to operationalize strategy
- Education starts at home and from schools and libraries, and should be included in discussions to enhance literacy among citizens
Topics: disinformation, education, literacy
Report
The analysis explores several key aspects of global governance, regulations, misinformation, and digital regulation. One of the main challenges in defining global governance arises from the presence of different models and guidelines, which leads to variations in the level of guidelines and enforcement, resulting in a lack of consensus on the precise meaning of global governance.
Concerning global governance regulations, it is crucial to distinguish between basic standards and additional issues. Examples such as the Digital Services Act (DSA) in the European Union and online safety regulations in Singapore and the UK emphasize the significance of addressing both fundamental standards and more complex issues in regulating global governance.
These regulations play a significant role in achieving Goal 16 of the Sustainable Development Goals (SDGs): Peace, Justice, and Strong Institutions. Governments can have contradictory impacts on global governance efforts. On one hand, they can be a source of misinformation, hindering progress towards effective global governance.
However, governments also possess the authority and skills necessary to continuously update legislation to keep pace with rapidly evolving technology. This ability is essential for achieving Goal 16 of the SDGs. Interoperability, the ability for different systems to communicate seamlessly, is a vital aspect of digital regulation.
Aaron Maniam highlights the importance of interoperability among different countries, as it enables coherent communication and collaboration. This is linked to Goal 9 of the SDGs: Industry, Innovation, and Infrastructure. A polycentric approach is crucial in combating disinformation. Governments should move away from solely having an authority-based role and embrace a convening function.
By engaging in consultation and deliberation, governments can prioritize and address issues related to disinformation in a bottom-up fashion. Additionally, community building, space building, and urban planning should be part of the government’s role in fighting disinformation. Education and literacy play a pivotal role in tackling disinformation.
In Singapore, various organizations, including the National Library Board, the Information and Media Development Authority, and the Cybersecurity Agency, collaborate to operationalize strategies. Education that starts at home and in schools and libraries is highlighted as a key factor in enhancing literacy among citizens.
This aligns with Goal 4 of the SDGs: Quality Education. In summary, the analysis underscores the complexities and challenges of global governance and the importance of clear regulations. It also highlights the dual role of governments as potential sources of misinformation and as crucial actors in updating legislation.
Interoperability is crucial for effective digital regulation, and a polycentric approach is essential in combating disinformation. Lastly, education and literacy are vital components in mitigating the impact of disinformation.
Anriette Esterhuysen
Speech speed
169 words per minute
Speech length
2835 words
Speech time
1009 secs
Arguments
Disinformation is not a major problem in South Africa, as there is more concern about believing the government.
Supporting facts:
- South Africa has well self-regulated media that deals with disinformation
- Fact-checking is a common practice done efficiently and quickly
- The public tends to believe the media
Topics: Disinformation, Government credibility
The media regulation in South Africa successfully maintains accuracy across different ideological spectrums.
Supporting facts:
- A common commitment to accuracy cross right-wing, center and left-wing media
- Fact-checking is a common practice
Topics: Media regulation, Accuracy
The weaponizing of disinformation in online platforms is different than traditional broadcasting platforms.
Supporting facts:
- In online platforms, the ability to weaponize disinformation is much more distributed and does not require political or economic power unlike traditional broadcasting platforms.
Topics: Disinformation, Online platforms
Anriette Esterhuysen believes that governmental regulations on disinformation may be jumping the gun before considering bottom-up approaches
Supporting facts:
- She argues that we feel like jumping to this response before we’ve actually explored more bottom-up ways
Topics: Governmental Regulation, Disinformation, Self-Regulation, Bottom-up Approaches
Report
The analysis provides a comprehensive overview of perspectives on disinformation in South Africa, covering various viewpoints and arguments. One viewpoint suggests that disinformation is not a major problem in the country, with more concern placed on trusting the government. The public tends to rely on the media, which is regarded as well self-regulated and proficient in dealing with disinformation.
Fact-checking is also a common practice in South Africa, swiftly and efficiently debunking false information. Another argument highlights the successful media regulation in South Africa, which ensures accuracy across different ideological spectrums. It is noted that a commitment to accuracy exists among right-wing, center, and left-wing media outlets in the country.
Fact-checking is a prevalent practice, further enhancing the reliability and trustworthiness of the media. This observation supports the notion that media regulation in South Africa effectively maintains accuracy and minimises the spread of disinformation. The analysis also emphasises the need for careful consideration in the international regulation of disinformation.
It is crucial to explore the implications of such regulation on access to information and freedom of expression. While national initiatives regarding disinformation regulation are controversial, existing international instruments may serve as a baseline for effectively governing disinformation. Furthermore, the analysis highlights the distinct dynamics of weaponising disinformation in online platforms compared to traditional broadcasting platforms.
Unlike traditional platforms, online platforms allow for the widespread distribution of disinformation without requiring significant political or economic power. This observation emphasises the need for tailored approaches in combating disinformation across different digital platforms. A noteworthy observation from the analysis is the advocacy for considering bottom-up approaches and self-regulation measures alongside governmental regulations.
Anriette Esterhuysen argues that jumping to governmental regulations without exploring more bottom-up ways may be premature. While a regulatory response might be necessary, Esterhuysen highlights the importance of not dismissing self-regulatory and bottom-up approaches to tackle disinformation. This perspective demonstrates a concern that solely relying on governmental regulations might overlook effective alternatives.
Overall, the analysis offers valuable insights into the various dimensions of disinformation in South Africa. The perspectives presented shed light on the strengths of the country’s media regulation, the challenges faced in international regulation, the dynamics of online platforms, and the importance of considering diverse approaches to combat disinformation.
Audience
Speech speed
161 words per minute
Speech length
2646 words
Speech time
984 secs
Arguments
Mis- and disinformation can incite people to act against democratic institutions
Supporting facts:
- The speaker cited the January 6th Capitol riots as an instance where these actions were inspired by misinformation and disinformation
Topics: Misinformation, Disinformation, Democracy, January 6th Capitol Riots
Difficulty to settle on a definition of disinformation
Supporting facts:
- Audience belongs to Think Tank
- Background is legal
Topics: Disinformation, Legal background, Think tank
Challenging to assess the longitudinal impact of disinformation
Topics: Disinformation, Impact assessment
Hard to extract substantive grievance from disinformation
Topics: Grievance, Disinformation, Cause and effect
Disinformation is inevitable, it’s a natural part of human error.
Topics: Fake News, Disinformation
Disinformation and fake news are an effective weapon, used in context of wars.
Supporting facts:
- Usage of this weapon in Russian-Ukrainian war
Topics: Fake News, Disinformation, Russian-Ukrainian War
It’s impossible to completely eliminate disinformation
Topics: Disinformation, Fake News
Disinformation in Switzerland is not influential enough to significantly sway elections
Supporting facts:
- Switzerland has a multi-party system and people have been voting for the same parties over the last 30 years
- There hasn’t been major changes in voting patterns
Topics: Disinformation, Elections, Switzerland
Disinformation governance can affect internet infrastructure and connectivity
Supporting facts:
- Europe’s IP blocking of Russian websites spreading disinformation
Topics: Disinformation Governance, Internet Infrastructure
The threat of undermining the fundamental right to freedom of expression
Supporting facts:
- Historical examples: Information war of the Cold War, Use of radio for propaganda during Nazi Germany
Topics: Freedom of Expression, Censorship
Online platforms have made the weaponizing of disinformation more distributed.
Topics: Disinformation, Online platforms
In democratic systems, even disinformation can be up for debate.
Topics: Disinformation, Democratic System
The Christchurch Call initiative was established in response to the live-streamed terrorist attacks in Christchurch, New Zealand in 2019
Supporting facts:
- Established by the Prime Minister’s Special Representative on Cyber and Digital, Paul Ash
- The attack consisted of the murder of 51 worshippers in two mosques
Topics: Christchurch Call, Online Terrorism, Live Streaming
The Christchurch Call has developed 25 commitments for implementation with the aim of preventing such incidents in the future through a broad consultation process
Supporting facts:
- The commitments were negotiated in a hurried eight-week process
- Efforts have been made to involve civil society groups in the consultation process
Topics: Preventing Terrorism, Online Policies
Maintaining a tight scope and having a Secretariat that facilitates trust and enables uncomfortable discussions is key for the success of such an initiative
Supporting facts:
- Keeping the scope limited to one specific subject area has proven effective
- The Secretariat works quietly on contentious issues to build trust across participants
Topics: Trust-building, Policy Negotiation, Scope Management
The multi-stakeholder model that involves different entities like governments, tech firms, and civil society has been instrumental in the Christchurch Call initiative
Supporting facts:
- Different multi-stakeholder configurations have been considered and tested
- This has allowed for a process of building trust and having the necessary uncomfortable discussions
Topics: Multi-stakeholder Participation, Collective Decision Making
Report
The analysis delved into the multifaceted nature of misinformation and disinformation. One of the speakers put forth the argument that these actions have the potential to incite individuals to act against democratic institutions. To support this claim, they highlighted the example of the January 6th Capitol riots, which they believed were inspired by misinformation and disinformation.
The speaker’s sentiment was negative, suggesting concern about the impact of these actions on democracy. However, another speaker expressed a more neutral stance, highlighting the challenge of settling on a clear definition of disinformation. They pointed out that assessing the longitudinal impact of disinformation is challenging.
This sentiment indicates a level of uncertainty regarding the extent to which misinformation and disinformation can influence actions and outcomes. A disagreement emerged regarding the possibility of completely eliminating disinformation. One speaker argued that efforts should be directed towards reducing its spread and minimizing the damage caused, rather than striving for complete elimination.
This sentiment aligned with a more positive outlook on the issue. In the specific context of Switzerland, it was suggested that disinformation does not possess enough influence to significantly sway elections. The speaker based this claim on the observation that Switzerland has a stable multi-party system with relatively consistent voting patterns over the past 30 years.
This sentiment reflects a more neutral perspective on the impact of disinformation in the Swiss political landscape. The analysis also examined the potential effects of disinformation on internet infrastructure and connectivity. There was evidence suggesting that disinformation governance can impact internet infrastructure, with an example cited of Europe implementing IP blocking of Russian websites spreading disinformation.
This negative sentiment implies the belief that the weaponization of disinformation through online platforms has had widespread consequences. The audience raised concerns regarding the potential threat disinformation poses to the fundamental right of freedom of expression. Historical examples, such as the information war of the Cold War and the use of radio for propaganda during Nazi Germany, were provided to illustrate this point.
This sentiment highlights the importance of protecting freedom of expression in the face of disinformation. Notably, the analysis explored the effectiveness of the Christchurch Call initiative in response to live-streamed terrorist attacks in New Zealand. The sentiment here was negative, as it was argued that rushed solutions to govern and regulate disinformation can cause unintended harm.
The speaker stressed the need for a nuanced approach to address disinformation, referencing the impact of tackling disinformation in G7 declarations. The audience member supporting the Christchurch Call initiative expressed a positive sentiment, believing in its effectiveness. They emphasized the significance of trust-building and multi-stakeholder involvement in addressing terrorism facilitated by online platforms.
This aligns with the overall positive sentiment of utilizing a multi-stakeholder model and engaging governments, tech firms, and civil society in combating disinformation. In conclusion, the analysis highlighted the complex nature of misinformation and disinformation. The arguments presented ranged from the potential dangers of these actions in undermining democratic institutions to the challenges in defining and assessing their impacts.
The disagreement regarding the elimination of disinformation reflected a difference in perspectives, with one side advocating for reducing its spread. The analysis also shed light on the specific impacts of disinformation on internet infrastructure, the threat it poses to freedom of expression, and the potential effectiveness of initiatives such as the Christchurch Call in preventing terrorism.
Overall, the analysis underscores the need for nuanced approaches and multi-stakeholder involvement to address disinformation and its various repercussions.
Bili Drake
Speech speed
191 words per minute
Speech length
2554 words
Speech time
801 secs
Arguments
Disinformation’s impact on democracy is highly dependent on context
Supporting facts:
- Bili Drake refers to the session at the Taiwan IGF last week
Topics: Disinformation, Democracy, Context Dependence
A lot of the disinformation does not originally come from social media but from broadcast media which is then amplified by social media
Supporting facts:
- There have been multiple studies indicating this behavior
Topics: Disinformation, Social Media, Broadcast Media
The definitions of disinformation tend to vary significantly among leading organizations
Supporting facts:
- The European Union and UN Special Rapporteur definitions of disinformation differ significantly in their details
Topics: Disinformation, Definition
Believes that tribal loyalty leads to belief in disinformation
Supporting facts:
- There are indeed lots of people who will say they believe in disinformation because of tribal loyalty
- There’s no question about that in the American case
Topics: Disinformation, Tribal loyalty
Identity politics can distort the perception of truth
Supporting facts:
- There’s a lot of people who have just built their identity around giving the finger to the other side
- It’s all about owning the libs, owning the other side, giving the finger to the people you don’t identify with
Topics: Identity Politics, Perception, Truth
Regulating disinformation at a global level is probably not effective, but efforts to build infrastructure to challenge it should be continued.
Supporting facts:
- UN discussions around cybercrime and cybersecurity have seen many proposals related to disinformation
- The difficulty of managing disinformation is evident in geopolitically divided environments
- China proposed laws calling spread of disinformation a criminal offence
- UNESCO guidelines for digital platforms also touch upon disinformation
Topics: Disinformation, Global Regulation, Internet Governance
Efforts are made internationally to handle disinformation like the UNESCO guidelines for digital platforms, China’s proposal in the cybercrime treaty negotiations, and UN Secretary General’s proposal for a code of conduct.
Supporting facts:
- China proposed language in the cybercrime treaty saying that all governments should adopt laws calling spread of disinformation a criminal offense
- UNESCO is finalizing guidelines for the regulation of digital platforms
- UN Secretary General is proposing a code of conduct for information integrity on digital platforms.
Topics: Disinformation, Global Regulation, Internet Governance
Bili Drake wants to know the views of other panelists on the Secretary General’s proposal
Supporting facts:
- The Secretary General of the United Nations is proposing a code of conduct on information integrity for digital platforms.
Topics: Internet governance, Code of Conduct, Information integrity, Digital platforms
States bear the primary responsibility to counter disinformation.
Supporting facts:
- The UN General Assembly had a resolution on countering disinformation and asked the Secretary General to do a report.
- The report stated that states bear the primary responsibility to counter disinformation.
Topics: Disinformation, Secretary General’s initiative, Freedom of opinion and expression, Privacy
Focusing on platforms alone won’t solve the disinformation issue.
Supporting facts:
- There are various sources of disinformation, including the dark web.
- People can hire others to generate disinformation.
Topics: Disinformation, Social Media Platforms, Freedom of opinion and expression, Dark Web
There is a need for greater civil society participation in the Secretary General’s process.
Supporting facts:
- IGF sessions have not extensively discussed the proposals under the Secretary General’s Global Digital Compact.
Topics: Civil society participation, Secretary General’s initiative, Global Digital Compact
Bili Drake suggests that the European Union should fine Elon Musk under the Digital Services Act.
Supporting facts:
- Elon Musk has disregarded the guidelines set by the European Union
Topics: European Union, Digital Services Act, Elon Musk
Report
The impact of disinformation on democracy is a complex issue that is influenced by various factors and is context-dependent. Different perspectives exist on the extent to which disinformation can affect democratic processes. Some argue that disinformation can have a significant negative impact on democracy, while others caution against oversimplifying the issue and relying on false dichotomies.
It has been observed that a considerable amount of disinformation originates from broadcast media and is then amplified through social media platforms. This highlights the interconnectedness between different forms of media in the spread of disinformation. Several studies have indicated this behavior, emphasizing the importance of understanding the role played by different media channels in the dissemination of disinformation.
One key aspect that complicates the issue of disinformation is the lack of a standardised definition. Leading organisations like the European Union and the UN Special Rapporteur have differing definitions of disinformation, which can give rise to confusion and inconsistencies in tackling this problem.
It becomes crucial to establish a common understanding to effectively address disinformation. Tribal loyalty is identified as a significant factor that can lead individuals to believe in disinformation. In cases like the United States, where tribal affiliations and identity politics play a prominent role, people may align with certain narratives or disinformation due to their loyalty to a particular group.
This highlights how social and political factors can impact an individual’s susceptibility to disinformation. Identity politics further compounds the issue, distorting the perception of truth. Some individuals develop their identities around opposing certain groups or ideologies, leading them to embrace disinformation that aligns with their pre-existing biases.
This phenomenon highlights the role of emotions and personal beliefs in shaping the acceptance of disinformation. Efforts to regulate disinformation on a global level have been proposed, but doubts remain about their effectiveness. Discussions in the United Nations have seen various proposals related to disinformation, such as China’s suggestion to criminalise its spread and UNESCO’s guidelines for digital platforms.
However, the complexities and geopolitical divisions inherent in regulating disinformation make it challenging to achieve meaningful global regulation. As a result, long-term engagement is advocated, focusing on building infrastructure to challenge disinformation effectively. The proposal for a code of conduct on information integrity for digital platforms is seen as an attempt at global internet governance.
This proposal aims to govern the information that flows through digital networks, aligning with the definition of internet governance. It raises questions about the extent to which such regulations should be implemented and their potential impact on freedom of expression and privacy.
The primary responsibility to counter disinformation lies with states, according to the UN General Assembly’s resolution. While platforms such as social media play a role, governments bear the primary responsibility to address the issue effectively. Simply pressuring platforms to act does not address the root causes of disinformation.
It is important to recognise that disinformation can originate from various sources, including the dark web. This highlights the need for a comprehensive approach that looks beyond platforms alone. Strategies should encompass multiple sources and channels through which disinformation can be generated and disseminated.
Civil society participation is crucial in the discussion on countering disinformation. While there have been limited discussions on proposals like the UN Secretary General’s Global Digital Compact, greater involvement of civil society in such initiatives can ensure diverse perspectives and balanced decision-making.
In conclusion, addressing the issue of disinformation requires a multifaceted approach that involves governments, platforms, and civil society. The complex nature of disinformation and its impact on democracy necessitate a nuanced understanding, taking into account various factors such as media channels, definitions, tribal loyalty, and identity politics.
Efforts to regulate disinformation at a global level should be complemented with long-term engagement and infrastructure-building, recognising the challenges and limitations faced in achieving effective global regulation.
Clara Iglesias Keller
Speech speed
167 words per minute
Speech length
1095 words
Speech time
394 secs
Arguments
Disinformation can undermine democracy depending on context
Supporting facts:
- Disinformation affects different political contexts in different ways.
- The impact it has on elections and voter behavior is yet to be fully understood.
Topics: Disinformation, Democracy
Current empirical evidence on the impact of misinformation is focused mainly on the U.S. and Europe
Supporting facts:
- Research focus should expand to include contexts from Latin America, Africa, and Asia.
Topics: Disinformation, Geographical bias in research
Disinformation is a communication practice and an online harm
Supporting facts:
- Disinformation comes alongside misinformation, propaganda, fake news.
Topics: Disinformation, Communication
The intent behind disinformation holds a big chunk of legal relevance
Topics: Disinformation, Law
Disinformation functions as a form of political intervention
Supporting facts:
- In Brazil, this type of intervention serves as a means to show dissatisfaction or to directly attack democratic institutions, particular the electoral system and high courts.
Topics: Disinformation, politics
Global governance solutions may not be sufficient to countermeasure misinformation and disinformation roles in political disputes
Supporting facts:
- Political and economic interests shaping societies might play role in misinformation and disinformation
- Mitigating disinformation involves confronting broader issues than just communication practices
Topics: Global Governance, Misinformation and disinformation, Political disputes
Global solutions often rely on consensus-based governance structures which may not be powerful enough to modify digital business models and data usages
Supporting facts:
- Global governance does not include authority enforcement
- Current digital business models and data usages may require stronger regulatory measures
Topics: Global Solutions, Consensus-based Governance, Digital Business Models, Data Usage
Need for more empirical evidence outside of the global north
Supporting facts:
- In Latin American countries like Brazil, a majority of the population equates internet with WhatsApp, Facebook, and Instagram
- This is facilitated by zero rating policies
Topics: Internet Usage, Communication Apps
Report
Disinformation has the potential to undermine democracy, although its impact varies depending on the context. While there is currently no solid empirical evidence to suggest that disinformation directly changes voters’ behaviors or affects election results, there is a consensus that further research is necessary to fully understand its implications.
The existing research on the impact of disinformation is primarily focused on the United States and Europe, highlighting a need for expanding studies to include other regions such as Latin America, Africa, and Asia. It is important to understand how disinformation strategies can influence political transformations in different contexts.
Disinformation is considered a communication practice and an online harm, along with misinformation, propaganda, and fake news. Its intent holds significant legal relevance, further emphasizing the need to address the issue. In some instances, disinformation serves as a form of political intervention.
For example, in Brazil, it has been used to express dissatisfaction or directly attack democratic institutions, including the electoral system and high courts. This highlights the destructive potential of disinformation as a tool in political disputes. However, the concept of disinformation poses a challenge within statutory regulation, as there is no clear space for its definition and regulation.
Global governance solutions, although important, may not be sufficient to address the impact of misinformation and disinformation on political disputes. It is necessary to confront the ultimate convertibility of economic power into political power, particularly within the media landscape. This is evident in countries like Brazil, where traditionally concentrated and unregulated media landscapes contribute to the spread of disinformation.
Additionally, global solutions often rely on consensus-based governance structures, which may lack the power needed to modify digital business models and data usage effectively. More empirical evidence is needed, especially outside of the global north. In countries like Brazil, internet usage is strongly associated with platforms such as WhatsApp, Facebook, and Instagram, facilitated by zero rating policies.
Understanding the impact of disinformation in these regions is crucial for developing effective countermeasures. In conclusion, addressing the challenges posed by disinformation requires not only further research but also more institutional innovation. This innovation should create an apparatus that allows diverse stakeholders and civil society to engage in the disputation of truth and content.
By confronting the convertibility of economic power into political power and exploring alternative governance structures, we can work towards mitigating the harmful effects of disinformation and safeguarding democratic institutions.
Corway Wu
Speech speed
147 words per minute
Speech length
161 words
Speech time
66 secs
Arguments
Politicians also create and spread disinformation, not only news media or social media
Topics: Disinformation, Politics, Media
Timing is crucial in disinformation spread as it can be too late when voters realises they have been moved by disinformation
Supporting facts:
- Brexit voting behavior as an example
Topics: Disinformation, Time, Voting Behavior
Report
In the realm of disinformation, politicians are also actively involved in creating and spreading false information, not just news media and social media. This amplifies the scope of actors responsible for misleading the public. The negative sentiment towards politicians indicates a lack of trust in their intentions.
Timing is a crucial aspect in the dissemination of disinformation. The example of Brexit voting behavior is used to demonstrate this. Voters may be influenced by inaccurate information without realizing it until it is too late. This implies that the impact of disinformation can have lasting effects, shaping important decisions.
However, an opposing viewpoint is presented, disagreeing with Jeanette’s argument about the significance of timing when discussing disinformation. It is argued that Jeanette’s failure to consider timing weakens their argument. The negative sentiment expressed towards this disagreement suggests a potential blind spot in understanding the issue.
Overall, this analysis illustrates the multifaceted nature of disinformation and its wide-ranging consequences. Politicians, news media, and social media platforms are all complicit in perpetuating false information. The timing of disinformation is highlighted as a crucial factor, as it can significantly influence its impact on individuals and societies.
The disagreement regarding the importance of timing further emphasizes the complexity of this subject.
David Kaye
Speech speed
150 words per minute
Speech length
1355 words
Speech time
542 secs
Arguments
Disinformation is a complex issue that requires nuanced understanding and cannot be addressed with a one-size-fits-all solution
Supporting facts:
- The perpetrators and locations of disinformation are important
- Different types of disinformation can have different impacts
- The platform used for disinformation may vary from legacy media to social platforms
Topics: Disinformation, Global Governance, Contextual Analysis
David Kaye believes clarity in the definition of disinformation is critical for legal regulation and governance
Supporting facts:
- He suggests that a lack of shared definitions may impede the process of legal regulation.
- He expresses concern about emerging regulations in Europe and the UK that rely on platforms to define disinformation during their work of transparency and risk assessment.
Topics: Disinformation, Legal Regulation, Governance
Global regulation of disinformation is not achievable
Supporting facts:
- He considers the idea of a concrete set of rules that will guide decision makers in every context and will have global oversight as a chimera
Topics: Disinformation, Regulation, Global Governance
We need common set of principles to guide authorities and platforms
Supporting facts:
- Governments, platforms and media outlets should have a common set of guiding principles
Topics: Disinformation, Principles, Platforms
Human rights law should be the base of these principles
Supporting facts:
- He suggests standards based on Article 19 on principles of legality, necessity and proportionality, and legitimacy of the objective
Topics: Human Rights, Law, Principles
Civil society should play a more active role in shaping documents and be included in the drafting and adoption process
Supporting facts:
Topics: Civil Society Participation, Policy Drafting
Excluding governments from the responsibility of better behavior and resourcing public service media will detract value from the process
Supporting facts:
- Examples from the Philippines and the United States
Topics: Government Responsibility, Public Service Media
David Kaye appreciates New Zealand’s leadership role in promoting multi-stakeholder approaches, even in times of trauma, such as after the Christchurch attack.
Supporting facts:
- Paul Ash’s stewardship of the Christchurch call had human rights at its core
Topics: New Zealand’s leadership, multi-stakeholder approaches, Christchurch attack
He highlights the need to keep human rights at the core of global governance and believes that the IGF can be a platform where human rights and access to information is prioritized.
Supporting facts:
- There are models that can be drawn from to move forward, keeping human rights at the core
Topics: global governance, human rights, Internet Governance Forum
Report
Disinformation is a complex issue that involves the dissemination of false or misleading information. It can have various impacts and is spread through different platforms, including legacy media and social platforms. Understanding the nuances of disinformation is crucial, as there is no one-size-fits-all solution to address it.
David Kaye, an advocate for human rights, highlights the importance of clear definitions and understanding in addressing disinformation through legal regulation and governance. However, he expresses concern about the lack of shared definitions, which may impede the process of regulation.
Kaye also raises concerns about emerging regulations in Europe and the UK that rely on platforms to define disinformation, as this may affect transparency and risk assessment. While global regulation of disinformation may seem desirable, Kaye argues that it is not achievable.
Instead, he suggests the development of a common set of guiding principles based on human rights. These principles should be the foundation for addressing disinformation, providing a framework that ensures legality, necessity, proportionality, and legitimacy of objectives. In shaping policies and strategies to combat disinformation, Kaye believes that civil society should play an active role.
They should be included in the drafting and adoption process to ensure a more inclusive approach. Additionally, Kaye argues that governments should be held responsible for their behavior and should support public service media, as excluding them would undermine the effectiveness of addressing disinformation.
Over-reliance on platforms for handling disinformation is a matter of concern. Relying solely on platforms may create challenges in terms of transparency, accountability, and bias. Therefore, it is necessary to explore alternative approaches and strategies to combat disinformation effectively. The leadership of New Zealand in promoting multi-stakeholder approaches and prioritising human rights in times of trauma, such as after the Christchurch attack, is commended by Kaye.
He recognises the importance of keeping human rights at the core of global governance. In this regard, Kaye highlights the Internet Governance Forum (IGF) as a platform where human rights and access to information should be given priority. However, Kaye also warns against adopting ideas that disregard human rights in response to traumatic events, such as the Israel-Palestine conflict.
While people may have natural responses to such events, it is crucial to ensure that any responses or measures taken are rooted in human rights principles. In conclusion, addressing disinformation requires nuanced understanding and a combination of approaches. Clear definitions, shared principles based on human rights, civil society participation, government responsibility, and cautious reliance on platforms are all essential factors in effectively combating disinformation.
New Zealand’s leadership and the IGF’s emphasis on human rights in global governance are notable examples of positive progress. However, it is crucial to avoid compromising human rights in times of trauma or conflict.
Greta
Speech speed
113 words per minute
Speech length
37 words
Speech time
20 secs
Arguments
Disinformation is undermining democracy
Topics: Disinformation, Democracy
Report
Greta strongly believes that disinformation is significantly weakening democratic systems. This issue is related to the topics of disinformation and democracy and is associated with a negative sentiment. However, no specific supporting facts or arguments were provided to support the claim that disinformation undermines democracy.
Disinformation, the deliberate spread of false or misleading information, poses a serious threat to the democratic process. It can manipulate public opinion, deceive voters, and erode trust in democratic institutions. Greta’s agreement with this viewpoint suggests that she recognizes the detrimental effects that disinformation can have on the functioning of democracies.
Although no specific supporting facts or arguments were presented, it is worth considering the widespread impact of disinformation in recent years. The rise of social media platforms has enabled the rapid spread of false information, often disguised as legitimate news or opinions.
This has the potential to sway public opinion and distort democratic discourse. Furthermore, disinformation campaigns have been known to target elections by spreading false information about candidates or manipulating public sentiment. Such tactics can undermine the integrity of electoral processes and compromise the ability of citizens to make informed choices.
The conclusion drawn from Greta’s strong agreement is that urgent actions are needed to address the problem of disinformation. Safeguarding the democratic process involves countering disinformation through fact-checking, promoting media literacy, and strengthening regulations on social media platforms. It is essential to restore trust and ensure that accurate and reliable information prevails in democratic societies.
In summary, Greta strongly believes that disinformation is undermining democracy. While specific arguments and supporting facts were not provided, the existence of disinformation poses a clear threat to democratic systems. Addressing this issue requires collective efforts to counter disinformation, promote media literacy, and protect the integrity of democratic processes.
Jeanette Hofmann
Speech speed
138 words per minute
Speech length
1457 words
Speech time
631 secs
Arguments
Little is known about how disinformation affects people’s minds and voting behavior
Topics: Disinformation, Voting behavior
Disinformation focuses on strategic intent
Supporting facts:
- Usually we say it’s a decision we make between misinformation and disinformation, disinformation focuses on strategic intent. And that usually is to manipulate people in their behavior and in their worldviews.
Topics: Disinformation, Propaganda, Manipulation
Empirical evidence on disinformation effects is lacking
Supporting facts:
- We really lack empirical evidence and I’d like to elaborate a bit on that.
Topics: Disinformation, Academia, Research
Research on disinformation needs to look beyond platforms and consider wider media environments
Supporting facts:
- If we want to learn more about whether or not disinformation works, we need to look beyond platforms and take into account the wider media landscapes. Context matters and there is no point in only looking at platforms and their algorithms.
Topics: Disinformation, Academia, Research, Media
Many people sharing disinformation are signaling their belonging or loyalty, not necessarily believing the information
Supporting facts:
- Evidence showing people may not believe what they’re sharing
- Republicans post-election behavior as example
Topics: Disinformation, Public Discourse
Growing number of people don’t care about distinguishing truth from falsity
Supporting facts:
- Increasing tribal attitudes and disregard for truth
Topics: Truth and Falsity, Public Discourse
Protocol standards for the infrastructure had to be globally agreed upon for a global network, but content should not be regulated at a global level.
Topics: Internet Governance, Protocol Standards, Content Regulation
The Digital Service Act by the European Commission is interesting, it extends the scope of human rights to not just govern the relationship between people and governments, but also to guide platform behaviours.
Supporting facts:
- The Digital Service Act will take effect early next year.
- In Germany, traditionally, human rights regulate the relationship between people and the government.
Topics: Digital Service Act, Human Rights, European Commission, Private sector
Jeanette Hofmann supports the provision in the Digital Service Act that provides regulators and researchers access to data related to systemic risks caused by platforms.
Supporting facts:
- Paragraph 40 of the Digital Service Act has a provision that allows for access to such data.
- This data access allows for better, evidence-based understanding of the impact of disinformation
Topics: Digital Service Act, Data Access, Systemic Risks
Governments should not use global concern on disinformation as an excuse to regulate public speech
Topics: Regulation, Freedom of Speech, Government
Report
This discussion explores the impact of disinformation on people’s minds and voting behaviour. One participant criticises the limited knowledge surrounding this issue due to the lack of empirical evidence. They argue that it is essential to conduct research to better understand how disinformation affects individuals and their decision-making processes.
Another participant highlights the strategic intent of disinformation, stating that it is often used as a tool to manipulate people’s behaviour and influence their worldviews. Disinformation is seen as a deliberate tactic that focuses on achieving specific objectives. The conversation also emphasises the need to expand research on disinformation beyond platforms and consider the wider media landscape.
It is noted that context plays a crucial role, and solely examining platforms and algorithms is insufficient. The impact of disinformation should be studied within the broader media environment to gain a comprehensive understanding of its effects. Furthermore, it is observed that individuals sharing disinformation may not necessarily believe the information themselves.
Instead, they may be using it as a means to signal their belonging or loyalty to a certain group or ideology. This highlights the complex motivations behind the sharing of disinformation and the need to consider social and psychological factors in analysing its influence.
The conversation also touches upon the rising disregard for truth and the detrimental impact it has on public discourse and democracy. This trend of increasing tribal attitudes and a lack of concern for distinguishing truth from falsity has severe consequences for the functioning of society and democratic processes.
Regarding the governance of the internet, there is a recognition that infrastructure standards need global agreement to ensure a cohesive global network. However, content regulation should not be undertaken at a global level, as it may impinge upon freedom of speech and local autonomy.
The Digital Service Act, proposed by the European Commission, is viewed as an interesting development. It extends the scope of human rights to not only govern the relationship between people and governments but also guide platform behaviours. This recognition that the private sector’s influence on the exercise of human rights should be guided by human rights principles is seen as positive.
The Act’s provision for data access related to systemic risks caused by platforms is supported. This data access allows for a better, evidence-based understanding of the impact of disinformation. However, the concept of needing to mobilise systemic risk to gain access to data is criticised, highlighting the need for more efficient mechanisms.
The discussion concludes with the suggestion that the Internet Governance Forum (IGF) could serve as a platform to discuss and implement best practices derived from Article 40 of the Digital Service Act. This highlights the potential for international collaboration and knowledge-sharing in addressing disinformation and its consequences.
Overall, this discussion emphasises the urgent need for comprehensive research, consideration of wider media environments, and the recognition of the complex motivations behind the sharing of disinformation. It also addresses the importance of upholding human rights principles and the challenges of content regulation in a global and interconnected digital landscape.
John Mahob
Speech speed
127 words per minute
Speech length
184 words
Speech time
87 secs
Arguments
Disinformation is undermining democracy in the Philippines
Supporting facts:
- The current president is Marcos, the son of the former dictator
- Disinformation influenced the recent elections
Topics: Disinformation, Democracy, Philippines
Report
A recent discussion highlighted the detrimental effect of disinformation on democracy in the Philippines. The concern was voiced by the current president, Marcos, who is the son of the former dictator. One of the key arguments made was that disinformation played a significant role in influencing the outcomes of the recent elections.
Disinformation in the political landscape is seen as a serious threat to the country’s democratic processes. It is suggested that the spread of false information and manipulation of facts can lead to citizens making ill-informed decisions, thus undermining the democratic values of transparency and accountability.
Supporting this viewpoint, John Mahob, a representative from the Foundation for Media Alternatives in the Philippines, also expressed concern over the impact of disinformation on the country’s democracy. He stressed the need to address and counter disinformation, as it has the potential to distort public opinion and undermine the credibility of democratic institutions.
The speakers argued that the negative consequences of disinformation are far-reaching. By spreading false narratives and distorting facts, disinformation can erode trust in institutions and create divisions among citizens. It is seen as a tool that can be used by those in power to manipulate public sentiment and secure their own interests.
The evidence presented by both speakers raises important questions about the state of democracy in the Philippines. The influence of disinformation on the recent elections serves as a warning sign that steps need to be taken to protect the integrity of democratic processes.
Efforts to combat disinformation and promote media literacy are crucial in order to safeguard the principles of democracy, uphold freedom of expression, and ensure that citizens are adequately informed to make informed decisions. In conclusion, the discussion reveals a shared concern about the negative impact of disinformation on democracy in the Philippines.
The speakers, including the current president and John Mahob, emphasize the urgent need to address this issue and prevent disinformation from undermining democratic values. It is hoped that by raising awareness and taking appropriate measures, the Philippines can work towards creating a more informed and resilient democratic society.
Nighat Dad
Speech speed
169 words per minute
Speech length
1308 words
Speech time
464 secs
Arguments
Disinformation might impact democratic processes but solid evidences are needed.
Supporting facts:
- Disinformation impacts several institutions and voters
Topics: Disinformation, Democracy, Democratic Processes
Definition of mis- and disinformation is complex and contextual
Supporting facts:
- Definitions are very contextual
- Content related to mis- and disinformation complex to define and identify
Topics: Misinformation, Disinformation, Contextuality
Disinformation can cause harm, especially to marginalized groups
Supporting facts:
- False content being shared can have intent of causing harm
- UNSR report on gender disinformation shows how it causes harm in various regions
Topics: Disinformation, Harm, Marginalized groups, Women
Global governance instruments exist but their application could be improved
Supporting facts:
- Several principles, conversations and resolutions have been established regarding global governance.
- There is need for reflection on what actors have learned from these tools.
- Regulations and laws introduced both in the global majority and north often suppress freedom of expression.
Topics: Disinformation, Freedom of Expression, Global Governance
Current treaties and mechanisms should operate synergistically and not in isolation
Supporting facts:
- Existing components can complement each other.
- Necessity to understand how to use existing systems.
Topics: Disinformation, Global Governance
Concern about governments using guidelines for their own interest, hence lack of accountability
Topics: Governance, Human Rights, Jurisdiction, Accountability
Need for regulatory mechanisms to hold state actors accountable
Topics: Regulation, State Accountability
There is a tendency to quickly jump to other solutions without fully examining the established ones.
Topics: problem solving, analysis
There needs to be more global oversight boards to hold companies accountable.
Topics: corporate accountability, oversight
Report
Disinformation, which can impact democratic processes, is a topic of concern. However, solid evidence is needed to support this claim. Caution must be exercised in interpreting the complex and contextual definitions of misinformation and disinformation. Disinformation has the potential to harm marginalized groups, and a UN report highlights its negative effects on gender equality.
Global governance instruments exist, but their application needs improvement as regulations and laws often suppress freedom of expression. State actors and companies have a shared obligation to provide accurate information and prevent the spread of misinformation. Synergy between existing systems is crucial, and the performance of oversight boards and governance mechanisms must be reviewed.
Concerns are raised about governments misusing guidelines and the lack of accountability. Regulatory mechanisms are needed to hold state actors accountable. User rights should not be forgotten in regions with restrictions. The local context is vital, and more global oversight boards are necessary to hold companies accountable.
Transparency reports play a key role in holding platforms accountable.
Remy Milan
Speech speed
158 words per minute
Speech length
45 words
Speech time
17 secs
Arguments
Misinformation undermines citizens’ confidence in institutions of the state.
Topics: Misinformation, Democracy
Report
Misinformation poses a significant threat to the stability of state institutions, as it undermines citizens’ confidence in these establishments. This erosion of trust has detrimental effects on democracy and should not be underestimated. Remy Milan also shares this view, considering misinformation to be a high-level danger to state institutions.
The spread of false or misleading information can have far-reaching consequences in a democracy. It confuses and disenchants citizens, weakening the democratic fabric by eroding trust between the governing and the governed. This issue is especially relevant to SDG 16: Peace, Justice and Strong Institutions, which aims to ensure inclusive governance and access to justice for all.
Misinformation disrupts this goal by sowing doubt and creating divisions within society, hindering efforts to achieve peace and justice. It is worth noting that advances in technology, particularly social media platforms, have facilitated the spread of false information, making it easier for malicious actors to manipulate public opinion.
Addressing this issue requires a multi-faceted approach, including education, media literacy, regulation, and responsible platform governance. Overall, the danger of misinformation to state institutions is significant, impacting citizens’ confidence and threatening democracy itself. Remy Milan emphasizes the importance of addressing this issue for achieving SDG 16 and ensuring peace, justice, and strong institutions.
Efforts must be made to promote media literacy, regulate false information, and foster trust and critical thinking to uphold the integrity of state institutions and democratic values.