Viewing Disinformation from a Global Governance Perspective | IGF 2023 WS #209
Event report
Speakers and Moderators
Speakers:
- Nighat Dad, Civil Society, Asia-Pacific Group
- Clara Iglesias Keller, Civil Society, Latin American and Caribbean Group (GRULAC)
- Aaron Maniam, Government, Asia-Pacific Group
- David Kaye, Civil Society, Western European and Others Group (WEOG)
- Jeanette Hofmann, Civil Society, Western European and Others Group (WEOG)
Moderators:
- Anriette Esterhuysen, Civil Society, African Group
Table of contents
Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.
Knowledge Graph of Debate
Session report
Corway Wu
In the realm of disinformation, politicians are also actively involved in creating and spreading false information, not just news media and social media. This amplifies the scope of actors responsible for misleading the public. The negative sentiment towards politicians indicates a lack of trust in their intentions.
Timing is a crucial aspect in the dissemination of disinformation. The example of Brexit voting behavior is used to demonstrate this. Voters may be influenced by inaccurate information without realizing it until it is too late. This implies that the impact of disinformation can have lasting effects, shaping important decisions.
However, an opposing viewpoint is presented, disagreeing with Jeanette's argument about the significance of timing when discussing disinformation. It is argued that Jeanette's failure to consider timing weakens their argument. The negative sentiment expressed towards this disagreement suggests a potential blind spot in understanding the issue.
Overall, this analysis illustrates the multifaceted nature of disinformation and its wide-ranging consequences. Politicians, news media, and social media platforms are all complicit in perpetuating false information. The timing of disinformation is highlighted as a crucial factor, as it can significantly influence its impact on individuals and societies. The disagreement regarding the importance of timing further emphasizes the complexity of this subject.
Audience
The analysis delved into the multifaceted nature of misinformation and disinformation. One of the speakers put forth the argument that these actions have the potential to incite individuals to act against democratic institutions. To support this claim, they highlighted the example of the January 6th Capitol riots, which they believed were inspired by misinformation and disinformation. The speaker's sentiment was negative, suggesting concern about the impact of these actions on democracy.
However, another speaker expressed a more neutral stance, highlighting the challenge of settling on a clear definition of disinformation. They pointed out that assessing the longitudinal impact of disinformation is challenging. This sentiment indicates a level of uncertainty regarding the extent to which misinformation and disinformation can influence actions and outcomes.
A disagreement emerged regarding the possibility of completely eliminating disinformation. One speaker argued that efforts should be directed towards reducing its spread and minimizing the damage caused, rather than striving for complete elimination. This sentiment aligned with a more positive outlook on the issue.
In the specific context of Switzerland, it was suggested that disinformation does not possess enough influence to significantly sway elections. The speaker based this claim on the observation that Switzerland has a stable multi-party system with relatively consistent voting patterns over the past 30 years. This sentiment reflects a more neutral perspective on the impact of disinformation in the Swiss political landscape.
The analysis also examined the potential effects of disinformation on internet infrastructure and connectivity. There was evidence suggesting that disinformation governance can impact internet infrastructure, with an example cited of Europe implementing IP blocking of Russian websites spreading disinformation. This negative sentiment implies the belief that the weaponization of disinformation through online platforms has had widespread consequences.
The audience raised concerns regarding the potential threat disinformation poses to the fundamental right of freedom of expression. Historical examples, such as the information war of the Cold War and the use of radio for propaganda during Nazi Germany, were provided to illustrate this point. This sentiment highlights the importance of protecting freedom of expression in the face of disinformation.
Notably, the analysis explored the effectiveness of the Christchurch Call initiative in response to live-streamed terrorist attacks in New Zealand. The sentiment here was negative, as it was argued that rushed solutions to govern and regulate disinformation can cause unintended harm. The speaker stressed the need for a nuanced approach to address disinformation, referencing the impact of tackling disinformation in G7 declarations.
The audience member supporting the Christchurch Call initiative expressed a positive sentiment, believing in its effectiveness. They emphasized the significance of trust-building and multi-stakeholder involvement in addressing terrorism facilitated by online platforms. This aligns with the overall positive sentiment of utilizing a multi-stakeholder model and engaging governments, tech firms, and civil society in combating disinformation.
In conclusion, the analysis highlighted the complex nature of misinformation and disinformation. The arguments presented ranged from the potential dangers of these actions in undermining democratic institutions to the challenges in defining and assessing their impacts. The disagreement regarding the elimination of disinformation reflected a difference in perspectives, with one side advocating for reducing its spread. The analysis also shed light on the specific impacts of disinformation on internet infrastructure, the threat it poses to freedom of expression, and the potential effectiveness of initiatives such as the Christchurch Call in preventing terrorism. Overall, the analysis underscores the need for nuanced approaches and multi-stakeholder involvement to address disinformation and its various repercussions.
Anriette Esterhuysen
The analysis provides a comprehensive overview of perspectives on disinformation in South Africa, covering various viewpoints and arguments. One viewpoint suggests that disinformation is not a major problem in the country, with more concern placed on trusting the government. The public tends to rely on the media, which is regarded as well self-regulated and proficient in dealing with disinformation. Fact-checking is also a common practice in South Africa, swiftly and efficiently debunking false information.
Another argument highlights the successful media regulation in South Africa, which ensures accuracy across different ideological spectrums. It is noted that a commitment to accuracy exists among right-wing, center, and left-wing media outlets in the country. Fact-checking is a prevalent practice, further enhancing the reliability and trustworthiness of the media. This observation supports the notion that media regulation in South Africa effectively maintains accuracy and minimises the spread of disinformation.
The analysis also emphasises the need for careful consideration in the international regulation of disinformation. It is crucial to explore the implications of such regulation on access to information and freedom of expression. While national initiatives regarding disinformation regulation are controversial, existing international instruments may serve as a baseline for effectively governing disinformation.
Furthermore, the analysis highlights the distinct dynamics of weaponising disinformation in online platforms compared to traditional broadcasting platforms. Unlike traditional platforms, online platforms allow for the widespread distribution of disinformation without requiring significant political or economic power. This observation emphasises the need for tailored approaches in combating disinformation across different digital platforms.
A noteworthy observation from the analysis is the advocacy for considering bottom-up approaches and self-regulation measures alongside governmental regulations. Anriette Esterhuysen argues that jumping to governmental regulations without exploring more bottom-up ways may be premature. While a regulatory response might be necessary, Esterhuysen highlights the importance of not dismissing self-regulatory and bottom-up approaches to tackle disinformation. This perspective demonstrates a concern that solely relying on governmental regulations might overlook effective alternatives.
Overall, the analysis offers valuable insights into the various dimensions of disinformation in South Africa. The perspectives presented shed light on the strengths of the country's media regulation, the challenges faced in international regulation, the dynamics of online platforms, and the importance of considering diverse approaches to combat disinformation.
Remy Milan
Misinformation poses a significant threat to the stability of state institutions, as it undermines citizens' confidence in these establishments. This erosion of trust has detrimental effects on democracy and should not be underestimated. Remy Milan also shares this view, considering misinformation to be a high-level danger to state institutions. The spread of false or misleading information can have far-reaching consequences in a democracy. It confuses and disenchants citizens, weakening the democratic fabric by eroding trust between the governing and the governed. This issue is especially relevant to SDG 16: Peace, Justice and Strong Institutions, which aims to ensure inclusive governance and access to justice for all. Misinformation disrupts this goal by sowing doubt and creating divisions within society, hindering efforts to achieve peace and justice. It is worth noting that advances in technology, particularly social media platforms, have facilitated the spread of false information, making it easier for malicious actors to manipulate public opinion. Addressing this issue requires a multi-faceted approach, including education, media literacy, regulation, and responsible platform governance. Overall, the danger of misinformation to state institutions is significant, impacting citizens' confidence and threatening democracy itself. Remy Milan emphasizes the importance of addressing this issue for achieving SDG 16 and ensuring peace, justice, and strong institutions. Efforts must be made to promote media literacy, regulate false information, and foster trust and critical thinking to uphold the integrity of state institutions and democratic values.
Jeanette Hofmann
This discussion explores the impact of disinformation on people's minds and voting behaviour. One participant criticises the limited knowledge surrounding this issue due to the lack of empirical evidence. They argue that it is essential to conduct research to better understand how disinformation affects individuals and their decision-making processes.
Another participant highlights the strategic intent of disinformation, stating that it is often used as a tool to manipulate people's behaviour and influence their worldviews. Disinformation is seen as a deliberate tactic that focuses on achieving specific objectives.
The conversation also emphasises the need to expand research on disinformation beyond platforms and consider the wider media landscape. It is noted that context plays a crucial role, and solely examining platforms and algorithms is insufficient. The impact of disinformation should be studied within the broader media environment to gain a comprehensive understanding of its effects.
Furthermore, it is observed that individuals sharing disinformation may not necessarily believe the information themselves. Instead, they may be using it as a means to signal their belonging or loyalty to a certain group or ideology. This highlights the complex motivations behind the sharing of disinformation and the need to consider social and psychological factors in analysing its influence.
The conversation also touches upon the rising disregard for truth and the detrimental impact it has on public discourse and democracy. This trend of increasing tribal attitudes and a lack of concern for distinguishing truth from falsity has severe consequences for the functioning of society and democratic processes.
Regarding the governance of the internet, there is a recognition that infrastructure standards need global agreement to ensure a cohesive global network. However, content regulation should not be undertaken at a global level, as it may impinge upon freedom of speech and local autonomy.
The Digital Service Act, proposed by the European Commission, is viewed as an interesting development. It extends the scope of human rights to not only govern the relationship between people and governments but also guide platform behaviours. This recognition that the private sector's influence on the exercise of human rights should be guided by human rights principles is seen as positive.
The Act's provision for data access related to systemic risks caused by platforms is supported. This data access allows for a better, evidence-based understanding of the impact of disinformation. However, the concept of needing to mobilise systemic risk to gain access to data is criticised, highlighting the need for more efficient mechanisms.
The discussion concludes with the suggestion that the Internet Governance Forum (IGF) could serve as a platform to discuss and implement best practices derived from Article 40 of the Digital Service Act. This highlights the potential for international collaboration and knowledge-sharing in addressing disinformation and its consequences.
Overall, this discussion emphasises the urgent need for comprehensive research, consideration of wider media environments, and the recognition of the complex motivations behind the sharing of disinformation. It also addresses the importance of upholding human rights principles and the challenges of content regulation in a global and interconnected digital landscape.
Bili Drake
The impact of disinformation on democracy is a complex issue that is influenced by various factors and is context-dependent. Different perspectives exist on the extent to which disinformation can affect democratic processes. Some argue that disinformation can have a significant negative impact on democracy, while others caution against oversimplifying the issue and relying on false dichotomies.
It has been observed that a considerable amount of disinformation originates from broadcast media and is then amplified through social media platforms. This highlights the interconnectedness between different forms of media in the spread of disinformation. Several studies have indicated this behavior, emphasizing the importance of understanding the role played by different media channels in the dissemination of disinformation.
One key aspect that complicates the issue of disinformation is the lack of a standardised definition. Leading organisations like the European Union and the UN Special Rapporteur have differing definitions of disinformation, which can give rise to confusion and inconsistencies in tackling this problem. It becomes crucial to establish a common understanding to effectively address disinformation.
Tribal loyalty is identified as a significant factor that can lead individuals to believe in disinformation. In cases like the United States, where tribal affiliations and identity politics play a prominent role, people may align with certain narratives or disinformation due to their loyalty to a particular group. This highlights how social and political factors can impact an individual's susceptibility to disinformation.
Identity politics further compounds the issue, distorting the perception of truth. Some individuals develop their identities around opposing certain groups or ideologies, leading them to embrace disinformation that aligns with their pre-existing biases. This phenomenon highlights the role of emotions and personal beliefs in shaping the acceptance of disinformation.
Efforts to regulate disinformation on a global level have been proposed, but doubts remain about their effectiveness. Discussions in the United Nations have seen various proposals related to disinformation, such as China's suggestion to criminalise its spread and UNESCO's guidelines for digital platforms. However, the complexities and geopolitical divisions inherent in regulating disinformation make it challenging to achieve meaningful global regulation. As a result, long-term engagement is advocated, focusing on building infrastructure to challenge disinformation effectively.
The proposal for a code of conduct on information integrity for digital platforms is seen as an attempt at global internet governance. This proposal aims to govern the information that flows through digital networks, aligning with the definition of internet governance. It raises questions about the extent to which such regulations should be implemented and their potential impact on freedom of expression and privacy.
The primary responsibility to counter disinformation lies with states, according to the UN General Assembly's resolution. While platforms such as social media play a role, governments bear the primary responsibility to address the issue effectively. Simply pressuring platforms to act does not address the root causes of disinformation.
It is important to recognise that disinformation can originate from various sources, including the dark web. This highlights the need for a comprehensive approach that looks beyond platforms alone. Strategies should encompass multiple sources and channels through which disinformation can be generated and disseminated.
Civil society participation is crucial in the discussion on countering disinformation. While there have been limited discussions on proposals like the UN Secretary General's Global Digital Compact, greater involvement of civil society in such initiatives can ensure diverse perspectives and balanced decision-making.
In conclusion, addressing the issue of disinformation requires a multifaceted approach that involves governments, platforms, and civil society. The complex nature of disinformation and its impact on democracy necessitate a nuanced understanding, taking into account various factors such as media channels, definitions, tribal loyalty, and identity politics. Efforts to regulate disinformation at a global level should be complemented with long-term engagement and infrastructure-building, recognising the challenges and limitations faced in achieving effective global regulation.
Nighat Dad
Disinformation, which can impact democratic processes, is a topic of concern. However, solid evidence is needed to support this claim. Caution must be exercised in interpreting the complex and contextual definitions of misinformation and disinformation. Disinformation has the potential to harm marginalized groups, and a UN report highlights its negative effects on gender equality. Global governance instruments exist, but their application needs improvement as regulations and laws often suppress freedom of expression. State actors and companies have a shared obligation to provide accurate information and prevent the spread of misinformation. Synergy between existing systems is crucial, and the performance of oversight boards and governance mechanisms must be reviewed. Concerns are raised about governments misusing guidelines and the lack of accountability. Regulatory mechanisms are needed to hold state actors accountable. User rights should not be forgotten in regions with restrictions. The local context is vital, and more global oversight boards are necessary to hold companies accountable. Transparency reports play a key role in holding platforms accountable.
Clara Iglesias Keller
Disinformation has the potential to undermine democracy, although its impact varies depending on the context. While there is currently no solid empirical evidence to suggest that disinformation directly changes voters' behaviors or affects election results, there is a consensus that further research is necessary to fully understand its implications.
The existing research on the impact of disinformation is primarily focused on the United States and Europe, highlighting a need for expanding studies to include other regions such as Latin America, Africa, and Asia. It is important to understand how disinformation strategies can influence political transformations in different contexts.
Disinformation is considered a communication practice and an online harm, along with misinformation, propaganda, and fake news. Its intent holds significant legal relevance, further emphasizing the need to address the issue.
In some instances, disinformation serves as a form of political intervention. For example, in Brazil, it has been used to express dissatisfaction or directly attack democratic institutions, including the electoral system and high courts. This highlights the destructive potential of disinformation as a tool in political disputes.
However, the concept of disinformation poses a challenge within statutory regulation, as there is no clear space for its definition and regulation.
Global governance solutions, although important, may not be sufficient to address the impact of misinformation and disinformation on political disputes. It is necessary to confront the ultimate convertibility of economic power into political power, particularly within the media landscape. This is evident in countries like Brazil, where traditionally concentrated and unregulated media landscapes contribute to the spread of disinformation.
Additionally, global solutions often rely on consensus-based governance structures, which may lack the power needed to modify digital business models and data usage effectively.
More empirical evidence is needed, especially outside of the global north. In countries like Brazil, internet usage is strongly associated with platforms such as WhatsApp, Facebook, and Instagram, facilitated by zero rating policies. Understanding the impact of disinformation in these regions is crucial for developing effective countermeasures.
In conclusion, addressing the challenges posed by disinformation requires not only further research but also more institutional innovation. This innovation should create an apparatus that allows diverse stakeholders and civil society to engage in the disputation of truth and content. By confronting the convertibility of economic power into political power and exploring alternative governance structures, we can work towards mitigating the harmful effects of disinformation and safeguarding democratic institutions.
David Kaye
Disinformation is a complex issue that involves the dissemination of false or misleading information. It can have various impacts and is spread through different platforms, including legacy media and social platforms. Understanding the nuances of disinformation is crucial, as there is no one-size-fits-all solution to address it.
David Kaye, an advocate for human rights, highlights the importance of clear definitions and understanding in addressing disinformation through legal regulation and governance. However, he expresses concern about the lack of shared definitions, which may impede the process of regulation. Kaye also raises concerns about emerging regulations in Europe and the UK that rely on platforms to define disinformation, as this may affect transparency and risk assessment.
While global regulation of disinformation may seem desirable, Kaye argues that it is not achievable. Instead, he suggests the development of a common set of guiding principles based on human rights. These principles should be the foundation for addressing disinformation, providing a framework that ensures legality, necessity, proportionality, and legitimacy of objectives.
In shaping policies and strategies to combat disinformation, Kaye believes that civil society should play an active role. They should be included in the drafting and adoption process to ensure a more inclusive approach. Additionally, Kaye argues that governments should be held responsible for their behavior and should support public service media, as excluding them would undermine the effectiveness of addressing disinformation.
Over-reliance on platforms for handling disinformation is a matter of concern. Relying solely on platforms may create challenges in terms of transparency, accountability, and bias. Therefore, it is necessary to explore alternative approaches and strategies to combat disinformation effectively.
The leadership of New Zealand in promoting multi-stakeholder approaches and prioritising human rights in times of trauma, such as after the Christchurch attack, is commended by Kaye. He recognises the importance of keeping human rights at the core of global governance. In this regard, Kaye highlights the Internet Governance Forum (IGF) as a platform where human rights and access to information should be given priority.
However, Kaye also warns against adopting ideas that disregard human rights in response to traumatic events, such as the Israel-Palestine conflict. While people may have natural responses to such events, it is crucial to ensure that any responses or measures taken are rooted in human rights principles.
In conclusion, addressing disinformation requires nuanced understanding and a combination of approaches. Clear definitions, shared principles based on human rights, civil society participation, government responsibility, and cautious reliance on platforms are all essential factors in effectively combating disinformation. New Zealand's leadership and the IGF's emphasis on human rights in global governance are notable examples of positive progress. However, it is crucial to avoid compromising human rights in times of trauma or conflict.
John Mahob
A recent discussion highlighted the detrimental effect of disinformation on democracy in the Philippines. The concern was voiced by the current president, Marcos, who is the son of the former dictator. One of the key arguments made was that disinformation played a significant role in influencing the outcomes of the recent elections.
Disinformation in the political landscape is seen as a serious threat to the country's democratic processes. It is suggested that the spread of false information and manipulation of facts can lead to citizens making ill-informed decisions, thus undermining the democratic values of transparency and accountability.
Supporting this viewpoint, John Mahob, a representative from the Foundation for Media Alternatives in the Philippines, also expressed concern over the impact of disinformation on the country's democracy. He stressed the need to address and counter disinformation, as it has the potential to distort public opinion and undermine the credibility of democratic institutions.
The speakers argued that the negative consequences of disinformation are far-reaching. By spreading false narratives and distorting facts, disinformation can erode trust in institutions and create divisions among citizens. It is seen as a tool that can be used by those in power to manipulate public sentiment and secure their own interests.
The evidence presented by both speakers raises important questions about the state of democracy in the Philippines. The influence of disinformation on the recent elections serves as a warning sign that steps need to be taken to protect the integrity of democratic processes. Efforts to combat disinformation and promote media literacy are crucial in order to safeguard the principles of democracy, uphold freedom of expression, and ensure that citizens are adequately informed to make informed decisions.
In conclusion, the discussion reveals a shared concern about the negative impact of disinformation on democracy in the Philippines. The speakers, including the current president and John Mahob, emphasize the urgent need to address this issue and prevent disinformation from undermining democratic values. It is hoped that by raising awareness and taking appropriate measures, the Philippines can work towards creating a more informed and resilient democratic society.
Aaron Maniam
The analysis explores several key aspects of global governance, regulations, misinformation, and digital regulation. One of the main challenges in defining global governance arises from the presence of different models and guidelines, which leads to variations in the level of guidelines and enforcement, resulting in a lack of consensus on the precise meaning of global governance.
Concerning global governance regulations, it is crucial to distinguish between basic standards and additional issues. Examples such as the Digital Services Act (DSA) in the European Union and online safety regulations in Singapore and the UK emphasize the significance of addressing both fundamental standards and more complex issues in regulating global governance. These regulations play a significant role in achieving Goal 16 of the Sustainable Development Goals (SDGs): Peace, Justice, and Strong Institutions.
Governments can have contradictory impacts on global governance efforts. On one hand, they can be a source of misinformation, hindering progress towards effective global governance. However, governments also possess the authority and skills necessary to continuously update legislation to keep pace with rapidly evolving technology. This ability is essential for achieving Goal 16 of the SDGs.
Interoperability, the ability for different systems to communicate seamlessly, is a vital aspect of digital regulation. Aaron Maniam highlights the importance of interoperability among different countries, as it enables coherent communication and collaboration. This is linked to Goal 9 of the SDGs: Industry, Innovation, and Infrastructure.
A polycentric approach is crucial in combating disinformation. Governments should move away from solely having an authority-based role and embrace a convening function. By engaging in consultation and deliberation, governments can prioritize and address issues related to disinformation in a bottom-up fashion. Additionally, community building, space building, and urban planning should be part of the government's role in fighting disinformation.
Education and literacy play a pivotal role in tackling disinformation. In Singapore, various organizations, including the National Library Board, the Information and Media Development Authority, and the Cybersecurity Agency, collaborate to operationalize strategies. Education that starts at home and in schools and libraries is highlighted as a key factor in enhancing literacy among citizens. This aligns with Goal 4 of the SDGs: Quality Education.
In summary, the analysis underscores the complexities and challenges of global governance and the importance of clear regulations. It also highlights the dual role of governments as potential sources of misinformation and as crucial actors in updating legislation. Interoperability is crucial for effective digital regulation, and a polycentric approach is essential in combating disinformation. Lastly, education and literacy are vital components in mitigating the impact of disinformation.
Greta
Greta strongly believes that disinformation is significantly weakening democratic systems. This issue is related to the topics of disinformation and democracy and is associated with a negative sentiment. However, no specific supporting facts or arguments were provided to support the claim that disinformation undermines democracy.
Disinformation, the deliberate spread of false or misleading information, poses a serious threat to the democratic process. It can manipulate public opinion, deceive voters, and erode trust in democratic institutions. Greta's agreement with this viewpoint suggests that she recognizes the detrimental effects that disinformation can have on the functioning of democracies.
Although no specific supporting facts or arguments were presented, it is worth considering the widespread impact of disinformation in recent years. The rise of social media platforms has enabled the rapid spread of false information, often disguised as legitimate news or opinions. This has the potential to sway public opinion and distort democratic discourse.
Furthermore, disinformation campaigns have been known to target elections by spreading false information about candidates or manipulating public sentiment. Such tactics can undermine the integrity of electoral processes and compromise the ability of citizens to make informed choices.
The conclusion drawn from Greta's strong agreement is that urgent actions are needed to address the problem of disinformation. Safeguarding the democratic process involves countering disinformation through fact-checking, promoting media literacy, and strengthening regulations on social media platforms. It is essential to restore trust and ensure that accurate and reliable information prevails in democratic societies.
In summary, Greta strongly believes that disinformation is undermining democracy. While specific arguments and supporting facts were not provided, the existence of disinformation poses a clear threat to democratic systems. Addressing this issue requires collective efforts to counter disinformation, promote media literacy, and protect the integrity of democratic processes.
Speakers
AM
Aaron Maniam
Speech speed
214 words per minute
Speech length
1443 words
Speech time
404 secs
Arguments
We have yet to define what we mean by global governance due to various models that exists
Supporting facts:
- There are de minimis models and maximal models, with varying levels of guidelines and enforcement.
Topics: global governance, policies
It's crucial to distinguish between basic standards and additional issues in global governance regulations
Supporting facts:
- Examples include the DSA and online safety regulations in Singapore and the UK.
Topics: global governance, regulations
Governments can be a source of misinformation, which could hinder global governance efforts
Topics: governments, misinformation
Governments need the ability and the skills to continually update legislation to keep up with dynamic technology
Topics: governments, legislation, technology
The work on combating disinformation needs to be polycentric, and governments should move from having exclusively an authority-based role towards having a convening function.
Supporting facts:
- In Singapore, there has been consultation and deliberation in how to set priorities and address issues in a 'bottom-up' fashion
- Role of government should include community building, space building, and urban planning
Topics: disinformation, government involvement, polycentric approach
Education and literacy are crucial for tackling disinformation.
Supporting facts:
- In Singapore, the National Library Board, the Information and Media Development Authority, and the Cybersecurity Agency work together to operationalize strategy
- Education starts at home and from schools and libraries, and should be included in discussions to enhance literacy among citizens
Topics: disinformation, education, literacy
Report
The analysis explores several key aspects of global governance, regulations, misinformation, and digital regulation. One of the main challenges in defining global governance arises from the presence of different models and guidelines, which leads to variations in the level of guidelines and enforcement, resulting in a lack of consensus on the precise meaning of global governance.
Concerning global governance regulations, it is crucial to distinguish between basic standards and additional issues. Examples such as the Digital Services Act (DSA) in the European Union and online safety regulations in Singapore and the UK emphasize the significance of addressing both fundamental standards and more complex issues in regulating global governance.
These regulations play a significant role in achieving Goal 16 of the Sustainable Development Goals (SDGs): Peace, Justice, and Strong Institutions. Governments can have contradictory impacts on global governance efforts. On one hand, they can be a source of misinformation, hindering progress towards effective global governance.
However, governments also possess the authority and skills necessary to continuously update legislation to keep pace with rapidly evolving technology. This ability is essential for achieving Goal 16 of the SDGs. Interoperability, the ability for different systems to communicate seamlessly, is a vital aspect of digital regulation.
Aaron Maniam highlights the importance of interoperability among different countries, as it enables coherent communication and collaboration. This is linked to Goal 9 of the SDGs: Industry, Innovation, and Infrastructure. A polycentric approach is crucial in combating disinformation. Governments should move away from solely having an authority-based role and embrace a convening function.
By engaging in consultation and deliberation, governments can prioritize and address issues related to disinformation in a bottom-up fashion. Additionally, community building, space building, and urban planning should be part of the government's role in fighting disinformation. Education and literacy play a pivotal role in tackling disinformation.
In Singapore, various organizations, including the National Library Board, the Information and Media Development Authority, and the Cybersecurity Agency, collaborate to operationalize strategies. Education that starts at home and in schools and libraries is highlighted as a key factor in enhancing literacy among citizens.
This aligns with Goal 4 of the SDGs: Quality Education. In summary, the analysis underscores the complexities and challenges of global governance and the importance of clear regulations. It also highlights the dual role of governments as potential sources of misinformation and as crucial actors in updating legislation.
Interoperability is crucial for effective digital regulation, and a polycentric approach is essential in combating disinformation. Lastly, education and literacy are vital components in mitigating the impact of disinformation.
AE
Anriette Esterhuysen
Speech speed
169 words per minute
Speech length
2835 words
Speech time
1009 secs
Arguments
Disinformation is not a major problem in South Africa, as there is more concern about believing the government.
Supporting facts:
- South Africa has well self-regulated media that deals with disinformation
- Fact-checking is a common practice done efficiently and quickly
- The public tends to believe the media
Topics: Disinformation, Government credibility
The media regulation in South Africa successfully maintains accuracy across different ideological spectrums.
Supporting facts:
- A common commitment to accuracy cross right-wing, center and left-wing media
- Fact-checking is a common practice
Topics: Media regulation, Accuracy
The weaponizing of disinformation in online platforms is different than traditional broadcasting platforms.
Supporting facts:
- In online platforms, the ability to weaponize disinformation is much more distributed and does not require political or economic power unlike traditional broadcasting platforms.
Topics: Disinformation, Online platforms
Anriette Esterhuysen believes that governmental regulations on disinformation may be jumping the gun before considering bottom-up approaches
Supporting facts:
- She argues that we feel like jumping to this response before we've actually explored more bottom-up ways
Topics: Governmental Regulation, Disinformation, Self-Regulation, Bottom-up Approaches
Report
The analysis provides a comprehensive overview of perspectives on disinformation in South Africa, covering various viewpoints and arguments. One viewpoint suggests that disinformation is not a major problem in the country, with more concern placed on trusting the government. The public tends to rely on the media, which is regarded as well self-regulated and proficient in dealing with disinformation.
Fact-checking is also a common practice in South Africa, swiftly and efficiently debunking false information. Another argument highlights the successful media regulation in South Africa, which ensures accuracy across different ideological spectrums. It is noted that a commitment to accuracy exists among right-wing, center, and left-wing media outlets in the country.
Fact-checking is a prevalent practice, further enhancing the reliability and trustworthiness of the media. This observation supports the notion that media regulation in South Africa effectively maintains accuracy and minimises the spread of disinformation. The analysis also emphasises the need for careful consideration in the international regulation of disinformation.
It is crucial to explore the implications of such regulation on access to information and freedom of expression. While national initiatives regarding disinformation regulation are controversial, existing international instruments may serve as a baseline for effectively governing disinformation. Furthermore, the analysis highlights the distinct dynamics of weaponising disinformation in online platforms compared to traditional broadcasting platforms.
Unlike traditional platforms, online platforms allow for the widespread distribution of disinformation without requiring significant political or economic power. This observation emphasises the need for tailored approaches in combating disinformation across different digital platforms. A noteworthy observation from the analysis is the advocacy for considering bottom-up approaches and self-regulation measures alongside governmental regulations.
Anriette Esterhuysen argues that jumping to governmental regulations without exploring more bottom-up ways may be premature. While a regulatory response might be necessary, Esterhuysen highlights the importance of not dismissing self-regulatory and bottom-up approaches to tackle disinformation. This perspective demonstrates a concern that solely relying on governmental regulations might overlook effective alternatives.
Overall, the analysis offers valuable insights into the various dimensions of disinformation in South Africa. The perspectives presented shed light on the strengths of the country's media regulation, the challenges faced in international regulation, the dynamics of online platforms, and the importance of considering diverse approaches to combat disinformation.
A
Audience
Speech speed
161 words per minute
Speech length
2646 words
Speech time
984 secs
Arguments
Mis- and disinformation can incite people to act against democratic institutions
Supporting facts:
- The speaker cited the January 6th Capitol riots as an instance where these actions were inspired by misinformation and disinformation
Topics: Misinformation, Disinformation, Democracy, January 6th Capitol Riots
Difficulty to settle on a definition of disinformation
Supporting facts:
- Audience belongs to Think Tank
- Background is legal
Topics: Disinformation, Legal background, Think tank
Challenging to assess the longitudinal impact of disinformation
Topics: Disinformation, Impact assessment
Hard to extract substantive grievance from disinformation
Topics: Grievance, Disinformation, Cause and effect
Disinformation is inevitable, it's a natural part of human error.
Topics: Fake News, Disinformation
Disinformation and fake news are an effective weapon, used in context of wars.
Supporting facts:
- Usage of this weapon in Russian-Ukrainian war
Topics: Fake News, Disinformation, Russian-Ukrainian War
It's impossible to completely eliminate disinformation
Topics: Disinformation, Fake News
Disinformation in Switzerland is not influential enough to significantly sway elections
Supporting facts:
- Switzerland has a multi-party system and people have been voting for the same parties over the last 30 years
- There hasn't been major changes in voting patterns
Topics: Disinformation, Elections, Switzerland
Disinformation governance can affect internet infrastructure and connectivity
Supporting facts:
- Europe's IP blocking of Russian websites spreading disinformation
Topics: Disinformation Governance, Internet Infrastructure
The threat of undermining the fundamental right to freedom of expression
Supporting facts:
- Historical examples: Information war of the Cold War, Use of radio for propaganda during Nazi Germany
Topics: Freedom of Expression, Censorship
Online platforms have made the weaponizing of disinformation more distributed.
Topics: Disinformation, Online platforms
In democratic systems, even disinformation can be up for debate.
Topics: Disinformation, Democratic System
The Christchurch Call initiative was established in response to the live-streamed terrorist attacks in Christchurch, New Zealand in 2019
Supporting facts:
- Established by the Prime Minister's Special Representative on Cyber and Digital, Paul Ash
- The attack consisted of the murder of 51 worshippers in two mosques
Topics: Christchurch Call, Online Terrorism, Live Streaming
The Christchurch Call has developed 25 commitments for implementation with the aim of preventing such incidents in the future through a broad consultation process
Supporting facts:
- The commitments were negotiated in a hurried eight-week process
- Efforts have been made to involve civil society groups in the consultation process
Topics: Preventing Terrorism, Online Policies
Maintaining a tight scope and having a Secretariat that facilitates trust and enables uncomfortable discussions is key for the success of such an initiative
Supporting facts:
- Keeping the scope limited to one specific subject area has proven effective
- The Secretariat works quietly on contentious issues to build trust across participants
Topics: Trust-building, Policy Negotiation, Scope Management
The multi-stakeholder model that involves different entities like governments, tech firms, and civil society has been instrumental in the Christchurch Call initiative
Supporting facts:
- Different multi-stakeholder configurations have been considered and tested
- This has allowed for a process of building trust and having the necessary uncomfortable discussions
Topics: Multi-stakeholder Participation, Collective Decision Making
Report
The analysis delved into the multifaceted nature of misinformation and disinformation. One of the speakers put forth the argument that these actions have the potential to incite individuals to act against democratic institutions. To support this claim, they highlighted the example of the January 6th Capitol riots, which they believed were inspired by misinformation and disinformation.
The speaker's sentiment was negative, suggesting concern about the impact of these actions on democracy. However, another speaker expressed a more neutral stance, highlighting the challenge of settling on a clear definition of disinformation. They pointed out that assessing the longitudinal impact of disinformation is challenging.
This sentiment indicates a level of uncertainty regarding the extent to which misinformation and disinformation can influence actions and outcomes. A disagreement emerged regarding the possibility of completely eliminating disinformation. One speaker argued that efforts should be directed towards reducing its spread and minimizing the damage caused, rather than striving for complete elimination.
This sentiment aligned with a more positive outlook on the issue. In the specific context of Switzerland, it was suggested that disinformation does not possess enough influence to significantly sway elections. The speaker based this claim on the observation that Switzerland has a stable multi-party system with relatively consistent voting patterns over the past 30 years.
This sentiment reflects a more neutral perspective on the impact of disinformation in the Swiss political landscape. The analysis also examined the potential effects of disinformation on internet infrastructure and connectivity. There was evidence suggesting that disinformation governance can impact internet infrastructure, with an example cited of Europe implementing IP blocking of Russian websites spreading disinformation.
This negative sentiment implies the belief that the weaponization of disinformation through online platforms has had widespread consequences. The audience raised concerns regarding the potential threat disinformation poses to the fundamental right of freedom of expression. Historical examples, such as the information war of the Cold War and the use of radio for propaganda during Nazi Germany, were provided to illustrate this point.
This sentiment highlights the importance of protecting freedom of expression in the face of disinformation. Notably, the analysis explored the effectiveness of the Christchurch Call initiative in response to live-streamed terrorist attacks in New Zealand. The sentiment here was negative, as it was argued that rushed solutions to govern and regulate disinformation can cause unintended harm.
The speaker stressed the need for a nuanced approach to address disinformation, referencing the impact of tackling disinformation in G7 declarations. The audience member supporting the Christchurch Call initiative expressed a positive sentiment, believing in its effectiveness. They emphasized the significance of trust-building and multi-stakeholder involvement in addressing terrorism facilitated by online platforms.
This aligns with the overall positive sentiment of utilizing a multi-stakeholder model and engaging governments, tech firms, and civil society in combating disinformation. In conclusion, the analysis highlighted the complex nature of misinformation and disinformation. The arguments presented ranged from the potential dangers of these actions in undermining democratic institutions to the challenges in defining and assessing their impacts.
The disagreement regarding the elimination of disinformation reflected a difference in perspectives, with one side advocating for reducing its spread. The analysis also shed light on the specific impacts of disinformation on internet infrastructure, the threat it poses to freedom of expression, and the potential effectiveness of initiatives such as the Christchurch Call in preventing terrorism.
Overall, the analysis underscores the need for nuanced approaches and multi-stakeholder involvement to address disinformation and its various repercussions.
BD
Bili Drake
Speech speed
191 words per minute
Speech length
2554 words
Speech time
801 secs
Arguments
Disinformation's impact on democracy is highly dependent on context
Supporting facts:
- Bili Drake refers to the session at the Taiwan IGF last week
Topics: Disinformation, Democracy, Context Dependence
A lot of the disinformation does not originally come from social media but from broadcast media which is then amplified by social media
Supporting facts:
- There have been multiple studies indicating this behavior
Topics: Disinformation, Social Media, Broadcast Media
The definitions of disinformation tend to vary significantly among leading organizations
Supporting facts:
- The European Union and UN Special Rapporteur definitions of disinformation differ significantly in their details
Topics: Disinformation, Definition
Believes that tribal loyalty leads to belief in disinformation
Supporting facts:
- There are indeed lots of people who will say they believe in disinformation because of tribal loyalty
- There's no question about that in the American case
Topics: Disinformation, Tribal loyalty
Identity politics can distort the perception of truth
Supporting facts:
- There's a lot of people who have just built their identity around giving the finger to the other side
- It's all about owning the libs, owning the other side, giving the finger to the people you don't identify with
Topics: Identity Politics, Perception, Truth
Regulating disinformation at a global level is probably not effective, but efforts to build infrastructure to challenge it should be continued.
Supporting facts:
- UN discussions around cybercrime and cybersecurity have seen many proposals related to disinformation
- The difficulty of managing disinformation is evident in geopolitically divided environments
- China proposed laws calling spread of disinformation a criminal offence
- UNESCO guidelines for digital platforms also touch upon disinformation
Topics: Disinformation, Global Regulation, Internet Governance
Efforts are made internationally to handle disinformation like the UNESCO guidelines for digital platforms, China's proposal in the cybercrime treaty negotiations, and UN Secretary General's proposal for a code of conduct.
Supporting facts:
- China proposed language in the cybercrime treaty saying that all governments should adopt laws calling spread of disinformation a criminal offense
- UNESCO is finalizing guidelines for the regulation of digital platforms
- UN Secretary General is proposing a code of conduct for information integrity on digital platforms.
Topics: Disinformation, Global Regulation, Internet Governance
Bili Drake wants to know the views of other panelists on the Secretary General's proposal
Supporting facts:
- The Secretary General of the United Nations is proposing a code of conduct on information integrity for digital platforms.
Topics: Internet governance, Code of Conduct, Information integrity, Digital platforms
States bear the primary responsibility to counter disinformation.
Supporting facts:
- The UN General Assembly had a resolution on countering disinformation and asked the Secretary General to do a report.
- The report stated that states bear the primary responsibility to counter disinformation.
Topics: Disinformation, Secretary General's initiative, Freedom of opinion and expression, Privacy
Focusing on platforms alone won't solve the disinformation issue.
Supporting facts:
- There are various sources of disinformation, including the dark web.
- People can hire others to generate disinformation.
Topics: Disinformation, Social Media Platforms, Freedom of opinion and expression, Dark Web
There is a need for greater civil society participation in the Secretary General's process.
Supporting facts:
- IGF sessions have not extensively discussed the proposals under the Secretary General's Global Digital Compact.
Topics: Civil society participation, Secretary General's initiative, Global Digital Compact
Bili Drake suggests that the European Union should fine Elon Musk under the Digital Services Act.
Supporting facts:
- Elon Musk has disregarded the guidelines set by the European Union
Topics: European Union, Digital Services Act, Elon Musk
Report
The impact of disinformation on democracy is a complex issue that is influenced by various factors and is context-dependent. Different perspectives exist on the extent to which disinformation can affect democratic processes. Some argue that disinformation can have a significant negative impact on democracy, while others caution against oversimplifying the issue and relying on false dichotomies.
It has been observed that a considerable amount of disinformation originates from broadcast media and is then amplified through social media platforms. This highlights the interconnectedness between different forms of media in the spread of disinformation. Several studies have indicated this behavior, emphasizing the importance of understanding the role played by different media channels in the dissemination of disinformation.
One key aspect that complicates the issue of disinformation is the lack of a standardised definition. Leading organisations like the European Union and the UN Special Rapporteur have differing definitions of disinformation, which can give rise to confusion and inconsistencies in tackling this problem.
It becomes crucial to establish a common understanding to effectively address disinformation. Tribal loyalty is identified as a significant factor that can lead individuals to believe in disinformation. In cases like the United States, where tribal affiliations and identity politics play a prominent role, people may align with certain narratives or disinformation due to their loyalty to a particular group.
This highlights how social and political factors can impact an individual's susceptibility to disinformation. Identity politics further compounds the issue, distorting the perception of truth. Some individuals develop their identities around opposing certain groups or ideologies, leading them to embrace disinformation that aligns with their pre-existing biases.
This phenomenon highlights the role of emotions and personal beliefs in shaping the acceptance of disinformation. Efforts to regulate disinformation on a global level have been proposed, but doubts remain about their effectiveness. Discussions in the United Nations have seen various proposals related to disinformation, such as China's suggestion to criminalise its spread and UNESCO's guidelines for digital platforms.
However, the complexities and geopolitical divisions inherent in regulating disinformation make it challenging to achieve meaningful global regulation. As a result, long-term engagement is advocated, focusing on building infrastructure to challenge disinformation effectively. The proposal for a code of conduct on information integrity for digital platforms is seen as an attempt at global internet governance.
This proposal aims to govern the information that flows through digital networks, aligning with the definition of internet governance. It raises questions about the extent to which such regulations should be implemented and their potential impact on freedom of expression and privacy.
The primary responsibility to counter disinformation lies with states, according to the UN General Assembly's resolution. While platforms such as social media play a role, governments bear the primary responsibility to address the issue effectively. Simply pressuring platforms to act does not address the root causes of disinformation.
It is important to recognise that disinformation can originate from various sources, including the dark web. This highlights the need for a comprehensive approach that looks beyond platforms alone. Strategies should encompass multiple sources and channels through which disinformation can be generated and disseminated.
Civil society participation is crucial in the discussion on countering disinformation. While there have been limited discussions on proposals like the UN Secretary General's Global Digital Compact, greater involvement of civil society in such initiatives can ensure diverse perspectives and balanced decision-making.
In conclusion, addressing the issue of disinformation requires a multifaceted approach that involves governments, platforms, and civil society. The complex nature of disinformation and its impact on democracy necessitate a nuanced understanding, taking into account various factors such as media channels, definitions, tribal loyalty, and identity politics.
Efforts to regulate disinformation at a global level should be complemented with long-term engagement and infrastructure-building, recognising the challenges and limitations faced in achieving effective global regulation.
CI
Clara Iglesias Keller
Speech speed
167 words per minute
Speech length
1095 words
Speech time
394 secs
Arguments
Disinformation can undermine democracy depending on context
Supporting facts:
- Disinformation affects different political contexts in different ways.
- The impact it has on elections and voter behavior is yet to be fully understood.
Topics: Disinformation, Democracy
Current empirical evidence on the impact of misinformation is focused mainly on the U.S. and Europe
Supporting facts:
- Research focus should expand to include contexts from Latin America, Africa, and Asia.
Topics: Disinformation, Geographical bias in research
Disinformation is a communication practice and an online harm
Supporting facts:
- Disinformation comes alongside misinformation, propaganda, fake news.
Topics: Disinformation, Communication
The intent behind disinformation holds a big chunk of legal relevance
Topics: Disinformation, Law
Disinformation functions as a form of political intervention
Supporting facts:
- In Brazil, this type of intervention serves as a means to show dissatisfaction or to directly attack democratic institutions, particular the electoral system and high courts.
Topics: Disinformation, politics
Global governance solutions may not be sufficient to countermeasure misinformation and disinformation roles in political disputes
Supporting facts:
- Political and economic interests shaping societies might play role in misinformation and disinformation
- Mitigating disinformation involves confronting broader issues than just communication practices
Topics: Global Governance, Misinformation and disinformation, Political disputes
Global solutions often rely on consensus-based governance structures which may not be powerful enough to modify digital business models and data usages
Supporting facts:
- Global governance does not include authority enforcement
- Current digital business models and data usages may require stronger regulatory measures
Topics: Global Solutions, Consensus-based Governance, Digital Business Models, Data Usage
Need for more empirical evidence outside of the global north
Supporting facts:
- In Latin American countries like Brazil, a majority of the population equates internet with WhatsApp, Facebook, and Instagram
- This is facilitated by zero rating policies
Topics: Internet Usage, Communication Apps
Report
Disinformation has the potential to undermine democracy, although its impact varies depending on the context. While there is currently no solid empirical evidence to suggest that disinformation directly changes voters' behaviors or affects election results, there is a consensus that further research is necessary to fully understand its implications.
The existing research on the impact of disinformation is primarily focused on the United States and Europe, highlighting a need for expanding studies to include other regions such as Latin America, Africa, and Asia. It is important to understand how disinformation strategies can influence political transformations in different contexts.
Disinformation is considered a communication practice and an online harm, along with misinformation, propaganda, and fake news. Its intent holds significant legal relevance, further emphasizing the need to address the issue. In some instances, disinformation serves as a form of political intervention.
For example, in Brazil, it has been used to express dissatisfaction or directly attack democratic institutions, including the electoral system and high courts. This highlights the destructive potential of disinformation as a tool in political disputes. However, the concept of disinformation poses a challenge within statutory regulation, as there is no clear space for its definition and regulation.
Global governance solutions, although important, may not be sufficient to address the impact of misinformation and disinformation on political disputes. It is necessary to confront the ultimate convertibility of economic power into political power, particularly within the media landscape. This is evident in countries like Brazil, where traditionally concentrated and unregulated media landscapes contribute to the spread of disinformation.
Additionally, global solutions often rely on consensus-based governance structures, which may lack the power needed to modify digital business models and data usage effectively. More empirical evidence is needed, especially outside of the global north. In countries like Brazil, internet usage is strongly associated with platforms such as WhatsApp, Facebook, and Instagram, facilitated by zero rating policies.
Understanding the impact of disinformation in these regions is crucial for developing effective countermeasures. In conclusion, addressing the challenges posed by disinformation requires not only further research but also more institutional innovation. This innovation should create an apparatus that allows diverse stakeholders and civil society to engage in the disputation of truth and content.
By confronting the convertibility of economic power into political power and exploring alternative governance structures, we can work towards mitigating the harmful effects of disinformation and safeguarding democratic institutions.
CW
Corway Wu
Speech speed
147 words per minute
Speech length
161 words
Speech time
66 secs
Arguments
Politicians also create and spread disinformation, not only news media or social media
Topics: Disinformation, Politics, Media
Timing is crucial in disinformation spread as it can be too late when voters realises they have been moved by disinformation
Supporting facts:
- Brexit voting behavior as an example
Topics: Disinformation, Time, Voting Behavior
Report
In the realm of disinformation, politicians are also actively involved in creating and spreading false information, not just news media and social media. This amplifies the scope of actors responsible for misleading the public. The negative sentiment towards politicians indicates a lack of trust in their intentions.
Timing is a crucial aspect in the dissemination of disinformation. The example of Brexit voting behavior is used to demonstrate this. Voters may be influenced by inaccurate information without realizing it until it is too late. This implies that the impact of disinformation can have lasting effects, shaping important decisions.
However, an opposing viewpoint is presented, disagreeing with Jeanette's argument about the significance of timing when discussing disinformation. It is argued that Jeanette's failure to consider timing weakens their argument. The negative sentiment expressed towards this disagreement suggests a potential blind spot in understanding the issue.
Overall, this analysis illustrates the multifaceted nature of disinformation and its wide-ranging consequences. Politicians, news media, and social media platforms are all complicit in perpetuating false information. The timing of disinformation is highlighted as a crucial factor, as it can significantly influence its impact on individuals and societies.
The disagreement regarding the importance of timing further emphasizes the complexity of this subject.
DK
David Kaye
Speech speed
150 words per minute
Speech length
1355 words
Speech time
542 secs
Arguments
Disinformation is a complex issue that requires nuanced understanding and cannot be addressed with a one-size-fits-all solution
Supporting facts:
- The perpetrators and locations of disinformation are important
- Different types of disinformation can have different impacts
- The platform used for disinformation may vary from legacy media to social platforms
Topics: Disinformation, Global Governance, Contextual Analysis
David Kaye believes clarity in the definition of disinformation is critical for legal regulation and governance
Supporting facts:
- He suggests that a lack of shared definitions may impede the process of legal regulation.
- He expresses concern about emerging regulations in Europe and the UK that rely on platforms to define disinformation during their work of transparency and risk assessment.
Topics: Disinformation, Legal Regulation, Governance
Global regulation of disinformation is not achievable
Supporting facts:
- He considers the idea of a concrete set of rules that will guide decision makers in every context and will have global oversight as a chimera
Topics: Disinformation, Regulation, Global Governance
We need common set of principles to guide authorities and platforms
Supporting facts:
- Governments, platforms and media outlets should have a common set of guiding principles
Topics: Disinformation, Principles, Platforms
Human rights law should be the base of these principles
Supporting facts:
- He suggests standards based on Article 19 on principles of legality, necessity and proportionality, and legitimacy of the objective
Topics: Human Rights, Law, Principles
Civil society should play a more active role in shaping documents and be included in the drafting and adoption process
Supporting facts:
Topics: Civil Society Participation, Policy Drafting
Excluding governments from the responsibility of better behavior and resourcing public service media will detract value from the process
Supporting facts:
- Examples from the Philippines and the United States
Topics: Government Responsibility, Public Service Media
David Kaye appreciates New Zealand's leadership role in promoting multi-stakeholder approaches, even in times of trauma, such as after the Christchurch attack.
Supporting facts:
- Paul Ash's stewardship of the Christchurch call had human rights at its core
Topics: New Zealand's leadership, multi-stakeholder approaches, Christchurch attack
He highlights the need to keep human rights at the core of global governance and believes that the IGF can be a platform where human rights and access to information is prioritized.
Supporting facts:
- There are models that can be drawn from to move forward, keeping human rights at the core
Topics: global governance, human rights, Internet Governance Forum
Report
Disinformation is a complex issue that involves the dissemination of false or misleading information. It can have various impacts and is spread through different platforms, including legacy media and social platforms. Understanding the nuances of disinformation is crucial, as there is no one-size-fits-all solution to address it.
David Kaye, an advocate for human rights, highlights the importance of clear definitions and understanding in addressing disinformation through legal regulation and governance. However, he expresses concern about the lack of shared definitions, which may impede the process of regulation.
Kaye also raises concerns about emerging regulations in Europe and the UK that rely on platforms to define disinformation, as this may affect transparency and risk assessment. While global regulation of disinformation may seem desirable, Kaye argues that it is not achievable.
Instead, he suggests the development of a common set of guiding principles based on human rights. These principles should be the foundation for addressing disinformation, providing a framework that ensures legality, necessity, proportionality, and legitimacy of objectives. In shaping policies and strategies to combat disinformation, Kaye believes that civil society should play an active role.
They should be included in the drafting and adoption process to ensure a more inclusive approach. Additionally, Kaye argues that governments should be held responsible for their behavior and should support public service media, as excluding them would undermine the effectiveness of addressing disinformation.
Over-reliance on platforms for handling disinformation is a matter of concern. Relying solely on platforms may create challenges in terms of transparency, accountability, and bias. Therefore, it is necessary to explore alternative approaches and strategies to combat disinformation effectively. The leadership of New Zealand in promoting multi-stakeholder approaches and prioritising human rights in times of trauma, such as after the Christchurch attack, is commended by Kaye.
He recognises the importance of keeping human rights at the core of global governance. In this regard, Kaye highlights the Internet Governance Forum (IGF) as a platform where human rights and access to information should be given priority. However, Kaye also warns against adopting ideas that disregard human rights in response to traumatic events, such as the Israel-Palestine conflict.
While people may have natural responses to such events, it is crucial to ensure that any responses or measures taken are rooted in human rights principles. In conclusion, addressing disinformation requires nuanced understanding and a combination of approaches. Clear definitions, shared principles based on human rights, civil society participation, government responsibility, and cautious reliance on platforms are all essential factors in effectively combating disinformation.
New Zealand's leadership and the IGF's emphasis on human rights in global governance are notable examples of positive progress. However, it is crucial to avoid compromising human rights in times of trauma or conflict.
G
Greta
Speech speed
113 words per minute
Speech length
37 words
Speech time
20 secs
Arguments
Disinformation is undermining democracy
Topics: Disinformation, Democracy
Report
Greta strongly believes that disinformation is significantly weakening democratic systems. This issue is related to the topics of disinformation and democracy and is associated with a negative sentiment. However, no specific supporting facts or arguments were provided to support the claim that disinformation undermines democracy.
Disinformation, the deliberate spread of false or misleading information, poses a serious threat to the democratic process. It can manipulate public opinion, deceive voters, and erode trust in democratic institutions. Greta's agreement with this viewpoint suggests that she recognizes the detrimental effects that disinformation can have on the functioning of democracies.
Although no specific supporting facts or arguments were presented, it is worth considering the widespread impact of disinformation in recent years. The rise of social media platforms has enabled the rapid spread of false information, often disguised as legitimate news or opinions.
This has the potential to sway public opinion and distort democratic discourse. Furthermore, disinformation campaigns have been known to target elections by spreading false information about candidates or manipulating public sentiment. Such tactics can undermine the integrity of electoral processes and compromise the ability of citizens to make informed choices.
The conclusion drawn from Greta's strong agreement is that urgent actions are needed to address the problem of disinformation. Safeguarding the democratic process involves countering disinformation through fact-checking, promoting media literacy, and strengthening regulations on social media platforms. It is essential to restore trust and ensure that accurate and reliable information prevails in democratic societies.
In summary, Greta strongly believes that disinformation is undermining democracy. While specific arguments and supporting facts were not provided, the existence of disinformation poses a clear threat to democratic systems. Addressing this issue requires collective efforts to counter disinformation, promote media literacy, and protect the integrity of democratic processes.
JH
Jeanette Hofmann
Speech speed
138 words per minute
Speech length
1457 words
Speech time
631 secs
Arguments
Little is known about how disinformation affects people's minds and voting behavior
Topics: Disinformation, Voting behavior
Disinformation focuses on strategic intent
Supporting facts:
- Usually we say it's a decision we make between misinformation and disinformation, disinformation focuses on strategic intent. And that usually is to manipulate people in their behavior and in their worldviews.
Topics: Disinformation, Propaganda, Manipulation
Empirical evidence on disinformation effects is lacking
Supporting facts:
- We really lack empirical evidence and I'd like to elaborate a bit on that.
Topics: Disinformation, Academia, Research
Research on disinformation needs to look beyond platforms and consider wider media environments
Supporting facts:
- If we want to learn more about whether or not disinformation works, we need to look beyond platforms and take into account the wider media landscapes. Context matters and there is no point in only looking at platforms and their algorithms.
Topics: Disinformation, Academia, Research, Media
Many people sharing disinformation are signaling their belonging or loyalty, not necessarily believing the information
Supporting facts:
- Evidence showing people may not believe what they're sharing
- Republicans post-election behavior as example
Topics: Disinformation, Public Discourse
Growing number of people don't care about distinguishing truth from falsity
Supporting facts:
- Increasing tribal attitudes and disregard for truth
Topics: Truth and Falsity, Public Discourse
Protocol standards for the infrastructure had to be globally agreed upon for a global network, but content should not be regulated at a global level.
Topics: Internet Governance, Protocol Standards, Content Regulation
The Digital Service Act by the European Commission is interesting, it extends the scope of human rights to not just govern the relationship between people and governments, but also to guide platform behaviours.
Supporting facts:
- The Digital Service Act will take effect early next year.
- In Germany, traditionally, human rights regulate the relationship between people and the government.
Topics: Digital Service Act, Human Rights, European Commission, Private sector
Jeanette Hofmann supports the provision in the Digital Service Act that provides regulators and researchers access to data related to systemic risks caused by platforms.
Supporting facts:
- Paragraph 40 of the Digital Service Act has a provision that allows for access to such data.
- This data access allows for better, evidence-based understanding of the impact of disinformation
Topics: Digital Service Act, Data Access, Systemic Risks
Governments should not use global concern on disinformation as an excuse to regulate public speech
Topics: Regulation, Freedom of Speech, Government
Report
This discussion explores the impact of disinformation on people's minds and voting behaviour. One participant criticises the limited knowledge surrounding this issue due to the lack of empirical evidence. They argue that it is essential to conduct research to better understand how disinformation affects individuals and their decision-making processes.
Another participant highlights the strategic intent of disinformation, stating that it is often used as a tool to manipulate people's behaviour and influence their worldviews. Disinformation is seen as a deliberate tactic that focuses on achieving specific objectives. The conversation also emphasises the need to expand research on disinformation beyond platforms and consider the wider media landscape.
It is noted that context plays a crucial role, and solely examining platforms and algorithms is insufficient. The impact of disinformation should be studied within the broader media environment to gain a comprehensive understanding of its effects. Furthermore, it is observed that individuals sharing disinformation may not necessarily believe the information themselves.
Instead, they may be using it as a means to signal their belonging or loyalty to a certain group or ideology. This highlights the complex motivations behind the sharing of disinformation and the need to consider social and psychological factors in analysing its influence.
The conversation also touches upon the rising disregard for truth and the detrimental impact it has on public discourse and democracy. This trend of increasing tribal attitudes and a lack of concern for distinguishing truth from falsity has severe consequences for the functioning of society and democratic processes.
Regarding the governance of the internet, there is a recognition that infrastructure standards need global agreement to ensure a cohesive global network. However, content regulation should not be undertaken at a global level, as it may impinge upon freedom of speech and local autonomy.
The Digital Service Act, proposed by the European Commission, is viewed as an interesting development. It extends the scope of human rights to not only govern the relationship between people and governments but also guide platform behaviours. This recognition that the private sector's influence on the exercise of human rights should be guided by human rights principles is seen as positive.
The Act's provision for data access related to systemic risks caused by platforms is supported. This data access allows for a better, evidence-based understanding of the impact of disinformation. However, the concept of needing to mobilise systemic risk to gain access to data is criticised, highlighting the need for more efficient mechanisms.
The discussion concludes with the suggestion that the Internet Governance Forum (IGF) could serve as a platform to discuss and implement best practices derived from Article 40 of the Digital Service Act. This highlights the potential for international collaboration and knowledge-sharing in addressing disinformation and its consequences.
Overall, this discussion emphasises the urgent need for comprehensive research, consideration of wider media environments, and the recognition of the complex motivations behind the sharing of disinformation. It also addresses the importance of upholding human rights principles and the challenges of content regulation in a global and interconnected digital landscape.
JM
John Mahob
Speech speed
127 words per minute
Speech length
184 words
Speech time
87 secs
Arguments
Disinformation is undermining democracy in the Philippines
Supporting facts:
- The current president is Marcos, the son of the former dictator
- Disinformation influenced the recent elections
Topics: Disinformation, Democracy, Philippines
Report
A recent discussion highlighted the detrimental effect of disinformation on democracy in the Philippines. The concern was voiced by the current president, Marcos, who is the son of the former dictator. One of the key arguments made was that disinformation played a significant role in influencing the outcomes of the recent elections.
Disinformation in the political landscape is seen as a serious threat to the country's democratic processes. It is suggested that the spread of false information and manipulation of facts can lead to citizens making ill-informed decisions, thus undermining the democratic values of transparency and accountability.
Supporting this viewpoint, John Mahob, a representative from the Foundation for Media Alternatives in the Philippines, also expressed concern over the impact of disinformation on the country's democracy. He stressed the need to address and counter disinformation, as it has the potential to distort public opinion and undermine the credibility of democratic institutions.
The speakers argued that the negative consequences of disinformation are far-reaching. By spreading false narratives and distorting facts, disinformation can erode trust in institutions and create divisions among citizens. It is seen as a tool that can be used by those in power to manipulate public sentiment and secure their own interests.
The evidence presented by both speakers raises important questions about the state of democracy in the Philippines. The influence of disinformation on the recent elections serves as a warning sign that steps need to be taken to protect the integrity of democratic processes.
Efforts to combat disinformation and promote media literacy are crucial in order to safeguard the principles of democracy, uphold freedom of expression, and ensure that citizens are adequately informed to make informed decisions. In conclusion, the discussion reveals a shared concern about the negative impact of disinformation on democracy in the Philippines.
The speakers, including the current president and John Mahob, emphasize the urgent need to address this issue and prevent disinformation from undermining democratic values. It is hoped that by raising awareness and taking appropriate measures, the Philippines can work towards creating a more informed and resilient democratic society.
ND
Nighat Dad
Speech speed
169 words per minute
Speech length
1308 words
Speech time
464 secs
Arguments
Disinformation might impact democratic processes but solid evidences are needed.
Supporting facts:
- Disinformation impacts several institutions and voters
Topics: Disinformation, Democracy, Democratic Processes
Definition of mis- and disinformation is complex and contextual
Supporting facts:
- Definitions are very contextual
- Content related to mis- and disinformation complex to define and identify
Topics: Misinformation, Disinformation, Contextuality
Disinformation can cause harm, especially to marginalized groups
Supporting facts:
- False content being shared can have intent of causing harm
- UNSR report on gender disinformation shows how it causes harm in various regions
Topics: Disinformation, Harm, Marginalized groups, Women
Global governance instruments exist but their application could be improved
Supporting facts:
- Several principles, conversations and resolutions have been established regarding global governance.
- There is need for reflection on what actors have learned from these tools.
- Regulations and laws introduced both in the global majority and north often suppress freedom of expression.
Topics: Disinformation, Freedom of Expression, Global Governance
Current treaties and mechanisms should operate synergistically and not in isolation
Supporting facts:
- Existing components can complement each other.
- Necessity to understand how to use existing systems.
Topics: Disinformation, Global Governance
Concern about governments using guidelines for their own interest, hence lack of accountability
Topics: Governance, Human Rights, Jurisdiction, Accountability
Need for regulatory mechanisms to hold state actors accountable
Topics: Regulation, State Accountability
There is a tendency to quickly jump to other solutions without fully examining the established ones.
Topics: problem solving, analysis
There needs to be more global oversight boards to hold companies accountable.
Topics: corporate accountability, oversight
Report
Disinformation, which can impact democratic processes, is a topic of concern. However, solid evidence is needed to support this claim. Caution must be exercised in interpreting the complex and contextual definitions of misinformation and disinformation. Disinformation has the potential to harm marginalized groups, and a UN report highlights its negative effects on gender equality.
Global governance instruments exist, but their application needs improvement as regulations and laws often suppress freedom of expression. State actors and companies have a shared obligation to provide accurate information and prevent the spread of misinformation. Synergy between existing systems is crucial, and the performance of oversight boards and governance mechanisms must be reviewed.
Concerns are raised about governments misusing guidelines and the lack of accountability. Regulatory mechanisms are needed to hold state actors accountable. User rights should not be forgotten in regions with restrictions. The local context is vital, and more global oversight boards are necessary to hold companies accountable.
Transparency reports play a key role in holding platforms accountable.
RM
Remy Milan
Speech speed
158 words per minute
Speech length
45 words
Speech time
17 secs
Arguments
Misinformation undermines citizens' confidence in institutions of the state.
Topics: Misinformation, Democracy
Report
Misinformation poses a significant threat to the stability of state institutions, as it undermines citizens' confidence in these establishments. This erosion of trust has detrimental effects on democracy and should not be underestimated. Remy Milan also shares this view, considering misinformation to be a high-level danger to state institutions.
The spread of false or misleading information can have far-reaching consequences in a democracy. It confuses and disenchants citizens, weakening the democratic fabric by eroding trust between the governing and the governed. This issue is especially relevant to SDG 16: Peace, Justice and Strong Institutions, which aims to ensure inclusive governance and access to justice for all.
Misinformation disrupts this goal by sowing doubt and creating divisions within society, hindering efforts to achieve peace and justice. It is worth noting that advances in technology, particularly social media platforms, have facilitated the spread of false information, making it easier for malicious actors to manipulate public opinion.
Addressing this issue requires a multi-faceted approach, including education, media literacy, regulation, and responsible platform governance. Overall, the danger of misinformation to state institutions is significant, impacting citizens' confidence and threatening democracy itself. Remy Milan emphasizes the importance of addressing this issue for achieving SDG 16 and ensuring peace, justice, and strong institutions.
Efforts must be made to promote media literacy, regulate false information, and foster trust and critical thinking to uphold the integrity of state institutions and democratic values.