Home | Newsletters & Shorts | IGF Daily Summary #1

IGF Daily Summary #1

DIPLO Reporting from IGF 2022 1200x280px mailchimp

IGF Daily Summary

Tuesday, 29 November 2022

Dear reader, 

Welcome to the IGF Daily #1

As in the previous 8 years, we have a team of rapporteurs following most of the sessions and providing just-in-time reports from the Internet Governance Forum (IGF). Starting today, we are also publishing IGF Dailies, looking back at what happened the day before, but with a twist: Instead of summarising all the sessions, we are highlighting only what is new.

Think about it this way: Instead of saying that the discussions underscored the importance of bridging the digital divide once and for all, we will tell you if new solutions are proposed on how to get there. Our IGF dailies will also be enriched by data analyses and illustrations.

r XDezpLUyVuibAEsSMB9TpMNiPurN3oz9FxFDI4ZGmHdj3AZfFm3tkC8cQY5PcIPWuqrohDudhFUhVHn441Qu9GK0hfa4C 4vfNsYxTkeXa3Lax1sw9FsB0tTCTIAiN8s2I6CkWLFjgSdKrIit1YjZAYcF uKlrZZinorsKAgjw6wMPJdwmH6 2qwhB71PU

Do you like what you’re reading? Bookmark us at https://dig.watch/event/igf2022 and tweet us @genevagip.

Have you heard something new during the discussions, but we’ve missed it? Send us your suggestions at digitalwatch@diplomacy.edu.

Digital Watch team

IGF 2022 logo
IGF Daily Summary #1 15

Setting the tone

The 17th Internet Governance Forum (IGF) was officially opened on 29 November 2022 under the theme ‘Resilient internet for a shared sustainable and common future’. This overarching topic and its five themes, aligned with the UN Secretary-General’s envisioned Global Digital Compact, echoed throughout the opening speeches and the high-level panel that followed. 

While the speakers saw the internet and digital technologies as accelerators for digital transformation and a springboard for the SDGs, they underlined the need for a human-centred and human rights approach to the digital future. Resilient digital infrastructure, interoperability, harmonisation of regulations, connectivity, affordability, and relevant content were mentioned as priorities for building an inclusive digital future. Challenges remain in identifying the boundaries of digital transformation and executing, implementing, and deploying digital transformation.

The dark side of the internet – including deadly disinformation, online bullying, and challenges to freedom of expression, among others – needs to be tackled if we are to forge a digital future where access to fast, safe, inclusive, and affordable internet is a given, not a privilege. 

Speakers also reflected on the role of IGF and its contribution to the Global Digital Compact. The IGF is seen as a convener and a connector, creating a high-level playing field for sharing policy solutions, best practices, and experiences for identifying emerging issues. Discussants underscored that the value of the IGF is in its multistakeholder model. The role of the IGF Leadership Panel was highlighted as well.

Some speakers, however, called upon the IGF to up its game and produce more than just reflections. The IGF, they say, should put forward tangible proposals, whether for shaping global norms and standards, informing national-level regulations, connecting citizens with their governments, contributing to the Global Digital Compact, or contributing to the UN Summit for the Future in September 2024. 

A High-Level Leaders Track discussion on digital trust and security drew parallels between digital transformation, security, and climate change. 

The main challenge was finding a baseline for trust among countries while respecting their sovereignty and fostering cooperation. A tough topic on its own, it comes at a time when digital security is under threat: Malicious actors are targeting the critical infrastructures of hospitals, airports, and power grids, with devastating human consequences. The panellists explored this issue from two perspectives: What are the existing barriers to digital trust and security, and what practices are in place that can foster a common understanding on underlying principles of trust and security?

The speakers pointed out the need for trust between different stakeholders – governments, law enforcement, civil society, service providers, and users – to supporting collaboration. A greater challenge lies in fostering trust between states, where the discussion turned to current avenues for exchanging opinions on security on the international level.

While there are not many, the existing processes, such as the Ad Hoc Committee on Cybercrime, UNESCO’s Recommendation on the Ethics of Artificial Intelligence, the Paris Call for Trust and Security in Cyberspace, the Oxford Process, and the Geneva Dialogue have shown significant progress. According to panellists, this challenge predates the discussion on digital security – the quandary stems from trying to find common fundamental values between states, upon which trust and norms can be built. 

Discussing fostering common understanding and meaningful, sustainable cooperation, the speakers agreed that dialogue and stakeholder engagement are the base, but new models of policy design must evolve. Using the examples of the Global Forum on Cyber Expertise (GFCE) and the NATO Centre of Excellence for Cybersecurity, the speakers reflected on inclusive participation in digital security discussions.

Turning to climate change and the role of digital technologies in supporting developing countries, the speakers highlighted the need to improve access to technologies, technology exchanges, and global value chains. The speakers underlined the importance of open data and open science, including UNESCO’s Recommendation on Open Science

Logo of Global Digital Compact
IGF Daily Summary #1 16

Will UN Digital Compact re-energise internet governance?

The UN Digital Compact was mentioned 44 times during the first day of the IGF, indicating its high relevance. The compact could provide a fresh breeze in the internet governance space. The IGF started losing steam during the pandemic years. At the same time, there are more and more issues that require governance solutions. The positions of actors have shifted as governments gain confidence as providers of public goods and technology companies seek more stable and regulated operational spaces. 

The compact can help revisit old IG narratives and introduce new framing of critical digital issues. It can be particularly useful in overcoming policy silos by promoting cross-cutting coverage of issues such as cybersecurity, AI, and data governance

The IGF 2022’s discussions in Addis Ababa will be fed into the compact consultations. Once it is adopted, the compact could also reshape the role of the IGF. At a time when new multilateral agreements are difficult to achieve, the existing policy spaces such as IGF – with a mandate endorsed by the UN – have even higher importance.

Thus, the future of digital governance, especially ahead of the decisive 2025, could be shaped at the nexus between the new dynamism triggered by the UN Digital Compact and the existing mandate of the IGF. 


AI in the spotlight

veveQ5nQjCDNuKYP55Lr4ytUWBNUAkdy8N9m yFP5YBDBOWMOpLeSPYyIy357DwqAeBTdAp3mTnPnSXUFS9gfGnbSRS EcPDQ YjMS6gEaqlw9P3pPJv4vtzz4yQvi 5cTHKrCZhBwhtT8QK1NFaIjUmi6w2ZNV3OA3 o5GwYRYV nQjpWO8AI ym20mPzJ

Two sessions on the first day of the IGF focussed on AI 

The session Affective computing: The Governance challenges introduced affective computing as a new term on the IGF agenda. This is usually associated with the use of AI to recognise, interpret, and simulate human emotions. It can detect anger, happiness, and excitement. Affective computing is about ‘machines knowing us better than we know ourselves’ as the power of tech platforms is often described. 

Affective computing can be used in education, transportation, hiring, entertainment, and even digital love lives. In education, AI can track students’ moods and attention, in policing to discern deception, and in job interviews to determine applicants’ feelings about a company. 

Discussion at the IGF demonstrated that affective computing still needs more capability to identify human emotions. Current research shows that emotions are not universal. They are relational, depending on local, cultural, and personal contexts. Emotions are also highly complex, ensembling hundreds of signals, from facial expressions to movements, body postures, choice of words, different abilities, and particular needs. Technology cannot yet capture the intrinsic complexity of emotions.

Affective computing carries bias, as it is trained mainly on images and expressions of people from developed countries. Thus, these technologies carry mainly the emotions of the Global North. Using them in other regions can cause significant damage to populations.

As a way to deal with the problems and risks of AI and affective technologies, new approaches are being developed. Microsoft came up with its 4Cs Ethical Guidelines: communication, consent, calibration, and contingency for affective computing systems. Other soft laws and non-binding guidelines exist, but over the last year, it has become clear that strong regulation is also needed. 

AI governance and regulation was the focus of the session Realizing trustworthy AI through stakeholder collaboration examining how to apply the OECD’s AI Principles (2019) to the development of AI platforms. Many discussants argued for AI governance via experimentation and policy sandboxes. This approach can increase transparency, trust, and public support for AI platforms. 

Standardisation is another indispensable approach to transform principles such as fairness and transparency into reality. 

Professional and policy silos are becoming more problematic in AI than in other digital realms. Even tech companies, developers, engineers, product managers, and data scientists address AI from their own angles. They have even fewer bridges to the policy community and the general public than they do to each other. The OECD principles try to unite the tech and policy sectors’ unique knowledge. Coding competitions and hackathons are events that join specialists from different coding and policy communities on common challenges.

lost in translation
How to reduce the lost in translation between tech and policy professional cultures?

AI and affective computing, in particular, will require a lot of governance innovation to reduce risks and increase trust, transparency and overall inclusion in AI governance. 

A Fragmented Internet?

The internet is global in its technical infrastructure but local in its consequences for economies, cultures, and societies, which should be reflected in its governance. 

Submarine Cable Map
Submarine cables – critical digital infrastructure

If we deal properly with this global/local interplay, internet fragmentation can be avoided, or at least slowed down.

This session outlined a catalogue of policies and approaches of governments and tech platforms that could lead toward internet fragmentation, including the vulnerability of submarine cables, tech platform policies, and government filtering.

A stronger push towards digital sovereignty as a part of national sovereignty is seen as an accelerator of fragmentation. After realising the importance of digital networks for national stability, especially during a pandemic, more and more countries are extending national sovereignty over digital networks and data. 

Increasing fragmentation could lead toward the end of the unified and interoperable internet. The internet core infrastructure is very robust, surviving all challenges, including recent conflicts crossing national borders. Many discussants called for the development of standards to define hate speech, disinformation, objectionable online content, and other issues that could fragment content and data sharing on the internet. More focus on the bottom billion than the next billion could reduce the risks of social fragmentation and new divides triggered by internet policy dilemmas. 

Respecting the principles outlined in the UN Charter will also help prevent internet fragmentation. The UN Global Digital Compact could help establish a new consensus on digital governance that would preserve the core technical infrastructure of the internet while providing space for other policies adjusted to regional, national, and cultural specificities.

Preventing fragmentation was one of the aspects covered by the session on the role of the community in achieving universal acceptance

Universal acceptance fosters the use of web and email addresses in many languages and scripts. If the internet infrastructure can be used in different languages, it will reduce the risk of internet fragmentation. 

Universal acceptance is primarily a societal value that should facilitate the inclusion of all internet users.

The growing erosion of digital rights

We start our coverage of human rights-related sessions with a stark reminder that the notion of privacy is eroding among the younger generation. Younger people, who represent up to one-third of the internet population, are growing up with a diluted understanding of what the right to privacy means, and what safeguards they are entitled to.

In the Global South, privacy and data protection rules have been enacted only in recent years, signalling an even stronger need for youth to be educated about human rights from an early age using language they can understand. Behavioural advertising or profiling for targeted advertising shouldn’t treat young users in the same way as adults.

Contributing to this problem is that the development of products and services does not always follow the privacy by design approach. Users shouldn’t have to monitor their privacy settings every time they install a new app. Many legal remedies exist for users who have been victims of data breaches. Their effectiveness largely relies on enforcement and regulatory oversight, which in some countries needs significant improvement.

When it comes to apps, the take-it-or-leave-it approach to signing up for an app or a service in exchange for relinquishing rights to user data should be replaced by a fairer system that gives users the option to limit the type or amount of data the app gathers. Better still, regulators should prohibit companies from gathering more data than they need, even if users might agree to sharing it. Young users, in particular, seldom understand the implications of such a choice.

We’re also reminded of another stark fact: Almost 20,000 webpages containing coerced self-generated child sexual abuse imagery of kids aged 7–10 were discovered in the first six months of 2022, according to data by the Internet Watch Foundation (IWF) released in August 2022. That’s an increase of over 360% compared to the previous year. While coercion is clearly abusive and illegal, there is other content voluntarily generated by kids that may be unwise, and can be misused. While there is much educational material available, governments and service providers need to design and create new, more user-friendly material that has children and adolescents as clear target audiences. 

Concerning gender rights online, digital technology has amplified abusive behaviour against women and girls, leading to a spiralling problem of online violence. The measures undertaken by NGOs, the private sector, and governments are taking on the fight against online abuse as well as their resources permit. Stronger enforcement, local solutions addressing local contexts, and more funding for civil society would make a greater difference. 

Data flows: Fragmentation vs harmonisation?

The discussion on economic and legal issues started on the first day, full steam ahead, with cross-border data flows. We were reminded that there are different approaches to data flows around the world; in China, India, the USA, the EU and elsewhere, each jurisdiction has; its own priorities and interests it wants to protect (safeguarding privacy, advancing the local economy, protecting (national) security, etc.).

While this regulatory fragmentation comes with challenges to trade and the global digital economy, harmonisation attempts – leaving aside their likelihood of success – run the risk of eliminating national characteristics.

And yet, there seem to be a few areas of agreement among speakers. One IGF session suggested the cross-border flow of non-personal data must be facilitated; that there is a need for minimal global rules for data transfers; and that African countries need to come together to strengthen their position on cross-border data flows.


hjVjPKMz CYKzaJfPQmWxV8NQKqIG5f86Dxtsjmg4K6BraAdRJoo82aNtbgrcv8Q9XTyBZRdOccaccFVCiKcoY2knURgxB 9VDU1aT4XpVAl6 bSSi 1L9BeaKs72iPMrVIpxP2nxkoXlT4TRR1ESymJlyuBq3agOwXK GZXd5tmExV

Fighting untruths, such as online misinformation and disinformation, was the main sociocultural concern across workshops on Day 1.

A pre-bunking approach to fighting misinformation was put forward: In the case of fighting misinformation epidemics, people can be exposed to weakened doses of misinformation or disinformation techniques to develop cognitive antibodies over time, through a process known as psychological inoculation. An underlying challenge, however, is adapting those interventions to different cultural contexts.

Other suggested approaches included promoting quality information that complies with journalistic good practices and the design and implementation of digital literacy programmes to fight disinformation. It was, however, noted that if the recipients of such programmes cannot read or write, digital media training seems like an unrealistic approach to tackle this issue. 

Participants also assessed governments’ role in internet governance, and noted that more policy innovations in internet governance are needed. This highlights that existing systems, such as WSIS, designed to foster the participation of governments in internet governance, remain insufficient. The UN Global Digital Compact should be a valuable avenue to address what should be the role of governments. 

The day in graphs

Diplo’s AI and Data Lab, the experimental space of Diplo, has processed transcripts from all workshops and main sessions of Day 1 employed the following models:

  • Bart Large model trained on Multi-Genre Natural Language Inference (MultiNLI) dataset for the task of zero-shot text classification
  • Roberta Base model trained on english language corpus for the tasks of tagging, parsing, lemmatizing, name-entity recognition
  • Gpt3 Da-Vinci model for the task of text generation
uPbo7hkp0k lPWDCIOwFmAzZ5iFVged3znjBGZ0aoca7u4ZTLiuGcE1eFbLI2trBsHsGfDvguUd3HmgUshe8CD0MZmY wGL2N kFJ13GQO3GD46qKJIRQQPMwttf AQcYQIdCl1nJ qXIyfgUFOWdqQf9WDii3Q7P9fbkTmnsO1dW8Ep pOzu7Gq1azPQ

Caption: The data analysis in progress

The lab generated the following four graphs, visually summarising yesterday’s (Tuesday’s) discussions. It is no surprise that at a forum about the internet and digital issues, online, internet, and digital are the most frequent words in discussions.

This was surely the case for Day 1, as our graphs below illustrate. What is interesting to note is that data, know, and people also appeared quite often in the debates. This is undoubtedly an illustration of several calls we have been hearing at the IGF and elsewhere more and more often; we list here only three of them:

1. Data is the engine of the digital economy, but also an asset (both personal and economic) that needs adequate protection.

2. Knowledge (or understanding) of how technology works will help us shape better policies and regulations for an inclusive, safe, and secure digital future.

3.We need digital developments to be people-centric and embed adequate protections for human rights and fundamental freedoms. And generated these four graphs, visually summarising the discussions of yesterday:

kIIYiL7hiuEpAN4bdz1XteF9zw5wIlomLAY4GKhkgyFtPflDxPD1hXMFf 1FqoCX6I0NXzi5Tia9xz6l6NyDpt9D fGLsvo63d3CFZZ PyBvNqInBXKau3Z6Ogs aN KIH0yHLdcOmtVCLpRWlidX qskqnvbFK9z9HgpoOi9Zn

Wordcloud of all Tuesday sessions, based on detected bigrams with Dunning likelihood collocation score greater than 30.

tIHtoWw3uQVTTagld nuMk3pAWeGg5g9UCmFh 4wP4BUPR6 8LanpF mRbBwqTmjVzaQZtUAfW9JBryjr1B210U98ww6hOa8ortfg1q26Z0g 4DWojdOw UXCZIswhUNnSY0fLdnVYyW0hr9B23bkS3JmgnSBK

Most prominent verbs with adverbs, based on pattern matching conducted with Roberta Base model

A bar graph shows the relative occurrence of ‘critical internet resources (817); privacy and data protection (7.99); interdisciplinary approaches (7.57); data governance (5.71; legal and regulatory (5.6); artificial intelligence (4.76); access (4.76); cybersecurity (3.92); content policy (3.41); and capacity development (3.22).

Most prominent topics addressed during Tuesday’s sessions, based on zero-shot classification of 3-sentence paragraphs with Bart Large model

Bar graph shows the relative frequency of noun chunks digital (616), internet (529), people (472), online (388), information (324), question (310), countries (275), rights (269), example (244), and governance (221).

Most frequent noun chunks, detected by Roberta Base model

Diplo at the IGF

Our speakers

Our executive director Dr Jovan Kurbalija spoke at two sessions today: WS #335 Fragmented reality. New horizons of digital distrust and WS #66 Reassessing government role in IG: How to embrace Leviathan. Read our reports from WS #335 and WS #66.

Our Head of Digital Commerce and Internet Policy Marilia Maciel spoke at OF #4 Digital self-determination: a pillar of digital democracy. Read our report from OF #4.

Visit our booth!


If you are attending the IGF in person, swing by the IGF village and visit our booth! Our brand new Geneva Digital Atlas and Stronger Digital Voices from Africa report are on display along with many other resources and goodies from Diplo and the Geneva Internet Platform.