Internet Governance Forum 2022
Addis Ababa, 28 November – 2 December 2022
If you are trying to discern the overall picture after hundreds of workshops and myriad discussions during the IGF last week, you are in the right place. That’s exactly what we are doing. Diplo and the GIP started reporting from the IGF eight years ago to gain the perspective of a composite zoomed out view of this complicated tapestry woven with our individual experiences coloured by issues of interest, meetings with friends, and nuanced by corridor chats.
In addition to this panoramic view of the IGF, you can dive deeper into issues of your particular interest, following our layered reporting.
From the first layer – this text – you can navigate to the second layer, consisting of summaries of sessions and data analyses of the corpus text of the IGF 2022.
On the third layer, you will find detailed information on topics from AI to cybersecurity, as well as on main actors from the UN, the private sector, academia, and civil society. This holistic reporting provides you with comprehensive coverage of the key topics, actors, and trends beyond IGF 2022 as a single event.
This summary is based on our reports from over 100 sessions, as well as data analysis of 188 session transcripts with 1,851,317 words (approximately 3,702 pages). The data section of this report contains more detailed analyses of the text corpus of IGF 2022.
10 Highlights from IGF 2022
IGF and Global Digital Compact: New dynamic interplays
During IGF 2022, the UN Tech Envoy presented the Global Digital Compact (GDC) to the IGF community. As the new kid on the block, the GDC garnered a lot of attention, being mentioned 265 times during the IGF sessions.
With a 2024 deadline for its adoption, the GDC gave new urgency to the internet governance debate. Uncertainty about how the IGF and the GDC work together started to be cleared up in practical and useful ways. IGF deliberations will feed into the GDC.
Furthermore, better designed and more effective interplays between the IGF’s tradition and mandate and the intensity engendered by the GDC create a new dynamism in internet/digital governance.
The appointment of the new UN Secretary General’s Envoy on Technology, Indian diplomat Amandeep Gill Singh, earlier this year, created a new dynamism in the digital governance space. In the centre of this dynamism is the work on the GDC which should be part of the Pact for the Future, to be adopted in autumn 2024.
The GDC is intended to address highly controversial digital issues in an extremely polarised world. Most of today’s pressing policy issues, from security to the economy and human rights, can be viewed through a digital lens.
As the GDC will be a complex exercise, its success will be judged on several criteria:
- Inclusivity of all actors that affect or are affected by digital developments
- Diversity of issues addressed and perspectives reflected in the GDC
- Concreteness of approaches proposed
Our hope is that the GDC will succeed in, at least, proposing a mechanism for answering the growing number of ‘calls’ from citizens, companies, and countries for solutions to problems ranging from cybercrime to dealing with misinformation and achieving a fair distribution of tax revenues in the digital economy. The list of more than 50 issues under discussion includes data protection and the regulation of AI.
Finding the ‘phone number’ to ask for help on digital problems is especially important for citizens and actors from small and developing countries who do not have institutional or individual capacities to navigate the current maze of internet governance with more than 1,000 institutions and processes. Most of them are looking for functional and straightforward solutions for the digital problems they face.
These solutions could be provided by international organisations, expert communities, tech platforms and other actors. The search for practical policy solutions could resolve the false dichotomy between multilateral and multistakeholder approaches that have consumed a lot of energy and time in the internet governance debates.
Maturing hybrid format of the IGF with some hiccups
After the prolonged pandemic, IGF 2022 in Addis Ababa returned in full swing. The IGF tradition was empowered by the vitality of the African digital community. It was a hybrid event with a total of 5.120 registered participants in situ and online attending over 300 sessions.
The hybrid format of the meetings is maturing and improving access, but has open issues still to be resolved to ensure an equitable experience for people attending online and in person.
Dive deeper: Research on future of meetings
Parliamentarians reclaim a seat at the multistakeholder table
Paradoxically or not, parliamentarians feel they have been left behind in multistakeholder discussions on internet governance and digital policy, despite being responsible for the laws governing our digital spaces. One of the reasons for the absence of parliamentarians has been their unique status of being part of national governance structures but not being part of governments’ representation.
Since IGF 2019, parliaments have been reclaiming a seat at the multistakeholder table. Some of the main goals of the IGF 2022 parliamentary track were to improve the ability of parliaments to deal with digital issues, get parliamentarians more involved in multistakeholder processes and discussions, and make sure that laws are passed by parliament and not through parliament.
Launched in 2019, the IGF parliamentary track gained new momentum this year. More focused discussions – this time on addressing cyber threats – and stronger messages characterised this year’s track, which saw particularly strong engagement from parliaments of developing countries.
When discussing their role in addressing cyber threats, parliamentarians acknowledged that they have a duty to ensure a proper balance between measures to enhance cybersecurity and tackle cybercrime, on the one hand, and the protection of internationally-recognised human rights, on the other hand.
They also committed to encouraging effective cooperation – nationally, regionally and internationally – between public and private actors in creating a more safe and secure cyberspace, and in building an environment of trust conducive to such cooperation.
Three strong calls reverberated throughout the discussions:
- Parliaments should contribute to strengthening national multistakeholder dialogue on internet and digital policy issues, and ensuring that national interests and priorities are reflected in international processes.
- More efforts are needed to build the capacity of parliamentarians to work on digital policy issues, including through training and skills building. This will help ensure that they engage in meaningful debates before passing legislation for the digital space.
- Parliaments should have their own seat at the table in regional and global processes dealing with digital issues.
The fact that the IGF has been paying increasing attention to parliamentarians in recent years has resulted in concrete outcomes. Earlier this year, an African Parliamentary Network on Internet Governance was launched, inspired by parliamentary activities at IGF 2021. The network had a strong presence in Addis, starting with a training session right before IGF, and continuing with its members’ active engagement throughout the entire meeting.
Youth participation: Rejuvenated IGF
This year, the IGF was ‘younger’ than usual, being held in Africa, the continent of young people. Even visually, one could notice many younger people at the sessions and in the corridors of the Addis venue. In addition to participation, youth was one of the thematic tracks. IGF 2022 provided an additional push for the IGF Secretariat’s Strategy on strengthening engagement of youth in internet governance.
A couple of years ago, the IGF Secretariat launched a Strategy on strengthening engagement of youth in internet governance, cementing the acknowledgement that young people should be empowered to be more actively engaged in internet governance processes. In line with this strategy, a Youth Track was part of the overall IGF 2022 process, and included a series of capacity building workshops in the run-up to Addis and a Youth Summit during the IGF meeting.
Throughout their discussions on the role of youth in digital transformation, participants in the summit stressed – once again – that ‘youth has to be recognised as a serious stakeholder in policy and regulatory development’. This one message stuck with us as quite powerful, being framed as a call – that hopefully many will answer – to truly support young people to be the architects of a safe, secure, and inclusive digital future.
And if we may take this one step further, we would add: In addition to youth, don’t forget to add a seat at the table for future generations! The digital space we shape today will be part of the legacy we leave for them.
Rise of digital diplomacy and foreign policy
At IGF 2022, there was a noticeable increase in the participation of diplomats and government officials. It reflected the growing relevance of digital issues for national diplomacies worldwide. Many countries are in the process of developing digital foreign policy and diplomacy approaches and institutions. Two sessions addressed the building of digital diplomacy and foreign policy in Africa.
Dive deeper: Digital diplomacy and Digital foreign policy | African digital diplomacy and governance | Report: Stronger digital voices from Africa: Building African digital foreign policy and diplomacy
Technology and infrastructure topics
Digital inclusion: Beyond cables
IGF 2022 showed clearly that digital inclusion is a priority and critical issue for African countries. As more and more optical cables are laid around the African continent, and new satellite technologies are employed for ‘last mile access’, discussion on digital inclusion evolved towards other aspects of exclusion: cost of access, language barriers, gender, skills, etc.
A holistic digital inclusion requires taking into consideration reflections on gender, youth, language, finance, education, and other critical factors that all play a role in the full realisation of the digital potential of citizens, communities, and countries worldwide.
While some of us take the internet for granted, the digital divide remains a reality. In Africa alone, over 800 million citizens still lack access to the internet, despite the commitment undertaken by world leaders to ‘significantly increase access to ICT and strive to provide universal and affordable access to the internet in LDCs by 2020’, and although the internet is widely recognised as an enabler of human rights.
There are efforts by various stakeholders – in Africa and beyond – determined to bring connectivity to those who do not yet have internet access. Locally-owned and operated networks (be they wired, wireless, or fibre) and innovative initiatives such as the Internet Backpack, for instance, are seen as solutions to fill connectivity gaps and provide access where traditional telecoms networks do not. Low Earth orbit (LEO) satellites also offer new opportunities to connect the unconnected, but they come with new governance and regulatory issues in areas such as spectrum allocation and space law.
Yet, it takes more than cables and satellites to make the internet accessible and inclusive.
For many experts, the path to meaningful and holistic internet access is through inclusion: closing the digital skills divide; adopting inclusive measures that embrace women and girls in ICT; developing more products and services for use by people with a disability, and more elderly-friendly devices, applications, and services; and teaching users about rights and responsibilities in language they can understand. The same holds for developing content in local languages: Users who don’t speak English – widely considered the internet’s lingua franca – won’t find much value in an internet which rarely speaks their language.
Fragmentation of the Internet: reality and risks
The red line which will make or break the internet is adherence to the use of the same core protocols, in particular, the TCP/IP (Transmission Control Protocol/Internet Protocol). New risks will emerge with the shift from core protocols, TCP/IP and HTML towards, for example, a protocol for metaverse platforms.
If countries and companies start using different internet protocols, the risk of fragmentation will increase. In the meantime, differences and distortions will also emerge from content filtering, companies’ exclusive spaces, walled gardens, and the wide diversity of policy and regulation.
With a dedicated sub-theme and a policy network of its own, fragmentation was a buzzword at this year’s IGF. And yet there is no one unique understanding of what internet fragmentation means.
At the technical/connectivity layer, a lack of interoperability between core standards and protocols is a risk to the global nature of the internet. On the application and content layers, policies of tech platforms and regulations imposed by governments (in particular content-related ones) can contribute to internet fragmentation, causing the user experience to be distorted. In addition, the filtering and blocking of certain content in some jurisdictions and different approaches to data sovereignty increase the risk of weakening the global internet on a policy and social level.
The growing geopolitical trend of imposing economic and cyber sanctions can also impact the availability of critical internet resources and online services in countries under sanctions. A stronger push towards digital sovereignty as a part of national sovereignty is further seen as an accelerator of fragmentation.
Trying to bring some clarity to current and future discussions on these issues, the Policy Network on Internet Fragmentation proposed a framework outlining three key dimensions of fragmentation:
- Fragmentation of the user experience
- Fragmentation of the internet’s technical layer
- Fragmentation of internet governance and coordination.
There are numerous solutions to avoid internet fragmentation:
- Building trust on the internet
- Adopting global protocols and standards such as IPv6 and IDNs
- Fostering industry-wide collaboration
- Assessing the potential impact of new laws and regulations on the architecture of the internet
- Promoting international regulatory collaboration and developing international standards around issues such as hate speech and disinformation
- Reinforcing the need to avoid lack of coordination between policy processes at ICANN, ITU, standardisation organisations, and the IGF
A somewhat bold proposal was also put forward: UN member states would sign a declaration recognising the internet as a peaceful environment for the public good; this – it was said – could be a confidence-building measure to avoid internet fragmentation. A more direct, easier approach would be to ensure that the upcoming UN GDC helps establish a new consensus on digital governance that would preserve the core technical infrastructure of the internet while providing space for other policies to be adjusted to regional, national, and cultural specificities.
Coming soon: The word fragmentation has been applied to so many issues and concepts that it has become challenging to understand its true significance. Stay tuned for a Diplo blog post on taxonomy and meanings – our contribution to the debate that started at the IGF (and we trust will continue).
AI: Fewer ethics debates – more governance proposals
If in past years, there used to be much talk about the good and bad of AI, and about overarching values and principles to guide the development of AI, this year the discussions focused on AI governance and regulation: Where are we with AI regulation? What is missing? What is feasible, and how can we get there?
How can we regulate AI in a way that encourages its development and use for the good of people and society around the world? This question came up in several IGF 2022 discussions, but there is no single answer. While some jurisdictions are developing their own comprehensive regulatory frameworks for AI, some argue in favour of step-by-step approaches involving governance experimentation and policy sandboxes, as these are considered useful to increase transparency, trust, and public support for AI platforms.
Technical standards are another governance mechanism that translates principles such as fairness and transparency into concrete tech requirements and defines how a system should behave. Once guidelines and regulations are in place, ecosystems of assurance and certification are eventually needed to assess and communicate compliance with the rules.
At the international level, we have high-level principles such as those outlined by the OECD and UNESCO, as well as ongoing work on developing regional regulations (the proposal for an AI Act at the EU level) and international instruments (the Council of Europe’s work on a treaty on AI and human rights).
But the possibility of reaching a globally binding agreement to regulate AI is seen with scepticism. A semi-bottom-up approach might come to the rescue: This would entail different stages, where agreements at the regional level would be built first, and then different interfaces for cross-border cooperation (including terms of knowledge transfer) would be defined.
Whatever form of regulation we envision at the international level, it needs to be shaped in a way that reflects the views and values of stakeholders worldwide, including under-represented groups and actors from the Global South. Nowadays, many benefits of AI solutions are concentrated in the Global North. The data sets used to train the algorithms insufficiently reflect the diversity of the developing world, which tends to be used as a testing ground for future consumers. This needs to change; developing countries need to encourage the development of local AI solutions and demand the full participation of their stakeholders in global governance processes.
Increasing trust in the use of AI also requires bridging professional and policy silos. Tech companies, developers, engineers, product managers, and data scientists must participate in conversations with policymakers if we are to develop and enforce effective and efficient regulations. Approaches include creating more opportunities for regulators to get closer to the technical field and encouraging more public-private partnerships and initiatives such as innovation hubs and hackathons.
To decrease the widening gap between policy and innovation and enhance public trust in AI solutions, an open approach to governance is needed; corporations must embed ethical and culturally sensitive principles in the design of AI technologies and products; and a multistakeholder approach is required in the formulation, implementation, monitoring, and evaluation of regulation.
Civil society has a role to play, too: It should bring people’s voices and real-life experiences into discussions on the use and development of AI.
If these are things that we have heard at previous IGFs, new(er) issues were also emerging. Among them was a discussion on AI-based affective computing – in short, the use of AI to recognise, interpret, and simulate human emotions. As the technology is not sufficiently advanced to correctly identify human emotions, especially in different cultural and social contexts, relying on it to make decisions comes with considerable challenges (e.g. bias, discrimination, and even risk of physical or emotional harm). The message is clear: Do not over-rely on affective computing systems without fully understanding their shortcomings.
The metaverse is pretty much a work in progress, from a technical point of view, but discussions have already started on potential regulatory issues (e.g. security and crime, safety and data protection, applicable legislation and enforcement) and how to address them. There seems to be agreement on the need to have a set of common rules and codes of conduct for the metaverse(s). The extent and depth of such frameworks, however, seem to differ.
Regulating the metaverse poses similar challenges to policymakers as regulating and governing cyberspace and the internet. So lessons learnt from the latter can be applied to the first: regulation needs to address risks, but without unduly hindering innovation; ethical principles should be embedded as much as possible into both regulations and the development of the tech itself; all relevant stakeholders have to be engaged in policy and regulatory processes.
Cybersecurity: many sessions – few new insights
Cybersecurity has always been featured prominently in the IGF agenda. It was one of five main themes this year, with 24 sessions. Most debates reflected well-known themes without offering new ideas or major conceptual breakthroughs. Even the cyber aspects of the current conflicts such as the Ukraine war were sporadically mentioned in the IGF debates.
The power of cyber diplomacy
We publish this report just as the UN OEWG continues its discussions on the norms of responsible behaviour in cyberspace. The OEWG itself is a continuation of efforts made by the international community to shape the norms of responsible behaviour in cyberspace at the UN, previously in the UN GGEs and then in the first iteration of the OEWG.
The implementation of the already agreed-upon framework has been described as long overdue. A way to achieve this is through the Cybersecurity Development Goals (CDGs), which aim to close the digital divide, increase resilience by fostering access to digital transformation, and effectuate international law and norms to curtail malicious cyber activities.
But broader questions of geopolitics have a very strong impact on the extent to which progress in cyber norms will be made, according to this IGF. Countries have moved from primarily wanting to protect their nations from cyberattacks to considering economic and trade issues as well. Yet, there is room for optimism – the ongoing work of the OEWG shows that diplomats of all interested countries still negotiate, which reinforces the power of diplomacy.
There are certain instruments a country has at hand to address a cyberattack. But it first must attribute the attack to a specific actor. Then it can apply cyber diplomacy instruments, such as information sharing, public naming and shaming of the perpetrator, diplomatic measures such as recalling ambassadors or even completely cutting diplomatic ties, using criminal indictments, and sanctions. The last option on the spectrum, rarely used, is military action.
Discussions touched on the role of parliaments in addressing cyberattacks and noted how parliamentarians could act as a link between high-level conversations with other stakeholders involved in addressing cyber threats. Civil society can collaborate with parliaments to ensure accountability and oversight. Civil society and the private sector were encouraged to see parliamentarians as a connection to make their voices heard.
What we often neglect when a cyberattack occurs is its societal harm and impact. There is an increasing need to develop a harm methodology with quantitative and qualitative indicators to document the harm of cyberattacks to people, communities, and societies. We need a taxonomy of cyber harm where all stakeholders can contribute to inform the next steps in developing effective legislation, push the private sector to increase security standards, and inform civil society how to help victims. Measuring harm needs to be part of a bigger process involving all parties, where silos are broken: Governments introduce new legislation, the private sector creates new security standards, and civil society supports victims and awareness raising.
The cybersecurity job market
The cyber threat landscape is increasingly complex, and good cyber defenders are needed. Cyber capacity development is now a priority on the international cooperation agenda. But on the national level, there is an overall lack of impetus by government institutions on cyber capacity building, a low number of cybersecurity courses at university levels (sometimes with outdated materials), and the inability of recent graduates to get cybersecurity jobs because they lack experience.
Some recommendations suggested that education and training should be less theoretical (more concrete and practical) and more diverse. Women and young people should be encouraged to join this sector, and greater collaboration between industry and education should be established. A capacity development approach connecting industries and educational institutions should ensure there is no supply-demand mismatch. Workforce development strategies should be country-specific, as the need for cybersecurity personnel varies depending on the country’s levels of industrialisation and digitalisation.
Data economy and the erosion of rights
This IGF served as a bleak reminder that younger users are growing up with a diluted understanding of what data protection and privacy mean. The private sector was particularly criticised, as the development of products and services do not always follow the privacy-by-design approach. Users shouldn’t have to monitor their privacy settings every time they install a new app. The take-it-or-leave-it approach to signing up for an app or a service in exchange for relinquishing rights to user data should be replaced by a fairer system that gives users the option to limit the type and amount of data the app gathers.
Governments need to play a stronger role in regulatory oversight and in enforcing legislation. Regulators should also prohibit companies from gathering more data than they need, even if users agree to share it. In developing their own e-government services, governments also need to keep in mind privacy and data protection aspects. In the Global South, privacy and data protection legislation is fairly new, so young people still need to learn about their rights and legal remedies.
The dark side of dark patterns
There are harmless advertising techniques meant to prompt a user to make a purchase, and then there are practices that cross the threshold of what is ethical and fair, also referred to as dark commercial patterns.
One of the main issues in dealing with dark patterns is to identify the moment when the threshold is reached. The techniques are constantly changing, so the way we defined them a few years ago might already be outdated today. Determining who’s responsible is another problem. Is it the online store that’s using dark patterns, or the developer of such interface – or both?
In order to tackle these practices, authorities may require access to the algorithms behind the advertising, which is an uphill battle considering that companies look at algorithms as trade secrets. Stronger consumer awareness could also go a long way. Although it won’t stop businesses from using persuasive techniques, it could help prevent consumers from falling into the trap.
Dive deeper: Consumer protection
Data governance: From ideological stances to practical solutions
Data governance is maturing. Many discussions moved beyond the generic notion of ‘data’ to understand the specificities of personal, corporate, and public data as they implement different governance solutions. Data localisation is not ideologically dismissed as a danger for the current Internet but is looked upon when it makes sense, such as dealing with critical national data.
Global data governance, cross-border data flows and reconciling different regulatory regimes remain on the IGF agenda. The fragmented data governance landscape is further complicated by the gaps between data protection and privacy legislation, as well as in the implementation and enforcement of already existing rules.
The harvesting of raw data by developed and developing countries is also a concern. Many developing countries are apprehensive that they will become major providers of raw data to external platforms while having to rely on the foreign knowledge produced from that data.
Ironically, then, another impact of the disparate regulatory landscape is its limitations to cross-border data flows on the global digital economy, protection of privacy, and development of national economies. Therefore, these countries need to evaluate whether to regulate digital spaces to balance digital sovereignty and the harmonisation of regulatory approaches.
Despite the disagreements, there were a few things that everyone agreed on. These were the need for flexible regulatory systems that allow for technology development while protecting users, the need to make it easier for non-personal data to flow across borders, and the need for minimal global rules for data transfers.
In addition, a future global system of data governance must strike a balance between public and private value creation in the digital economy (the idea of a social contract for data that sets out a bundle of rights) and establish ex-ante requirements for transparency.
Access to data and security
Timely and efficient access to data for security and digital evidence remains a challenge. The traditional methods of accessing digital evidence through mutual legal assistance treaties are ineffective. New considerations related to facial recognition technology, AI, and the protection of human rights must be embedded in the mechanisms for access to data for security. Additionally, data that needs to be accessed for security and digital evidence is often in the hands of private companies.
The open-source intelligence (OSINT) tools struggle to legally identify the extent to which non-open source data, such as data purchased from private companies, should form part of OSINT tools.
To continue the work on common principles of trustworthy data flows, it is necessary to create an interoperable and efficient legal framework that protects the rights of individuals, such as the rights to privacy and due process, and establish transparency mechanisms and human rights impact assessments related to new technologies.
Online safety regulation
Another area that would greatly benefit from baseline principles in regulatory regimes is online safety and platform regulation. While the value of such baseline principles is not disputed, regulators struggle with the implementation and enforcement of existing rules and businesses navigating the diverse landscape. New cooperation by the regulators themselves across jurisdictions and embedding safety standards during the design of platforms and apps may be the way forward.
Human rights topics
Charting a path towards a safer, rights-based internet
Part of the discussion on making the internet inclusive focuses on ensuring that the online space is safe and secure for everyone while simultaneously upholding and protecting people’s human rights.
Privacy and security are often pitted against each other. But that’s a false binary, experts warn. The two are mutually reinforcing, and one cannot meaningfully exist without the other. So, for instance, users who rely on encrypted communications to keep safe (not only online but also in the physical world) shouldn’t be put at risk through backdoor access. There are other ways of identifying perpetrators, preventing crime, and keeping people safe, and it’s through respect for human rights that the internet can become safer and more connected.
Gender-based violence: Online and offline impacts
Gender-based violence is particularly worrisome, in some regions experiencing a surge. While this is not a new problem, digital technology has amplified abusive behaviour – such as hate speech and other more violent behaviour – against women and girls, and other gender identities. Online violence has an offline impact, and vice versa.
NGOs, the private sector, and governments are taking on the fight against online abuse as well as their resources permit. Stronger enforcement, local solutions addressing local contexts, and more funding for civil society would make a more significant difference. We also need more efforts to identify and eliminate bias in the data and algorithms used for AI systems.
Children and technology: Limiting the risk
Protecting children and young people – who make up almost one-third of the internet population in many countries – from harm is among stakeholders’ top priorities. Two main concerns, data protection in online learning and sexual imagery, were tackled during this IGF.
At the outset of the COVID-19 pandemic, governments and educators rushed to introduce online platforms to keep children engaged in learning. Some of the platforms used data practices that were deemed harmful to children’s rights, in most cases without the consent and knowledge of their parents and guardians. If online educational platforms are to remain the norm, experts suggest that these platforms be audited to determine how children’s data is being gathered, processed, and stored.
With more children and young people spending time online, not least due to the pandemic, the amount of self-generated sexual imagery circulating online is also increasing. Although not all of it is the result of abuse and coercion, content voluntarily generated by kids can still be misused. Experts have therefore argued for more user-friendly material to explain to children and adolescents the repercussions of their risky behaviour.
Increasing connectivity in underserved regions
ITU estimates that approximately 5.3 billion people will use the internet in 2022. This represents an increase of 24% compared to 2019.
Despite a considerable rise in internet penetration over the last four years, participants acknowledged that to make a real difference in people’s lives, internet access needs to comply with sufficient standards – including affordability, inclusivity, sustainability, and links to human capacity development. If policymakers focus only on improving the single metric of basic connectivity, efforts to improve internet access and use for all will fall short, and the digital divide will continue to widen.
A whole-of-society response to the lack of connectivity and other challenges of the digital age was highlighted.
Improvement in connectivity could be achieved through public and private partnerships, local access provision through community networks, using universal service/access funds in financing access, infrastructure sharing, and decentralised approaches to infrastructure development.
Proposed alternative ways of connecting the unconnected include Australia’s Stand programme, a disaster satellite service funded by the government to strengthen telecommunications. Such combined efforts are needed, especially in Africa, to expand its terrestrial and extra-terrestrial internet coverage, to address emergency alerts and communications.
It is paramount that policymakers recognise the value of small operators, such as community networks, and formulate timely policies to assist them. Relying on community networks as a backup for essential infrastructure was also highlighted, especially during crises and natural disasters.
The role of communities of practice was noted in another session, emphasising that they can ensure a stronger representation of African interests in global digital discussions. Substantial African diaspora communities, especially at universities worldwide, are seen as a great asset in strengthening African representation and promoting African interests.
Access has also been discussed in the context of internet shutdowns. A session dedicated to the growing number of internet shutdowns worldwide presented OPTIMA, an online library containing national internet shutdown need assessment reports. Documentation of the consequences of shutdowns is a significant resource helping raise awareness and enhance capacity development, especially when there is a lack of technical training.
Ensuring equitable access to digital healthcare
Another issue prominent on the IGF’s agenda on Day 4 was telemedicine and fairer access to internet health. The discussion built on two years of experience charting a way forward for the future of digital health.
A new research paper, Online health indicators in LAC: Access to safe and affordable health solutions using the internet was introduced as a backdrop for a discussion. focusing on data collection around two axes: access and quality of medicines, and digital health information. The study establishes a methodology for evaluating health solutions using the internet across Latin America and the Caribbean.
Although a growing number of countries have recently adopted laws to regulate telemedicine, it is still a grey area in many countries. Regulating the importation of medicines via the internet can also be crucial, since the availability of medicines can be higher and the prices lower online. There are numerous cases where the price of certain medicines is much lower in neighbouring countries. For instance, in some countries in Latin America the disparity of prices for the same medicine is estimated to be up to 171%.
With the spike in the number of digital healthcare providers and digital well-being apps, challenges abound because not all tools and services are of uniform quality. These are rarely evaluated for effectiveness and trustworthiness. Thus, an effort is needed to institutionalise digital health in the existing health system, provide suitable cybersecurity measures to resolve safety and privacy concerns, and ensure special provisions to guarantee accessibility for people with disabilities. Finally, promoting digital health literacy so that people can participate meaningfully is still weak but still indispensable.
Regulating content online
Fighting untruths, such as online misinformation and disinformation, was one of the main sociocultural concerns brought up in discussions. Some of the approaches suggested were: a pre-bunking approach to fighting misinformation, promoting quality information that complies with good journalistic practices and the design and implementation of digital literacy programmes to fight disinformation. It was, however, noted that if the recipients of such programmes cannot read or write, digital media training seems like an unrealistic approach to tackle this issue.
One way of monitoring content is through platform regulation. Regulation of digital platforms should not be driven by particular interest groups, but rather guarantee basic human rights. Regulations that include mechanisms of control and accountability should be built to develop legal frameworks that protect citizens.
A core focus area for regulators is transparency. To achieve meaningful transparency and accountability in terms of content moderation requires inter alia timely audits and evaluations of platforms by third parties, advocacy and monitoring by civil society, the use of knowledge brokers to interpret technical information for regulators and consumer awareness of their digital rights and responsibilities.
In times of crisis, it’s even more important to stick to rules everyone agrees on to manage content and platforms. A major contribution in this regard is the Declaration of principles for content and platform governance in times of crisis launched by AccessNow during IGF 2022. The presentation recognised the challenge of ad hoc responses when a crisis escalates or when there is ongoing public and political pressure on platforms to react.
Towards universal internet principles
What are the core principles of the internet that we should focus on preserving? Some that were listed during this IGF were the rule of law, fairness, and accountability (for both the public sector and companies); multistakeholder governance (including in policy making); openness and transparency in decision-making processes; a human-centric approach (i.e. prioritising the needs of users and serving individuals); the public interest; engaging young people in policy-making; and trustworthiness, reliability, and inclusivity.
One of the most recent initiatives outlining internet principles is the Declaration for the Future of the Internet (DFI), which outlines basic principles on how nation states should act in relation to the internet.
A debate sparked between the representatives of countries that have signed the declaration and those that have not. There are several reasons why countries might decide not to join the declaration – refraining from signing a document that one did not negotiate was cited the most.
While state-focused, the declaration still strongly supports multistakeholderism – it maintains that multistakeholder approaches are needed to translate the principles into concrete and enforceable actions. The declaration says that civil society, the private sector, the technical community, academia, and other interested parties have a role to play in encouraging more states to follow these principles and holding states accountable for them. However, some argue that we might need to refine the multistakeholder model to ensure a proportional representation of both small and underrepresented groups and larger and stronger actors.
Reassessing stakeholders’ roles in IG
Participants assessed governments’ role in internet governance and noted that more policy innovations are needed. The UN GDC should be a valuable avenue to address the role of governments.
There have also been calls to expand the scope of youth participation in internet governance. For instance, the session Global youth engagement in IG: Successes and opportunities addressed the manifold challenges youth encounter, such as limited space for participation in IG decision-making at the national level, gender stereotyping, and accessing content in languages other than English. Fostering young people’s sustained participation in IG will require decision-makers to remove these and other obstacles, and actively listen to unexpected ideas.
Moreover, it is essential to create spaces for different stakeholders to meet together on a permanent basis. The IGF is a longstanding, successful example of an open, impartial, and bottom-up multistakeholder process. Participants stressed that awareness of global and national IGFs on national levels needs to be actively stimulated starting at IGF 2022.
Start preparing for IGF 2023 in Japan by following Digital Watch coverage of governance topics, actors, and processes.