Non-regulatory approaches to the digital public debate | IGF 2023 Open Forum #139
Event report
Speakers and Moderators
Speakers:
- Pedro Vaca, Special Rapporteurship for Freedom of Expression of the IACHR (OAS), Intergovernmental Organization/ treaty-based international organizations, Americas/Latin America
- Anna Karin Eneström, permanent representative of Sweden in the United Nations and co-facilitator of the Global Digital Compact, Intergovernmental Organization/ treaty-based international organizations, Europe
- MarÃa Elósegui, judge at the European Court of Human Rights, Europe
Moderators:
- Jonathan Bock RuÃz, Director of the Foundation for the Freedom of the Press (FLIP), Civil Society, Latin America
- Agustina del Campo, Director at the Center for Studies on Freedom of Expression and Access to Information (CELE) at Universidad de Palermo, Civil society, Latin America
Table of contents
Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.
Knowledge Graph of Debate
Session report
Full session report
Juan Carlos Lara
The discussions revolve around the challenges posed by online violence, discrimination, and disinformation in the digital public debate. These harmful effects have far-reaching impacts, particularly against marginalised and vulnerable communities and groups. The failure of both private tech companies and states to fully comply with their human rights obligations has worsened these challenges.
Regulatory proposals have emerged globally in response to these issues in the digital public sphere. These proposals aim to address concerns such as competition, data protection, interoperability, transparency, and due diligence. Efforts by international organisations to provide guidelines and regional blocs reacting with their own concerns have contributed to this regulatory landscape.
While regulation is necessary, it is crucial that it does not infringe upon the principles of freedom of expression and privacy. The question of how to strike a balance between regulation and these fundamental rights remains a point of debate. It is important to consider the potential fragmentation of the internet and the lack of regulatory debates in many regions of the majority world.
Soft law principles, as well as the application of international human rights laws, play a crucial role in guiding the behaviour of companies in the digital sphere. They have provided valuable guidance for alternative frameworks. However, the effectiveness of these principles and laws is a matter of discussion.
In conclusion, the discussions highlight the urgent need to address the challenges posed by online violence, discrimination, and disinformation. While regulatory proposals have emerged globally, it is essential to ensure that the regulation strikes a balance between protecting human rights, such as freedom of expression and privacy, and addressing the harmful effects of the digital public sphere. Soft law principles and international human rights laws provide valuable guidance for company behaviour, but ongoing discussions are needed to determine their effectiveness. Overall, collaborative efforts between governments, tech companies, and civil society are essential to achieve a digital space that upholds human rights and promotes a more inclusive and equitable society.
Chantal Duris
Chantal Duris stressed the importance of adopting both regulatory and non-regulatory approaches to address challenges related to social media platforms. She expressed concern about legislations that primarily hold platforms accountable for user speech, rather than addressing the underlying business models. Duris highlighted the potential dangers of such approaches, as they can impact freedom of expression. She advocated for platforms to operate based on the UN Guiding Principles, regardless of regulatory status, emphasizing the need to respect human rights. Duris also emphasized the importance of addressing the root causes of issues like disinformation and hate speech, both through regulating business models and exploring solutions outside the digital space. She supported the decentralization of social media platforms to empower users and enhance freedom of expression. Duris expressed concern about the limitations of automated content moderation tools and suggested the need for more human reviewers with language expertise. She discussed the trend of strategic litigation against platforms, highlighting that it could hold them accountable for failures to respect human rights. Duris recognized the challenge of keeping pace with evolving technology and regulatory initiatives, but argued that both platforms and regulators should take responsibility for upholding human rights. She also noted the growing recognition of civil society’s role in the digital space and the increasing consultations and engagements sought by platforms and regulators. Overall, Duris highlighted the need for a multi-faceted approach, incorporating regulatory measures, adherence to UN Guiding Principles, addressing root causes, decentralization, improving content moderation, and recognizing the role of civil society, with platforms and regulators sharing responsibility for upholding human rights.
Ana Cristina Ruelas
Summary:
Addressing harmful content online requires a multidimensional approach that takes into account linguistic nuances, cultural context, and the protection of freedom of expression. This is highlighted by the need to consider the complexities of different languages and crisis situations when moderating content. Companies must align their actions with the UN guiding principles to ensure their policies prioritise transparency, accountability, and human rights.
Education and community engagement play integral roles in tackling harmful content. Media and information literacy programmes empower users to navigate online spaces responsibly, while fostering a sense of shared responsibility in maintaining a safer online environment. Furthermore, a synergistic effort is necessary, combining policy advice, regulation, and the involvement of multiple stakeholders. This involves a multi-stakeholder process that includes the development, implementation, and evaluation of regulations.
Collaboration between regulators and civil society is vital to effective enforcement. Creating conversations between these groups can help reduce tensions and enhance the efficacy of regulations. Regulators should not feel abandoned after legislation is passed; ongoing enforcement and operation of laws must be a key focus.
To achieve a balanced and collective approach in dealing with companies, stakeholders from different regions are coming together. For example, the African Union is taking steps to address companies with a united front. This collective approach allows for better negotiation and more equitable outcomes.
It is important to emphasise a balanced, human rights-based approach when dealing with companies. Among the 40 countries analysed, some believe that this approach is the correct path forward. By prioritising the principles of human rights, such as freedom of expression and inclusive stakeholder participation, governments can create a regulatory framework that safeguards individuals while promoting peace, justice, and strong institutions.
In conclusion, tackling harmful content online requires a comprehensive and nuanced strategy. Such an approach considers linguistic nuances, cultural context, and the protection of freedom of expression. It involves aligning company actions with UN guiding principles, prioritising education and community engagement, and establishing effective regulatory processes that involve collaboration between regulators and civil society. With these measures in place, a safer online environment can be achieved without compromising individual rights and the pursuit of global goals.
Pedro Vaca
The current dynamics of freedom of expression on the internet are concerning, as there is a deterioration of public debate. This raises the need to ensure that processes, criteria, and mechanisms for internet content governance are compatible with democratic and human rights standards. Moreover, limited access to the internet, including connectivity and digital literacy, poses a challenge in enhancing civic skills online.
Recognising the importance of addressing these issues, digital media and information literacy programmes should be integrated into education efforts. By equipping individuals with the necessary skills to navigate the digital landscape, they can critically evaluate information, participate in online discussions, and make informed decisions.
State actors have a responsibility to avoid using public resources to finance content that spreads illicit and violent materials. They should instead promote human rights, fostering a safer and more inclusive online environment. In addition, internet intermediaries bear the responsibility of respecting the human rights of users. This entails ensuring the protection of user privacy, freedom of expression, and access to information.
Managing the challenges in digital public debate requires a multidimensional approach. Critical digital literacy is vital in empowering individuals to engage in meaningful discourse, while the promotion of journalism supports a free and informed press. Internet intermediaries must also play a role in upholding human rights standards and fostering a healthy online debate.
Upon further analysis, it is evident that there is a lack of capacity and knowledge among member states regarding internet regulation. This poses a significant challenge in effectively addressing issues related to content governance and user rights. Efforts should be made to enhance understanding and collaboration among countries to develop effective and inclusive policies.
Shifting the focus towards the role of public servants and political leaders presents an opportunity to reduce discrimination and inequality. By implementing stronger regulation, especially for political leaders, their limited freedom of expression compared to ordinary citizens can be addressed. Adhering to inter-American and international standards can serve as a guideline for ensuring accountability and promoting a fair and inclusive public sphere.
Overall, this extended summary highlights the importance of protecting freedom of expression online, promoting digital literacy, and holding both state actors and internet intermediaries accountable. It also emphasizes the need for increased collaboration and knowledge-sharing among member states to effectively address the challenges in the digital realm.
Ramiro Alvarez Ugarte
The global discussion on the regulation of online platforms is gaining momentum, with diverse viewpoints and arguments emerging. The Digital Services Act (DSA) implemented in Europe is being viewed as a potential model for global regulation. Bills resembling the DSA have been presented in Latin American congresses. Additionally, several states in the US have passed legislation imposing obligations on platforms.
Legal challenges concerning companies’ compliance with human rights standards and the First Amendment are being debated. These challenges can have both positive and negative implications for holding companies accountable. For instance, companies have faced litigation in the US for alleged violations of the First Amendment.
In addition to regulatory measures, there is recognition of the potential of non-regulatory initiatives, such as counter-speech and literacy programs, in addressing the challenges posed by online platforms. These initiatives aim to empower individuals to discern between fake and real information and combat disinformation. Successful implementation of counter-speech initiatives has been observed during Latin American elections.
Nevertheless, concerns exist about the potential negative consequences of well-intentioned legislation on online platforms. It is argued that legislation, even if well-designed, may have unintended harmful effects in countries with insufficient institutional infrastructure.
The tension between decentralization and the need for regulatory controls is another point of contention. A fully decentralized internet, while offering freedom of choice, may facilitate the spread of discriminatory content. Balancing the desire for increased controls to prevent harmful speech with the concept of decentralization is a challenge.
Polarization further complicates the discussion on online platform regulation. Deep polarization hampers progress in implementing regulatory or non-regulatory measures. However, it also presents an opportunity to rebuild the public sphere and promote civic discourse, which is essential for overcoming polarization.
In conclusion, the global conversation on regulating online platforms is complex and multifaceted. The potential of the DSA as a global regulatory model, legal challenges against companies, non-regulatory measures like counter-speech and literacy programs, concerns about the unintended consequences of legislation, the tension between decentralization and regulatory controls, and the challenge of polarization all contribute to this ongoing discourse. Rebuilding the public sphere and fostering civic discourse are seen as positive steps towards addressing these challenges.
Session transcript
Juan Carlos Lara:
The mic is open, guys. What? The mic is open. It’s a hot mic. I think it is time to start. So it is now the moment in which we begin this panel, this session right here. Welcome everyone who is attending this in the final day of the IGF 2023. This is open forum number 139, non-regulatory approaches to the digital public debates. Are we going to speak Spanish? OK, cool. So welcome to this session. This is the final day of this year’s IGF. It is a pleasure to be with you all. First of all, I want to thank the organizers of this here event, representing the Office of the Special Rapporteur for Freedom of Expression of the Inter-American Commission on Human Rights of the Organization of American States. Thanks also to the representatives of Sweden and the European Court of Human Rights that have supported the proposal for this session. And also to the Foundation for the Freedom of the Press in Colombia and the Center for Freedom of Expression and Access to Information at the University of Palermo in Argentina. Second of all, I will introduce myself. My name is Juan Carlos Lara. I work for Derechos Digitales, a civil society organization working on the intersection of human rights and digital technologies in Latin America. I am coming from the city of Santiago in Chile, and my colleagues are scattered throughout the Latin American region. Our concern as an organization is how digital technologies can be used for the exercise of human rights, as well as they can be a threat to human rights when they are regulated or misused by actors both private and public. Finally, I’m going to briefly introduce the panelists. Just by name, they will be introducing themselves when it’s time for their own interventions. We are accompanied at this hour online by Mr. Pedro Vaca, the Special Rapporteur for Freedom of Expression of the Inter-American Commission of Human Rights in the Organization of American States. Here on site, we have Ana Cristina Reyes, Senior Program Specialist at the Freedom of Expression and Safety of Journalists section in UNESCO, the United Nations Educational, Scientific, and Cultural Organization. By Chantal Duris, Legal Officer at Article 19, the International Human Rights Organization working to protect and promote the rights of freedom of expression. And by Ramiro Alvarez-Ugarte, Deputy Director at the Center for Studies on Freedom of Expression and Access to Information at the University of Palermo, Argentina. Thank you all once again for attending this, and thank you to the panelists who will be speaking in turn in a few minutes. The rules of this panel are as follows. We will begin with a brief overview of the situation which has motivated this discussion here on what the digital public debate landscape and the challenges to human rights are with regards to online expression. After that, each speaker will have 10 minutes for their interventions. After that, if time allows, we will have a second round of reactions and participation, hopefully, for audience interventions mediated by the moderators here on site and also online. The guiding question that will open this discussion is on the possibilities of non-regulatory approaches, whether they can succeed and the challenges they present. But to introduce the subject, a few words from the moderator here, because we understand that in the intricate terrain of the digital public debates, we have faced for a long time a series of challenges to human rights that have been compounded, that have been reinforced, that have been worsened. In some cases, by events around the world. And the failure of both private tech companies and states to fully comply with their human rights obligations has had profound consequences affecting democratic institutions, human rights, and the rule of law. And with the backgrounds of global and local crises in terms of war, disease, authoritarian rule, and human rights abuses that happen both offline and online, we are faced with challenges to human rights that oftentimes are addressed or attempted to be addressed through regulatory response, but because of the presence and the importance of private actors. This always entails also an interaction with companies that often have more power or more resources than many states. Over time, we have witnessed the far-reaching impact of online violence, discrimination, and disinformation in the digital public debate, issues that have cast shadows over the virtual landscape, leading to harm, especially against marginalized and vulnerable communities and groups. What was once a platform promising diverse voices and perspective has seen troubling developments, hostile communicative environments, particularly for traditionally discriminated groups. Furthermore, the discourse has become polarized, distorting the conversations around essential matters and eroding trust in authoritative sources, such as academia, traditional media sometimes, and also public health authorities. To address these challenges, some regulatory proposals have come to the forefront at a global scale. We have seen that there are efforts by international organizations to provide guidelines, to provide guidance for regulatory response. We have seen that regional blocs have also reacted with their own concerns, but many of these intricate systems have aimed to tackle various diverse, different, but interconnected issues, including competition, data protection, interoperability, transparency, and due diligence in the digital public sphere. And while these efforts are critical for responsible behavior online and for protecting human rights, they also introduce complex questions and concerns that demand careful consideration about the balance of rights, about the roles of states, about jurisdictional issues, and the enforceability of the provisions that are created. One of the pivotal questions that emerges is related to the fragmentation of the internet. And while regulation is essential for safeguarding human rights, it is vital that these regulations do not inadvertently infringe upon the principles of freedom of expression, of privacy, and the rest of the human rights. So striking a delicate balance in the digital world is a formidable challenge. Notably, in many regions, regulatory debates have been in their infancy or have been completely absent, especially in many regions in the majority world. And in this context, soft law principles, the application of international human rights laws, have played a crucial role in guiding the behavior of companies that mediate online communications. These principles have provided valuable guidance for alternative frameworks, but their effectiveness is a matter of discussion and debate. So in response to this debate, we are going to speak this morning here about what these challenges are. Since we have seen the advance of a global trend to regulate platforms and the internet in general as a path to address the growing threats of human rights, what are the limitations of these proposals? If they have limited effects, in some cases can present these tensions with the balance of human rights. What other policies, what other institutional and legal frameworks have been implemented globally or can be implemented globally or regionally to propel freedom of expression online and its diverse, equal, fair, non-discriminatory, and democratic online public debates? The first word is going to be to Mr. Pedro Vaca, the Special Rapporteur for Freedom of Expression of the Inter-American Commission of Human Rights. So please, Pedro, go ahead. Thank you.
Pedro Vaca:
Good morning there. I hope you’re having a great IGF this year. Thank you very much. Firstly, I would like to highlight that in the Americas, we identified that the current dynamics of freedom of expression on the internet are characterized by at least three aspects. The first one is the deterioration of the public debate. The second is the need to make processes, criterias, and mechanisms for internet content governance compatible with democratic and human rights standards. And third, the lack of access, including connectivity and digital literacy to enhance civic skills online. And this is closely related to dynamics of violence, disinformation, inequalities, and the opportunities of participation in the public debate, and the viralization of extremist content. We understand at the rapporteurship that diverse and reliable information and free, independent, and diverse media are affecting disinformation, violence, and human rights violations, and that this requires multidimensional and multistakeholder responses that are well-grounded in the full range of human rights. As people worldwide increasingly rely on the internet to connect, learn, and consume news, it is imperative to develop connectivity, and access to the internet is an indispensable enabler of a broad range of human rights, including access to information. Interoperable, reliable, and secure internet for all, facilitated individuals’ enjoyment of their rights, including freedom of expression, opinion, and peaceful assembly is only possible if we have more people accessing and sharing information online. Additionally, in the informational scenario of media and digital communication, citizens and consumers should be given new tools to help them assess the origin and likely veracity of news stories they read online. Since the potential to access and spread information in this environment is relatively easy, and malicious actors benefit from it to manipulate the public debate. In this sense, critical digital literacy aims to empower users to consume content critically, as a prerequisite for online engagement by identifying issues of bias, prejudice, misrepresentation. Critical digital literacy, however, should also be about understanding the position of digital media technologies in society. This goes beyond understanding digital media content to include knowledge of the wider socioeconomic structures within which digital technologies are embedded. So here we have a few questions. How are social media platforms funded? Or for instance, what is the role of advertisement? To what extent is content free or regulated? Given the importance for the exercise of rights in the digital age, digital media and information literacy programs should be considered an integral part of education efforts. The promotion of digital media and information literacy must form part of a broader commitment by states to respect, protect, and fulfill human rights and by business entities. Likewise, initiatives to promote journalism are key in facing informational manipulation and distortion which requires states and private actors to promote the diversity of digital and non-digital media. On the other hand, the role of public officials in the public debate is highlighted. It is recalled that state actors must preserve the balance and conditions of the exercise of the right of access to information and freedom of expression. Therefore, such actors should not use public resources to finance content on sites, applications, or platforms that spread illicit and violent content and should not promote or encourage stigmatization and must observe obligations to promote human rights which includes promoting the protection of users against online violence. The state has a positive role in creating and enabling environment for freedom of expression and equality while recognizing that this brings potential for abuse. In this sense, in the Americas, we have a recent example in Colombia of a decision by the Constitutional Court that urged political parties to adopt guidelines in their code of ethics to sanction acts or incitement to online violence. In this paradigmatic decision, the court recalled the obligation of the state to educate about the seriousness of online violence and gender online violence and to implement measures to prevent, investigate, punish, and repel it. And also, the court insisted that the political actors, parties, and movements, due to their importance in the democratic regime, are obliged to promote, respect, and defend human rights as a duty that must be reflected in their actions and in their attitudes. Additionally, the court ruled that the state should adopt the necessary measures to establish a training plan for members and affiliates of political parties and movements on gender perspective and online violence against women in response. Considering that unloveful and violent narratives are propelled by state actors on the internet through paid actors should follow specific criteria in the ad market. Any paid contracting for content by state actors or candidates must report through active transparency on the government or political party portals the data regarding the value of the contract, the contracted company, and the form of contracting, the content resource distribution mechanisms, the audience segmentation criteria, and the number of exhibition. On the other hand, to make business activity compatible with human rights possible, the office of the special rapporteur reiterates that internet intermediaries are responsible of respecting the human rights of users. In this sense, they should first refrain from infringing human rights and address negative consequences on such rights in which they have some participation, which implies taking appropriate measures to prevent, mitigate, and where appropriate, remedy them. Second, try to prevent or mitigate negative consequences on human rights directly related to operations, products, or services provided by their business relationship, even when they have not contributed to generating them. Third, to adopt a public commitment at the highest level regarding respect for human rights of users, and that is duly reflected in operational policies and procedures. And fourth, carry out due diligence activities that identify and explain the actual and potential impacts of their activities on human rights, which is called also impact assessments. In particular, by periodically carrying out analysis of the risk and effects of their operations. In conclusion, to wrap up, the challenges facing the digital public debate require a multidimensional approach. Soft law, as was stated before, education, self-regulation, and legal mechanisms can together create a framework to mitigate harms we face online. Let us strive for a digital space where freedom of expression and the protection of human rights are promoted, fostering a society that values inclusivity, diversity, and respect for all.
Juan Carlos Lara:
Thank you very much. Thank you very much, Mr. Pedrovaca. Thank you for those remarks. And thank you for also starting this conversation addressing the need for a multidimensional approach. This is not necessarily a discussion of regulatory or non-regulatory measures, but apparently of different types of measures at the same time. And we will now listen to the rest of our panelists, beginning, of course, with our second onsite participant here, Mrs. Ana Cristina Ruelas, Senior Program Specialist at the Freedom of Expression and Safety of Journalists section in UNESCO. Please, Ana Cristina, you have 10 minutes. Thank you.
Ana Cristina Ruelas:
Thank you very much. It’s an honor to share this panel with you, Pedro. Good to see you. So as Pedro said, from UNESCO, we have a holistic approach to try to deal and understand with this phenomenon. UNESCO tries to foster public debate through education measures that I will not speak quite a lot about, because this is not my area of expertise. But there’s a lot of work done with teachers, with educators, to target potential harmful content and harmful content online. There’s a specific work that is being done to develop resilience in different communities, primarily in four countries, Bosnia and Herzegovina, Indonesia, Colombia, and Kenya, through the Social Media for Peace Project, which is founded by the European Union, and aims to create media and information literacy measures, but also to develop a way of understanding how content moderation is happening in these different countries, and what are the different issues and context-related matters that allow this harmful content to be spread. There’s another action that is happening that relates to capacity building on different stakeholders, duty barriers, such as judges, parliamentarians, regulators, in order to understand that when dealing with potential harmful content, there’s a name to safeguard freedom of expression, access to information, and diverse cultural content. And there’s work done, also, through the cultural sector, in order to understand the impacts of harmful content in artistic freedoms and cultural expressions, such as indigenous expressions. And the last thing, which I think is also important, is that we also have another action that is related to policy advice and guiding member states in the process of acknowledging that governance of digital platforms requires, as Pedro has mentioned, to safeguard freedom. of expression, access to information, and diverse cultural content, while balancing and while addressing the phenomenon of disinformation, hate speech, conspiracy, charity, and propaganda. So in this session, I will focus in two main and specific projects that UNESCO is being putting forward lately. And I will start with the Social Media for Peace Project, which is one of the projects that, as I said, started in four different countries and allow us to understand what is happening with content moderation and how is it affecting different communities, and also how a non-regulatory approach can be successful while it’s holistic with other different type of solutions. So the first thing that we learn within the Social Media for Peace Project is that context matters. This means that when it comes to content moderation, language cannot be just left aside. There’s specific languages in different regions that are important to understand in order to address content moderation issues. And this is not happening in many countries, or mainly in the countries that we’re working on. That specifically are also countries that are in crisis, or that come from crisis. The second thing that it is important that we found is that despite acknowledging the crisis, despite of the lack of knowledge and context and nuances that the platforms should understand, and that the problems that hateful content can create in an offline world, there’s a problem of not considering these countries as a priority, and then not providing enough funding for the development of content moderation measures. So companies have specific priorities to those countries that have a global impact, or that represent a market share that are important. And in those countries that this is not happening, they are not putting sufficient budget to them. And then this is increasing and creating more problems. The Social Media for Peace project also understood that when dealing with these problems, the most important thing to bear in mind is to have the capacity to dialogue between the different stakeholders. Acknowledging that in conflict zones, there are many issues that should be, like that in the offline world are happening, that have to be considered in the online world. So that’s why due diligence from the platforms is very important. Understanding the context, having the possibility to develop risk assessment and identify the specific mitigation measures that they have to put in place in order to reduce the specific risk based on the context is very, very important. But while doing this work, and I want to say there was two main approaches. The first one is the fate on the companies to turn their economic interest on how content moderation was doing through the public interest of making people know and reducing the impact of this content that many times it’s also a harvest through advertising as it’s already been mentioned. So that’s the first question. Are we keeping the fate on changing or shifting the economic interest to the public interest from the companies? Many people still believe in these countries that this can be one of the approaches to push for companies to increase their budgets in order to do better content moderation and then have a safer space. Then there’s other approach, which mainly come from the states that Pedro has already comment, which is try to reduce this phenomenon with bad regulation, with regulation that does not safeguard freedom of expression that criminalizes the user and does not touch the companies that considers that the only and solely responsible for harmful content is specifically the user. And that is another approach. And then UNESCO, after the work that is being done through the social media piece, they started saying, okay, as we are not acknowledging that these are the two different approaches, what we need to also is to start a debate that allow us to understand if it’s possible to balance freedom of expression, access to information and the access to diverse cultural content with while dealing with potential harmful content such as disinformation, hate speech and conspiracy theories. And while doing this debate, UNESCO started a consultation that led to more than 10,000 comment that came from the engagement of people from around 134 countries. And what we learned is that when governance systems are transparent, have check and balances put in place, they align content moderation and creation to the UN guiding principles of human rights. When they are accessible and inclusive to that diverse expertise and what they actually take bearing in mind the promotion of cultural content, then it can be a game changer. So that’s why UNESCO started developing these guidelines for the governance of digital platforms that on the one hand, recognize the state responsibilities on enabling a freedom of expression environment that such as Pedro has mentioned had a specific requirements for the governments to commit not only to freedom of expression online, but also to all of their duties in respecting and promoting freedom of expression offline. And the second thing is that UNESCO acknowledged that creating a governance system requires the acknowledgement that any regulatory measures that has to be coherent and comprehensive with the different kinds of regulatory arrangements should be through a multi-stakeholder approach. This means that there’s no only statutory regulation that depends on state and companies, but there should be a participation, an active participation of other stakeholders in the whole of the regulatory process, meaning the development of the regulation, the implementation and the evaluation of the regulation. Then the third thing that the guidelines state is that companies have to comply with five key principles. One, due diligence, which specifically state that companies have to develop different human rights risk assessments when they are developing new operations, when they are enhancing new operations, create new ownerships, develop new products. They have to do it prior an electoral cycle. This is very important considering, for instance, that 2024 is a super electoral year and at least three, four parts of the population that is able to vote will come to vote on 2024. The third is that a company should develop a human rights assessment when it comes to crisis, emergencies, and armed conflicts. And the fourth is that they have to understand the different risks that the companies or the content that poses within the company’s post to a specific communities, such as journalists, such as environmental defenders, such as artists, or other vulnerable and marginalized communities. The second principle is transparency. I don’t have to go through very deep into it. The third is accountability. The fourth is user empowerment, which means that within the governance system, there should be specific programs that are developed for media and information literacy. And the fifth is the alignment of all the actions to the UN guiding principles. So this is a work that so far has been done. We definitely believe, as Pedro said, and we state that this is an holistic approach and that non-action should be only and one only because if they don’t come together with many other actions that relate to, yes, education, to yes, creation of communities, yes, policy advice and regulation, then these different phenomenons will not be targeted. Thank you.
Juan Carlos Lara:
Thank you very much, Ana Cristina, for that extremely informative intervention with all of the initiatives that UNESCO is carrying out, including trying to provide guidance for regulation for governments in a manner that has included many rounds of consultations and a broad discussion, as you mentioned, with thousands of comments from the world over, which, of course, as you have been mentioning, also enriches the learning inside organizations like UNESCO itself in how to address many of these issues from the perspective of freedom of expression, access to information, and access to diverse cultural contents, which I think is a key factor in all of this and sometimes not necessarily addressed explicitly. So thank you very much for that. Now, Chantal, can you please tell us about your own view about these subjects? Can you hear me? Okay. Thank you very much.
Chantal Duris:
I will try not to be repeating too many points that have been made by the first two interveners, which are obviously excellent and all extremely relevant. For example, that we need to look at both, at the whole toolbox, right? We need regulatory and non-regulatory approaches. Perhaps just very briefly, I think this discussion is very important because we do agree that many of the proposals that we’ve seen or legislation that has been adopted recently that was seeking to regulate platforms has indeed, there is indeed a danger that these will do more harm than good because they talk a lot about holding platforms accountable, but at the same time, very often what they do is not necessarily focus on the business model of the platforms, on the data tracking, on the advertising model, but almost they ask the platforms to exert more control over, in fact, user speech. So the focus goes from the platforms on systems to the speech of users. And it is critical that any regulatory framework that has this strong impact on freedom of expression, that it is seriously grounded in it, that it is evidence-based and of course, grounded in the principle of legality, legitimacy, necessity and proportionality as Article 19 of the ICCPR requires. And this is also why working more or less globally, it depends also on the jurisdiction, what sort of solutions we think will be appropriate. With many governments, we would not advocate although in principle, we think sound regulatory frameworks should be in place. With many governments, we won’t start to advocate for passing legislation that will control platforms because we do fear, of course, that it will be not a regulatory proposal that will be respectful of freedom of expression, but give the government more options to control online speech. And also Article 19 has long advocated that it is extremely important to take this competition law angle as well because there are very few dominant players in this field. They are gatekeepers of these markets and they are also really gatekeepers of our freedom of expression online. And we do strongly believe that decentralization can per se have a positive effect on freedom of expression, more healthy competition, more empowerment for users. For example, if a user thinks, I do not want to be on a certain platform because I do not think that they respect privacy enough. This is important for me. They should be able to leave that platform and still be, for example, connected to the contacts and families that wish to remain on the platform. As has been mentioned, the UN Guiding Principles can be a very important tool, of course are an essential tool that we advocate for platforms to take into considerations all over the world, really. So whether we have a good regulation in place, a bad regulation in place, or no regulation in place at all, that should always be the basic benchmark against which they should operate. And a lot has been said about them, so I won’t go into detail. Also, because we’re also talking about risks of the different approaches, we think if we take the approach that enabling responses are also at the center of this discussion, then we think that the risks to freedom of expression are much more limited. And this is also linked to another observation we make. Often we find that the discussions seem to say that the social media platforms are the cause of the problems, and we do not deny that they have exacerbated certain societal tensions and increased polarization. There is no question about it, and there is enough evidence that this is happening. At the same time, we do think that this is essential to look at the root causes, for example, of disinformation, of hate speech, of online gender-based violence. And this may, again, include certain regulation of the platform’s business model, but it also needs to look at very different areas outside the specific digital space. So, for example, Article 19 has published now, a couple of years ago, a toolkit when it comes to hate speech, where we detail really what those different approaches need to look like, where we also, again, need to look at regulatory and non-regulatory responses, such as anti-discrimination legislation. Public officials, as Pedro mentioned, should not themselves engage in stigmatizing discourse or counter such discourse when they encounter it. There needs to be, they need to receive, public officials should receive equality training, independent and diverse media environment. All these aspects are obviously key to ensure that we have, say, offline, so to speak, an environment that is also inclusive, that is not gonna translate into then even more extreme speech online. And, of course, civic space, a strong civic space, strong civil society initiatives are also a key component of that. And also to mention, to follow up on what Ana Cristina said, so Article 19 is a partner of UNESCO when it comes to the Social Media for Peace project, and there have been a number of research report, as Ana Cristina alluded to, that have really found also the failings of the platforms, again, taking into account sufficiently the contextual elements. It starts from human rights teams that are not in place for many countries, so civil society in many countries, they don’t have anyone to call at META, for example, if they say there’s a video that needs to be taken down, or we see there’s an election coming, we see that there’s a crisis developing offline and online, there’s not really anyone who they might necessarily be able to talk to who will be responsive. Obviously, a very important additional problematic element is the use of automated content moderation tools as well, because they exacerbate why we recognize that obviously content moderation cannot happen only through human reviewers. It’s also true that many of these tools, they are not sophisticated enough and might never be to really make a proper assessment of some very complex categories of speech. Even for a court, it can be very complex to make a judgment on, you know, was there really hate speech? Was there the intention to incite hatred? Was there disinformation? Was there an intent to publish false information and disseminate it? Was there an intent to cause harm? So, obviously, doing this moderation at scale can present very serious challenges and we always call for more human reviewers that are native in the languages that they moderate. More local civil society organizations need to have direct access, meaningful access to the platforms because we also know that there have been these trusty partner programs which have not always been very satisfactory, to say mildly, and civil society has often found that it’s a bit of a waste of time and the waste of their resources and the impact is limited. Perhaps because I know we are far advanced in time, I wanna make a final reflection. I think an interesting trend we are seeing now is also, which is a non-regulatory trend, but also based on regulation, is a strategic litigation that we see increasingly brought against online platforms. So very prominent examples have recently been the US Supreme Court cases where victims, where families of victims of terrorist attacks in Turkey and in France have filed suits against Twitter and Google, for example, saying that their systems have failed in a way where they have enabled terrorist content to spread online and have also sort of aided and abetted these terrorist organizations. We’ve also had other litigations happening in Kenya over the violence, the violent content that was spread in Ethiopia that was moderated from Kenya and also over the failings in Myanmar, strategic litigation has been brought. That in itself, from our perspective, has some challenges because from a freedom of expression perspective, organizations have always said it is essential that platforms do remain largely immune from liability for the content that they host. But at the same time, of course, there needs to be platform accountability and there needs to be remedies if they infringe on the human rights of the actors in the respective countries or affected communities in the respective countries. So here as well, it will depend on how this litigation is brought. We do not wanna see a court saying, after all, you need to be held liable for hosting terrorist content because it has led to a terrorist attack. At the same time, it can be very interesting if we start seeing more litigation that focuses on remedies for failures to conduct these human rights impact assessments to take human rights to diligence measures and to do the mitigation measures properly. So I do think that is a trend that we see that has a lot of publicity. So there’s a lot of bad reputational aspects linked for the platforms and that could be also a good pressure tool for them to essentially get their act together as well.
Juan Carlos Lara:
Thank you. Thank you very much, Chantal, also for offering so many different pathways towards what we expect to see, but they’re so difficult to achieve, which is accountability from the platforms that speaks to the role that they have in exacerbating social problems even though they might not be creating them according to some discussion and some views. So now, Ramiro, your turn. So tell us what policies, institutional, legal. frameworks have been implemented or can be implemented beyond just the regulatory ones to address the problems that we have with online speech.
Ramiro Alvarez Ugarte:
Thank you very much. Should I introduce myself? Yeah. I’m Ramiro Alvarez-Hugarte. I’m the Deputy Director of CELE, a research center based in Buenos Aires. I don’t want to be too repetitive of things that have already been said. So let me just offer you, I think, a diagnosis that we have at CELE in terms of where we are and also to highlight a few tensions that I think underlie our discussion and have not yet been resolved. It seems like we’re in an interregnum. The old doesn’t die yet and the new is not born yet. So we are at that moment in which we are sort of in between the old and the new. And that’s always interesting times to be and it’s also challenging. I think we are clearly moving towards a regulatory moment. So in a way, the question that has been posed in this panel, I think it’s more or less intention with the trend of where the world is going. I agree with everything you just said and I agree that regulatory and non-regulatory measures are important and they should take place at the same time. But I think we are moving towards a regulatory moment. Of course, the DSA in Europe is obviously what will most likely be a model that will expand across the globe. We have already seen bills presented in congresses in Latin America. They haven’t been adopted yet but legislators in other countries look at the DSA and they copy language and they copy some of their provisions and that is a process in and of itself full of challenges. We have also seen calls to revisit Section 230 in the United States because of congress and its gridlock. It’s difficult to imagine that a comprehensive review of Section 230 will happen anytime soon. But we have seen state-level legislation that has been passed imposing on platforms obligations. We have already seen strategic litigation against companies but not in the direction that you mentioned, in the opposite direction. Like, for instance, the Joe Boning cases in which they basically say that the kind of relationship that the federal government has established with companies in the US violates the First Amendment. So in a way, litigation cuts both ways. So it could be a litigation that questions companies for failing to stand up to their human rights standards but it could be also litigation against companies for violating the First Amendment in the case of the United States. So I think that’s where we’re going. It will be interesting to see how we get there. Now in terms of alternatives, of course, the Inter-American Commission has supported alternatives for a long time, non-regulatory approaches. I was part of the 2019 process of discussing the guidelines to combat disinformation in the electoral context. And the main outcome of that was just to support non-regulatory measures. So I’m not going to repeat what you guys just said, but literacy, of course, it’s incredibly important. I would like to highlight, though, that literacy initiatives are, in a way, a bet on an old principle that it was very cherished in the human rights and freedom of expression field, which is that, to an extent, it is our responsibility as democratic citizens to figure out what’s fake from what’s not. So the internet, of course, makes it more difficult to exercise that responsibility. But in a way, I would highlight and underscore that those kinds of initiatives are a bet on that old principle. We haven’t yet renounced it. And, of course, all kinds of measures to promote counter-speech are obviously very easy. They’re not threatening from a human rights point of view, and they’re fairly easy to implement and, apparently, they’re quite successful, especially what I’ve seen most successfully deployed is counter-speech to combat disinformation in the context of elections in Latin America. But again, calls for regulation has been happening. Observacom in Latin America has been very strongly supporting the kind of regulation that on paper looks very good and looks respectful of human rights standards. The same with the UNESCO guidelines. Of course, the risk that is involved in these initiatives is something that Chantal already mentioned, the risk that even good legislation on paper could do more harm than good. And I think this has to do with, in many countries, sort of a lack of an institutional infrastructure necessary to adopt these kinds of regulations. That obviously is a concern for activists, but as I said before, I think we’re moving in that direction, and we’ll have to deal with that as the time comes. But I’m pretty sure that in the next couple of years, we will see legislation being passed outside of the European Union, and we will have challenges in that sense. Now I would like to highlight a couple of underlying tensions in order to close my remarks. So for instance, we have been discussing the importance of decentralization. I also would agree with Chantal about the importance of antitrust legislation, which for practical reasons, of course, will happen where corporations are incorporated or in places where they have important marketplace presence, and where they have the kind of institutional infrastructure necessary to move forward with this process. There is ongoing litigation in the United States against Google. There is, at the same time, investigations in the European Union. It is hard to imagine that, for instance, a Latin American country could move in that direction, but I think that’s important. Now it seems to me that this is in tension with the, I would say, framing of the DSA, or the framing of the regulations that are being proposed, because to an extent, those kinds of regulations depend on a few powerful intermediaries. So if we would, let’s say, break them all apart and have an Internet that is extremely decentralized as it was towards the end of the 1990s and beginning of the 2000s, I don’t know how that would be compatible with increasing control, even in a way that is respectful of human rights. Because if we have a truly decentralized web in which people get to choose, a lot of people will choose hateful content. A lot of people will choose and engage in discriminatory content. If it is truly decentralized, there will be no way of controlling that. So I think that’s an underlying tension that, to an extent, speaks about, I think, a really deep and profound disagreement in the field of human rights, in terms of what kind of future are we imagining as desirable. And I mean, this is something that I think is there, that it’s underlying. And I think we don’t discuss it as openly as we should. Are we willing to support freedom of expression in the form that we have affirmed it through the 20th century, where we informally relied on gatekeepers to sort of keep check on that? Are we embracing the decentralized promise of the Internet of late 1990s? And that means a lot of speech that is really problematic. I don’t know if it’s harmful. I think there is still a lot to figure out in terms of evidence. A lot of speech that is called harmful, we just don’t have enough evidence to support that it is actually that harmful. But I think that underlying tension is there, and that we should keep it in mind, and that we should discuss it more openly. Thank you. Thank you, Ramiro, for your sobering remarks, and also for highlighting what’s one of the trends that we see towards regulation, even though we can discuss other forms of addressing
Juan Carlos Lara:
some of these challenges. So I want to first check whether we have hands in the room that would like to pose any questions. So otherwise, we would start to be closing this panel, since time is about to run out. But before we do that, I would like to pose the question myself. So if I see no hands, it would be to the panel itself, beginning with Pedro. I don’t know if you are there, but it will be a rapid round of one challenge and one opportunity we have if there is a future in which we will see regulation that will come. One challenge and one opportunity that we may find in non-regulatory approaches that can be taken today as soon as possible among non-governmental actors in order to provide for the internet that we all want, and for the platform responsibility with human rights that we would expect. We will go in the same sense that this panel began, with up to two minutes. Please, Pedro, you go first. Thank you, Juan Carlos.
Pedro Vaca:
And let me just thank the whole panel for this amazing conversation, a lot of questions. The challenge that we have faced is the lack of capacity in a lot of member states. We cover the Americas, we monitor 30 countries, and at this moment, October 2023, we do not have enough capacities, even knowledge, among member states to be part of the conversation. So, I think we have to develop contact points at the foreign affairs ministers in as many countries as possible, because we only have powerful countries with the capacity, then we do not have enough representation to deal with the challenges. And then the opportunity, I think, and that’s why I highlighted the Constitutional Court of Colombia. I think the opportunity is we can put all our efforts in the user and the consequences for the user, or we can also prioritize the role of public servants and political leaders. I mean, if you have xenophobia or racism in a society, you have a problem, but if you have political leaders that incentivize xenophobia and discrimination, you have a bigger problem. And that’s why I think that if we consider public servants as points of reference of society, probably they should, and democracies should and could frame in a better way what is allowed and what is not allowed at that level of representation. I mean, the frame of freedom of expression of people that wants to become, wants to govern, wants to participate in the political sphere is limited if you compare it with ordinary citizens. And in that specific opportunity, we have a lot of inter-American and international standards. So, it’s something that is not even soft law. You have ruling at the inter-American court to support that.
Juan Carlos Lara:
Thank you, Pedro. I’ll ask also to the rest of the panelists, first Cristina and then Justina, please. One challenge, one opportunity.
Ana Cristina Ruelas:
That the discussions focus a lot on how legislation will look like and not how the second stage of the process would feel. So, I’ve been saying this in the different panels that I have participated in the IEF. It’s like, many regulators have said, you know, once legislation is passed, no one cares about it and they leave us alone. And as Ramiro mentioned, there’s many regulatory authorities that do not know how to deal with this issue and that are not used to talk with civil society. So, we need to break that tension and to be able to create conversation among them. So, that will be another opportunity. And an opportunity also is that since companies are based in the same country, what we see is that countries, stakeholders in different countries, in different regions, for instance in Africa and the African Union are coming together because they say, okay, companies don’t care about one of our countries per se. You know, they don’t have a specific interest in X country. But what they do care is of us together. So, they are getting together with civil society, with electoral management bodies, with the African Union. They are coming together with the different stakeholders to go before the companies and say, this is what we need and this is how we want it. That said, that creates a great opportunity because between 40 countries, you have countries that actually believe that a human rights-based approach is the way to go through and there are other countries that do not believe so. But there’s a balancing process and that is, for me, a great opportunity. Thank you very much. Chantal?
Juan Carlos Lara:
Thank you.
Chantal Duris:
I think in terms of challenges, I will mention this is a challenge, generally speaking. I mean, society tends to move slow, regulators tend to move slowly, technology doesn’t. And we are seeing this trend now again where they are trying to catch up. There are a lot of initiatives. There are a lot in the European Union itself, for example. There are a lot. There’s the AI Act, the Digital Markets Act, the Digital Service Act, the Political Advertising Regulation. And there is a challenge also for civil society active in this field already to be able to catch up with everything and cover everything. And not to mention, also, there are a lot of civil society actors that are very much impacted by what’s happening in the digital space but are not necessarily experts in it. They’re not experts in content moderation, they’re experts in, for example, women’s rights. And those are quite technical subjects, so it requires a lot of expertise. So I think this is one of the main challenges, the expertise that it requires and the capacity that it requires. I think the opportunities, we do feel that there is more recognition from, say, some of the platforms, some of the regulators that many of the issues they are dealing with, civil society are experts in it as well. They seek more, there are more consultation processes. To what extent the opinions of civil society are taken into account is another point. But we do feel there is more, again, appetite from platforms and regulators to have us engaged. But at the same time, we don’t want this in a way where they just outsource their own responsibility and say, we don’t need to deal with the human rights aspect, civil society do the work for us. Perfect.
Juan Carlos Lara:
Thank you very much, Chantal. Ramiro, you have the last word. Very quickly.
Ramiro Alvarez Ugarte:
I would say the following. I think one of the biggest challenges is that to move forward in regulation or non-regulatory measures, we have to do it generally in a context of deep polarization, and that is always very difficult. But at the same time, I think that context offers an opportunity, because I think that in most democracies around the world, there is a need to rebuild the public sphere and civic discourse. There is a need to start talking to each other in a way that is respectful. And even though that is difficult precisely because of polarization, that underlying need is still an opportunity, and we should take advantage of it. Thank you very much.
Juan Carlos Lara:
And with that, our time is up. Thank you very much to my fantastic panelists and everyone who has attended this session, and have a nice rest of your IGF. Take care, everyone. Bye-bye. Thank you.
Speakers
Chantal Duris
Speech speed
177 words per minute
Speech length
2064 words
Speech time
699 secs
Arguments
Chantal Duris stressed the importance of regulatory and non-regulatory approaches in addressing issues related to social media platforms
Supporting facts:
- In her speech, Chantal Duris expressed concern over legislations that focus more on holding platforms accountable for user speech, rather than focusing on the platform’s business model
- She highlighted the danger of such approaches, considering their implications for freedom of expression
- She discussed the importance of ensuring regulatory frameworks adhere to principles of legality, legitimacy, necessity and proportionality
Topics: Regulation, Social Media Platforms
Chantal Duris advocates for platforms to respect the UN Guiding Principles
Supporting facts:
- According to Duris, platforms should always operate based on the UN Guiding Principles, regardless of the regulatory status
Topics: UN Guiding Principles, Human Rights
Duris believes it’s necessary to also address the root causes of issues like disinformation and hate speech
Supporting facts:
- Duris stated that while social media platforms have exacerbated societal tensions and increased polarization, it’s important to look at the root causes of the issues
- She suggested that this may require regulation of platform’s business model, but also necessitates looking outside the digital space
Topics: Disinformation, Hate speech, Root causes
Keeping pace with rapidly evolving digital technology and regulatory initiatives is a challenge for the civil society
Supporting facts:
- Technological evolution is faster than societal and regulatory changes
- Numerous initiatives in the EU like the AI Act, Digital Markets Act etc.
- Requires expertise and capacity to understand technical aspects
Topics: Digital technology, Regulatory initiatives, Civil society challenges
Report
Chantal Duris stressed the importance of adopting both regulatory and non-regulatory approaches to address challenges related to social media platforms. She expressed concern about legislations that primarily hold platforms accountable for user speech, rather than addressing the underlying business models.
Duris highlighted the potential dangers of such approaches, as they can impact freedom of expression. She advocated for platforms to operate based on the UN Guiding Principles, regardless of regulatory status, emphasizing the need to respect human rights. Duris also emphasized the importance of addressing the root causes of issues like disinformation and hate speech, both through regulating business models and exploring solutions outside the digital space.
She supported the decentralization of social media platforms to empower users and enhance freedom of expression. Duris expressed concern about the limitations of automated content moderation tools and suggested the need for more human reviewers with language expertise. She discussed the trend of strategic litigation against platforms, highlighting that it could hold them accountable for failures to respect human rights.
Duris recognized the challenge of keeping pace with evolving technology and regulatory initiatives, but argued that both platforms and regulators should take responsibility for upholding human rights. She also noted the growing recognition of civil society’s role in the digital space and the increasing consultations and engagements sought by platforms and regulators.
Overall, Duris highlighted the need for a multi-faceted approach, incorporating regulatory measures, adherence to UN Guiding Principles, addressing root causes, decentralization, improving content moderation, and recognizing the role of civil society, with platforms and regulators sharing responsibility for upholding human rights.
Ana Cristina Ruelas
Speech speed
150 words per minute
Speech length
2028 words
Speech time
809 secs
Arguments
A multidimensional approach is required to tackle online harmful content
Supporting facts:
- Context matters in content moderation, including considering nuances of different languages and crisis situations
- Companies treat countries differently based on global impact or market share when it comes to addressing harmful content
- The Social Media for Peace project highlights the importance of stakeholder dialogue in conflict zones
- There are differing approaches to dealing with harmful content, from relying on companies to shift their interests to public interest, to bad regulatory measures that criminalize users
- UNESCO has begun working on guidelines for digital platform governance that prioritize transparency, check and balances, human rights alignment, and inclusive stakeholder participation
Topics: Online regulation, Content moderation, Freedom of expression
Discussions on regulatory legislation often focus solely on the law’s creation, not its ongoing enforcement or operation.
Supporting facts:
- Many regulators feel abandoned after legislation is passed
- Many regulatory Authorities do not know how to handle some issues
Topics: Regulatory Legislation, Enforcement
Creating conversations between regulators and civil society can help reduce tension and make regulations more effective.
Supporting facts:
- There is a need to break the tension between regulators and civil society for effective enforcement
Topics: Regulators, Civil Society, Dialogue
Countries grouping together allows for a balanced, collective approach in dealing with companies.
Supporting facts:
- Stakeholders in different regions, such as the African Union, are coming together to deal with companies
Topics: Collective Approach, Companies, Negotiation
Report
Summary: Addressing harmful content online requires a multidimensional approach that takes into account linguistic nuances, cultural context, and the protection of freedom of expression. This is highlighted by the need to consider the complexities of different languages and crisis situations when moderating content.
Companies must align their actions with the UN guiding principles to ensure their policies prioritise transparency, accountability, and human rights. Education and community engagement play integral roles in tackling harmful content. Media and information literacy programmes empower users to navigate online spaces responsibly, while fostering a sense of shared responsibility in maintaining a safer online environment.
Furthermore, a synergistic effort is necessary, combining policy advice, regulation, and the involvement of multiple stakeholders. This involves a multi-stakeholder process that includes the development, implementation, and evaluation of regulations. Collaboration between regulators and civil society is vital to effective enforcement.
Creating conversations between these groups can help reduce tensions and enhance the efficacy of regulations. Regulators should not feel abandoned after legislation is passed; ongoing enforcement and operation of laws must be a key focus. To achieve a balanced and collective approach in dealing with companies, stakeholders from different regions are coming together.
For example, the African Union is taking steps to address companies with a united front. This collective approach allows for better negotiation and more equitable outcomes. It is important to emphasise a balanced, human rights-based approach when dealing with companies.
Among the 40 countries analysed, some believe that this approach is the correct path forward. By prioritising the principles of human rights, such as freedom of expression and inclusive stakeholder participation, governments can create a regulatory framework that safeguards individuals while promoting peace, justice, and strong institutions.
In conclusion, tackling harmful content online requires a comprehensive and nuanced strategy. Such an approach considers linguistic nuances, cultural context, and the protection of freedom of expression. It involves aligning company actions with UN guiding principles, prioritising education and community engagement, and establishing effective regulatory processes that involve collaboration between regulators and civil society.
With these measures in place, a safer online environment can be achieved without compromising individual rights and the pursuit of global goals.
Juan Carlos Lara
Speech speed
163 words per minute
Speech length
2061 words
Speech time
758 secs
Arguments
Challenges to human rights have been worsened by events around the world and the failure of both private tech companies and states to fully comply with their human rights obligations
Supporting facts:
- Far-reaching impact of online violence, discrimination, and disinformation in the digital public debate
- Harmful effects especially against marginalized and vulnerable communities and groups
Topics: Human rights, Digital technology, Regulation
Regulatory proposals aiming to tackle issues such as competition, data protection, interoperability, transparency, and due diligence in the digital public sphere have been emerging globally
Supporting facts:
- Efforts by international organizations to provide guidelines
- Regional blocs reacting with their own concerns
Topics: Regulation, Digital public sphere, Data Protection
Regulation should not infringe upon the principles of freedom of expression and privacy
Supporting facts:
- Question on the fragmentation of the internet
- Lack of infancy or absence of regulatory debates in many regions of the majority world
Topics: Regulation, Freedom of Expression, Privacy
Soft law principles and the application of international human rights laws play a crucial role in guiding the behavior of companies
Supporting facts:
- Provided valuable guidance for alternative frameworks
- Effectiveness is a matter of discussion
Topics: Soft law principles, International human rights laws, Company behavior
Report
The discussions revolve around the challenges posed by online violence, discrimination, and disinformation in the digital public debate. These harmful effects have far-reaching impacts, particularly against marginalised and vulnerable communities and groups. The failure of both private tech companies and states to fully comply with their human rights obligations has worsened these challenges.
Regulatory proposals have emerged globally in response to these issues in the digital public sphere. These proposals aim to address concerns such as competition, data protection, interoperability, transparency, and due diligence. Efforts by international organisations to provide guidelines and regional blocs reacting with their own concerns have contributed to this regulatory landscape.
While regulation is necessary, it is crucial that it does not infringe upon the principles of freedom of expression and privacy. The question of how to strike a balance between regulation and these fundamental rights remains a point of debate.
It is important to consider the potential fragmentation of the internet and the lack of regulatory debates in many regions of the majority world. Soft law principles, as well as the application of international human rights laws, play a crucial role in guiding the behaviour of companies in the digital sphere.
They have provided valuable guidance for alternative frameworks. However, the effectiveness of these principles and laws is a matter of discussion. In conclusion, the discussions highlight the urgent need to address the challenges posed by online violence, discrimination, and disinformation.
While regulatory proposals have emerged globally, it is essential to ensure that the regulation strikes a balance between protecting human rights, such as freedom of expression and privacy, and addressing the harmful effects of the digital public sphere. Soft law principles and international human rights laws provide valuable guidance for company behaviour, but ongoing discussions are needed to determine their effectiveness.
Overall, collaborative efforts between governments, tech companies, and civil society are essential to achieve a digital space that upholds human rights and promotes a more inclusive and equitable society.
Pedro Vaca
Speech speed
136 words per minute
Speech length
1533 words
Speech time
675 secs
Arguments
The current dynamics of freedom of expression on the internet are characterized by the deterioration of the public debate, the need to make processes, criterias, and mechanisms for internet content governance compatible with democratic and human rights standards, and the lack of access, including connectivity and digital literacy to enhance civic skills online.
Topics: Freedom of expression, Internet content governance, Digital literacy, Connectivity
Digital media and information literacy programs should be considered an integral part of education effort.
Topics: Digital media, Information literacy, Education
State actors should not use public resources to finance content that spreads illicit and violent content and should promote human rights.
Topics: State actors, Public resources, illicit content, Violence, Human rights
Internet intermediaries are responsible of respecting the human rights of users.
Topics: Internet intermediaries, User rights, Human rights
The challenges facing the digital public debate require a multidimensional approach.
Topics: Digital public debate, Human rights, Multidimensional approach
Lack of capacity and knowledge among member states pose a significant challenge
Supporting facts:
- Monitors 30 countries in Americas
- Lack of contact points at foreign affairs ministers
Topics: Internet Regulation, Platform Responsibility
Opportunity lies in shifting focus towards the role of public servants and political leaders
Supporting facts:
- Highlighted the Constitutional Court of Colombia
- Cites problem of political leaders incentivizing discrimination
Topics: Leadership, Discrimination, Xenophobia
Report
The current dynamics of freedom of expression on the internet are concerning, as there is a deterioration of public debate. This raises the need to ensure that processes, criteria, and mechanisms for internet content governance are compatible with democratic and human rights standards.
Moreover, limited access to the internet, including connectivity and digital literacy, poses a challenge in enhancing civic skills online. Recognising the importance of addressing these issues, digital media and information literacy programmes should be integrated into education efforts. By equipping individuals with the necessary skills to navigate the digital landscape, they can critically evaluate information, participate in online discussions, and make informed decisions.
State actors have a responsibility to avoid using public resources to finance content that spreads illicit and violent materials. They should instead promote human rights, fostering a safer and more inclusive online environment. In addition, internet intermediaries bear the responsibility of respecting the human rights of users.
This entails ensuring the protection of user privacy, freedom of expression, and access to information. Managing the challenges in digital public debate requires a multidimensional approach. Critical digital literacy is vital in empowering individuals to engage in meaningful discourse, while the promotion of journalism supports a free and informed press.
Internet intermediaries must also play a role in upholding human rights standards and fostering a healthy online debate. Upon further analysis, it is evident that there is a lack of capacity and knowledge among member states regarding internet regulation. This poses a significant challenge in effectively addressing issues related to content governance and user rights.
Efforts should be made to enhance understanding and collaboration among countries to develop effective and inclusive policies. Shifting the focus towards the role of public servants and political leaders presents an opportunity to reduce discrimination and inequality. By implementing stronger regulation, especially for political leaders, their limited freedom of expression compared to ordinary citizens can be addressed.
Adhering to inter-American and international standards can serve as a guideline for ensuring accountability and promoting a fair and inclusive public sphere. Overall, this extended summary highlights the importance of protecting freedom of expression online, promoting digital literacy, and holding both state actors and internet intermediaries accountable.
It also emphasizes the need for increased collaboration and knowledge-sharing among member states to effectively address the challenges in the digital realm.
Ramiro Alvarez Ugarte
Speech speed
156 words per minute
Speech length
1528 words
Speech time
588 secs
Arguments
We’re moving towards a regulatory moment with platforms
Supporting facts:
- The DSA in Europe is seen as a potential model for global regulation
- Seen bills presented in congresses in Latin America that mirror DSA’s provisions
- State-level legislation passed in the US imposing obligations on platforms
Topics: Digital Regulation, Online speech, DSA
Legal challenges against companies could work both ways
Supporting facts:
- Cases questioning companies for failing to uphold human rights standards
- Litigation against companies for allegedly violating the First Amendment in the US
Topics: Legal challenges, Online platforms, Section 230, First Amendment
Promoting counter-speech and literacy initiatives are promising non-regulatory measures
Supporting facts:
- These initiatives are a bet on the principle of democratic responsibility to discern fake from real information
- Counter-speech has been used successfully to combat disinformation in Latin American elections
Topics: Counter-speech, Literacy initiatives, Non-regulatory measures
Moving forward in regulation or non-regulatory measures is challenging due to deep polarization
Supporting facts:
- Moving forward is done in a context of deep polarization
Topics: Regulation, Polarization
Need to rebuild the public sphere and civic discourse as a way to overcome polarization
Supporting facts:
- In most democracies around the world, there is a need to rebuild the public sphere and civic discourse
- There is a need to start talking to each other in a way that is respectful
Topics: Public sphere, Civic discourse, Polarization
Report
The global discussion on the regulation of online platforms is gaining momentum, with diverse viewpoints and arguments emerging. The Digital Services Act (DSA) implemented in Europe is being viewed as a potential model for global regulation. Bills resembling the DSA have been presented in Latin American congresses.
Additionally, several states in the US have passed legislation imposing obligations on platforms. Legal challenges concerning companies’ compliance with human rights standards and the First Amendment are being debated. These challenges can have both positive and negative implications for holding companies accountable.
For instance, companies have faced litigation in the US for alleged violations of the First Amendment. In addition to regulatory measures, there is recognition of the potential of non-regulatory initiatives, such as counter-speech and literacy programs, in addressing the challenges posed by online platforms.
These initiatives aim to empower individuals to discern between fake and real information and combat disinformation. Successful implementation of counter-speech initiatives has been observed during Latin American elections. Nevertheless, concerns exist about the potential negative consequences of well-intentioned legislation on online platforms.
It is argued that legislation, even if well-designed, may have unintended harmful effects in countries with insufficient institutional infrastructure. The tension between decentralization and the need for regulatory controls is another point of contention. A fully decentralized internet, while offering freedom of choice, may facilitate the spread of discriminatory content.
Balancing the desire for increased controls to prevent harmful speech with the concept of decentralization is a challenge. Polarization further complicates the discussion on online platform regulation. Deep polarization hampers progress in implementing regulatory or non-regulatory measures. However, it also presents an opportunity to rebuild the public sphere and promote civic discourse, which is essential for overcoming polarization.
In conclusion, the global conversation on regulating online platforms is complex and multifaceted. The potential of the DSA as a global regulatory model, legal challenges against companies, non-regulatory measures like counter-speech and literacy programs, concerns about the unintended consequences of legislation, the tension between decentralization and regulatory controls, and the challenge of polarization all contribute to this ongoing discourse.
Rebuilding the public sphere and fostering civic discourse are seen as positive steps towards addressing these challenges.