Pre 6: Countering Disinformation and Harmful Content Online
12 May 2025 09:00h - 10:15h
Pre 6: Countering Disinformation and Harmful Content Online
Session at a glance
Summary
This discussion focused on strategies to counter disinformation and harmful content online while respecting freedom of expression. Participants from various European countries and organizations shared insights on the challenges and potential solutions.
The discussion highlighted the need for a multi-faceted approach, including fact-checking, platform design solutions, and user empowerment. Speakers emphasized the importance of quality journalism and public service broadcasting as foundations for combating disinformation. The role of regulators in addressing online harms was discussed, with examples from Ukraine, Moldova, and Bosnia and Herzegovina illustrating different approaches and challenges.
Participants raised concerns about the power of big tech platforms and the need for more effective regulation. The discussion touched on the challenges of balancing freedom of expression with the need to counter harmful content, particularly in the context of Russian disinformation campaigns targeting Eastern European countries.
The importance of media literacy and supporting independent journalism was stressed, along with the need for international cooperation to address the global nature of disinformation. Speakers also highlighted the challenges faced by journalists, especially women and minorities, who are often targets of online harassment and disinformation campaigns.
The discussion concluded with calls for continued efforts to combat disinformation, despite the complexities and challenges involved. Participants emphasized the need for creative, proactive approaches and the importance of defending human rights, democracy, and the rule of law in the digital age.
Keypoints
Major discussion points:
– The need for systemic solutions to strengthen the integrity of the information ecosystem, including fact-checking, platform design changes, and user empowerment
– The challenges faced by countries like Ukraine, Moldova and Belarus in combating Russian disinformation and propaganda
– The failure of self-regulation by tech platforms and the need for stronger government regulation and accountability
– The importance of supporting quality journalism and public service media as an antidote to disinformation
– The disproportionate targeting of women and minority journalists with online violence and harassment
Overall purpose/goal:
The discussion aimed to examine the current challenges in countering disinformation and harmful content online, share experiences from different countries, and explore potential solutions and policy approaches.
Tone:
The tone was largely serious and concerned, with many speakers expressing frustration at the scale of the problem and the lack of effective solutions so far. There was a sense of urgency in addressing these issues. Towards the end, some speakers tried to strike a more optimistic tone about continuing to fight disinformation despite the challenges, while acknowledging the gravity of the situation.
Speakers
– Valentyn Koval: First Deputy Chair of the National Council of Television and Radio Broadcasting of Ukraine
– Alina Koushyk: Director of Belsat TV, Belarusian media outlet in exile
– Alina Tatarenko: Head of the Division for Cooperation and Freedom of Expression at the Council of Europe
– Amela OdobaÅ¡iÄ: Head of Broadcasting at the Communication Regulatory Agency of Bosnia and Herzegovina
– Pavlo Pushkar: Head of the Division for Execution of Judgments of the European Court of Human Rights
– Julie Posetti: Professor and Global Director of Research at the International Center for Journalists, Professor of Journalism at City University of London
– Aneta Gonta: Deputy Chair of the Audiovisual Council of the Republic of Moldova, Member of the Council of Europe Committee on Media and Information Society
– Andrin Eichin: Senior Policy Advisor on Online Platforms, Algorithms and Digital Policy at the Swiss Federal Office of Communications
Additional speakers:
– Daniel Michos: Eurodig moderator
– Mykyta Poturaiev: Head of the Ukrainian Parliament Committee for Humanitarian and Informational Policies
– Jordan Ogg: Representative from Ofcom (UK’s independent communications regulator)
– Giovana Fleck: Representative from RNW Media
– Marilia Maciel: Director of Digital Trade and Economic Security at Diplo Foundation
– Luljeta Aliu: Member of the Independent Media Commission in Kosovo
– Oksana Prykhodko: Representative from European Media Platform (Ukraine)
– Giacomo Mazzone: Member of the European Digital Media Observatory (EDMO)
– Oleksandr Shevchuk: Institute of International Relations, Ukraine
Full session report
Expanded Summary of Discussion on Countering Disinformation and Harmful Content Online
This discussion brought together experts from various European countries and organisations to examine strategies for combating disinformation and harmful content online while respecting freedom of expression. The conversation highlighted the complex challenges faced by different nations and the need for multifaceted, creative approaches to address the issue.
Key Challenges in Combating Disinformation
Participants identified several significant obstacles in the fight against online disinformation:
1. Inadequate Legal Frameworks: Amela OdobaÅ¡iÄ highlighted the lack of comprehensive legal structures and enforcement mechanisms in many countries, hindering effective action against disinformation.
2. Failure of Self-Regulation: Julie Posetti emphasised that self-regulation by tech platforms has largely proven ineffective, necessitating more robust government intervention.
3. Resource Constraints: Valentyn Koval pointed out that many regulators lack sufficient resources and capacity to tackle the scale of the problem.
4. Algorithmic Bias: Alina Koushyk raised concerns about the algorithmic suppression of certain languages and content, particularly affecting minority languages like Belarusian.
5. Data Access: Giovana Fleck noted the difficulty in obtaining necessary data from platforms to effectively combat disinformation.
6. Threats to Journalists: Julie Posetti highlighted the disproportionate targeting of women and minority journalists with online violence and harassment.
7. Financial Motivations: Marilia Maciel drew attention to the growing disinformation industry driven by financial gain, emphasizing its global nature and the need for international cooperation to address it.
Proposed Solutions and Approaches
Speakers offered various strategies to address these challenges:
1. Media Literacy and Fact-Checking: Andrin Eichin advocated for developing media literacy initiatives and fact-checking programmes to empower users.
2. Platform Design Solutions: Eichin also suggested implementing ‘safety by design’ principles in platform architecture.
3. Public Service Broadcasting: Jordan Ogg and Julie Posetti emphasised the importance of strengthening public service media, particularly highlighting the role of the BBC World Service, as a reliable source of information.
4. Co-Regulation: Amela OdobaÅ¡iÄ proposed experimenting with co-regulation approaches involving multiple stakeholders, noting that Bosnia-Herzegovina is adopting this approach.
5. Financial Countermeasures: Marilia Maciel suggested focusing on cutting off financial resources to the disinformation industry.
6. Proactive Content Creation: Valentyn Koval argued for concentrating efforts on creating and disseminating truthful information rather than solely combating disinformation.
Council of Europe’s Guidance Note
Andrin Eichin presented the Council of Europe’s guidance note on countering disinformation, which offers a comprehensive framework for addressing the issue. The guidance emphasizes:
1. The importance of quality information as a long-term antidote to disinformation.
2. The need for a multi-stakeholder approach involving governments, civil society, and tech companies.
3. The role of human rights-based regulation in combating disinformation.
Challenges Faced by Specific Countries
1. Ukraine: Valentyn Koval highlighted Ukraine’s ongoing journey towards a stable democratic media environment, particularly in the face of intense Russian propaganda and disinformation campaigns.
2. Moldova: Aneta Gonta discussed the challenge of balancing national security concerns with media freedoms, noting that Moldova is disproportionately targeted by disinformation campaigns.
3. Belarus: Alina Koushyk detailed the unique challenges faced by Belarusian media in exile, including algorithmic suppression of content in the Belarusian language and the struggle to reach audiences inside Belarus.
Role of International Organizations and Initiatives
1. European Digital Media Observatory (EDMO): Giacomo Mazzone discussed EDMO’s activities in coordinating fact-checking efforts and research on disinformation across Europe.
2. Support for Exiled Media: Alina Koushyk highlighted the need for international support for exiled media and journalists who play a crucial role in countering state-sponsored disinformation.
3. Addressing Transnational Threats: Giovana Fleck pointed out the necessity of tackling the transnational nature of disinformation threats through international cooperation.
Emerging Discussions and Future Considerations
1. The moderator raised the question of potential hypocrisy in requesting free information while also advocating for content bans on platforms, highlighting the complex balance between freedom of expression and content moderation.
2. Oksana Prykhodko questioned the implications of reduced US support for counter-disinformation programs and the need for alternative funding sources.
3. Mykyta Poturaiev emphasized the potential political consequences of unchecked disinformation across Europe, calling for stronger legal frameworks to ensure platform accountability.
4. Julie Posetti stressed the need for more proactive and creative regulatory responses to address the evolving nature of online disinformation.
In conclusion, the discussion underscored the complex, multifaceted nature of the disinformation challenge and the need for creative, proactive, and internationally coordinated responses. While there was broad agreement on the urgency of addressing the issue, developing and implementing effective solutions remains a significant challenge requiring ongoing collaboration and innovation across various sectors and stakeholders.
Session transcript
Alina Tatarenko: The Eurodig will be a pre-session of the Eurodig which will be dedicated to the discussion on how to counter harmful content and disinformation online. My name is Alina Tatarenko, I’m the head of the Division for Cooperation and Freedom of Expression here at the Council of Europe. We are the division which helps our member states to implement the recommendations and the standards of the Council of Europe in the area of freedom of expression, which includes also helping our member states to find ways to counter disinformation. Before we begin and before I introduce our panel, I would like to give the floor to the Eurodig moderator, Daniel, who will quickly explain the rules of the session. My name is Daniel Michos and I’ll be remote moderating this session. More information about the session and speakers is available on the Eurodig wiki. We encourage you to raise your hand if you would like to present a question, but if you would like me to ask you a question, please write Q in front of your question. These are the session rules. Please enter your full name. To ask a question, raise your hand using the Zoom function. You will be unmuted when the floor is given to you. When speaking, switch on the video, state your name and affiliation, and do not share links to the Zoom meeting, not even your own colleagues. Thank you. Thank you very much. So I will introduce our panel. We will start from Andrzej Eichin, who is the Senior Policy Advisor on on Online Platforms, Algorithms and Digital Policy at the Swiss Federal Office of Communications and who is also a part of the group of Council of Europe experts who developed Council of Europe guidance note on countering disinformation. Then we will have three regulators present here with whom we work through our Council of Europe projects. One is the representative of a Ukrainian regulator, Valentin Koval. Then we will have a representative of the Moldovan regulator, Aneta Gonta. Then we will have a representative of the regulator from the Bosnia-Herzegovina, Amela OdobaÅ¡iÄ. And then we also have a director of the Belsat TV, which is a Belarusian media outlet in exile, Alina Koushyk. And then we will have Julie Posetti, who is a professor and a global director of research at the International Center for Journalists and professor also of journalism at the City University of London. They will make short statements, present their arguments, after which we will open the floor to questions and then we will circle back to our panelists for going into more detail, proposing solutions and making their conclusions. Thank you very much. And with that, we begin with Adrian, please. Could you tell us about what are the latest Council of Europe standards and what is the latest guidance on countering disinformation online? Yes, thank you very much.
Andrin Eichin: Thank you for having me. I’ll try to keep this brief and give you the main elements of this Council of Europe guidance note on countering the spread of online misinformation through fact-checking and platform design solutions in a human rights compliant manner. Even though I was on the expert committee, I still have to read the whole title every time because it’s a bit of a mouthful. I had the pleasure to chair this expert committee that developed these guidelines and they were presented to the Steering Committee on Media and Information Society, the CDMSI, in December 2023 and then have been adopted. Now, what does the Guidance Note aim to do? The goal is to outline available strategies to address the challenges of this and misinformation online all while complying with the Convention and in particular with the right to freedom of expression which is enshrined in Article 10. Now, I would like to begin with a few words on the problem. We are all aware of the risks of mis- and disinformation, the risks they pose to democracy, they erode trust in public institutions, they distort public debate and they challenge the credibility of the media. Our citizens now operate in an information space that is fragmented, that is fast-moving, with content flowing through platforms often lacking any editorial oversight. And this sometimes leads to a perception that disinformation is omnipresent and that it exists at alluring levels. However, the expert committee was very clear and it highlighted very specifically that empirical data on the actual reach and impact of disinformation is still limited and the reality is often more nuanced. We do have clear evidence in certain areas, and I’m sure we will hear about it today, where disinformation is prevalent and being weaponised. Russian interference in the context of the full-scale invasion of Ukraine, misinformation that we had with regard to COVID-19. But the complexity of the situation means that people’s fear of misinformation often generates as much polarisation and distrust as the problematic content itself. Emerging technologies, especially generative AI, only exacerbate this problem, but they make false content more scalable and convincing. And this further blurs the line between legitimate, misleading and deliberately false content. The expert committee believes that countering this complexity demands policy that go beyond just identifying and removing bad actors or problematic content. We need systemic solutions that strengthen the resilience and integrity of the overall information ecosystem. And the Guide and Know tries to offer some recommendations, particularly in three areas, and I will just highlight some of the most important ones for you. The first area that we’re looking into is fact-checking. The expert committee highlights that fact-checking must be recognized as a key practice for information integrity, regardless whether it is integrated in the journalistic process, so before the content is published, or whether it’s executed in an independent professional capacity after the information has been made available. And the Guide and Know recommends that member states should create and support conditions for financial independence, transparent governance, and public service orientation of fact-checking organizations. Platforms should actively cooperate with fact-checking organizations to debunk and contextualize this and misinformation. And fact-checking operations themselves must operate free from state or commercial influence, maintain clear and high standards, as well as transparency in methods and funding. Fact-checkers serve as guardians of information integrity, both before and after its dissemination, and as such they serve not just correct, they serve not just to correct falsehoods, but they reinforce a culture of accuracy and trust in our systems. The second element we were looking at is platform design solutions. Platform architecture and their design play a vital role. The Guidance Note insists that platforms must adopt human rights by design and safety by design principles. This means they have to conduct and publish human rights impact assessments for new futures and policies. They need to design systems that take into account the risk profile of specific contents or audiences and adapt accordingly. They need to make moderation practices transparent and open to appeal. But importantly, the Guidance Note also focuses on recommendations for member states, especially when they consider regulatory frameworks. They should focus on process-based regulation rather than targeting individual content. They should apply proportionate regulation tailored to a platform’s size and risk profile. And they treat content removal as their last possible resort, never the default. Instead, they should be aware that platforms can also use other, more friction-based mechanisms to reduce reach and impact of content. The third pillar is user empowerment, and it’s a concept that is often cited but very rarely put in place. The Guidance Note highlights that users need tools to be able to control what content is shown and recommended. They need tools that allow them to verify sources and that allow them to seek redress when they feel that their human rights have been limited by platform decisions. Currently, these tools are very sparse, their implementation is opaque, and often they depend on the goodwill of platform providers. The Guidance Note also emphasizes the need for comprehensive digital literacy efforts, again something that is very often cited. But, and this is important, these efforts need to be available for all age groups. Only this allows to build critical thinking and resilience against dis- and misinformation across society. And perhaps most importantly, we must strengthen the foundations. Invest in reliable independent journalism and build healthy media ecosystems. It is something that is easy to forget when being confronted with disinformation from the outside. But the most important task is not a new one. It’s something that has been at the forefront of the Council of Europe for years. We need to create the structural conditions to ensure that there is a steady and reliable supply of quality information by recognized trustworthy sources. Quality information is the most effective long-term antidote for disinformation. So what must happen now? Member states should integrate these recommendations into their national frameworks with consistent alignment to human rights obligations. Platforms must make meaningful steps to reform system design, not just rely on post-facto moderation. And policymakers and researchers should collaborate to evaluate the impact of these measures and adjust to new technological threats. As I mentioned before, we still need a lot of research and evidence in this area. Thank you.
Alina Tatarenko: Thank you very much, Andrin. Yes, just to summarize quickly, three main parts of the Council of Europe guidance is about developing and emphasizing media literacy and about fact checking and platform-based solutions which are addressed directly to the platforms to ensure that they have design safety incorporated in the initial design into every algorithm. How do they comply with that? Maybe our regulators can let us know and that’s why I want to give the floor now to Valentin Koval, who is the first deputy chair of the National Council of Television and Radio Broadcasting of Ukraine. Please, Valentin. Thanks so much. Hi to everybody.
Valentyn Koval: Democracy is by their very nature. open societies, war of censorship, and bound by bureaucratic inertia are fertile ground for disinformation. While long-standing democracies in Europe rely on independent and fair traditional media, Ukraine lacks this institutional media heritage, and its journey towards a stable democratic media environment is still in progress. Ironically, Ukraine has been criticized for taking undemocratic steps, such as banning Russian and pro-Russian television channels. Yet, these moves were essential for defending media pluralism. It is precisely the diverse media voices in Ukraine that twice helped resist Russian attempts to assert political control, leading eventually to Russia’s full-scale invasion as a last-ditch effort. A core challenge of countering disinformation is that most responses are reactive. First comes the fake news and only later fact-checking. When fake content is spread through unregulated space like so-called social media, it often reaches wide audiences before being addressed. Worse, attempts to debunk falsehoods can sometimes amplify them, especially when the debunking lacks credibility. Ukraine’s response includes significant legal and institutional changes. In March 2023, a new law gave the National Council of Television and Radio Broadcasting of Ukraine broad powers to regulate not only traditional broadcasters, but also online and print media, DVB network operators, and content platforms within the Ukrainian jurisdiction. However, the law lacks the enforcement strength found in EU regulations like DSA, and Ukraine’s relatively small market limits its influence over global platforms. To address these challenges, the National Council focused on media literacy for media companies. Since disinformation spread by professional media outlets seems more credible, efforts have centered on improving journalistic standards. In participating with organizations like the Pilipovlic Institute for Democracy, Internews, Deutsche Welle Akademie, the Council conducted studies and trainings to identify and address vulnerabilities in Ukrainian media. Additionally, the Council produces weekly programming to debunk Russian disinformation narratives in English. This way, we help our colleagues abroad to understand more in-depth how Russian propaganda works. So-called social networks pose a particular threat. Their core design incentivizes broad engagement often from less critical users. Without reliable verification mechanisms, these platforms become breeding grounds for disinformation. Even worse, their modern policies, driven by global community guidelines, often restrict war-related content under the guise of neutrality. This includes suppressing or deleting the commentation of war crimes and failing to block fake accounts or abort activities that push Russian narratives. One key strategy is preemption, flooding the information space with verified truthful content. The amount of information consumed by individuals is limited, not because of lack of sources, but due to time limits and limited attention spans. This space is contested between professional media and unmoderated platforms. A 2024 study showed that social media platforms are structurally incapable of supporting truthful narratives during crises. These platforms often suppress war content. citing global rules and failed to act against bots and AI-generated harmful content. The study titled Guide for Risk Management in the Context of Emergencies, Armed Conflicts, and Crisis was conducted by International Media Support and Interviews of Ukraine in partnership with UNESCO and with support from Japan with deep cooperation with National Council members and staff. It analyzes the risk of spreading truthful content during conflicts and proposes recommendations for reducing platform-related threats. The study is available here, it’s https.cat.us. And I have some exemplars of printed materials, so for those who like to have paper materials, you can later take this and work with this.
Alina Tatarenko: Thank you. Thank you very much, Valentin. And of course, countering the threats of Russian propaganda online is not just a problem for Ukraine, it is a problem everywhere in Europe and in the world. One country which is also struggling a lot with Russian propaganda is Moldova. And we have here a representative of the Moldovan regulator, Aneta Gonta, who is a Deputy Chair of the Audiovisual Council of the Republic of Moldova and also she is a member of the Council of Europe Committee on Media and Information Society. Please, Aneta. Thank you very much.
Aneta Gonta: Good morning, everyone, and thank you for this opportunity to be here today and to try to speak about this very complex topic, the subject of disinformation and harmful propaganda. Not just any disinformation, but the Russian one. Russian disinformation and harmful propaganda. which are probably the best in the world, so therefore the most difficult to combat. The Republic of Moldova held presidential elections and a referendum on EU accession membership in the fall of 2024. Also, Moldova has parliamentary elections on September 28 this year, which are very important for the European path introduced in the country’s constitution. In both cases, the electoral process has to take place under conditions of deep interference by the Russian Federation, against which my country does not have proportionate resources to fight. In April 2025, which means a couple weeks ago, journalistic investigations revealed two extremely powerful Russian harmful propaganda networks named Matryoshka and Pravda. Those main target is the Republic of Moldova and its president, Maya Sandu. In the last two years, Russia has already invested more than 200 million euros or more than 1% of Moldova’s GDP in online disinformation and harmful propaganda campaigns to undermine trust in state institutions, hijack the European course and change the Kishinev government to a pro-Russian one. According to this information observatory, Moldova is now in this moment the most targeted country in the region by these campaigns, more than 50 times the average of harmful propaganda in Western Europe. The tentacles of Russian harmful propaganda and disinformation campaigns are many and varied, from influencers, for example, paid to comment on or to promote Kremlin narratives, to priests who are organized visit to Jerusalem, and expressly asked to deliver pro-Russian votes this year, to teachers. who are more recently invited to visit Moscow by the NGO of a Moldovan oligarch internationally sanctioned in the last years. And last but not least, the Moldovan vloggers who distribute and amplify the Kremlin’s narratives and who, curiously enough, ardently supported the unknown Kalin Georgescu’s candidacy in Romania in the presidential elections last fall. Under these conditions, the Moldovan authorities together with experts from civil society and the media are increasingly discussing now the need to establish rules for online activity that would ensure fair conditions for all voices but at the same time diminish the momentum of those who do not aim to inform society in a pluralistic manner but work in favor of foreign interests and against the national security of the Republic of Moldova. The biggest challenge in this context is, of course, to ensure freedom of expression under the conditions of Article 10 of the European Convention and for our state to demonstrate that it does not introduce censorship as is already being talked about in Moldova in the pro-Russian camp. We believe, however, that it is important to emphasize the usefulness of existing European documents including those of the Council of Europe which, for example, in Resolution 2567 from 2024 on propaganda and freedom of information in Europe points out exactly what is happening now in Moldova, what happened in Romania and how important are legal, proportionate and necessary measures in a democratic society to maintain this democracy. Moldova is now in the process of revising and aligning its legislation to the European one. But in the meantime, battles are being fought in which legislation, rules, values and standards, but also freedom of expression, are being packaged in a populist, reductionist and generalist language, which only a resilient and media-educated society can deal with. Until this extremely important, but long-term investments really bear fruit, rules, even at the risk of being seen as restrictions, must be put in place now.
Alina Tatarenko: Thank you very much. Thank you very much, Aneta. So we have heard from the regulators from Ukraine and Moldova, and we also work, our division has a big project working with the Western Balkans, and specifically we support Western Balkan regulators in helping them to regulate harmful content online and to speak about it, we invited today Amela OdobaÅ¡iÄ, who is the head of broadcasting at the Communication Regulatory Agency of Bosnia and Herzegovina. Please, Amela.
Amela OdobaÅ¡iÄ: Thank you very much, Alina, and good morning, still good morning from me. I will speak on behalf of Communications Regulatory Agency, that’s a regulatory authority, a converged regulatory authority in Bosnia and Herzegovina, and as Alina said, I will also touch upon the practices and experiences, and mainly challenges, that we, the regulatory authorities in the Western Balkans region really face when it comes to tackling topics of both disinformation and harmful content online. But before I make my introduction, I really have to, I feel that I should really mention that having heard our colleagues from Ukraine and Moldova, and as you all know, I’m in Bosnia and Herzegovina, went through the war 30 years ago, my first impression was just like, okay, in a way, if you can be thankful for something, and that is, thank God there were no online media and there were no social media then. Okay, and we really understand the hardship that our colleagues are experiencing, both as regulatory authorities as well as the public, the general public, especially of Ukraine. So, as you all know, countries of the Western Balkans are enlargement countries. And what does it mean? It means that as a part of accession process towards the EU, which is a very, not an easy ride when it comes to countries of the Western Balkans, but for the regulators, it is a very challenging ride because we are obliged to transpose the EU legislation framework into our national legislations. Okay, so these processes are not without risks, given the very complex political, social and economic instability and fragile institutions in the region. However, we, the regulator in Bosnia and Herzegovina, as well as our colleagues in the Western Balkans are not being set silent and just observing what is happening. Our biggest task, really, when we started tackling over the topic of disinformation and harmful content online started to happen, our first task was really just to define, okay, so what is what? What falls under the regulatory competencies and what does not fall under the regulatory competencies? So, naturally, being the regulator to whom the public filed a complaint, we started to receive so many complaints that were basically the complaints about disinformation and not harmful content. And as such, of course, I mean, we do not have jurisdictions, we do not have mandate to deal with disinformation. But does it mean that the regulatory authorities are just sitting back and observing all this really very disturbing events happening? No, we are not. For example, in Bosnia and Herzegovina, because we do have our department for media information literacy, we are tackling disinformation through the activities that fall under the umbrella of the activities when it comes to media information literacy. However, when it comes to harmful content online, okay, the first initial reaction of regulatory authorities in the Western Balkans, in Bosnia and Herzegovina as well, is like, okay, are we really, do we have competencies for online content, full stop? Because, you know, if you look at the legal framework in Bosnia and Herzegovina, the legal framework, the law on communication dates back in 2001. Okay, so how could you possibly have competencies, okay, when it comes to online content altogether. But then on the other hand, there is the EU legislation, okay. And as again, as a country that is potentially, I mean, as a candidate country, of course, I mean, there is a set of directives, there is audiovisual media directive that we should, that we are obliged to transpose in our legal framework. I will not go into the details now, I will focus more on challenging, but I hope, Alina, that we will come back to this issue, because actually, the lack of or should I say the inactivity of policy makers to ensure that legal frameworks are in place are really one of the biggest challenges in Bosnia and Herzegovina as well as in some other countries when it comes to tackling the harmful online content. And then, yes, along that way we discovered, well, yes, actually, because we did transpose the Directorate of Audiovisual Media Services into our bylaws. Hence, the rule of video sharing platforms has been adopted and then all of a sudden we realize, yes, we do have the law that dates back to 2001, but yes, also, we do have bylaws and we do have the rule of video sharing platforms which actually makes us competent to a certain extent for online media. I can go back to that in more details, but speaking of challenges for the regulatory authorities in the Western Balkans and in light of those challenges we are very much grateful to Council of Europe for the project that is really helping us to strengthen our capacities in order to be able to act as hands-on regulatory authorities and to respond to all these challenges when it comes to harmful online content is really hugely important to us. But apart from this challenge, which focuses on the lack of legal framework or ineptness of our policy makers, it’s actually also how do small countries, such as the countries in the Western Balkans, establish collaboration or communication to big online platforms? That’s a challenge that we are still trying to find the answer or to try to find a way out. Again, through the project of Council of Europe, yes, there may be a way out, but still, it may be a small step by strengthening regional collaboration and then making some progress in that area. And also, we mustn’t forget that the regulators in the Western Balkans, as well as some regulators in the European Union, face problems such as the issues of internal capacities, of resources, both financial and human resources, in order to expand its capacities to respond adequately to all these events. I’ll stop here and I hope to go back in more detail later on.
Alina Tatarenko: Thank you very much, Amela. Yes, so, lack of legal framework is a common problem for many countries. The problem of the gap between countries which are covered in the EU by the DSA and non-EU member states is an issue that you mentioned. And, of course, lack of resources and capacity also is common for so many countries, for the Council of Europe member states. We are also working, not just with the Council of Europe member states, we are also working and trying to support the journalists in exile. And specifically, we have a project with the Belarusian journalists in exile. And I’m very happy to see that today we have in our panel Alina Koushyk, who is the director of Belsat TV, which is a Belarusian media outlet in exile. And they have a specific, very interesting and very particular set of issues that they would like to share with us today. Please, Alina.
Alina Koushyk: Thank you very much, Alina. Good morning, ladies and gentlemen. Good morning, dear colleagues. Let me greet you in Belarusian language as well, because the issue of Belarusian language is one of the most urgent. but let me go back a little bit later to this. Thank you for inviting me and making a voice of free Belarusian media heard here in Council of Europe. Belarus is not a member of Council of Europe, it’s not a member of EU, but anyway, I will tell you how we operate also in EU, but also how can we influence in this situation in my country. Belarus is a country with almost 9 million people, de facto still independent, which stands a buffer between aggressive Russia and European democracies. Belarus was also used, territory of Belarus was used to start a war in Ukraine and we as free Belarusians, of course, do not support these actions. And the role of independent media in this abuse of Belarusian society is incredible. Today, up to 88% of Belarusian independent media outlets are closed in Belarus. Some of them are continuing their work in exile. We have 45 more or less titles which are covering Belarus from exile. And this independent Belarusian media, of course, including Belsat, we are covering one third of Belarusian population, which is quite a lot, taking into consideration really difficult conditions in which we operate. Also, we have a really difficult situation because of USAID and American turbulences, let’s say, that caused… budget cuts up to 60% of our media sector, which is really a lot, and which may cause a lot of losses in our media eco-sector. Moreover, Belarus is the most dangerous country in Europe for journalists. Until Russia’s full-scale invasion on Ukraine, this is up to information on reporters without borders. Now, 30 Belarusian media workers are imprisoned, just for doing their job, just for telling the truth. I believe that here in our room we have more than 40 people, but can you imagine that half of you would be imprisoned in my country at the moment? Eleven of my colleagues, just from Belsat, are behind the bars at the moment. They receive from two and a half up to eight years, just for doing their job. My name is Alina Koushyk and I am editor-in-chief of Belsat TV, the only independent Belarusian satellite television. The only television which speaks each day using Belarusian language. We are broadcasting from Warsaw for 17 years, every day in our national Belarusian language. And 17 years ago I had an honor to present the first news service in Belsat. I was a presenter. And now I am heading the entire channel. But independent Belarusian television is not about stars, it’s not about glamour, it’s not about headlines. In our daily basis it’s very often about fear, resistance and extraordinary care and courage. Today, we as the independent Belarusian media in exile, both Belsat and other colleagues, we are facing double threats. Authoritarian repressions and algorithmic suppressions. Can you imagine that in my country people can receive even up to five or seven years in prison just for commenting, sharing or liking our content, Belsat content. Over 500 Belarusian sites were labeled as extremists. Me, myself, I’m a triple extremist in Belarus. First, for being a member of the Transitional United Cabinet, then for being a journalist of Belsat, and the third time as myself. 1400 Belarusian websites were blocked in the country. That’s why we are using Miros, we are using other platforms, trying to reach the audience with our free information. Despite of all these difficulties, over 75% of our audience is still in Belarus. Silently, anonymously, without subscribing, without commenting, they are just watching free information. One woman told me a story how she watched Belsat. She’s going to a bathroom, closes the door, watching news, then erasing story, opening up the door and going back to her family. Most Belarusians consume independent media now after 8 p.m. Why? because it’s too dangerous during the day. It is dangerous to open up the site of Belsat or the other media outlets in your job or even in public transport, you know, if you are scrolling news, people are afraid that somebody like Vladimir can see that I’m reading Belsat. Moreover, in your working places or even in the street, militia or policy, you know, they can check your phone and see if you have any subscriptions with any extremist, so-called extremist media outlets. One of these is Belsat. And of course, if they will find it, you can have some troubles. Moreover, we know cases that they are taking phones and making likes from your phone, Belsat or the other Belarusian media outlet and say, you see, you’ve got likes here. And you cannot prove that they just did it to make you, you know, blame that you are using media in exile. That’s why, of course, we see that our audience of Belarusian media is a little bit lower than it was before. But still, we are covering one third of the country. And I believe in these extremely difficult conditions, this is quite big. But what is important for us? We operate mainly through social media platforms, which already mentioned here. And for video content, YouTube is the main platform. YouTube is the main platform for longer videos. And TikTok is the main platform for short videos. But what about YouTube? As Belsat, we operate 10 YouTube channels with around 1 million subscribers at the moment. This is a huge number. Please remember that Belarus is less than 9 million people at the moment. So many of the people who are watching us can subscribe, can comment, can share, just because of fear. And that makes it very difficult to build communities around our media. Moreover, while watching Belsat, sometimes interesting advertisements appear. For example, confession video. What is confession video? It sounds innocent, but what is it? As we have more than 1,500 political prisoners, who are tortured every day behind the bars. So, they make people say that, sorry, I was wrong, I was supporting Tikhanovskaya, that was a mistake, Lukashenko is the only good president. So, this so-called confession video made by a huge violent behind the bars. These videos are appearing on YouTube like advertisements. So, the regime is paying a big platform for this kind of content. Of course, when we see this kind of content, we are knocking to YouTube and say, please block it. And of course, sometimes they block, sometimes not, but this is like you can do each case by your hands. Moreover, an important issue, which is very important personally for me. Belarusian language content is de-prioritized by algorithms of social media, especially in YouTube, even in the largest channels as Belsat. Why? For example, shorts. on our biggest channel where we have almost half a million people, have less than 1,000 viewers. Why? Because algorithms are not supporting Belarusian language. If you do shorts in Russian, you will have a million easily. But if you do it in Belarusian, you will have less than 100. What can we do? For us, it’s extremely important to keep broadcasting in Belarusian language, to talk to our people in our own language. But they are pushing us to make content in Russian. That’s why some Belarusian clever media are trying to have separate channels doing the same job but in Russian. But why should we choose Russian language? We don’t want. That’s why I really want to call on digital platforms to stop penalizing Belarusian language and make algorithms help us to provide really free information according to journalistic standards, professional journalistic standards. Because this is for us the main base for authentic, democratic and resilient media space in our country and moreover in our region. It is also critical to emphasize that quality and independent journalism, this is what Andrin actually said, is a foundation of any strategy against disinformation. Without well-supported media, we cannot counter disinformation, we cannot counter propaganda. That is why Digital Services Act and the Media Freedom Act matter not only for you but also for exiled media and European democratic country. Now if we have Belarusian media in exile, so we are also operating according to European laws. That’s why I believe and I really want this could be helpful for us. Belarusian media exiled are registered. European countries, that’s why for us, it is absolutely important to be visible in these acts and these processes. And the last one. I’m a godmother of Belarusian political prisoner, Ihor Elenevich. He’s Belarusian anarchist, he’s also author of books. He received 20 years in prison for standing up against dictatorship. And once he said, whoever is silent is defeated. So don’t Lukashenko and Putin silence free voices of Belarusians. And let me finish with glory to Ukraine and lonely Belarus.
Alina Tatarenko: Thank you. Thank you very much, Alina. We will try to discuss that more later in more detail because it is really important to stop prioritizing Belarusian language and to do something about it. And platforms do have a lot of power in order to make that happen. Next, we continue with Julie Posetti, who is a professor at the City University of London and who has conducted a very interesting research. And she has very well informed her opinion on what can we do and how we can work with the platforms. Thank you. Thank you.
Julie Posetti: So I co-led a study for UNESCO and the ITU called Balancing Act, Countering Digital Disinformation while Respecting Freedom of Expression. And that was published in 2020 at the height of the pandemic. It recommended a range of regulatory, legislative and normative actions, some of which are parallels to what we’re discussing here today. But in 2025, the threat to information integrity is so much worse than it was in 2020. And the need for proactive, right? rather than reactive and creative responses to disinformation is even more urgent. The broligarchy, as the obscenely wealthy tech bros in power are collectively referred to more frequently now, is choking democracy. Journalists, fact checkers, and other public interest information providers, who I think of as the cleaners in the toxic information ecosystem, and a protective force for democracy, human rights, and the rule of law, are retreating from visibility on big tech platforms to avoid abuse, harassment, and threats. Meanwhile, citizens increasingly don’t know what to believe, nor whom to trust, with devastating consequences for truth, facts, and a shared reality, as Nobel laureate Maria Ressa forewarned. I wish more audience members, more citizens would behave like your Belarusians in the toilet after 8 p.m. But unfortunately, they do not. So I also led a global study for UNESCO on online violence against women journalists called The Chilling. And I just want to highlight three statistics from that research which continue to be meaningful and resonant. So 73% of around 1,000 women journalists we surveyed said they’d experienced online violence in the course of their work. 37% of those said that political actors were the main perpetrators of online violence. And 41% of them said they were targeted in what they believed to be coordinated disinformation campaigns. According to our research, political actors, disinformation purveyors, and networked misogynists are the primary perpetrators of online violence against journalists and other public information producers. and activists and human rights defenders are among those. Women and minorities are both the most at risk and the most prolifically targeted in these online violence campaigns. And as I’ve said, they’re often fueled by disinformation and hate speech. And these attacks tend to be prioritized algorithmically due to high levels of engagement in the same way that angry and divisive speech is prioritized. And the reason for that is ultimately profit. So the attacks are designed to undercut trust in truth and facts and fact-based analysis, imperiling democracy, the rule of law and human rights norms in parallel. They’re also designed to expose their targets to greater risk. And it’s important to note, I think, that impunity for online violence, aids and abets impunity for crimes against journalists and other human rights defenders. Also notable, big tech actors are the vectors or facilitators of these attacks. And in some cases, big tech oligarchs have also proven themselves to be perpetrators. So I now lead a project for ICFJ called Disarming Disinformation, which is studying counter disinformation work in five countries in the context of democratic backsliding. And we’re studying both editorial responses and audience responses to this problem. And in parallel, I lead a project funded by the UK government primarily, which is developing an AI-assisted online violence early warning system, which is designed to monitor online threats in real time and to help predict the escalation of online violence to offline harm with a view to ultimately trying to prevent crimes against journalists and human rights defenders. And the system is also designed to help news organisations and legal teams and civil society organisations document the attacks, to help them hold the big tech actors and other perpetrators accountable. It’s a human rights by design approach to responding to a critical threat to the safety of journalists initially. But I really wish it hadn’t been necessary to do this work because the reason it’s been outsourced to us, this effort, is that the tech oligarchs failed to make safe products in the interests of maximising already obscene profits. And then the US failed to effectively regulate them with devastating consequences, not just for democracy and genuine freedom of expression in the US, but globally, especially in places like Myanmar and Ukraine. And now, in a climate of American free speech absolutism, where freedom of expression rights have been rebranded as censorship, we see these terms being weaponised against journalists and human rights defenders in order to silence them and also endanger them. And these big tech actors have been emboldened in this context to abandon and roll back their already limited trust and safety systems. And in parallel, as the Trump administration has cracked down on counter disinformation work, while also defunding so very many international programs that support public interest media. Meta has abandoned fact-checking in the US initially, but with a global cancellation foreshadowed. The result is that online violence perpetrators and disinformation purveyors can now act with impunity. while the risk management is increasingly outsourced to news organizations and civil society. But we struggle to simply access the data we need to effectively monitor and respond to these threats. And the threats are only escalating in the context of generative AI tools, as we’ve heard, which supercharge the speed and production and distribution of abusive and disinformational content, as well as hate speech content, which is all the more believable as a result of these tools. Meanwhile, we risk legal action from big tech actors if we try to work around the obfuscation to access the necessary publicly available data, which is at the source of attacks on journalists and fact checkers and human rights defenders. And if we can get access, it’s incredibly expensive to actually fund this access. That’s a really important point to highlight. So the sustainability and security of journalism and democracy are intertwined, and both are dependent on the integrity of information, which is under unprecedented attack. So based on all of the research that we’ve done over the past decade or more, we have concluded that the time for self-regulation has passed as it applies to big tech actors. It would be naive as well to assume that the tech oligarchs will meaningfully participate in co-regulation, in our view. So we’re calling on Europe to hold the line against the broligarchy. We need European legislators and regulators to double down on efforts to make big tech responsible, accountable, and transparent. And that needs to happen through legal obligation, litigation, and punitive action. This approach needs to be collaborative, creative.
Alina Tatarenko: and proactive rather than reactive, while of course respecting global standards and international human rights law with regard to freedom of expression. I’ll leave it there. Wow. Thank you, Julie. So, you are saying that the platforms have failed so far to self-regulate and we need collective action to counter the disinformation threat and for that we need legislators, we need governments, we need regulators, we need the efforts from international organizations. I would like to actually take advantage of the presence of my colleague here who is the head of the division for execution of judgments of the European Court of Human Rights and maybe he can comment on that from the judicial perspective and mention what does the European Court case law say about it and what can be done from the judicial point of view. We know cases when the judges made judgments on those issues.
Pavlo Pushkar: Thank you. Thank you so much, Alina. Thank you so much for the invitation to be here and to speak about disinformation and propaganda. I will be rather brief as the court’s case law is also rather brief on the subject of disinformation and propaganda. In my work at the department for the execution of judgments we also have cases which largely relate to instances of, let’s put it this way, disguised propaganda that is not accepted as a valid reason for interference with freedom of expression that relates to a number of different instances of disguised propaganda as terrorist-related, separatist propaganda in favor of prohibited organizations or the so-called propaganda of homosexuality where the states use the reasons to counteract expression. actually in a disproportionate manner or not lawfully. But most importantly, there are several cases of which I was going to refer to. And more generally, I think under the Convention on Human Rights itself, as interpreted by the Strasbourg Court in its case law, states have a wide margin of appreciation in matters of combating disinformation and propaganda. Based on the legitimate and well-justified needs to protect the interests of national security, territorial integrity and public safety, to prevent disorder or crime, for the protection of health and morals as it happened in several cases relating to COVID as well. However, the interference with propaganda and disinformation as expressions disguised as freedom of expression has to be based on relevant and sufficient reasons or generally pursued legitimate aims that need to be invoked by the authorities as well. On the other hand of the spectrum of these discussions on disinformation and propaganda, we see a lot more in context of valid thoughts and strong arguments which firmly suggest that propaganda and disinformation are not views or value judgments. They don’t constitute forms of protected expression that are covered by the requirements of freedom of expression under the European Convention on Human Rights. And indeed, while combating disinformation and propaganda is a valid objective, we might find, as I mentioned, instances in the case law of the Court where the attempts to qualify certain acts as propaganda have no reasonable justification on a lawful basis are disproportionate and this relates to instances I have referred to. This leads us to the discussions also under the Convention and the Court’s case law as to proportionality of sanctions in certain cases. which have been imposed for spreading disinformation and propaganda which is still seen in some instances as forms of expressions covered by the protection of the requirements of Article 10 of the Convention. But then again, the main idea that is mentioned continuously in the case law of the Court is that the aim of interference and sanctions should not totally discourage open debate of matters of public concern but rather the aim is to take robust measures to protect freedom of expression and public discussion space whether offline or online from harmful influences of disinformation and propaganda. And two cases which I think would be rather interesting for you to look into is a wonderful and to a certain extent interesting case of Kirkorov versus Lithuania about the prohibition of the so-called musician to actually be allowed to enter the territory of Lithuania because actually he was involved in spreading Russian disinformation and propaganda. And similarly, the harmful and tragic effects of propaganda are being recognized in another case concerning Ukraine which is discussed in the recent judgment of the Court in the case of Vyacheslav and others versus Ukraine concerning the events at the Kulikovo Pole in Odessa, tragic events which led to a number of tragic deaths as well. So this is quite briefly what I wanted to say from the point of view of the case law of the Court and as regards the enforcement of judgments of the Court to a certain extent fortunately we didn’t have judgments of the kind because the Court once again takes a rather robust stance on the issues of disinformation and propaganda, and these instances do not come to the attention of the committee of ministers exactly for that reason. Thank you very much, Pavlo, very interesting. So we see really
Alina Tatarenko: very good representation here from different actors. I would like to open, we do not have much time left, but we can go a little bit over time. Do we have questions from the online? And then we will take some, shall we take first all the questions, comments, and then answer, right?
Moderator: This way we will try to give as many people as possible. The first question will be from Siva Subraminna. In the process of dealing with the harm of the harmful content, some measures had to be in a seeming excessive degree in times of war of some kind. Are these measures designed to be reversible at a later date to restore liberties or are they set in stone?
Alina Tatarenko: Okay, can we go on? Can you maybe read all of the questions? Okay, anyone wants to address the harmful content question? Yes, okay, so while you’re thinking about addressing this question, we will go on and take more questions, please.
Mykyta Poturaiev: Okay, so it won’t be like maybe a question, it will be a remarkable question. So my name is Mykita Peturaev, I’m the head of the Ukrainian Parliament Committee for Humanitarian and Informational Policies. So colleagues, do we understand where it all is happening? I think yes, on social platforms. Do we have any protection from what is happening there? No. Does DSA work for now? No. Does MFA work for now? No. Will they work for non-member states? No. So do social platforms care about DSA and MFA? No, because they are not in jurisdiction of the European Union, and they won’t take care. So does any one of us have the possibility to protect our good names if our reputations are striked on social platforms? The answer is no, in no court. We don’t have anywhere to go to protect our names. Okay, so can we protect our children from bullying on social platforms? No. Well, the answer for Ukraine is 100% no, Valentin will confirm. Can we protect women from hate speech? No. Can we protect sexual minorities from hate speech? No. I don’t know again about all European countries, but I know for Ukraine, no. In traditional media, yes. I’m one of the key authors of new Ukrainian law and media, so yes, we regulated everything for traditional media. So really, we have nice articles protecting women, children, sexual minorities, all groups. Does it work for social platforms? No. Will it work for social platforms? No. Okay, media literacy. It’s a good idea itself, yes, and I know a couple of countries which are champions like Nordic Baltic countries. When did they start? The answer is from 15 to 20 years ago. Do we have this time in other countries which didn’t care about it? The answer is no, we don’t have this time. Do we have the answer what to do? No. Is fact-checking, which is also very important of course, working to protect us? The answer is no. Do you know why? Because according to all sociological Ukrainian and European surveys, people, ordinary people, they don’t care about fact-checking. So they either trust to some media or they don’t trust. But unfortunately, the inconvenient truth is that they trust mostly to anonymous and other accounts in social platforms, but not to traditional media. Traditional media are losing their audiences in every country, in Ukraine, in every European country, everywhere. Okay, so do we have answers on political level or governmental level? I’m not sure. I’m in communication with my colleagues from European Parliament. I’m also vice president of OECPA, so I’m in communication. We don’t have answer. Also, why? Because we all are afraid. Because if we are going to make social platforms accountable, what will happen in Washington? Maybe someone will wake up and write on Twitter, hey, these people in Europe, they are against freedom of speech, so I will apply 50% of taxes against them, 100% of taxes against them. What will our governments do in such a moment? What is more valuable, taxes or what? Freedom of expression, freedom of speech, freedom of media, accountability? Well, let’s calculate. And let me finalize with the results. And the results are the following. We don’t know maybe an ultra-right candidate will win the Romanian elections, and maybe it will help an ultra-right candidate to win the presidential election in Poland, and maybe it will help to pro-Russian revenge in Moldova, and ultra-rights in the first place in Germany, and ultra-rights in the first place in France. So, till we are discussing, till we don’t have answers, till we don’t have any practical decisions, we are losing this battle, and we are very, very close to lose this war, informational war. And then, well, let’s then answer my question. What are we going to do in this world, new horrible world? Thank you. Thank you very much. It’s, of course, a big question and
Alina Tatarenko: very optimistic intervention. Thank you. You have immediately sparkled a couple of reactions from online. I think we will just give the floor very briefly to the Ofcom from UK, then RNV Media, and then there is a participant at the back. Thank you.
Jordan Ogg: Thank you very much for giving me the opportunity to make a short comment. So, yes, Ofcom is the UK’s independent communications regulator, and just reflecting on some of the comments made by earlier discussants about the importance of strengthening the foundations of the information ecosystem and that quality information is the best antidote to some of the challenges that we have heard about today, I just wanted to raise that Ofcom is currently conducting a review into public service media, public service broadcasting in the UK, and I wanted to highlight that partly to share some interim findings, and that is that the huge increase in consumption of news online while delivering a range of benefits to users, including greater choice and personalization, is also raising huge challenges in relation to how people discover, consume, and are able to judge high quality and accurate news, as we believe is provided by public service broadcasters, at least in the UK, but also other parts of the world, of course. But we also know that audiences are at a much greater risk of exposure to misinformation and disinformation when consuming news online. And within that, we think the public service broadcasters have a really important role to play in countering the effects of those. So what I want to do is just raise the fact that we’ll be publishing policy recommendations aimed at supporting public service broadcasting, and they will be available in the summer, but also just to raise a question amongst anyone in the room who’d like to answer it in relation to how important they think public service broadcasting can be, and how to support it in this context. I’ll stop there. Thank you very much. Thank you. Yes, public service broadcasting is absolutely important, and we also have a
Alina Tatarenko: recommendation, Council of Europe recommendation on public service broadcasting, which is instrumental in countering disinformation. Please, the RNW Media, please.
Giovana Fleck: Hi, everyone. I hope you can hear me. My name is Giovanna. I represent a Dutch organization called RNW Media. But through RNW Media, I represent dozens of journalists and media servants worldwide. And I want to make a few remarks based on what was said here, and also to emphasize some of the issues related to disinformation that reach a global scale beyond European borders, and how that’s also relevant for the discussions within the EU. I think one thing was said early on in this discussion was a remark on fragmentation and how fragmentation of platforms and the internet correlates to disinformation being weaponized. I think we’ve seen the results of that in the form of information disorder in the global scale, especially through the height of the COVID pandemic, but also we continue to see it inside of transnational narratives and key issues related to delegitimizing democracy and human rights worldwide. And this is not excluded to one specific country. Those are global effects that take place inside of information ecosystems that are shaped to elevate harms instead of positive or trustworthy information. If we’re thinking about protecting journalism voices, and if we’re thinking about using journalism as a base to counter those issues, we also need to think about the sustainability of journalism in its time. And as one of the colleagues here said, that costs a lot of money, resources, and time. Most of all, we need to be aware of specific trends targeting journalists. It was also said here that especially female and minority journalists are at attack constantly. And that is absolutely true. The amount of attacks online towards female journalists are disproportional in comparison to other colleagues. And a lot of that also happens because their jurisdictions, also in Europe, don’t protect those claims. Women that are attacked, dogs online, that are doing their jobs as journalists, go to local police efforts to seek for help, and they are often not helped. That also relates to slaps, to lawsuits that specifically target journalists and try to limit their efforts and the resources when reporting. And it also relates to overall coordinated and authentic harm as well. So, to conclude my intervention, but also call attention for this myriad of difficulties in the ecosystem, I think we also need to think on a question of agency towards journalism, journalists allowing them to be able to put their work in a sustainable and aiming at the future in general, but also as a question of agency from civil society as well, to take them as participants of this information ecosystem, not only as a part of society that is reactive to everything that is happening, and that is related to literacy and that is related to forms of building a healthier information ecosystem, and we cannot shy away from those initiatives as well. Thank you very much.
Alina Tatarenko: Thank you very much. Agency for journalism and civil society. Of course, we all agree with that. Please, there was a question there at the back.
Marilia Maciel: Thank you very much. Good morning, everyone. My name is Marilia Maciel. I am Director of Digital Trade and Economic Security at Diplo Foundation, but I speak in my personal capacity. I’d like to focus a little bit more on a subset of the problem we are discussing here, which is a growing disinformation industry motivated by financial gain, which has become global, and this industry is skillfully taking advantage of platforms, business models, but I think it is a problem of its own. With the support of GIZ, we have conducted our research on misinformation, trying to identify lessons learned in a number of countries, and in this exercise, we also came across researchers and investigative journalists, which have identified companies, for example, based in Spain, which were selling disinformation services to Latin America, or companies based in the UK selling services to South Africa, and they were very clearly promising to change election results and postpone election, cause confusion, and so on. So, looking beyond platforms, which is a very important aspect, but not the only one, How do you think it’s possible to cut the financial resources to the disinformation industry? And do you think that there is space for international cooperation, perhaps cooperation with law enforcement? And in your view, is this something that we could collaborate with platforms to the extent that it seems that there is a threshold beyond which information disorder is also detrimental to platforms, as the example of Parler shows as well? Thank you.
Alina Tatarenko: Thank you very much. Very interesting question. Can the law protection agencies be used in order to cut resources of the disinformation industry? We’ll take maybe one more. There was one more here, one more there. Okay, and please, in the meantime, maybe panelists can think about answering the questions, please.
Luljeta Aliu: Hello. My name is Ljudeca Iliuo. I’m a member of the Independent Media Commission in Kosovo. Thank you for having me here. I wanted to thank Ms. Amela OdobaÅ¡iÄ for her presentation. She mentioned or pointed out a lot of challenges we are facing right now, ourselves in Kosovo. We just had some days ago a new law on the Independent Media Commission repealed by the Constitutional Court that was, yes, submitted to the Constitutional Court by media rights groups, media representatives. And you pointing out the challenges, like asking ourselves, do we have the right to regulate and so on. I was wondering, did you also experience the challenge that media rights groups or civil society groups are being sometimes used as an instrument to oppose regulation and then mostly like calling it politically motivated censorship. Like this is what we are going through right now and it is a really difficult position being between two fires wanting to regulate it for the people and for the citizens and then on the other hand having the same NGOs like sometimes used, abused to oppose these regulations and on the other hand the question do you think that could be also an influence of Russia or something? Do you have any cases in this direction?
Alina Tatarenko: Thank you so much for your presentation, it was really good, thank you. Thank you, we will give Amela two minutes to think about it and there was a question here somewhere on this side.
Oksana Prykhodko: Thank you very much, Oksana Prihodko, I am from Ukraine, international non-governmental organization, European media platform. American institutions played a very important role in counteracting disinformation. Now the new American administration closed a lot of projects. I understand that the Council of Europe, the European Union lack money to replace all such projects. Can we discuss any other asymmetrical responses? Thank you very much. Sorry, one more time, asymmetrical responses to? To closed American projects.
Alina Tatarenko: Okay, I take note of that. Okay, I’ll take, so if you have time we can stay for a couple more minutes, otherwise, yeah, we can. So if someone needs to go, you can go. Those who want to stay are welcome to stay because it’s very interesting and we can continue. Please, there was another question there.
Giacomo Mazzone: Yes, Giacomo Mazzone, member of the EDMOD, European Digital Media Observatory. I have two questions. The first is exactly about the observatory. We have done, and Andrew knows that we have worked on this together, as EDMOD we have put in practice a lot of activities exactly in this field and we have a successful example of what happened during the European elections last year. That was a process that went quite smooth and there was room for cooperation with the platform that was another world at that time, one year ago. And I think that we need to use this model and try to replicate elsewhere. The second aspect is, I remember that the Council of Europe some years ago opened a dialogue with the platforms based on goodwill, let’s say. There is any sign in this platform of dialogue that the changes that we are seeing in the US are going to be replicated also in Europe?
Alina Tatarenko: Okay, thank you very much. Yes, to open the dialogue of the Council of Europe with digital platforms, we do have some people here who are involved in this and maybe we can talk about it later. But in the meantime, we have a reaction from Ips.
Moderator: Yeah, also from myself, as part of the EFD, representing the young voices in Europe on internet governance. And I would like to ask a few questions. Because we often criticize this information propaganda mentioning that it only comes from Russia or China. But are we sufficiently aware and critical of this information that originates from within our own countries or from our closest allies? Is isolating us from different narratives undermines our capabilities of access to information, influence our own decisions? And this is censorship. Or do we want to say that the US and Europe don’t generate propaganda and misinformation? If we had voices here from countries from where us and our allies had financed foreign interventions, they wouldn’t agree with it. Maybe they would mention how the West has justified multiple military interventions, misleading claims about weapons of mass destruction, or claiming that the West would bring democracy to them just to let their countries completely torn apart. How can we ensure our efforts to combat disinformation remain balanced, credible, and fair by addressing misinformation and propaganda respective of its source? Because the narrative I’m listening to here is simply a push for censorship. Shouldn’t we be more self-critical? And I have also a question to Alina and Raul. Why do you think a country would allow a foreign entity to finance media in its own territory? That is pushing a narrative that threatens its national sovereignty through USAID, for instance. This is far from being independent media. Also, at the same time, you are asking for free information. You are also requesting to ban content that goes against narratives on YouTube. Isn’t it hypocritical?
Alina Tatarenko: Thank you. Thank you. Very interesting. I take note. We’ll try to address it later. I was told that we have a request from the ex-representative online. Hakim, can you raise your hand, please? I understood that you wanted to speak if you are listening online. Bear witness while questioning Anyone else want to say something or ask a question? I’ll take one more, then we’ll go back to panel to address, and then we’ll close, I promise, please.
Oleksandr Shevchuk: Alexander Shevchuk, Institute of International Relations, Ukraine, what do you see as the improvement, instruments, and mechanism how to fight against Russian propaganda in Ukraine, especially historical propaganda, and what is, how you assess the effectiveness of such actions by the European countries and the Council of Europe?
Alina Tatarenko: Okay, thank you very much. I will probably now ask the panel to try to answer at least some of the questions and comments that were here. Maybe we can start with Amala, because there was one directly addressed to her, and then we’ll go around, and Julie.
Amela OdobaÅ¡iÄ: Thank you, Alina, and my colleague just came back in time. Okay, so to go back to your question, and you asked me whether the media organizations or representatives of civil society reacted as opposing to the legal solutions that we basically provided. In a nutshell, no. Okay. Especially in the media art industry? No, no, no. I mean, there is a whole story behind on how to implement the rule that we have in place, and that is the one with the sharing platforms, because we do not have in Bosnia and Herzegovina platforms as such registered. But anyway, we can discuss it outside of this meeting. But then, if you allow me, then perhaps I can wrap up. and give additional information as a response. As when we were preparing for this panel, Julie was brilliant when she asked me, she said, oh, so you are experimenting with co-regulation. Okay, and that is exactly what we are doing, the regulators. In Bosnia-Herzegovina, we were very lucky that Council of Europe had a study, this device that was produced for us as the regulator. And that is actually, the topic of the study is mapping of stakeholders in Bosnia-Herzegovina towards regulating harmful content online. Okay, and I think that is the first study of that kind produced in the countries of the Western Balkans region. It’s an excellent study and it’s available, I can also send it to you. And basically, that is exactly with that study as a step one. This is exactly what we are doing. We are experimenting with co-regulation. In Bosnia-Herzegovina, we are already, as we, at least we as a regulator, we very much adopted this co-regulatory approach when it comes to harmful content online. Okay, which does not mean that we as the regulator should have all power, we can’t simply do it. But it’s basically somehow developing a network or platform of all stakeholders who should have their saying in line with their competencies. It’s going to be a very interesting process. We already started talking to the government and somehow bringing that idea to them because it’s their business. Okay, it’s a national topic. It’s not something that the regulatory authority or some other state institution should do. So it’s the government that should lead that process. Okay, so we will let you know how the process is going because it is going to be a process, but I think that we are on the right track. when it comes to that. And also, allow me just to take another 10 seconds. Look, it’s easy for all of us to say no to everything that is happening and make us somehow not being able to respond to whatever is happening in the online world. In that case, then we should simply just give up our jobs and stop being paid for what we are doing. I believe, yes, the situation is quite serious, but I think that we should all really join together. Whether we are going to get to beat the platforms or big tech companies, we do not know, but we can’t give up, because this is the only way just to keep pushing. And for us, developing countries, it’s even more challenging. But if we are willing to make that extra mile, then the members of the European Union should really serve as a role model to us. Because, yes, it is true that DSA and DMA are not going to solve all these problems, such as hate speech in online space, etc. But, yes, they are going to benefit a lot our national regulatory frameworks or national legal frameworks in combating these events. And I’ll stop here. This could be my closing remark as well, Alina, if you allow me.
Alina Tatarenko: Thank you very much. I agree with you. I think it’s basically like a crime. It will always exist. There will always be crimes. There will always be problems. But we can do things to mitigate the damage. We can do something to help prevent, to reduce. We will not eliminate it completely. We will not eliminate any problem completely. Whatever we can do, let’s do it. And we have provided, as you know, Council Europe provided opinion on the IMC law and maybe we can discuss a little bit in more detail later. Julie wanted to address some questions. Thanks. Yeah, I’ll just try and pull some threads from a variety of questions.
Julie Posetti: Somebody talked about the emphasis on Russian and Chinese disinformation in this discussion. I think myself and others have observed that the rise of US-led disinformation is a legitimate threat, especially in the context of the weaponization of counter disinformation work by the US administration with the defunding of any research programs emphasizing disinformation that involve any kind of state-related funding, for example. So, you know, some have argued that the US is actually, and the disinformation networks that are seeded by the US, among the most dangerous that we are currently dealing with. And I think that’s both shocking to contemplate, but also something that requires deeper investigation and close monitoring. Because if all of our efforts are indeed focused on these obvious geopolitical actors with a known history in this space, then we risk avoiding the emerging threats or failing to respond to the emerging threats. And it’s also true that these are transnational threats. So they may have been born in the US. And I’m speaking here now about, not about US-generated disinformation or misinformation, but about the function of US-owned platforms and big tech companies. And we have barely touched AI in terms of the industry. We will do that tomorrow and through the rest of the day. And we can come back to that. But they are also monetized. So the economy of disinformation needs to be better understood. And perhaps there are some creative options there in terms of… legal action, where there’s been a monetization of disinformation. I agree with what colleagues have said about both the sense of despondency, which I share to an extent with our Ukrainian colleague, but with an insistence built on optimism associated with the ongoing belief in defending human rights and the rule of law and democracy that we cannot give up the fight. I know it seems absurd for an Australian British woman to sound like I’m lecturing Ukrainians who are literally holding the line physically, but collectively, I think it’s vital that we maintain the fight with eyes wide open, especially to the information operations that are funded by and run by the big tech companies themselves to resist any form of regulation, but also to try to shut down these sorts of critical conversations as though they represent acts of censorship, which goes to the point our young colleague made, which I respectfully vehemently disagree with. What we are talking about here is not censorship. It is about our role and our responsibility to defend human rights, the rule of law and democracy. That is why the Council of Europe exists. It’s also what the UN seeks in its efforts to defend in terms of freedom of expression, press freedom, safety of journalists, human rights defenders, in reference to the Universal Declaration of Human Rights and especially Article 19, which does not give a right to any person on the internet to abuse, harass and threaten people who are trying to express their democratically rooted rights to engage publicly in democratic deliberation by driving them through fear and genuine fear, especially if you’re a woman or a minority, or if you are trying to speak truth to power in a country which has forced you into exile, like our colleague here from. Belarus. And I would argue that public interest media, to our colleague from the UK, are increasingly important, especially public media, especially public broadcasting, which must be reinforced in terms of appropriate funding. And to the UK, I speak directly to what I understand is under consideration, which is massive budget cuts to the BBC World Service, which could not come at a worse time when we see the Voice of America and RFE, RL being defunded and having to fight for their very existence to continue. And the funding for those services, again, is not about imposing some foreign perspective on an individual country. It’s about upholding democracy, human rights, and the rule of law. Those are not partisan pursuits. Those are fundamental and foundational, and I’ll end it there.
Alina Tatarenko: Thank you, Julie. Andrin, Valentin, Andrin, please.
Valentyn Koval: I just wanted to answer maybe the most of the questions that were here. I think, and I do believe, that we should stop looking for money for combating disinformation as fake news, because we will be always secondary, and we will disseminate this disinformation more and more times, trying to debunk it. So we need to look for money for creating new information. We should understand that information is not a chaotic space. It’s just a tube which people use to get new information, and we should fill this tube with truthful, new, reliable, and checked information. We should just replace fake news with real information. This is the only way to, you know, to change the game. because money, those who make disinformation as a result, as fake news, or even if we will take it as a process of creation, dissemination, exception, and so on, those money will, for all the time, there will be that money. With oil, gas, any other things that Russia and others are having in big amounts. So we just need to replace, not just like the UAE is now replacing the UK in the EU, but we need to replace disinformation with real information, just like what Belsat is doing, because they are, I understand, the only way for Belarusian people to have a real truth about the Belarusian situation, and we should just look for money for this. Not for combating disinformation as a fact, but for filling this information tube with reliable information. Thank you. Thank you very much, Valentin, and I’m terribly sorry, but I was just told that we really have to stop.
Alina Tatarenko: I’m sorry about this, but let’s continue our discussions outside. We are all here, we’ll be all here for the whole two or three days, and please come up to us, and let’s talk in the corridor, the coffee. Thank you. Thank you. And please don’t forget, if you really need some paper, it’s about two. Thank you.
Amela Odobašić
Speech speed
139 words per minute
Speech length
1630 words
Speech time
700 seconds
Experimenting with co-regulation approaches
Explanation
Amela OdobaÅ¡iÄ discussed the approach of experimenting with co-regulation in Bosnia and Herzegovina. This involves developing a network or platform of all stakeholders who should have a say in regulating harmful content online, in line with their competencies.
Evidence
She mentioned a study on mapping stakeholders in Bosnia and Herzegovina towards regulating harmful content online, which is being used as a basis for this co-regulatory approach.
Major discussion point
Proposed solutions and approaches
Agreed with
– Julie Posetti
– Valentyn Koval
– Andrin Eichin
Agreed on
Need for proactive and creative approaches to combat disinformation
Julie Posetti
Speech speed
136 words per minute
Speech length
1770 words
Speech time
777 seconds
Need for proactive and creative regulatory responses
Explanation
Julie Posetti argued for the need for proactive and creative regulatory responses to combat disinformation and protect democracy. She emphasized that these efforts are not about censorship, but about defending human rights, the rule of law, and democracy.
Major discussion point
Role of governments and international organizations
Agreed with
– Alina Koushyk
– Jordan Ogg
Agreed on
Importance of supporting independent media and journalists
Disagreed with
– Valentyn Koval
Disagreed on
Approach to countering disinformation
Valentyn Koval
Speech speed
118 words per minute
Speech length
847 words
Speech time
429 seconds
Focusing on creating and disseminating truthful information
Explanation
Valentyn Koval argued for a shift in focus from combating disinformation to creating and disseminating truthful, reliable, and verified information. He emphasized the need to fill the information space with real information rather than constantly reacting to and debunking fake news.
Evidence
He cited the example of Belsat providing real truth about the Belarusian situation to its people.
Major discussion point
Proposed solutions and approaches
Agreed with
– Julie Posetti
– Amela OdobaÅ¡iÄ
– Andrin Eichin
Agreed on
Need for proactive and creative approaches to combat disinformation
Disagreed with
– Julie Posetti
Disagreed on
Approach to countering disinformation
Alina Koushyk
Speech speed
121 words per minute
Speech length
1497 words
Speech time
742 seconds
Supporting exiled media and journalists
Explanation
Alina Koushyk emphasized the importance of supporting exiled media and journalists in countering disinformation. She highlighted the challenges faced by Belarusian media in exile, including threats to journalists and algorithmic suppression of content.
Evidence
She mentioned that 88% of Belarusian independent media outlets are closed in Belarus, with 45 titles continuing their work in exile. She also noted that 30 Belarusian media workers are currently imprisoned.
Major discussion point
Role of governments and international organizations
Agreed with
– Julie Posetti
– Jordan Ogg
Agreed on
Importance of supporting independent media and journalists
Giovana Fleck
Speech speed
141 words per minute
Speech length
483 words
Speech time
204 seconds
Addressing transnational nature of disinformation threats
Explanation
Giovana Fleck highlighted the need to address the transnational nature of disinformation threats. She emphasized that disinformation is a global issue that goes beyond European borders and affects information ecosystems worldwide.
Evidence
She mentioned the global effects of information disorder seen during the COVID-19 pandemic and in transnational narratives delegitimizing democracy and human rights.
Major discussion point
Role of governments and international organizations
Marilia Maciel
Speech speed
168 words per minute
Speech length
253 words
Speech time
90 seconds
Cutting financial resources to disinformation industry
Explanation
Marilia Maciel proposed focusing on cutting the financial resources of the disinformation industry. She suggested exploring international cooperation and collaboration with law enforcement agencies to address this global issue.
Major discussion point
Proposed solutions and approaches
Andrin Eichin
Speech speed
142 words per minute
Speech length
1063 words
Speech time
446 seconds
Developing media literacy and fact-checking initiatives
Explanation
Andrin Eichin emphasized the importance of comprehensive digital literacy efforts and fact-checking as key practices for information integrity. He argued that these efforts should be available for all age groups to build critical thinking and resilience against disinformation across society.
Evidence
He cited the Council of Europe guidance note recommendations on supporting fact-checking organizations and implementing digital literacy programs.
Major discussion point
Proposed solutions and approaches
Agreed with
– Julie Posetti
– Amela OdobaÅ¡iÄ
– Valentyn Koval
Agreed on
Need for proactive and creative approaches to combat disinformation
Implementing platform design solutions and safety by design
Explanation
Andrin Eichin highlighted the need for platforms to adopt human rights by design and safety by design principles. He emphasized that platforms should conduct and publish human rights impact assessments for new features and policies, and design systems that take into account the risk profile of specific contents or audiences.
Evidence
He referenced the Council of Europe guidance note recommendations on platform design solutions.
Major discussion point
Proposed solutions and approaches
Jordan Ogg
Speech speed
156 words per minute
Speech length
302 words
Speech time
115 seconds
Strengthening public service broadcasting
Explanation
Jordan Ogg emphasized the importance of public service broadcasting in countering misinformation and disinformation. He argued that public service broadcasters play a crucial role in providing high-quality and accurate news, especially in the context of increased online news consumption.
Evidence
He mentioned Ofcom’s ongoing review of public service media in the UK, which aims to support public service broadcasting in the face of challenges posed by online news consumption.
Major discussion point
Proposed solutions and approaches
Agreed with
– Alina Koushyk
– Julie Posetti
Agreed on
Importance of supporting independent media and journalists
Giacomo Mazzone
Speech speed
138 words per minute
Speech length
159 words
Speech time
68 seconds
Maintaining international cooperation and dialogue with platforms
Explanation
Giacomo Mazzone emphasized the importance of maintaining international cooperation and dialogue with platforms in combating disinformation. He suggested using successful models, such as the European Digital Media Observatory’s work during the European elections, and replicating them elsewhere.
Evidence
He mentioned the successful cooperation with platforms during the European elections last year, which resulted in a smooth process.
Major discussion point
Proposed solutions and approaches
Pavlo Pushkar
Speech speed
136 words per minute
Speech length
703 words
Speech time
309 seconds
Importance of protecting freedom of expression while countering disinformation
Explanation
Pavlo Pushkar highlighted the importance of balancing efforts to counter disinformation with the protection of freedom of expression. He emphasized that states have a wide margin of appreciation in combating disinformation, but interventions must be based on relevant and sufficient reasons and pursue legitimate aims.
Evidence
He cited the European Court of Human Rights case law, which recognizes the need for robust measures to protect freedom of expression and public discussion space from harmful influences of disinformation and propaganda.
Major discussion point
Role of governments and international organizations
Aneta Gonta
Speech speed
125 words per minute
Speech length
617 words
Speech time
295 seconds
Balancing national security concerns with media freedoms
Explanation
Aneta Gonta discussed the challenge of balancing national security concerns with media freedoms in the context of combating Russian disinformation in Moldova. She emphasized the need for proportionate and necessary measures to maintain democracy while ensuring freedom of expression.
Evidence
She mentioned the recent discovery of two powerful Russian propaganda networks targeting Moldova and the significant investment (over 200 million euros) by Russia in online disinformation campaigns against the country.
Major discussion point
Role of governments and international organizations
Mykyta Poturaiev
Speech speed
127 words per minute
Speech length
661 words
Speech time
311 seconds
Ensuring accountability of tech platforms through legislation
Explanation
Mykyta Poturaiev argued for the need to ensure accountability of tech platforms through legislation. He emphasized that current self-regulatory approaches are insufficient and that stronger legal frameworks are needed to address issues like hate speech, bullying, and disinformation on social platforms.
Evidence
He pointed out that while Ukraine has regulated traditional media effectively, these regulations do not work for social platforms.
Major discussion point
Role of governments and international organizations
Disagreed with
– Luljeta Aliu
Disagreed on
Role of regulation in combating disinformation
Oksana Prykhodko
Speech speed
122 words per minute
Speech length
75 words
Speech time
36 seconds
Finding alternatives to defunded counter-disinformation programs
Explanation
Oksana Prykhodko raised the issue of finding alternatives to counter-disinformation programs that have been defunded by the new American administration. She emphasized the need for asymmetrical responses to fill the gap left by these closed projects.
Major discussion point
Role of governments and international organizations
Luljeta Aliu
Speech speed
110 words per minute
Speech length
215 words
Speech time
116 seconds
Challenge of media rights groups opposing regulation
Explanation
Luljeta Aliu highlighted the challenge of media rights groups and civil society organizations opposing regulation efforts, often framing them as politically motivated censorship. This creates a difficult position for regulators trying to balance protecting citizens while respecting media freedoms.
Evidence
She mentioned a recent case in Kosovo where media rights groups submitted a new law on the Independent Media Commission to the Constitutional Court, resulting in its repeal.
Major discussion point
Challenges in combating disinformation and harmful content online
Disagreed with
– Mykyta Poturaiev
Disagreed on
Role of regulation in combating disinformation
Alina Tatarenko
Speech speed
145 words per minute
Speech length
1848 words
Speech time
763 seconds
Importance of mitigating damage from disinformation
Explanation
Alina Tatarenko emphasized that while completely eliminating disinformation may not be possible, efforts to mitigate its damage and reduce its impact are crucial. She argued for taking whatever actions possible to help prevent and reduce the spread of disinformation.
Major discussion point
Proposed solutions and approaches
Moderator
Speech speed
145 words per minute
Speech length
347 words
Speech time
142 seconds
Need for balanced approach to combating disinformation
Explanation
The moderator raised concerns about the focus on Russian and Chinese disinformation, arguing for a more balanced approach that also addresses misinformation from Western sources. They emphasized the importance of being self-critical and addressing disinformation regardless of its source.
Evidence
They mentioned examples of Western misinformation, such as justifications for military interventions based on misleading claims about weapons of mass destruction.
Major discussion point
Challenges in combating disinformation and harmful content online
Oleksandr Shevchuk
Speech speed
126 words per minute
Speech length
48 words
Speech time
22 seconds
Improving mechanisms to counter Russian propaganda in Ukraine
Explanation
Oleksandr Shevchuk inquired about potential improvements in instruments and mechanisms to fight against Russian propaganda in Ukraine, particularly historical propaganda. He also asked about the effectiveness of actions taken by European countries and the Council of Europe in this regard.
Major discussion point
Proposed solutions and approaches
Agreements
Agreement points
Need for proactive and creative approaches to combat disinformation
Speakers
– Julie Posetti
– Amela OdobaÅ¡iÄ
– Valentyn Koval
– Andrin Eichin
Arguments
Need for proactive and creative regulatory responses
Experimenting with co-regulation approaches
Focusing on creating and disseminating truthful information
Developing media literacy and fact-checking initiatives
Summary
Speakers agreed on the importance of proactive, creative, and multi-faceted approaches to combat disinformation, including regulatory responses, co-regulation, creating truthful content, and developing media literacy.
Importance of supporting independent media and journalists
Speakers
– Alina Koushyk
– Julie Posetti
– Jordan Ogg
Arguments
Supporting exiled media and journalists
Need for proactive and creative regulatory responses
Strengthening public service broadcasting
Summary
Speakers emphasized the crucial role of independent media, including exiled journalists and public service broadcasters, in countering disinformation and providing reliable information.
Similar viewpoints
These speakers shared the view that tech platforms need to be held accountable through various means, including legislation, financial measures, and design solutions.
Speakers
– Mykyta Poturaiev
– Marilia Maciel
– Andrin Eichin
Arguments
Ensuring accountability of tech platforms through legislation
Cutting financial resources to disinformation industry
Implementing platform design solutions and safety by design
These speakers highlighted the challenge of balancing efforts to counter disinformation with protecting freedom of expression and media rights.
Speakers
– Aneta Gonta
– Pavlo Pushkar
– Luljeta Aliu
Arguments
Balancing national security concerns with media freedoms
Importance of protecting freedom of expression while countering disinformation
Challenge of media rights groups opposing regulation
Unexpected consensus
Addressing disinformation from all sources, including Western countries
Speakers
– Julie Posetti
– Moderator
Arguments
Need for proactive and creative regulatory responses
Need for balanced approach to combating disinformation
Explanation
Despite different perspectives, both Julie Posetti and the Moderator acknowledged the importance of addressing disinformation from all sources, including Western countries, which was an unexpected area of agreement given the focus on Russian disinformation in much of the discussion.
Overall assessment
Summary
The main areas of agreement included the need for proactive and multi-faceted approaches to combat disinformation, support for independent media, and the importance of balancing security concerns with freedom of expression.
Consensus level
There was a moderate level of consensus among speakers on the need for action against disinformation, but differences emerged in specific approaches and priorities. This suggests that while there is broad agreement on the importance of addressing disinformation, developing and implementing effective solutions remains challenging and requires further discussion and collaboration.
Differences
Different viewpoints
Approach to countering disinformation
Speakers
– Valentyn Koval
– Julie Posetti
Arguments
Focusing on creating and disseminating truthful information
Need for proactive and creative regulatory responses
Summary
Koval argues for focusing on creating and disseminating truthful information rather than combating disinformation directly, while Posetti emphasizes the need for proactive regulatory responses to combat disinformation.
Role of regulation in combating disinformation
Speakers
– Mykyta Poturaiev
– Luljeta Aliu
Arguments
Ensuring accountability of tech platforms through legislation
Challenge of media rights groups opposing regulation
Summary
Poturaiev argues for stronger legal frameworks to ensure platform accountability, while Aliu highlights the challenges posed by media rights groups opposing regulation efforts.
Unexpected differences
Perception of efforts to combat disinformation
Speakers
– Julie Posetti
– Moderator
Arguments
Need for proactive and creative regulatory responses
Need for balanced approach to combating disinformation
Explanation
While most speakers focused on combating disinformation from specific sources, the moderator unexpectedly argued for a more balanced approach that also addresses misinformation from Western sources, challenging the prevailing narrative.
Overall assessment
Summary
The main areas of disagreement revolve around the approach to countering disinformation (regulatory vs. content-focused), the role and extent of regulation, and the sources of disinformation that should be prioritized.
Disagreement level
The level of disagreement is moderate. While there is general consensus on the need to address disinformation, speakers differ on specific strategies and priorities. These disagreements reflect the complexity of the issue and the challenges in finding a unified approach to combating disinformation while balancing other concerns such as freedom of expression and national security.
Partial agreements
Partial agreements
Similar viewpoints
These speakers shared the view that tech platforms need to be held accountable through various means, including legislation, financial measures, and design solutions.
Speakers
– Mykyta Poturaiev
– Marilia Maciel
– Andrin Eichin
Arguments
Ensuring accountability of tech platforms through legislation
Cutting financial resources to disinformation industry
Implementing platform design solutions and safety by design
These speakers highlighted the challenge of balancing efforts to counter disinformation with protecting freedom of expression and media rights.
Speakers
– Aneta Gonta
– Pavlo Pushkar
– Luljeta Aliu
Arguments
Balancing national security concerns with media freedoms
Importance of protecting freedom of expression while countering disinformation
Challenge of media rights groups opposing regulation
Takeaways
Key takeaways
Combating online disinformation and harmful content remains a major challenge, with insufficient legal frameworks and enforcement mechanisms in many countries
Self-regulation by tech platforms has largely failed, necessitating more proactive government and regulatory approaches
A multi-stakeholder approach involving regulators, platforms, civil society, and media is needed to address the complex challenges
Supporting quality journalism, public service media, and exiled/independent media is crucial for countering disinformation
Media literacy initiatives and fact-checking, while important, are not sufficient on their own to address the scale of the problem
There is a need to balance freedom of expression protections with measures to combat harmful content and disinformation
Resolutions and action items
Experiment with co-regulation approaches involving multiple stakeholders
Develop and implement platform design solutions and ‘safety by design’ principles
Strengthen public service broadcasting and independent journalism
Continue international cooperation and dialogue with tech platforms
Focus on creating and disseminating truthful information rather than just combating disinformation
Unresolved issues
How to effectively regulate global tech platforms from individual countries with limited jurisdiction
Addressing the transnational nature of disinformation threats
Finding sustainable funding models for counter-disinformation efforts
Balancing national security concerns with media freedoms
Dealing with algorithmic suppression of certain languages and content
Effectively combating the growing disinformation industry motivated by financial gain
Suggested compromises
Implementing proportionate regulation tailored to a platform’s size and risk profile rather than blanket approaches
Using friction-based mechanisms to reduce reach of harmful content rather than outright content removal
Balancing proactive creation of truthful content with targeted fact-checking and debunking efforts
Thought provoking comments
Quality information is the most effective long-term antidote for disinformation.
Speaker
Andrin Eichin
Reason
This comment shifts the focus from reactive measures to proactive ones, emphasizing the importance of creating a strong foundation of reliable information.
Impact
It set the tone for discussing systemic solutions rather than just content removal, leading to conversations about media literacy, fact-checking, and platform design.
Ukraine lacks this institutional media heritage, and its journey towards a stable democratic media environment is still in progress.
Speaker
Valentyn Koval
Reason
This provides important context about the unique challenges faced by countries with less established democratic institutions in combating disinformation.
Impact
It broadened the discussion to consider how different countries’ historical and institutional contexts affect their ability to combat disinformation.
Moldova is now in this moment the most targeted country in the region by these campaigns, more than 50 times the average of harmful propaganda in Western Europe.
Speaker
Aneta Gonta
Reason
This highlights the disproportionate impact of disinformation campaigns on smaller, more vulnerable countries.
Impact
It drew attention to the need for tailored approaches and international support for countries facing intense disinformation campaigns.
Belarusian language content is de-prioritized by algorithms of social media, especially in YouTube, even in the largest channels as Belsat.
Speaker
Alina Koushyk
Reason
This reveals how platform algorithms can inadvertently suppress minority languages and cultures, compounding the challenges faced by exiled media.
Impact
It introduced a new dimension to the discussion about platform responsibility and the unintended consequences of content algorithms.
The time for self-regulation has passed as it applies to big tech actors. It would be naive as well to assume that the tech oligarchs will meaningfully participate in co-regulation, in our view.
Speaker
Julie Posetti
Reason
This strong statement challenges the effectiveness of current approaches to regulating tech platforms.
Impact
It sparked a more critical examination of regulatory approaches and led to discussions about the need for more stringent, legally binding measures.
We don’t know maybe an ultra-right candidate will win the Romanian elections, and maybe it will help an ultra-right candidate to win the presidential election in Poland, and maybe it will help to pro-Russian revenge in Moldova, and ultra-rights in the first place in Germany, and ultra-rights in the first place in France.
Speaker
Mykyta Poturaiev
Reason
This comment starkly illustrates the potential real-world political consequences of unchecked disinformation.
Impact
It injected a sense of urgency into the discussion and highlighted the broader societal stakes of the disinformation problem.
Overall assessment
These key comments shaped the discussion by broadening its scope from theoretical concepts to real-world impacts and challenges. They highlighted the complexity of the disinformation problem, touching on issues of language, algorithms, geopolitics, and the limitations of current regulatory approaches. The discussion evolved from general principles to specific challenges faced by different countries and media outlets, emphasizing the need for urgent, multifaceted, and internationally coordinated responses to disinformation.
Follow-up questions
How can algorithms be adjusted to stop penalizing content in minority languages like Belarusian?
Speaker
Alina Koushyk
Explanation
Important for preserving minority languages and cultures in the digital space
How can public service broadcasting be supported and strengthened to counter disinformation?
Speaker
Jordan Ogg
Explanation
Public broadcasters play a key role in providing quality information to counter disinformation
How can financial resources be cut off from the global disinformation industry?
Speaker
Marilia Maciel
Explanation
Addressing the financial incentives could help reduce the production of disinformation
What asymmetrical responses can be developed to replace closed American projects countering disinformation?
Speaker
Oksana Prykhodko
Explanation
Important to find new ways to continue disinformation efforts after loss of US funding
How can efforts to combat disinformation remain balanced and address misinformation from all sources, including Western countries?
Speaker
Moderator
Explanation
Ensuring a balanced approach is important for credibility of anti-disinformation efforts
What instruments and mechanisms can improve the fight against Russian propaganda in Ukraine, especially historical propaganda?
Speaker
Oleksandr Shevchuk
Explanation
Specific tactics are needed to address Russian propaganda targeting Ukraine
How can the economic aspects of disinformation be better understood and potentially used for legal action?
Speaker
Julie Posetti
Explanation
Understanding the financial incentives could reveal new ways to combat disinformation
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
