Pre 6: Countering Disinformation and Harmful Content Online
12 May 2025 09:00h - 10:15h
Pre 6: Countering Disinformation and Harmful Content Online
Session at a glance
Summary
This discussion was a pre-session of the Eurodig focused on countering harmful content and disinformation online, hosted by the Council of Europe. The panel brought together regulators, media professionals, and experts to discuss strategies for addressing online misinformation while respecting freedom of expression under Article 10 of the European Convention on Human Rights.
Andrin Eichin presented the Council of Europe’s guidance note on countering disinformation, which emphasizes three key pillars: fact-checking as a cornerstone of information integrity, platform design solutions that incorporate human rights by design principles, and user empowerment through digital literacy and verification tools. The guidance stresses that quality journalism and reliable information sources are the most effective long-term antidotes to disinformation.
Regulators from Ukraine, Moldova, and Bosnia-Herzegovina shared their experiences with Russian propaganda and disinformation campaigns. Valentyn Koval from Ukraine highlighted how democratic societies’ openness makes them vulnerable to disinformation, emphasizing the need for proactive measures that flood information spaces with verified content rather than merely reacting to false narratives. Aneta Gonta from Moldova described how Russia has invested over 200 million euros in disinformation campaigns targeting her country, making it the most targeted nation in the region. Amela OdobaÅ¡iÄ from Bosnia-Herzegovina discussed the challenges Western Balkan countries face in transposing EU legislation while lacking adequate legal frameworks and resources.
Alina Koushyk from Belsat TV, a Belarusian media outlet in exile, presented the unique challenges faced by independent media operating under authoritarian regimes. She highlighted how algorithms suppress content in Belarusian language while promoting Russian content, and how her journalists face imprisonment simply for doing their work. Julie Posetti emphasized that self-regulation by tech platforms has failed and called for stronger European regulatory action, noting that online violence against journalists, particularly women and minorities, is often fueled by disinformation campaigns.
The discussion revealed a consensus that reactive approaches to disinformation are insufficient and that systemic solutions strengthening the overall information ecosystem are needed. Participants agreed that while complete elimination of disinformation is impossible, coordinated efforts involving regulators, platforms, civil society, and international organizations can significantly mitigate the damage and protect democratic institutions.
Keypoints
## Major Discussion Points:
– **Council of Europe Standards and Multi-Pillar Approach**: Discussion of the Council of Europe’s guidance note on countering disinformation, which emphasizes three key areas: fact-checking (with independent, transparent organizations), platform design solutions (human rights by design, process-based regulation), and user empowerment (digital literacy, tools for content control).
– **Regional Challenges and Russian Disinformation**: Extensive focus on how different countries face disinformation threats, particularly Russian propaganda targeting Ukraine, Moldova, and Belarus. Speakers highlighted the weaponization of disinformation during conflicts, electoral interference, and the challenges faced by media in exile.
– **Platform Accountability and Regulatory Gaps**: Critical examination of social media platforms’ role in spreading disinformation, including algorithmic suppression of certain languages (like Belarusian), lack of effective content moderation, and the limitations of current regulations like the DSA for non-EU countries.
– **Attacks on Journalists and Information Integrity**: Discussion of coordinated online violence against journalists, particularly women and minorities, and how these attacks undermine trust in factual information. The “broligarchy” (wealthy tech executives) was identified as choking democracy through platform design that prioritizes engagement over truth.
– **Systemic Solutions vs. Reactive Measures**: Debate over moving beyond reactive fact-checking to proactive approaches, including strengthening independent journalism, improving media literacy, cutting financial resources to disinformation industries, and filling information spaces with quality content rather than just debunking false information.
## Overall Purpose:
The discussion aimed to examine strategies for countering harmful content and disinformation online while respecting freedom of expression rights under the European Convention on Human Rights. Participants sought to share experiences, identify challenges, and explore solutions for protecting democratic discourse in the digital age.
## Overall Tone:
The discussion began with a measured, academic tone as experts presented frameworks and standards. However, it became increasingly urgent and concerned as speakers from conflict-affected regions shared their experiences. The tone shifted to frustration and alarm, particularly with a Ukrainian parliamentarian’s stark assessment that “we are losing this battle.” Despite moments of pessimism about the scale of the challenge, the discussion concluded on a note of determined resilience, with speakers emphasizing the need to continue fighting for human rights, democracy, and the rule of law rather than surrendering to the magnitude of the disinformation threat.
Speakers
**Speakers from the provided list:**
– **Alina Tatarenko** – Head of the Division for Cooperation and Freedom of Expression at the Council of Europe
– **Mykyta Poturaiev** – Head of the Ukrainian Parliament Committee for Humanitarian and Informational Policies
– **Giovana Fleck** – Representative of RNW Media (Dutch organization representing dozens of journalists and media servants worldwide)
– **Pavlo Pushkar** – Head of the division for execution of judgments of the European Court of Human Rights
– **Luljeta Aliu** – Member of the Independent Media Commission in Kosovo
– **Jordan Ogg** – Representative of Ofcom, UK’s independent communications regulator
– **Amela OdobaÅ¡iÄ** – Head of broadcasting at the Communication Regulatory Agency of Bosnia and Herzegovina
– **Alina Koushyk** – Director of Belsat TV (Belarusian media outlet in exile), Editor-in-chief
– **Oleksandr Shevchuk** – Institute of International Relations, Ukraine
– **Moderator** – Daniel Michos, remote moderating the session
– **Oksana Prykhodko** – Ukraine, international non-governmental organization, European media platform
– **Andrin Eichin** – Senior Policy Advisor on Online Platforms, Algorithms and Digital Policy at the Swiss Federal Office of Communications, part of the Council of Europe experts group
– **Valentyn Koval** – First deputy chair of the National Council of Television and Radio Broadcasting of Ukraine
– **Giacomo Mazzone** – Member of the EDMOD (European Digital Media Observatory)
– **Aneta Gonta** – Deputy Chair of the Audiovisual Council of the Republic of Moldova, member of the Council of Europe Committee on Media and Information Society
– **Julie Posetti** – Professor and global director of research at the International Center for Journalists, professor of journalism at the City University of London
– **Marilia Maciel** – Director of Digital Trade and Economic Security at Diplo Foundation
**Additional speakers:**
– **Representative from EFD** – Representing young voices in Europe on internet governance (name not provided in transcript)
Full session report
# Comprehensive Report: Eurodig Pre-Session on Countering Harmful Content and Disinformation Online
## Executive Summary
This pre-session of the Eurodig, hosted by the Council of Europe, brought together regulators, media professionals, and policy experts to examine strategies for addressing online misinformation whilst respecting freedom of expression under Article 10 of the European Convention on Human Rights. The discussion, moderated remotely by Daniel Michos, featured speakers from across Europe sharing frameworks, regional challenges, and practical experiences in countering disinformation.
Participants presented the Council of Europe’s three-pillar approach to countering disinformation, examined the devastating impact of Russian disinformation campaigns across Eastern Europe and the Western Balkans, and discussed the systematic failures of platform self-regulation. The session revealed both areas of strong consensusâparticularly regarding the failure of platform self-regulation and the need for quality journalismâand ongoing challenges in developing effective regulatory responses that protect democratic values whilst addressing urgent threats.
## Council of Europe Framework and Standards
Andrin Eichin from the Swiss Federal Office of Communications presented the Council of Europe’s comprehensive guidance note on countering disinformation and protecting freedom of opinion and expression. As he noted, he had to “read the whole title every time because it’s a bit of a mouthful,” but the document establishes a three-pillar approach that provides practical guidance for member states.
The framework emphasises fact-checking as a cornerstone of information integrity, requiring independent and transparent organisations to verify information. The second pillar focuses on platform design solutions that incorporate human rights by design principles, moving beyond reactive content moderation to systemic improvements in how platforms operate. The third pillar centres on user empowerment through digital literacy programmes and tools that allow users to control their information environment.
Eichin stressed that the guidance represents a paradigm shift from content removal towards systemic solutions that strengthen the overall information ecosystem’s resilience. “Quality information is the most effective long-term antidote for disinformation,” he argued, emphasising that well-funded independent journalism serves as the foundation of any effective counter-disinformation strategy. However, he acknowledged a critical paradox: “The complexity of the situation means that people’s fear of misinformation often generates as much polarisation and distrust as the problematic content itself.”
Alina Tatarenko from the Council of Europe reinforced these principles, noting that countering Russian propaganda online represents a Europe-wide and global challenge requiring coordinated responses. She emphasised that the Council’s approach maintains strict adherence to human rights obligations whilst providing member states with practical tools for addressing disinformation threats.
Pavlo Pushkar, head of the division for execution of judgments of the European Court of Human Rights, provided legal context by explaining that whilst states have a wide margin of appreciation in combating disinformation, any interference with freedom of expression must be proportionate and necessary in a democratic society.
## Regional Challenges and Russian Disinformation Campaigns
The discussion revealed the devastating impact of systematic disinformation campaigns across Eastern Europe and the Western Balkans. Aneta Gonta from Moldova’s Audiovisual Council presented particularly alarming statistics, revealing that Russia has invested over 200 million euros in disinformation campaigns targeting her country, making Moldova the most targeted nation in the region. She detailed how Russian disinformation networks like Matryoshka and Pravda specifically target Moldova and its president, creating a constant barrage of false narratives designed to undermine democratic institutions. Gonta referenced Resolution 2567 from 2024 in her presentation of these challenges.
Valentyn Koval from Ukraine’s National Council of Television and Radio Broadcasting provided a sobering assessment of his country’s challenges. He argued that Ukraine lacks the institutional media heritage necessary to effectively counter disinformation, leaving authorities in a perpetually reactive position. “A core challenge of countering disinformation is that most responses are reactive,” he explained. “First comes the fake news and only later fact-checking… Worse, attempts to debunk falsehoods can sometimes amplify them, especially when the debunking lacks credibility.”
Koval advocated for a fundamental shift in strategy, arguing that authorities should abandon reactive fact-checking in favour of proactively flooding the information space with verified, truthful content. He mentioned the availability of a specific study, the “Guide for Risk Management in the Context of Emergencies, Armed Conflicts, and Crisis,” which provides additional guidance on these approaches. He contended that social media platforms are structurally incapable of supporting truthful narratives during crises, requiring alternative approaches that don’t rely on platform cooperation.
Alina Koushyk, director of Belsat TV, a Belarusian media outlet operating in exile, provided a harrowing account of the challenges faced by independent media under authoritarian regimes. She revealed that 88% of Belarusian independent media outlets have been closed, with 45 titles now operating from exile. Koushyk, who had “an honor to present the first news service” and worked as a presenter, documented systematic algorithmic discrimination against Belarusian language content: “If you do shorts in Russian, you will have a million easily. But if you do it in Belarusian, you will have less than 100.”
This algorithmic bias forces Belarusian media outlets to choose between maintaining their cultural identity and reaching their audience, effectively supporting Russian cultural dominance. Koushyk also revealed a particularly disturbing development: confession videos of Belarusian political prisoners are being used as advertisements on YouTube, monetising political persecution. She noted that 30 Belarusian media workers are currently imprisoned simply for doing their jobs, whilst audiences consume independent media secretly after 8 PM due to fear of persecution.
## Platform Accountability and Regulatory Failures
The discussion revealed deep frustration with current approaches to platform regulation, with several speakers arguing that self-regulation has fundamentally failed. Julie Posetti, a professor and global director of research at the International Center for Journalists, delivered a particularly forceful critique, introducing the concept of the “broligarchy” – obscenely wealthy tech executives who are “choking democracy” through platform design that prioritises engagement over truth.
“The time for self-regulation has passed as it applies to big tech actors,” Posetti declared, calling for legal obligations and punitive measures against platforms that fail to address harmful content. She presented alarming statistics showing that 73% of women journalists experience online violence, with 37% targeted by political actors and 41% subjected to coordinated disinformation campaigns. These attacks, she argued, systematically undermine trust in factual information and silence democratic voices.
Posetti also mentioned her “Disarming Disinformation” project and the development of an AI-assisted online violence early warning system designed to help protect journalists from systematic harassment.
Mykyta Poturaiev, head of the Ukrainian Parliament Committee for Humanitarian and Informational Policies, provided perhaps the most pessimistic assessment of current regulatory approaches. He systematically dismantled the effectiveness of existing frameworks: “Do we have any protection from what is happening there? No. Does DSA work for now? No. Does MFA work for now? No… So do we have answers on political level or governmental level? I’m not sure.”
Poturaiev argued that social platforms operating outside EU jurisdiction simply ignore regulations like the Digital Services Act and Media Freedom Act, leaving users without protection from reputation damage, children vulnerable to bullying, and women exposed to hate speech. His stark conclusion – “we are losing this battle, and we are very, very close to lose this war, informational war” – highlighted the urgency of the challenges faced.
## Regulatory Gaps and Implementation Challenges
Western Balkan regulators highlighted significant challenges in implementing effective counter-disinformation measures. Amela OdobaÅ¡iÄ from Bosnia and Herzegovina’s Communication Regulatory Agency explained that small countries struggle to establish meaningful communication with global platforms, lacking the market influence necessary for effective engagement. She identified the lack of adequate legal frameworks as a major obstacle, noting that Western Balkan countries face particular difficulties in transposing EU legislation whilst lacking the resources and institutional capacity for effective implementation.
OdobaÅ¡iÄ referenced a Council of Europe study on “mapping of stakeholders in Bosnia-Herzegovina towards regulating harmful content online” and advocated for co-regulatory approaches involving all stakeholders rather than relying solely on regulatory authorities. However, she acknowledged the practical difficulties of implementing such frameworks.
The regulatory gap between EU member states covered by the Digital Services Act and non-EU countries creates additional vulnerabilities, leaving many nations exposed to disinformation without adequate legal protections.
Luljeta Aliu from Kosovo’s Independent Media Commission revealed another layer of complexity: constitutional court challenges to new media laws create additional regulatory obstacles. She noted that civil society organisations, potentially influenced by external actors, sometimes oppose legitimate regulatory efforts, creating a complex multi-front battle for effective governance.
## Financial Dimensions and Sustainability Challenges
Marilia Maciel from the Diplo Foundation highlighted the growing commercialisation of disinformation, describing a global industry motivated by financial gain that operates across borders to influence elections and undermine democratic processes. Companies now sell disinformation services internationally, creating a marketplace for democratic interference that transcends national boundaries.
This commercialisation requires new approaches focused on cutting financial resources to disinformation operations through international cooperation and law enforcement coordination. However, Oksana Prykhodko from a Ukrainian international non-governmental organisation noted that US funding cuts to counter-disinformation programmes have created significant resource gaps just as threats are escalating.
The financial challenges extend to supporting quality journalism and fact-checking organisations, which require sustainable funding models to serve as effective antidotes to disinformation. Giovana Fleck from RNW Media reinforced these concerns, noting that sustainability challenges for journalism require providing agency for both journalists and civil society organisations.
## Media Literacy and Public Service Broadcasting
Whilst acknowledging the importance of media literacy, several speakers noted significant limitations in current approaches. Andrin Eichin observed that media literacy efforts must be comprehensive across all age groups but acknowledged that current programmes are often sparse and opaque. Traditional media literacy approaches require 15-20 years to implement effectively, making them inadequate for addressing immediate disinformation threats.
Jordan Ogg from Ofcom UK highlighted the important role of public service broadcasting in countering misinformation effects, noting that these institutions provide trusted sources of quality information that serve as natural antidotes to disinformation. However, public service broadcasters require adequate funding and policy support to fulfil this role effectively.
The challenge extends beyond individual critical thinking skills to systemic issues with information distribution and consumption. Even with improved media literacy, audiences increasingly trust anonymous social media accounts over traditional media sources, suggesting that technical and regulatory solutions are necessary alongside educational approaches.
## Critical Perspectives and Balanced Analysis
A young participant representing young voices in Europe on internet governance as part of the European Forum for Democracy introduced an important challenge to the discussion framework, questioning whether the focus on Russian and Chinese disinformation created blind spots about Western propaganda. “Are we sufficiently aware and critical of this information that originates from within our own countries or from our closest allies?” they asked.
This intervention prompted Julie Posetti to acknowledge legitimate concerns about US-led disinformation, demonstrating how challenging assumptions can lead to more honest and comprehensive analysis. The participant’s critique highlighted the risk that counter-disinformation efforts might become censorious rather than genuinely protective of information freedom.
## Areas of Consensus and Ongoing Challenges
Despite disagreements on specific approaches, participants demonstrated consensus on several fundamental issues. All speakers agreed that quality journalism and independent media represent the most effective long-term antidotes to disinformation, requiring investment in reliable information sources rather than merely reactive fact-checking or content removal.
There was universal agreement that platform self-regulation has failed and that stronger regulatory measures are necessary. The systematic nature of Russian disinformation campaigns was acknowledged by all participants from affected regions, who provided consistent evidence of well-funded, coordinated threats to democratic institutions.
Participants also agreed that women journalists and minorities face disproportionate online violence and targeting, requiring specific attention and resources to address systematic harassment designed to silence democratic voices.
However, several critical issues remain unresolved, including how to effectively regulate global platforms operating outside EU jurisdiction and how small countries can establish meaningful communication with big tech companies. The fundamental business model of social media platforms that incentivises engagement over truth represents a systemic challenge requiring innovative regulatory approaches.
## Conclusion
This discussion revealed both the urgency of disinformation threats and the complexity of developing effective responses that respect democratic values. The systematic targeting of journalists, algorithmic discrimination against minority languages, and the commercialisation of disinformation represent serious threats to democratic discourse that require coordinated international responses.
The session highlighted the need for sustainable funding models for independent journalism and fact-checking organisations, particularly given funding cuts to counter-disinformation programmes. Questions about balancing freedom of expression with necessary restrictions during crises remain contentious, requiring ongoing attention and refinement.
Despite sobering assessments of current challenges, participants emphasised the importance of continuing to defend human rights, democracy, and the rule of law. The call for proactive rather than reactive approaches, combined with recognition of the need for sustainable funding for quality journalism, suggests a potential path forward that emphasises strengthening democratic information ecosystems rather than merely combating false narratives.
Session transcript
Alina Tatarenko: The Eurodig will be a pre-session of the Eurodig which will be dedicated to the discussion on how to counter harmful content and disinformation online. My name is Alina Tatarenko, I’m the head of the Division for Cooperation and Freedom of Expression here at the Council of Europe. We are the division which helps our member states to implement the recommendations and the standards of the Council of Europe in the area of freedom of expression, which includes also helping our member states to find ways to counter disinformation. Before we begin and before I introduce our panel, I would like to give the floor to the Eurodig moderator, Daniel, who will quickly explain the rules of the session. My name is Daniel Michos and I’ll be remote moderating this session. More information about the session and speakers is available on the Eurodig wiki. We encourage you to raise your hand if you would like to present a question, but if you would like me to ask you a question, please write Q in front of your question. These are the session rules. Please enter your full name. To ask a question, raise your hand using the Zoom function. You will be unmuted when the floor is given to you. When speaking, switch on the video, state your name and affiliation, and do not share links to the Zoom meeting, not even your own colleagues. Thank you. Thank you very much. So I will introduce our panel. We will start from Andrzej Eichin, who is the Senior Policy Advisor on on Online Platforms, Algorithms and Digital Policy at the Swiss Federal Office of Communications and who is also a part of the group of Council of Europe experts who developed Council of Europe guidance note on countering disinformation. Then we will have three regulators present here with whom we work through our Council of Europe projects. One is the representative of a Ukrainian regulator, Valentin Koval. Then we will have a representative of the Moldovan regulator, Aneta Gonta. Then we will have a representative of the regulator from the Bosnia-Herzegovina, Amela OdobaÅ¡iÄ. And then we also have a director of the Belsat TV, which is a Belarusian media outlet in exile, Alina Koushyk. And then we will have Julie Posetti, who is a professor and a global director of research at the International Center for Journalists and professor also of journalism at the City University of London. They will make short statements, present their arguments, after which we will open the floor to questions and then we will circle back to our panelists for going into more detail, proposing solutions and making their conclusions. Thank you very much. And with that, we begin with Adrian, please. Could you tell us about what are the latest Council of Europe standards and what is the latest guidance on countering disinformation online? Yes, thank you very much.
Andrin Eichin: Thank you for having me. I’ll try to keep this brief and give you the main elements of this Council of Europe guidance note on countering the spread of online misinformation through fact-checking and platform design solutions in a human rights compliant manner. Even though I was on the expert committee, I still have to read the whole title every time because it’s a bit of a mouthful. I had the pleasure to chair this expert committee that developed these guidelines and they were presented to the Steering Committee on Media and Information Society, the CDMSI, in December 2023 and then have been adopted. Now, what does the Guidance Note aim to do? The goal is to outline available strategies to address the challenges of this and misinformation online all while complying with the Convention and in particular with the right to freedom of expression which is enshrined in Article 10. Now, I would like to begin with a few words on the problem. We are all aware of the risks of mis- and disinformation, the risks they pose to democracy, they erode trust in public institutions, they distort public debate and they challenge the credibility of the media. Our citizens now operate in an information space that is fragmented, that is fast-moving, with content flowing through platforms often lacking any editorial oversight. And this sometimes leads to a perception that disinformation is omnipresent and that it exists at alluring levels. However, the expert committee was very clear and it highlighted very specifically that empirical data on the actual reach and impact of disinformation is still limited and the reality is often more nuanced. We do have clear evidence in certain areas, and I’m sure we will hear about it today, where disinformation is prevalent and being weaponised. Russian interference in the context of the full-scale invasion of Ukraine, misinformation that we had with regard to COVID-19. But the complexity of the situation means that people’s fear of misinformation often generates as much polarisation and distrust as the problematic content itself. Emerging technologies, especially generative AI, only exacerbate this problem, but they make false content more scalable and convincing. And this further blurs the line between legitimate, misleading and deliberately false content. The expert committee believes that countering this complexity demands policy that go beyond just identifying and removing bad actors or problematic content. We need systemic solutions that strengthen the resilience and integrity of the overall information ecosystem. And the Guide and Know tries to offer some recommendations, particularly in three areas, and I will just highlight some of the most important ones for you. The first area that we’re looking into is fact-checking. The expert committee highlights that fact-checking must be recognized as a key practice for information integrity, regardless whether it is integrated in the journalistic process, so before the content is published, or whether it’s executed in an independent professional capacity after the information has been made available. And the Guide and Know recommends that member states should create and support conditions for financial independence, transparent governance, and public service orientation of fact-checking organizations. Platforms should actively cooperate with fact-checking organizations to debunk and contextualize this and misinformation. And fact-checking operations themselves must operate free from state or commercial influence, maintain clear and high standards, as well as transparency in methods and funding. Fact-checkers serve as guardians of information integrity, both before and after its dissemination, and as such they serve not just correct, they serve not just to correct falsehoods, but they reinforce a culture of accuracy and trust in our systems. The second element we were looking at is platform design solutions. Platform architecture and their design play a vital role. The Guidance Note insists that platforms must adopt human rights by design and safety by design principles. This means they have to conduct and publish human rights impact assessments for new futures and policies. They need to design systems that take into account the risk profile of specific contents or audiences and adapt accordingly. They need to make moderation practices transparent and open to appeal. But importantly, the Guidance Note also focuses on recommendations for member states, especially when they consider regulatory frameworks. They should focus on process-based regulation rather than targeting individual content. They should apply proportionate regulation tailored to a platform’s size and risk profile. And they treat content removal as their last possible resort, never the default. Instead, they should be aware that platforms can also use other, more friction-based mechanisms to reduce reach and impact of content. The third pillar is user empowerment, and it’s a concept that is often cited but very rarely put in place. The Guidance Note highlights that users need tools to be able to control what content is shown and recommended. They need tools that allow them to verify sources and that allow them to seek redress when they feel that their human rights have been limited by platform decisions. Currently, these tools are very sparse, their implementation is opaque, and often they depend on the goodwill of platform providers. The Guidance Note also emphasizes the need for comprehensive digital literacy efforts, again something that is very often cited. But, and this is important, these efforts need to be available for all age groups. Only this allows to build critical thinking and resilience against dis- and misinformation across society. And perhaps most importantly, we must strengthen the foundations. Invest in reliable independent journalism and build healthy media ecosystems. It is something that is easy to forget when being confronted with disinformation from the outside. But the most important task is not a new one. It’s something that has been at the forefront of the Council of Europe for years. We need to create the structural conditions to ensure that there is a steady and reliable supply of quality information by recognized trustworthy sources. Quality information is the most effective long-term antidote for disinformation. So what must happen now? Member states should integrate these recommendations into their national frameworks with consistent alignment to human rights obligations. Platforms must make meaningful steps to reform system design, not just rely on post-facto moderation. And policymakers and researchers should collaborate to evaluate the impact of these measures and adjust to new technological threats. As I mentioned before, we still need a lot of research and evidence in this area. Thank you.
Alina Tatarenko: Thank you very much, Andrin. Yes, just to summarize quickly, three main parts of the Council of Europe guidance is about developing and emphasizing media literacy and about fact checking and platform-based solutions which are addressed directly to the platforms to ensure that they have design safety incorporated in the initial design into every algorithm. How do they comply with that? Maybe our regulators can let us know and that’s why I want to give the floor now to Valentin Koval, who is the first deputy chair of the National Council of Television and Radio Broadcasting of Ukraine. Please, Valentin. Thanks so much. Hi to everybody.
Valentyn Koval: Democracy is by their very nature. open societies, war of censorship, and bound by bureaucratic inertia are fertile ground for disinformation. While long-standing democracies in Europe rely on independent and fair traditional media, Ukraine lacks this institutional media heritage, and its journey towards a stable democratic media environment is still in progress. Ironically, Ukraine has been criticized for taking undemocratic steps, such as banning Russian and pro-Russian television channels. Yet, these moves were essential for defending media pluralism. It is precisely the diverse media voices in Ukraine that twice helped resist Russian attempts to assert political control, leading eventually to Russia’s full-scale invasion as a last-ditch effort. A core challenge of countering disinformation is that most responses are reactive. First comes the fake news and only later fact-checking. When fake content is spread through unregulated space like so-called social media, it often reaches wide audiences before being addressed. Worse, attempts to debunk falsehoods can sometimes amplify them, especially when the debunking lacks credibility. Ukraine’s response includes significant legal and institutional changes. In March 2023, a new law gave the National Council of Television and Radio Broadcasting of Ukraine broad powers to regulate not only traditional broadcasters, but also online and print media, DVB network operators, and content platforms within the Ukrainian jurisdiction. However, the law lacks the enforcement strength found in EU regulations like DSA, and Ukraine’s relatively small market limits its influence over global platforms. To address these challenges, the National Council focused on media literacy for media companies. Since disinformation spread by professional media outlets seems more credible, efforts have centered on improving journalistic standards. In participating with organizations like the Pilipovlic Institute for Democracy, Internews, Deutsche Welle Akademie, the Council conducted studies and trainings to identify and address vulnerabilities in Ukrainian media. Additionally, the Council produces weekly programming to debunk Russian disinformation narratives in English. This way, we help our colleagues abroad to understand more in-depth how Russian propaganda works. So-called social networks pose a particular threat. Their core design incentivizes broad engagement often from less critical users. Without reliable verification mechanisms, these platforms become breeding grounds for disinformation. Even worse, their modern policies, driven by global community guidelines, often restrict war-related content under the guise of neutrality. This includes suppressing or deleting the commentation of war crimes and failing to block fake accounts or abort activities that push Russian narratives. One key strategy is preemption, flooding the information space with verified truthful content. The amount of information consumed by individuals is limited, not because of lack of sources, but due to time limits and limited attention spans. This space is contested between professional media and unmoderated platforms. A 2024 study showed that social media platforms are structurally incapable of supporting truthful narratives during crises. These platforms often suppress war content. citing global rules and failed to act against bots and AI-generated harmful content. The study titled Guide for Risk Management in the Context of Emergencies, Armed Conflicts, and Crisis was conducted by International Media Support and Interviews of Ukraine in partnership with UNESCO and with support from Japan with deep cooperation with National Council members and staff. It analyzes the risk of spreading truthful content during conflicts and proposes recommendations for reducing platform-related threats. The study is available here, it’s https.cat.us. And I have some exemplars of printed materials, so for those who like to have paper materials, you can later take this and work with this.
Alina Tatarenko: Thank you. Thank you very much, Valentin. And of course, countering the threats of Russian propaganda online is not just a problem for Ukraine, it is a problem everywhere in Europe and in the world. One country which is also struggling a lot with Russian propaganda is Moldova. And we have here a representative of the Moldovan regulator, Aneta Gonta, who is a Deputy Chair of the Audiovisual Council of the Republic of Moldova and also she is a member of the Council of Europe Committee on Media and Information Society. Please, Aneta. Thank you very much.
Aneta Gonta: Good morning, everyone, and thank you for this opportunity to be here today and to try to speak about this very complex topic, the subject of disinformation and harmful propaganda. Not just any disinformation, but the Russian one. Russian disinformation and harmful propaganda. which are probably the best in the world, so therefore the most difficult to combat. The Republic of Moldova held presidential elections and a referendum on EU accession membership in the fall of 2024. Also, Moldova has parliamentary elections on September 28 this year, which are very important for the European path introduced in the country’s constitution. In both cases, the electoral process has to take place under conditions of deep interference by the Russian Federation, against which my country does not have proportionate resources to fight. In April 2025, which means a couple weeks ago, journalistic investigations revealed two extremely powerful Russian harmful propaganda networks named Matryoshka and Pravda. Those main target is the Republic of Moldova and its president, Maya Sandu. In the last two years, Russia has already invested more than 200 million euros or more than 1% of Moldova’s GDP in online disinformation and harmful propaganda campaigns to undermine trust in state institutions, hijack the European course and change the Kishinev government to a pro-Russian one. According to this information observatory, Moldova is now in this moment the most targeted country in the region by these campaigns, more than 50 times the average of harmful propaganda in Western Europe. The tentacles of Russian harmful propaganda and disinformation campaigns are many and varied, from influencers, for example, paid to comment on or to promote Kremlin narratives, to priests who are organized visit to Jerusalem, and expressly asked to deliver pro-Russian votes this year, to teachers. who are more recently invited to visit Moscow by the NGO of a Moldovan oligarch internationally sanctioned in the last years. And last but not least, the Moldovan vloggers who distribute and amplify the Kremlin’s narratives and who, curiously enough, ardently supported the unknown Kalin Georgescu’s candidacy in Romania in the presidential elections last fall. Under these conditions, the Moldovan authorities together with experts from civil society and the media are increasingly discussing now the need to establish rules for online activity that would ensure fair conditions for all voices but at the same time diminish the momentum of those who do not aim to inform society in a pluralistic manner but work in favor of foreign interests and against the national security of the Republic of Moldova. The biggest challenge in this context is, of course, to ensure freedom of expression under the conditions of Article 10 of the European Convention and for our state to demonstrate that it does not introduce censorship as is already being talked about in Moldova in the pro-Russian camp. We believe, however, that it is important to emphasize the usefulness of existing European documents including those of the Council of Europe which, for example, in Resolution 2567 from 2024 on propaganda and freedom of information in Europe points out exactly what is happening now in Moldova, what happened in Romania and how important are legal, proportionate and necessary measures in a democratic society to maintain this democracy. Moldova is now in the process of revising and aligning its legislation to the European one. But in the meantime, battles are being fought in which legislation, rules, values and standards, but also freedom of expression, are being packaged in a populist, reductionist and generalist language, which only a resilient and media-educated society can deal with. Until this extremely important, but long-term investments really bear fruit, rules, even at the risk of being seen as restrictions, must be put in place now.
Alina Tatarenko: Thank you very much. Thank you very much, Aneta. So we have heard from the regulators from Ukraine and Moldova, and we also work, our division has a big project working with the Western Balkans, and specifically we support Western Balkan regulators in helping them to regulate harmful content online and to speak about it, we invited today Amela OdobaÅ¡iÄ, who is the head of broadcasting at the Communication Regulatory Agency of Bosnia and Herzegovina. Please, Amela.
Amela OdobaÅ¡iÄ: Thank you very much, Alina, and good morning, still good morning from me. I will speak on behalf of Communications Regulatory Agency, that’s a regulatory authority, a converged regulatory authority in Bosnia and Herzegovina, and as Alina said, I will also touch upon the practices and experiences, and mainly challenges, that we, the regulatory authorities in the Western Balkans region really face when it comes to tackling topics of both disinformation and harmful content online. But before I make my introduction, I really have to, I feel that I should really mention that having heard our colleagues from Ukraine and Moldova, and as you all know, I’m in Bosnia and Herzegovina, went through the war 30 years ago, my first impression was just like, okay, in a way, if you can be thankful for something, and that is, thank God there were no online media and there were no social media then. Okay, and we really understand the hardship that our colleagues are experiencing, both as regulatory authorities as well as the public, the general public, especially of Ukraine. So, as you all know, countries of the Western Balkans are enlargement countries. And what does it mean? It means that as a part of accession process towards the EU, which is a very, not an easy ride when it comes to countries of the Western Balkans, but for the regulators, it is a very challenging ride because we are obliged to transpose the EU legislation framework into our national legislations. Okay, so these processes are not without risks, given the very complex political, social and economic instability and fragile institutions in the region. However, we, the regulator in Bosnia and Herzegovina, as well as our colleagues in the Western Balkans are not being set silent and just observing what is happening. Our biggest task, really, when we started tackling over the topic of disinformation and harmful content online started to happen, our first task was really just to define, okay, so what is what? What falls under the regulatory competencies and what does not fall under the regulatory competencies? So, naturally, being the regulator to whom the public filed a complaint, we started to receive so many complaints that were basically the complaints about disinformation and not harmful content. And as such, of course, I mean, we do not have jurisdictions, we do not have mandate to deal with disinformation. But does it mean that the regulatory authorities are just sitting back and observing all this really very disturbing events happening? No, we are not. For example, in Bosnia and Herzegovina, because we do have our department for media information literacy, we are tackling disinformation through the activities that fall under the umbrella of the activities when it comes to media information literacy. However, when it comes to harmful content online, okay, the first initial reaction of regulatory authorities in the Western Balkans, in Bosnia and Herzegovina as well, is like, okay, are we really, do we have competencies for online content, full stop? Because, you know, if you look at the legal framework in Bosnia and Herzegovina, the legal framework, the law on communication dates back in 2001. Okay, so how could you possibly have competencies, okay, when it comes to online content altogether. But then on the other hand, there is the EU legislation, okay. And as again, as a country that is potentially, I mean, as a candidate country, of course, I mean, there is a set of directives, there is audiovisual media directive that we should, that we are obliged to transpose in our legal framework. I will not go into the details now, I will focus more on challenging, but I hope, Alina, that we will come back to this issue, because actually, the lack of or should I say the inactivity of policy makers to ensure that legal frameworks are in place are really one of the biggest challenges in Bosnia and Herzegovina as well as in some other countries when it comes to tackling the harmful online content. And then, yes, along that way we discovered, well, yes, actually, because we did transpose the Directorate of Audiovisual Media Services into our bylaws. Hence, the rule of video sharing platforms has been adopted and then all of a sudden we realize, yes, we do have the law that dates back to 2001, but yes, also, we do have bylaws and we do have the rule of video sharing platforms which actually makes us competent to a certain extent for online media. I can go back to that in more details, but speaking of challenges for the regulatory authorities in the Western Balkans and in light of those challenges we are very much grateful to Council of Europe for the project that is really helping us to strengthen our capacities in order to be able to act as hands-on regulatory authorities and to respond to all these challenges when it comes to harmful online content is really hugely important to us. But apart from this challenge, which focuses on the lack of legal framework or ineptness of our policy makers, it’s actually also how do small countries, such as the countries in the Western Balkans, establish collaboration or communication to big online platforms? That’s a challenge that we are still trying to find the answer or to try to find a way out. Again, through the project of Council of Europe, yes, there may be a way out, but still, it may be a small step by strengthening regional collaboration and then making some progress in that area. And also, we mustn’t forget that the regulators in the Western Balkans, as well as some regulators in the European Union, face problems such as the issues of internal capacities, of resources, both financial and human resources, in order to expand its capacities to respond adequately to all these events. I’ll stop here and I hope to go back in more detail later on.
Alina Tatarenko: Thank you very much, Amela. Yes, so, lack of legal framework is a common problem for many countries. The problem of the gap between countries which are covered in the EU by the DSA and non-EU member states is an issue that you mentioned. And, of course, lack of resources and capacity also is common for so many countries, for the Council of Europe member states. We are also working, not just with the Council of Europe member states, we are also working and trying to support the journalists in exile. And specifically, we have a project with the Belarusian journalists in exile. And I’m very happy to see that today we have in our panel Alina Koushyk, who is the director of Belsat TV, which is a Belarusian media outlet in exile. And they have a specific, very interesting and very particular set of issues that they would like to share with us today. Please, Alina.
Alina Koushyk: Thank you very much, Alina. Good morning, ladies and gentlemen. Good morning, dear colleagues. Let me greet you in Belarusian language as well, because the issue of Belarusian language is one of the most urgent. but let me go back a little bit later to this. Thank you for inviting me and making a voice of free Belarusian media heard here in Council of Europe. Belarus is not a member of Council of Europe, it’s not a member of EU, but anyway, I will tell you how we operate also in EU, but also how can we influence in this situation in my country. Belarus is a country with almost 9 million people, de facto still independent, which stands a buffer between aggressive Russia and European democracies. Belarus was also used, territory of Belarus was used to start a war in Ukraine and we as free Belarusians, of course, do not support these actions. And the role of independent media in this abuse of Belarusian society is incredible. Today, up to 88% of Belarusian independent media outlets are closed in Belarus. Some of them are continuing their work in exile. We have 45 more or less titles which are covering Belarus from exile. And this independent Belarusian media, of course, including Belsat, we are covering one third of Belarusian population, which is quite a lot, taking into consideration really difficult conditions in which we operate. Also, we have a really difficult situation because of USAID and American turbulences, let’s say, that caused… budget cuts up to 60% of our media sector, which is really a lot, and which may cause a lot of losses in our media eco-sector. Moreover, Belarus is the most dangerous country in Europe for journalists. Until Russia’s full-scale invasion on Ukraine, this is up to information on reporters without borders. Now, 30 Belarusian media workers are imprisoned, just for doing their job, just for telling the truth. I believe that here in our room we have more than 40 people, but can you imagine that half of you would be imprisoned in my country at the moment? Eleven of my colleagues, just from Belsat, are behind the bars at the moment. They receive from two and a half up to eight years, just for doing their job. My name is Alina Koushyk and I am editor-in-chief of Belsat TV, the only independent Belarusian satellite television. The only television which speaks each day using Belarusian language. We are broadcasting from Warsaw for 17 years, every day in our national Belarusian language. And 17 years ago I had an honor to present the first news service in Belsat. I was a presenter. And now I am heading the entire channel. But independent Belarusian television is not about stars, it’s not about glamour, it’s not about headlines. In our daily basis it’s very often about fear, resistance and extraordinary care and courage. Today, we as the independent Belarusian media in exile, both Belsat and other colleagues, we are facing double threats. Authoritarian repressions and algorithmic suppressions. Can you imagine that in my country people can receive even up to five or seven years in prison just for commenting, sharing or liking our content, Belsat content. Over 500 Belarusian sites were labeled as extremists. Me, myself, I’m a triple extremist in Belarus. First, for being a member of the Transitional United Cabinet, then for being a journalist of Belsat, and the third time as myself. 1400 Belarusian websites were blocked in the country. That’s why we are using Miros, we are using other platforms, trying to reach the audience with our free information. Despite of all these difficulties, over 75% of our audience is still in Belarus. Silently, anonymously, without subscribing, without commenting, they are just watching free information. One woman told me a story how she watched Belsat. She’s going to a bathroom, closes the door, watching news, then erasing story, opening up the door and going back to her family. Most Belarusians consume independent media now after 8 p.m. Why? because it’s too dangerous during the day. It is dangerous to open up the site of Belsat or the other media outlets in your job or even in public transport, you know, if you are scrolling news, people are afraid that somebody like Vladimir can see that I’m reading Belsat. Moreover, in your working places or even in the street, militia or policy, you know, they can check your phone and see if you have any subscriptions with any extremist, so-called extremist media outlets. One of these is Belsat. And of course, if they will find it, you can have some troubles. Moreover, we know cases that they are taking phones and making likes from your phone, Belsat or the other Belarusian media outlet and say, you see, you’ve got likes here. And you cannot prove that they just did it to make you, you know, blame that you are using media in exile. That’s why, of course, we see that our audience of Belarusian media is a little bit lower than it was before. But still, we are covering one third of the country. And I believe in these extremely difficult conditions, this is quite big. But what is important for us? We operate mainly through social media platforms, which already mentioned here. And for video content, YouTube is the main platform. YouTube is the main platform for longer videos. And TikTok is the main platform for short videos. But what about YouTube? As Belsat, we operate 10 YouTube channels with around 1 million subscribers at the moment. This is a huge number. Please remember that Belarus is less than 9 million people at the moment. So many of the people who are watching us can subscribe, can comment, can share, just because of fear. And that makes it very difficult to build communities around our media. Moreover, while watching Belsat, sometimes interesting advertisements appear. For example, confession video. What is confession video? It sounds innocent, but what is it? As we have more than 1,500 political prisoners, who are tortured every day behind the bars. So, they make people say that, sorry, I was wrong, I was supporting Tikhanovskaya, that was a mistake, Lukashenko is the only good president. So, this so-called confession video made by a huge violent behind the bars. These videos are appearing on YouTube like advertisements. So, the regime is paying a big platform for this kind of content. Of course, when we see this kind of content, we are knocking to YouTube and say, please block it. And of course, sometimes they block, sometimes not, but this is like you can do each case by your hands. Moreover, an important issue, which is very important personally for me. Belarusian language content is de-prioritized by algorithms of social media, especially in YouTube, even in the largest channels as Belsat. Why? For example, shorts. on our biggest channel where we have almost half a million people, have less than 1,000 viewers. Why? Because algorithms are not supporting Belarusian language. If you do shorts in Russian, you will have a million easily. But if you do it in Belarusian, you will have less than 100. What can we do? For us, it’s extremely important to keep broadcasting in Belarusian language, to talk to our people in our own language. But they are pushing us to make content in Russian. That’s why some Belarusian clever media are trying to have separate channels doing the same job but in Russian. But why should we choose Russian language? We don’t want. That’s why I really want to call on digital platforms to stop penalizing Belarusian language and make algorithms help us to provide really free information according to journalistic standards, professional journalistic standards. Because this is for us the main base for authentic, democratic and resilient media space in our country and moreover in our region. It is also critical to emphasize that quality and independent journalism, this is what Andrin actually said, is a foundation of any strategy against disinformation. Without well-supported media, we cannot counter disinformation, we cannot counter propaganda. That is why Digital Services Act and the Media Freedom Act matter not only for you but also for exiled media and European democratic country. Now if we have Belarusian media in exile, so we are also operating according to European laws. That’s why I believe and I really want this could be helpful for us. Belarusian media exiled are registered. European countries, that’s why for us, it is absolutely important to be visible in these acts and these processes. And the last one. I’m a godmother of Belarusian political prisoner, Ihor Elenevich. He’s Belarusian anarchist, he’s also author of books. He received 20 years in prison for standing up against dictatorship. And once he said, whoever is silent is defeated. So don’t Lukashenko and Putin silence free voices of Belarusians. And let me finish with glory to Ukraine and lonely Belarus.
Alina Tatarenko: Thank you. Thank you very much, Alina. We will try to discuss that more later in more detail because it is really important to stop prioritizing Belarusian language and to do something about it. And platforms do have a lot of power in order to make that happen. Next, we continue with Julie Posetti, who is a professor at the City University of London and who has conducted a very interesting research. And she has very well informed her opinion on what can we do and how we can work with the platforms. Thank you. Thank you.
Julie Posetti: So I co-led a study for UNESCO and the ITU called Balancing Act, Countering Digital Disinformation while Respecting Freedom of Expression. And that was published in 2020 at the height of the pandemic. It recommended a range of regulatory, legislative and normative actions, some of which are parallels to what we’re discussing here today. But in 2025, the threat to information integrity is so much worse than it was in 2020. And the need for proactive, right? rather than reactive and creative responses to disinformation is even more urgent. The broligarchy, as the obscenely wealthy tech bros in power are collectively referred to more frequently now, is choking democracy. Journalists, fact checkers, and other public interest information providers, who I think of as the cleaners in the toxic information ecosystem, and a protective force for democracy, human rights, and the rule of law, are retreating from visibility on big tech platforms to avoid abuse, harassment, and threats. Meanwhile, citizens increasingly don’t know what to believe, nor whom to trust, with devastating consequences for truth, facts, and a shared reality, as Nobel laureate Maria Ressa forewarned. I wish more audience members, more citizens would behave like your Belarusians in the toilet after 8 p.m. But unfortunately, they do not. So I also led a global study for UNESCO on online violence against women journalists called The Chilling. And I just want to highlight three statistics from that research which continue to be meaningful and resonant. So 73% of around 1,000 women journalists we surveyed said they’d experienced online violence in the course of their work. 37% of those said that political actors were the main perpetrators of online violence. And 41% of them said they were targeted in what they believed to be coordinated disinformation campaigns. According to our research, political actors, disinformation purveyors, and networked misogynists are the primary perpetrators of online violence against journalists and other public information producers. and activists and human rights defenders are among those. Women and minorities are both the most at risk and the most prolifically targeted in these online violence campaigns. And as I’ve said, they’re often fueled by disinformation and hate speech. And these attacks tend to be prioritized algorithmically due to high levels of engagement in the same way that angry and divisive speech is prioritized. And the reason for that is ultimately profit. So the attacks are designed to undercut trust in truth and facts and fact-based analysis, imperiling democracy, the rule of law and human rights norms in parallel. They’re also designed to expose their targets to greater risk. And it’s important to note, I think, that impunity for online violence, aids and abets impunity for crimes against journalists and other human rights defenders. Also notable, big tech actors are the vectors or facilitators of these attacks. And in some cases, big tech oligarchs have also proven themselves to be perpetrators. So I now lead a project for ICFJ called Disarming Disinformation, which is studying counter disinformation work in five countries in the context of democratic backsliding. And we’re studying both editorial responses and audience responses to this problem. And in parallel, I lead a project funded by the UK government primarily, which is developing an AI-assisted online violence early warning system, which is designed to monitor online threats in real time and to help predict the escalation of online violence to offline harm with a view to ultimately trying to prevent crimes against journalists and human rights defenders. And the system is also designed to help news organisations and legal teams and civil society organisations document the attacks, to help them hold the big tech actors and other perpetrators accountable. It’s a human rights by design approach to responding to a critical threat to the safety of journalists initially. But I really wish it hadn’t been necessary to do this work because the reason it’s been outsourced to us, this effort, is that the tech oligarchs failed to make safe products in the interests of maximising already obscene profits. And then the US failed to effectively regulate them with devastating consequences, not just for democracy and genuine freedom of expression in the US, but globally, especially in places like Myanmar and Ukraine. And now, in a climate of American free speech absolutism, where freedom of expression rights have been rebranded as censorship, we see these terms being weaponised against journalists and human rights defenders in order to silence them and also endanger them. And these big tech actors have been emboldened in this context to abandon and roll back their already limited trust and safety systems. And in parallel, as the Trump administration has cracked down on counter disinformation work, while also defunding so very many international programs that support public interest media. Meta has abandoned fact-checking in the US initially, but with a global cancellation foreshadowed. The result is that online violence perpetrators and disinformation purveyors can now act with impunity. while the risk management is increasingly outsourced to news organizations and civil society. But we struggle to simply access the data we need to effectively monitor and respond to these threats. And the threats are only escalating in the context of generative AI tools, as we’ve heard, which supercharge the speed and production and distribution of abusive and disinformational content, as well as hate speech content, which is all the more believable as a result of these tools. Meanwhile, we risk legal action from big tech actors if we try to work around the obfuscation to access the necessary publicly available data, which is at the source of attacks on journalists and fact checkers and human rights defenders. And if we can get access, it’s incredibly expensive to actually fund this access. That’s a really important point to highlight. So the sustainability and security of journalism and democracy are intertwined, and both are dependent on the integrity of information, which is under unprecedented attack. So based on all of the research that we’ve done over the past decade or more, we have concluded that the time for self-regulation has passed as it applies to big tech actors. It would be naive as well to assume that the tech oligarchs will meaningfully participate in co-regulation, in our view. So we’re calling on Europe to hold the line against the broligarchy. We need European legislators and regulators to double down on efforts to make big tech responsible, accountable, and transparent. And that needs to happen through legal obligation, litigation, and punitive action. This approach needs to be collaborative, creative.
Alina Tatarenko: and proactive rather than reactive, while of course respecting global standards and international human rights law with regard to freedom of expression. I’ll leave it there. Wow. Thank you, Julie. So, you are saying that the platforms have failed so far to self-regulate and we need collective action to counter the disinformation threat and for that we need legislators, we need governments, we need regulators, we need the efforts from international organizations. I would like to actually take advantage of the presence of my colleague here who is the head of the division for execution of judgments of the European Court of Human Rights and maybe he can comment on that from the judicial perspective and mention what does the European Court case law say about it and what can be done from the judicial point of view. We know cases when the judges made judgments on those issues.
Pavlo Pushkar: Thank you. Thank you so much, Alina. Thank you so much for the invitation to be here and to speak about disinformation and propaganda. I will be rather brief as the court’s case law is also rather brief on the subject of disinformation and propaganda. In my work at the department for the execution of judgments we also have cases which largely relate to instances of, let’s put it this way, disguised propaganda that is not accepted as a valid reason for interference with freedom of expression that relates to a number of different instances of disguised propaganda as terrorist-related, separatist propaganda in favor of prohibited organizations or the so-called propaganda of homosexuality where the states use the reasons to counteract expression. actually in a disproportionate manner or not lawfully. But most importantly, there are several cases of which I was going to refer to. And more generally, I think under the Convention on Human Rights itself, as interpreted by the Strasbourg Court in its case law, states have a wide margin of appreciation in matters of combating disinformation and propaganda. Based on the legitimate and well-justified needs to protect the interests of national security, territorial integrity and public safety, to prevent disorder or crime, for the protection of health and morals as it happened in several cases relating to COVID as well. However, the interference with propaganda and disinformation as expressions disguised as freedom of expression has to be based on relevant and sufficient reasons or generally pursued legitimate aims that need to be invoked by the authorities as well. On the other hand of the spectrum of these discussions on disinformation and propaganda, we see a lot more in context of valid thoughts and strong arguments which firmly suggest that propaganda and disinformation are not views or value judgments. They don’t constitute forms of protected expression that are covered by the requirements of freedom of expression under the European Convention on Human Rights. And indeed, while combating disinformation and propaganda is a valid objective, we might find, as I mentioned, instances in the case law of the Court where the attempts to qualify certain acts as propaganda have no reasonable justification on a lawful basis are disproportionate and this relates to instances I have referred to. This leads us to the discussions also under the Convention and the Court’s case law as to proportionality of sanctions in certain cases. which have been imposed for spreading disinformation and propaganda which is still seen in some instances as forms of expressions covered by the protection of the requirements of Article 10 of the Convention. But then again, the main idea that is mentioned continuously in the case law of the Court is that the aim of interference and sanctions should not totally discourage open debate of matters of public concern but rather the aim is to take robust measures to protect freedom of expression and public discussion space whether offline or online from harmful influences of disinformation and propaganda. And two cases which I think would be rather interesting for you to look into is a wonderful and to a certain extent interesting case of Kirkorov versus Lithuania about the prohibition of the so-called musician to actually be allowed to enter the territory of Lithuania because actually he was involved in spreading Russian disinformation and propaganda. And similarly, the harmful and tragic effects of propaganda are being recognized in another case concerning Ukraine which is discussed in the recent judgment of the Court in the case of Vyacheslav and others versus Ukraine concerning the events at the Kulikovo Pole in Odessa, tragic events which led to a number of tragic deaths as well. So this is quite briefly what I wanted to say from the point of view of the case law of the Court and as regards the enforcement of judgments of the Court to a certain extent fortunately we didn’t have judgments of the kind because the Court once again takes a rather robust stance on the issues of disinformation and propaganda, and these instances do not come to the attention of the committee of ministers exactly for that reason. Thank you very much, Pavlo, very interesting. So we see really
Alina Tatarenko: very good representation here from different actors. I would like to open, we do not have much time left, but we can go a little bit over time. Do we have questions from the online? And then we will take some, shall we take first all the questions, comments, and then answer, right?
Moderator: This way we will try to give as many people as possible. The first question will be from Siva Subraminna. In the process of dealing with the harm of the harmful content, some measures had to be in a seeming excessive degree in times of war of some kind. Are these measures designed to be reversible at a later date to restore liberties or are they set in stone?
Alina Tatarenko: Okay, can we go on? Can you maybe read all of the questions? Okay, anyone wants to address the harmful content question? Yes, okay, so while you’re thinking about addressing this question, we will go on and take more questions, please.
Mykyta Poturaiev: Okay, so it won’t be like maybe a question, it will be a remarkable question. So my name is Mykita Peturaev, I’m the head of the Ukrainian Parliament Committee for Humanitarian and Informational Policies. So colleagues, do we understand where it all is happening? I think yes, on social platforms. Do we have any protection from what is happening there? No. Does DSA work for now? No. Does MFA work for now? No. Will they work for non-member states? No. So do social platforms care about DSA and MFA? No, because they are not in jurisdiction of the European Union, and they won’t take care. So does any one of us have the possibility to protect our good names if our reputations are striked on social platforms? The answer is no, in no court. We don’t have anywhere to go to protect our names. Okay, so can we protect our children from bullying on social platforms? No. Well, the answer for Ukraine is 100% no, Valentin will confirm. Can we protect women from hate speech? No. Can we protect sexual minorities from hate speech? No. I don’t know again about all European countries, but I know for Ukraine, no. In traditional media, yes. I’m one of the key authors of new Ukrainian law and media, so yes, we regulated everything for traditional media. So really, we have nice articles protecting women, children, sexual minorities, all groups. Does it work for social platforms? No. Will it work for social platforms? No. Okay, media literacy. It’s a good idea itself, yes, and I know a couple of countries which are champions like Nordic Baltic countries. When did they start? The answer is from 15 to 20 years ago. Do we have this time in other countries which didn’t care about it? The answer is no, we don’t have this time. Do we have the answer what to do? No. Is fact-checking, which is also very important of course, working to protect us? The answer is no. Do you know why? Because according to all sociological Ukrainian and European surveys, people, ordinary people, they don’t care about fact-checking. So they either trust to some media or they don’t trust. But unfortunately, the inconvenient truth is that they trust mostly to anonymous and other accounts in social platforms, but not to traditional media. Traditional media are losing their audiences in every country, in Ukraine, in every European country, everywhere. Okay, so do we have answers on political level or governmental level? I’m not sure. I’m in communication with my colleagues from European Parliament. I’m also vice president of OECPA, so I’m in communication. We don’t have answer. Also, why? Because we all are afraid. Because if we are going to make social platforms accountable, what will happen in Washington? Maybe someone will wake up and write on Twitter, hey, these people in Europe, they are against freedom of speech, so I will apply 50% of taxes against them, 100% of taxes against them. What will our governments do in such a moment? What is more valuable, taxes or what? Freedom of expression, freedom of speech, freedom of media, accountability? Well, let’s calculate. And let me finalize with the results. And the results are the following. We don’t know maybe an ultra-right candidate will win the Romanian elections, and maybe it will help an ultra-right candidate to win the presidential election in Poland, and maybe it will help to pro-Russian revenge in Moldova, and ultra-rights in the first place in Germany, and ultra-rights in the first place in France. So, till we are discussing, till we don’t have answers, till we don’t have any practical decisions, we are losing this battle, and we are very, very close to lose this war, informational war. And then, well, let’s then answer my question. What are we going to do in this world, new horrible world? Thank you. Thank you very much. It’s, of course, a big question and
Alina Tatarenko: very optimistic intervention. Thank you. You have immediately sparkled a couple of reactions from online. I think we will just give the floor very briefly to the Ofcom from UK, then RNV Media, and then there is a participant at the back. Thank you.
Jordan Ogg: Thank you very much for giving me the opportunity to make a short comment. So, yes, Ofcom is the UK’s independent communications regulator, and just reflecting on some of the comments made by earlier discussants about the importance of strengthening the foundations of the information ecosystem and that quality information is the best antidote to some of the challenges that we have heard about today, I just wanted to raise that Ofcom is currently conducting a review into public service media, public service broadcasting in the UK, and I wanted to highlight that partly to share some interim findings, and that is that the huge increase in consumption of news online while delivering a range of benefits to users, including greater choice and personalization, is also raising huge challenges in relation to how people discover, consume, and are able to judge high quality and accurate news, as we believe is provided by public service broadcasters, at least in the UK, but also other parts of the world, of course. But we also know that audiences are at a much greater risk of exposure to misinformation and disinformation when consuming news online. And within that, we think the public service broadcasters have a really important role to play in countering the effects of those. So what I want to do is just raise the fact that we’ll be publishing policy recommendations aimed at supporting public service broadcasting, and they will be available in the summer, but also just to raise a question amongst anyone in the room who’d like to answer it in relation to how important they think public service broadcasting can be, and how to support it in this context. I’ll stop there. Thank you very much. Thank you. Yes, public service broadcasting is absolutely important, and we also have a
Alina Tatarenko: recommendation, Council of Europe recommendation on public service broadcasting, which is instrumental in countering disinformation. Please, the RNW Media, please.
Giovana Fleck: Hi, everyone. I hope you can hear me. My name is Giovanna. I represent a Dutch organization called RNW Media. But through RNW Media, I represent dozens of journalists and media servants worldwide. And I want to make a few remarks based on what was said here, and also to emphasize some of the issues related to disinformation that reach a global scale beyond European borders, and how that’s also relevant for the discussions within the EU. I think one thing was said early on in this discussion was a remark on fragmentation and how fragmentation of platforms and the internet correlates to disinformation being weaponized. I think we’ve seen the results of that in the form of information disorder in the global scale, especially through the height of the COVID pandemic, but also we continue to see it inside of transnational narratives and key issues related to delegitimizing democracy and human rights worldwide. And this is not excluded to one specific country. Those are global effects that take place inside of information ecosystems that are shaped to elevate harms instead of positive or trustworthy information. If we’re thinking about protecting journalism voices, and if we’re thinking about using journalism as a base to counter those issues, we also need to think about the sustainability of journalism in its time. And as one of the colleagues here said, that costs a lot of money, resources, and time. Most of all, we need to be aware of specific trends targeting journalists. It was also said here that especially female and minority journalists are at attack constantly. And that is absolutely true. The amount of attacks online towards female journalists are disproportional in comparison to other colleagues. And a lot of that also happens because their jurisdictions, also in Europe, don’t protect those claims. Women that are attacked, dogs online, that are doing their jobs as journalists, go to local police efforts to seek for help, and they are often not helped. That also relates to slaps, to lawsuits that specifically target journalists and try to limit their efforts and the resources when reporting. And it also relates to overall coordinated and authentic harm as well. So, to conclude my intervention, but also call attention for this myriad of difficulties in the ecosystem, I think we also need to think on a question of agency towards journalism, journalists allowing them to be able to put their work in a sustainable and aiming at the future in general, but also as a question of agency from civil society as well, to take them as participants of this information ecosystem, not only as a part of society that is reactive to everything that is happening, and that is related to literacy and that is related to forms of building a healthier information ecosystem, and we cannot shy away from those initiatives as well. Thank you very much.
Alina Tatarenko: Thank you very much. Agency for journalism and civil society. Of course, we all agree with that. Please, there was a question there at the back.
Marilia Maciel: Thank you very much. Good morning, everyone. My name is Marilia Maciel. I am Director of Digital Trade and Economic Security at Diplo Foundation, but I speak in my personal capacity. I’d like to focus a little bit more on a subset of the problem we are discussing here, which is a growing disinformation industry motivated by financial gain, which has become global, and this industry is skillfully taking advantage of platforms, business models, but I think it is a problem of its own. With the support of GIZ, we have conducted our research on misinformation, trying to identify lessons learned in a number of countries, and in this exercise, we also came across researchers and investigative journalists, which have identified companies, for example, based in Spain, which were selling disinformation services to Latin America, or companies based in the UK selling services to South Africa, and they were very clearly promising to change election results and postpone election, cause confusion, and so on. So, looking beyond platforms, which is a very important aspect, but not the only one, How do you think it’s possible to cut the financial resources to the disinformation industry? And do you think that there is space for international cooperation, perhaps cooperation with law enforcement? And in your view, is this something that we could collaborate with platforms to the extent that it seems that there is a threshold beyond which information disorder is also detrimental to platforms, as the example of Parler shows as well? Thank you.
Alina Tatarenko: Thank you very much. Very interesting question. Can the law protection agencies be used in order to cut resources of the disinformation industry? We’ll take maybe one more. There was one more here, one more there. Okay, and please, in the meantime, maybe panelists can think about answering the questions, please.
Luljeta Aliu: Hello. My name is Ljudeca Iliuo. I’m a member of the Independent Media Commission in Kosovo. Thank you for having me here. I wanted to thank Ms. Amela OdobaÅ¡iÄ for her presentation. She mentioned or pointed out a lot of challenges we are facing right now, ourselves in Kosovo. We just had some days ago a new law on the Independent Media Commission repealed by the Constitutional Court that was, yes, submitted to the Constitutional Court by media rights groups, media representatives. And you pointing out the challenges, like asking ourselves, do we have the right to regulate and so on. I was wondering, did you also experience the challenge that media rights groups or civil society groups are being sometimes used as an instrument to oppose regulation and then mostly like calling it politically motivated censorship. Like this is what we are going through right now and it is a really difficult position being between two fires wanting to regulate it for the people and for the citizens and then on the other hand having the same NGOs like sometimes used, abused to oppose these regulations and on the other hand the question do you think that could be also an influence of Russia or something? Do you have any cases in this direction?
Alina Tatarenko: Thank you so much for your presentation, it was really good, thank you. Thank you, we will give Amela two minutes to think about it and there was a question here somewhere on this side.
Oksana Prykhodko: Thank you very much, Oksana Prihodko, I am from Ukraine, international non-governmental organization, European media platform. American institutions played a very important role in counteracting disinformation. Now the new American administration closed a lot of projects. I understand that the Council of Europe, the European Union lack money to replace all such projects. Can we discuss any other asymmetrical responses? Thank you very much. Sorry, one more time, asymmetrical responses to? To closed American projects.
Alina Tatarenko: Okay, I take note of that. Okay, I’ll take, so if you have time we can stay for a couple more minutes, otherwise, yeah, we can. So if someone needs to go, you can go. Those who want to stay are welcome to stay because it’s very interesting and we can continue. Please, there was another question there.
Giacomo Mazzone: Yes, Giacomo Mazzone, member of the EDMOD, European Digital Media Observatory. I have two questions. The first is exactly about the observatory. We have done, and Andrew knows that we have worked on this together, as EDMOD we have put in practice a lot of activities exactly in this field and we have a successful example of what happened during the European elections last year. That was a process that went quite smooth and there was room for cooperation with the platform that was another world at that time, one year ago. And I think that we need to use this model and try to replicate elsewhere. The second aspect is, I remember that the Council of Europe some years ago opened a dialogue with the platforms based on goodwill, let’s say. There is any sign in this platform of dialogue that the changes that we are seeing in the US are going to be replicated also in Europe?
Alina Tatarenko: Okay, thank you very much. Yes, to open the dialogue of the Council of Europe with digital platforms, we do have some people here who are involved in this and maybe we can talk about it later. But in the meantime, we have a reaction from Ips.
Moderator: Yeah, also from myself, as part of the EFD, representing the young voices in Europe on internet governance. And I would like to ask a few questions. Because we often criticize this information propaganda mentioning that it only comes from Russia or China. But are we sufficiently aware and critical of this information that originates from within our own countries or from our closest allies? Is isolating us from different narratives undermines our capabilities of access to information, influence our own decisions? And this is censorship. Or do we want to say that the US and Europe don’t generate propaganda and misinformation? If we had voices here from countries from where us and our allies had financed foreign interventions, they wouldn’t agree with it. Maybe they would mention how the West has justified multiple military interventions, misleading claims about weapons of mass destruction, or claiming that the West would bring democracy to them just to let their countries completely torn apart. How can we ensure our efforts to combat disinformation remain balanced, credible, and fair by addressing misinformation and propaganda respective of its source? Because the narrative I’m listening to here is simply a push for censorship. Shouldn’t we be more self-critical? And I have also a question to Alina and Raul. Why do you think a country would allow a foreign entity to finance media in its own territory? That is pushing a narrative that threatens its national sovereignty through USAID, for instance. This is far from being independent media. Also, at the same time, you are asking for free information. You are also requesting to ban content that goes against narratives on YouTube. Isn’t it hypocritical?
Alina Tatarenko: Thank you. Thank you. Very interesting. I take note. We’ll try to address it later. I was told that we have a request from the ex-representative online. Hakim, can you raise your hand, please? I understood that you wanted to speak if you are listening online. Bear witness while questioning Anyone else want to say something or ask a question? I’ll take one more, then we’ll go back to panel to address, and then we’ll close, I promise, please.
Oleksandr Shevchuk: Alexander Shevchuk, Institute of International Relations, Ukraine, what do you see as the improvement, instruments, and mechanism how to fight against Russian propaganda in Ukraine, especially historical propaganda, and what is, how you assess the effectiveness of such actions by the European countries and the Council of Europe?
Alina Tatarenko: Okay, thank you very much. I will probably now ask the panel to try to answer at least some of the questions and comments that were here. Maybe we can start with Amala, because there was one directly addressed to her, and then we’ll go around, and Julie.
Amela OdobaÅ¡iÄ: Thank you, Alina, and my colleague just came back in time. Okay, so to go back to your question, and you asked me whether the media organizations or representatives of civil society reacted as opposing to the legal solutions that we basically provided. In a nutshell, no. Okay. Especially in the media art industry? No, no, no. I mean, there is a whole story behind on how to implement the rule that we have in place, and that is the one with the sharing platforms, because we do not have in Bosnia and Herzegovina platforms as such registered. But anyway, we can discuss it outside of this meeting. But then, if you allow me, then perhaps I can wrap up. and give additional information as a response. As when we were preparing for this panel, Julie was brilliant when she asked me, she said, oh, so you are experimenting with co-regulation. Okay, and that is exactly what we are doing, the regulators. In Bosnia-Herzegovina, we were very lucky that Council of Europe had a study, this device that was produced for us as the regulator. And that is actually, the topic of the study is mapping of stakeholders in Bosnia-Herzegovina towards regulating harmful content online. Okay, and I think that is the first study of that kind produced in the countries of the Western Balkans region. It’s an excellent study and it’s available, I can also send it to you. And basically, that is exactly with that study as a step one. This is exactly what we are doing. We are experimenting with co-regulation. In Bosnia-Herzegovina, we are already, as we, at least we as a regulator, we very much adopted this co-regulatory approach when it comes to harmful content online. Okay, which does not mean that we as the regulator should have all power, we can’t simply do it. But it’s basically somehow developing a network or platform of all stakeholders who should have their saying in line with their competencies. It’s going to be a very interesting process. We already started talking to the government and somehow bringing that idea to them because it’s their business. Okay, it’s a national topic. It’s not something that the regulatory authority or some other state institution should do. So it’s the government that should lead that process. Okay, so we will let you know how the process is going because it is going to be a process, but I think that we are on the right track. when it comes to that. And also, allow me just to take another 10 seconds. Look, it’s easy for all of us to say no to everything that is happening and make us somehow not being able to respond to whatever is happening in the online world. In that case, then we should simply just give up our jobs and stop being paid for what we are doing. I believe, yes, the situation is quite serious, but I think that we should all really join together. Whether we are going to get to beat the platforms or big tech companies, we do not know, but we can’t give up, because this is the only way just to keep pushing. And for us, developing countries, it’s even more challenging. But if we are willing to make that extra mile, then the members of the European Union should really serve as a role model to us. Because, yes, it is true that DSA and DMA are not going to solve all these problems, such as hate speech in online space, etc. But, yes, they are going to benefit a lot our national regulatory frameworks or national legal frameworks in combating these events. And I’ll stop here. This could be my closing remark as well, Alina, if you allow me.
Alina Tatarenko: Thank you very much. I agree with you. I think it’s basically like a crime. It will always exist. There will always be crimes. There will always be problems. But we can do things to mitigate the damage. We can do something to help prevent, to reduce. We will not eliminate it completely. We will not eliminate any problem completely. Whatever we can do, let’s do it. And we have provided, as you know, Council Europe provided opinion on the IMC law and maybe we can discuss a little bit in more detail later. Julie wanted to address some questions. Thanks. Yeah, I’ll just try and pull some threads from a variety of questions.
Julie Posetti: Somebody talked about the emphasis on Russian and Chinese disinformation in this discussion. I think myself and others have observed that the rise of US-led disinformation is a legitimate threat, especially in the context of the weaponization of counter disinformation work by the US administration with the defunding of any research programs emphasizing disinformation that involve any kind of state-related funding, for example. So, you know, some have argued that the US is actually, and the disinformation networks that are seeded by the US, among the most dangerous that we are currently dealing with. And I think that’s both shocking to contemplate, but also something that requires deeper investigation and close monitoring. Because if all of our efforts are indeed focused on these obvious geopolitical actors with a known history in this space, then we risk avoiding the emerging threats or failing to respond to the emerging threats. And it’s also true that these are transnational threats. So they may have been born in the US. And I’m speaking here now about, not about US-generated disinformation or misinformation, but about the function of US-owned platforms and big tech companies. And we have barely touched AI in terms of the industry. We will do that tomorrow and through the rest of the day. And we can come back to that. But they are also monetized. So the economy of disinformation needs to be better understood. And perhaps there are some creative options there in terms of… legal action, where there’s been a monetization of disinformation. I agree with what colleagues have said about both the sense of despondency, which I share to an extent with our Ukrainian colleague, but with an insistence built on optimism associated with the ongoing belief in defending human rights and the rule of law and democracy that we cannot give up the fight. I know it seems absurd for an Australian British woman to sound like I’m lecturing Ukrainians who are literally holding the line physically, but collectively, I think it’s vital that we maintain the fight with eyes wide open, especially to the information operations that are funded by and run by the big tech companies themselves to resist any form of regulation, but also to try to shut down these sorts of critical conversations as though they represent acts of censorship, which goes to the point our young colleague made, which I respectfully vehemently disagree with. What we are talking about here is not censorship. It is about our role and our responsibility to defend human rights, the rule of law and democracy. That is why the Council of Europe exists. It’s also what the UN seeks in its efforts to defend in terms of freedom of expression, press freedom, safety of journalists, human rights defenders, in reference to the Universal Declaration of Human Rights and especially Article 19, which does not give a right to any person on the internet to abuse, harass and threaten people who are trying to express their democratically rooted rights to engage publicly in democratic deliberation by driving them through fear and genuine fear, especially if you’re a woman or a minority, or if you are trying to speak truth to power in a country which has forced you into exile, like our colleague here from. Belarus. And I would argue that public interest media, to our colleague from the UK, are increasingly important, especially public media, especially public broadcasting, which must be reinforced in terms of appropriate funding. And to the UK, I speak directly to what I understand is under consideration, which is massive budget cuts to the BBC World Service, which could not come at a worse time when we see the Voice of America and RFE, RL being defunded and having to fight for their very existence to continue. And the funding for those services, again, is not about imposing some foreign perspective on an individual country. It’s about upholding democracy, human rights, and the rule of law. Those are not partisan pursuits. Those are fundamental and foundational, and I’ll end it there.
Alina Tatarenko: Thank you, Julie. Andrin, Valentin, Andrin, please.
Valentyn Koval: I just wanted to answer maybe the most of the questions that were here. I think, and I do believe, that we should stop looking for money for combating disinformation as fake news, because we will be always secondary, and we will disseminate this disinformation more and more times, trying to debunk it. So we need to look for money for creating new information. We should understand that information is not a chaotic space. It’s just a tube which people use to get new information, and we should fill this tube with truthful, new, reliable, and checked information. We should just replace fake news with real information. This is the only way to, you know, to change the game. because money, those who make disinformation as a result, as fake news, or even if we will take it as a process of creation, dissemination, exception, and so on, those money will, for all the time, there will be that money. With oil, gas, any other things that Russia and others are having in big amounts. So we just need to replace, not just like the UAE is now replacing the UK in the EU, but we need to replace disinformation with real information, just like what Belsat is doing, because they are, I understand, the only way for Belarusian people to have a real truth about the Belarusian situation, and we should just look for money for this. Not for combating disinformation as a fact, but for filling this information tube with reliable information. Thank you. Thank you very much, Valentin, and I’m terribly sorry, but I was just told that we really have to stop.
Alina Tatarenko: I’m sorry about this, but let’s continue our discussions outside. We are all here, we’ll be all here for the whole two or three days, and please come up to us, and let’s talk in the corridor, the coffee. Thank you. Thank you. And please don’t forget, if you really need some paper, it’s about two. Thank you.
Andrin Eichin
Speech speed
142 words per minute
Speech length
1063 words
Speech time
446 seconds
Three-pillar approach: fact-checking, platform design solutions, and user empowerment
Explanation
The Council of Europe guidance note recommends a comprehensive approach with three main areas: supporting fact-checking organizations with financial independence and transparency, implementing human rights by design principles in platform architecture, and providing users with tools to control content and verify sources.
Evidence
The guidance note was developed by an expert committee and adopted by the Steering Committee on Media and Information Society in December 2023. Specific recommendations include conducting human rights impact assessments, making moderation practices transparent, and comprehensive digital literacy efforts for all age groups.
Major discussion point
Council of Europe Standards and Guidance on Countering Disinformation
Disagreed with
– Valentyn Koval
Disagreed on
Strategy focus – reactive debunking vs. proactive information creation
Need for systemic solutions that strengthen information ecosystem resilience rather than just removing bad content
Explanation
The expert committee believes that countering disinformation complexity demands policies that go beyond identifying and removing problematic content or bad actors. Instead, comprehensive systemic solutions are needed to strengthen the overall integrity and resilience of the information ecosystem.
Evidence
The guidance note highlights that empirical data on disinformation’s actual reach and impact is still limited, and the reality is often more nuanced than perceived. Examples include clear evidence of Russian interference in Ukraine and COVID-19 misinformation, but people’s fear of misinformation often generates as much polarization as the content itself.
Major discussion point
Council of Europe Standards and Guidance on Countering Disinformation
Quality information is the most effective long-term antidote for disinformation
Explanation
The guidance emphasizes that the most important task is creating structural conditions to ensure a steady and reliable supply of quality information by recognized trustworthy sources. This involves investing in reliable independent journalism and building healthy media ecosystems.
Evidence
The guidance note stresses this is not a new task but something that has been at the forefront of the Council of Europe for years. It’s described as easy to forget when confronted with external disinformation, but remains fundamental.
Major discussion point
Council of Europe Standards and Guidance on Countering Disinformation
Agreed with
– Valentyn Koval
– Julie Posetti
– Alina Koushyk
Agreed on
Quality journalism and independent media are fundamental to countering disinformation
Media literacy efforts must be available for all age groups to build critical thinking
Explanation
The guidance note emphasizes the need for comprehensive digital literacy efforts that are accessible to all age groups, not just children or young people. This comprehensive approach is necessary to build critical thinking and resilience against disinformation across all of society.
Evidence
The guidance notes that digital literacy is often cited but very rarely put in place effectively, and current tools are sparse with opaque implementation that often depends on platform providers’ goodwill.
Major discussion point
Media Literacy and Public Service Broadcasting
Pavlo Pushkar
Speech speed
136 words per minute
Speech length
703 words
Speech time
309 seconds
States have wide margin of appreciation in combating disinformation but interference must be proportionate
Explanation
Under the European Convention on Human Rights, states have significant discretion in matters of combating disinformation and propaganda based on legitimate needs to protect national security, territorial integrity, public safety, and prevent disorder. However, any interference must be based on relevant and sufficient reasons and be proportionate.
Evidence
The European Court of Human Rights case law includes cases like Kirkorov versus Lithuania (about prohibiting a musician from entering Lithuania for spreading Russian disinformation) and Vyacheslav and others versus Ukraine (concerning tragic events at Kulikovo Pole in Odessa). The Court takes a robust stance on disinformation and propaganda issues.
Major discussion point
Council of Europe Standards and Guidance on Countering Disinformation
Valentyn Koval
Speech speed
118 words per minute
Speech length
847 words
Speech time
429 seconds
Ukraine lacks institutional media heritage and faces reactive responses to disinformation
Explanation
Unlike long-standing European democracies that rely on independent and fair traditional media, Ukraine lacks this institutional media heritage and its journey towards stable democratic media environment is still in progress. Most responses to disinformation are reactive, with fact-checking coming only after fake news spreads.
Evidence
Ukraine has been criticized for taking undemocratic steps like banning Russian and pro-Russian television channels, but these moves were essential for defending media pluralism. Ukraine’s diverse media voices twice helped resist Russian attempts to assert political control, leading to Russia’s full-scale invasion as a last-ditch effort.
Major discussion point
Russian Disinformation and Propaganda Threats
Agreed with
– Aneta Gonta
– Alina Koushyk
– Alina Tatarenko
Agreed on
Russian disinformation represents a systematic and coordinated threat
Social media platforms are structurally incapable of supporting truthful narratives during crises
Explanation
A 2024 study showed that social media platforms often suppress war content citing global rules and fail to act against bots and AI-generated harmful content. Their core design incentivizes broad engagement often from less critical users, making them breeding grounds for disinformation.
Evidence
The study titled ‘Guide for Risk Management in the Context of Emergencies, Armed Conflicts, and Crisis’ was conducted by International Media Support and Internews Ukraine in partnership with UNESCO and with support from Japan. It analyzes risks of spreading truthful content during conflicts and proposes recommendations for reducing platform-related threats.
Major discussion point
Platform Regulation and Accountability Challenges
Need to replace disinformation with reliable information rather than just debunking fake news
Explanation
Instead of looking for money to combat disinformation as fake news (which makes us always secondary and spreads disinformation more), we should focus on creating and disseminating new, truthful, reliable, and checked information. The strategy should be to fill the information space with verified truthful content.
Evidence
The amount of information consumed by individuals is limited due to time constraints and attention spans. This space is contested between professional media and unmoderated platforms. Ukraine’s National Council produces weekly programming to debunk Russian disinformation narratives in English to help international colleagues understand Russian propaganda.
Major discussion point
Russian Disinformation and Propaganda Threats
Agreed with
– Andrin Eichin
– Julie Posetti
– Alina Koushyk
Agreed on
Quality journalism and independent media are fundamental to countering disinformation
Disagreed with
– Andrin Eichin
Disagreed on
Strategy focus – reactive debunking vs. proactive information creation
Aneta Gonta
Speech speed
125 words per minute
Speech length
617 words
Speech time
295 seconds
Moldova is most targeted country in region with Russia investing over 200 million euros in disinformation campaigns
Explanation
According to information observatory data, Moldova is currently the most targeted country in the region by Russian disinformation campaigns, experiencing more than 50 times the average of harmful propaganda compared to Western Europe. Russia has invested over 200 million euros (more than 1% of Moldova’s GDP) in online disinformation and propaganda campaigns.
Evidence
In April 2025, journalistic investigations revealed two extremely powerful Russian harmful propaganda networks named Matryoshka and Pravda targeting Moldova and its president Maya Sandu. Moldova held presidential elections and EU accession referendum in fall 2024, and has parliamentary elections in September 2025, all under conditions of deep Russian interference.
Major discussion point
Russian Disinformation and Propaganda Threats
Agreed with
– Valentyn Koval
– Alina Koushyk
– Alina Tatarenko
Agreed on
Russian disinformation represents a systematic and coordinated threat
Disagreed with
– Moderator
– Multiple speakers including Aneta Gonta, Valentyn Koval
Disagreed on
Western approach to disinformation – balanced vs. geopolitically focused
Russian disinformation networks like Matryoshka and Pravda specifically target Moldova and its president
Explanation
These networks use various tactics including paid influencers to promote Kremlin narratives, organizing priests’ visits to Jerusalem with instructions to deliver pro-Russian votes, inviting teachers to Moscow through sanctioned oligarch NGOs, and using Moldovan vloggers who distribute Kremlin narratives.
Evidence
The tentacles of Russian propaganda include influencers paid to comment on or promote Kremlin narratives, priests organized to visit Jerusalem and asked to deliver pro-Russian votes, teachers invited to Moscow by NGO of internationally sanctioned Moldovan oligarch, and Moldovan vloggers who ardently supported unknown Kalin Georgescu’s candidacy in Romania’s presidential elections.
Major discussion point
Russian Disinformation and Propaganda Threats
Amela Odobašić
Speech speed
139 words per minute
Speech length
1630 words
Speech time
700 seconds
Small countries struggle to establish communication with big online platforms
Explanation
Western Balkan countries face significant challenges in establishing collaboration or communication with major online platforms due to their small size and limited market influence. This creates a substantial obstacle for effective regulation of harmful online content.
Evidence
Through the Council of Europe project, there may be solutions through strengthening regional collaboration, but this remains a challenge that regulators are still trying to address. The gap between countries covered by EU DSA and non-EU member states creates additional difficulties.
Major discussion point
Platform Regulation and Accountability Challenges
Lack of adequate legal frameworks is major challenge for Western Balkan regulators
Explanation
The biggest challenge for regulators is the inactivity of policymakers to ensure that legal frameworks are in place. Many laws are outdated – for example, Bosnia and Herzegovina’s communication law dates back to 2001, making it difficult to address online content regulation.
Evidence
Bosnia and Herzegovina discovered they had competencies for online media through bylaws and rules for video sharing platforms that transposed the EU Audiovisual Media Services Directive, despite having a main law from 2001. The lack of legal framework or ineptness of policymakers is identified as one of the biggest challenges.
Major discussion point
Legal Framework and Regulatory Gaps
Need for co-regulatory approach involving all stakeholders rather than just regulatory authorities
Explanation
Regulators are experimenting with co-regulation, which means developing a network or platform of all stakeholders who should have input according to their competencies. This approach recognizes that regulatory authorities cannot and should not have all the power to address harmful online content alone.
Evidence
Bosnia and Herzegovina has adopted a co-regulatory approach and produced a study mapping stakeholders towards regulating harmful content online – the first study of its kind in the Western Balkans region. They are working with the government to lead this process as a national topic rather than leaving it to individual state institutions.
Major discussion point
Legal Framework and Regulatory Gaps
Agreed with
– Julie Posetti
– Mykyta Poturaiev
Agreed on
Platform self-regulation has failed and stronger regulatory measures are needed
Disagreed with
– Mykyta Poturaiev
Disagreed on
Effectiveness of current regulatory frameworks
Alina Koushyk
Speech speed
121 words per minute
Speech length
1497 words
Speech time
742 seconds
88% of Belarusian independent media outlets are closed, with 45 titles operating from exile
Explanation
The vast majority of independent Belarusian media outlets have been shut down in Belarus, with only some continuing their work from exile. Despite these difficult conditions, independent Belarusian media including Belsat cover about one-third of the Belarusian population.
Evidence
30 Belarusian media workers are imprisoned just for doing their job, receiving sentences from 2.5 to 8 years. Eleven colleagues from Belsat alone are behind bars. Over 500 Belarusian sites were labeled as extremists, and 1,400 Belarusian websites were blocked in the country. Belarus is the most dangerous country in Europe for journalists according to Reporters Without Borders.
Major discussion point
Media in Exile and Language Discrimination
Agreed with
– Valentyn Koval
– Aneta Gonta
– Alina Tatarenko
Agreed on
Russian disinformation represents a systematic and coordinated threat
Belarusian language content is de-prioritized by algorithms, forcing media to choose Russian for better reach
Explanation
Social media algorithms, especially on YouTube, systematically de-prioritize Belarusian language content. Shorts on Belsat’s biggest channel with almost half a million subscribers get less than 1,000 viewers, while Russian content easily gets millions of views.
Evidence
If you create shorts in Russian, you can easily get a million views, but if you do it in Belarusian, you get less than 100 views. Some Belarusian media are forced to create separate channels in Russian to reach audiences, but this undermines the goal of preserving Belarusian language and culture.
Major discussion point
Media in Exile and Language Discrimination
People consume independent media secretly after 8 PM due to fear of persecution
Explanation
Belarusians face severe risks for consuming independent media, with people potentially receiving 5-7 years in prison just for commenting, sharing, or liking Belsat content. Most consumption happens secretly in the evening hours when it’s safer.
Evidence
One woman described watching Belsat by going to the bathroom, closing the door, watching news, then erasing the history and returning to her family. People are afraid to open Belsat or other independent media sites at work or in public transport. Police can check phones and even create fake likes to frame people for using ‘extremist’ media.
Major discussion point
Media in Exile and Language Discrimination
30 Belarusian media workers are imprisoned just for doing their job
Explanation
Belarus has become extremely dangerous for journalists, with 30 media workers currently imprisoned simply for practicing journalism and telling the truth. This represents a systematic crackdown on press freedom and independent media.
Evidence
Eleven colleagues from Belsat alone are behind bars, receiving sentences from 2.5 to 8 years. The speaker notes that in a room of 40 people, half would be imprisoned in Belarus for doing journalism. Belarus was the most dangerous country in Europe for journalists until Russia’s full-scale invasion of Ukraine.
Major discussion point
Media in Exile and Language Discrimination
Julie Posetti
Speech speed
136 words per minute
Speech length
1770 words
Speech time
777 seconds
73% of women journalists experience online violence, with 37% targeted by political actors
Explanation
A global UNESCO study called ‘The Chilling’ found that nearly three-quarters of women journalists surveyed experienced online violence in their work, with political actors being the main perpetrators in over one-third of cases. This represents a systematic attack on press freedom and democratic discourse.
Evidence
The study surveyed around 1,000 women journalists globally. Additionally, 41% of respondents said they were targeted in what they believed to be coordinated disinformation campaigns. Political actors, disinformation purveyors, and networked misogynists are identified as the primary perpetrators.
Major discussion point
Online Violence Against Journalists
Agreed with
– Giovana Fleck
Agreed on
Women journalists and minorities face disproportionate online violence and targeting
41% of women journalists are targeted in coordinated disinformation campaigns
Explanation
A significant portion of women journalists face coordinated attacks that combine disinformation with online violence. These campaigns are designed to undercut trust in truth and facts while exposing targets to greater risk, often fueled by disinformation and hate speech.
Evidence
The attacks are prioritized algorithmically due to high levels of engagement, similar to how angry and divisive speech is prioritized, ultimately for profit. Women and minorities are both the most at risk and most prolifically targeted in these campaigns.
Major discussion point
Online Violence Against Journalists
Big tech actors serve as vectors for attacks and sometimes as perpetrators themselves
Explanation
Technology companies not only facilitate online violence against journalists through their platforms but in some cases, big tech oligarchs have proven themselves to be direct perpetrators of attacks. This dual role makes them both enablers and active participants in undermining press freedom.
Evidence
The speaker leads a project developing an AI-assisted online violence early warning system to monitor threats in real time and predict escalation to offline harm. This work became necessary because tech oligarchs failed to make safe products in the interests of maximizing profits.
Major discussion point
Online Violence Against Journalists
Impunity for online violence aids impunity for crimes against journalists
Explanation
The lack of accountability for online attacks against journalists creates a culture of impunity that extends to offline crimes against media workers. This connection between online and offline violence represents a serious threat to press freedom and journalist safety.
Evidence
The speaker notes that attacks are designed to expose targets to greater risk, and there’s a clear link between online harassment and physical harm. The early warning system being developed aims to help predict when online violence might escalate to offline harm.
Major discussion point
Online Violence Against Journalists
Time for self-regulation has passed; need legal obligation and punitive action against big tech
Explanation
Based on extensive research over the past decade, the conclusion is that big tech companies have failed at self-regulation and cannot be trusted to meaningfully participate in co-regulation. Legal obligations, litigation, and punitive actions are now necessary to make them accountable.
Evidence
Meta has abandoned fact-checking in the US with global cancellation foreshadowed. The Trump administration has cracked down on counter-disinformation work while defunding international programs supporting public interest media. Tech companies have been emboldened to abandon their already limited trust and safety systems.
Major discussion point
Platform Regulation and Accountability Challenges
Agreed with
– Mykyta Poturaiev
– Amela OdobaÅ¡iÄ
Agreed on
Platform self-regulation has failed and stronger regulatory measures are needed
Disagreed with
– Andrin Eichin
Disagreed on
Approach to platform regulation – self-regulation vs. legal obligation
Mykyta Poturaiev
Speech speed
127 words per minute
Speech length
661 words
Speech time
311 seconds
Social platforms don’t care about DSA and MFA as they’re not in EU jurisdiction
Explanation
Social media platforms operate outside European Union jurisdiction and therefore have no obligation to comply with the Digital Services Act (DSA) or Media Freedom Act (MFA). This creates a regulatory gap where platforms can ignore European standards and regulations with impunity.
Evidence
The speaker, as head of Ukrainian Parliament Committee for Humanitarian and Informational Policies, notes that these acts don’t work for non-member states and platforms don’t take care because they’re not under EU jurisdiction. Traditional media can be regulated effectively, but social platforms remain beyond reach.
Major discussion point
Platform Regulation and Accountability Challenges
Agreed with
– Julie Posetti
– Amela OdobaÅ¡iÄ
Agreed on
Platform self-regulation has failed and stronger regulatory measures are needed
Disagreed with
– Amela OdobaÅ¡iÄ
Disagreed on
Effectiveness of current regulatory frameworks
Current system provides no protection for reputation, children from bullying, or women from hate speech on social platforms
Explanation
Unlike traditional media where regulations exist to protect various groups, social media platforms offer no meaningful protection for individuals’ reputations, children facing bullying, women experiencing hate speech, or sexual minorities facing discrimination. There are no courts or mechanisms for redress.
Evidence
The speaker confirms that while Ukraine’s new media law has nice articles protecting women, children, and sexual minorities in traditional media, none of these protections work for social platforms. People have nowhere to go to protect their names or seek justice for online harms.
Major discussion point
Platform Regulation and Accountability Challenges
Traditional media losing audiences to anonymous social media accounts
Explanation
According to sociological surveys in Ukraine and Europe, people increasingly trust anonymous and other accounts on social platforms more than traditional media. This shift represents a fundamental challenge to established journalism and fact-based reporting.
Evidence
The speaker notes that people either trust some media or they don’t, but unfortunately they trust mostly anonymous accounts on social platforms rather than traditional media. Traditional media are losing audiences in every country, including Ukraine and all European countries.
Major discussion point
Media Literacy and Public Service Broadcasting
Marilia Maciel
Speech speed
168 words per minute
Speech length
253 words
Speech time
90 seconds
Growing disinformation industry motivated by financial gain operates globally
Explanation
There is an emerging global disinformation industry that operates for profit, skillfully taking advantage of platform business models. This represents a distinct subset of the broader disinformation problem that requires specific attention and countermeasures.
Evidence
Research supported by GIZ identified companies based in Spain selling disinformation services to Latin America, and companies based in the UK selling services to South Africa. These companies clearly promise to change election results, postpone elections, and cause confusion.
Major discussion point
Financial Aspects of Disinformation
Companies sell disinformation services across borders to influence elections
Explanation
Commercial disinformation operations work internationally, with companies in one country providing disinformation services to clients in other countries specifically to manipulate electoral processes. This represents a form of information warfare conducted for profit.
Evidence
Researchers and investigative journalists have identified specific examples of cross-border disinformation commerce, including Spanish companies serving Latin American clients and UK companies serving South African clients, with explicit promises to influence election outcomes.
Major discussion point
Financial Aspects of Disinformation
Need to cut financial resources to disinformation industry through international cooperation
Explanation
Addressing the commercial disinformation industry requires targeting its financial foundations through coordinated international action, potentially involving law enforcement cooperation. This approach focuses on the economic incentives that drive disinformation rather than just the content itself.
Evidence
The speaker suggests there may be space for collaboration with platforms since there appears to be a threshold beyond which information disorder becomes detrimental to platforms themselves, as demonstrated by the example of Parler.
Major discussion point
Financial Aspects of Disinformation
Jordan Ogg
Speech speed
156 words per minute
Speech length
302 words
Speech time
115 seconds
Public service broadcasters play important role in countering misinformation effects
Explanation
Public service broadcasters provide high quality and accurate news that serves as a counterbalance to the risks of misinformation and disinformation that people face when consuming news online. They have a crucial role in maintaining information integrity in the digital age.
Evidence
Ofcom is conducting a review of public service media in the UK and has found that while increased online news consumption delivers benefits like greater choice and personalization, it also raises challenges in how people discover, consume, and judge high quality news. Audiences face much greater risk of exposure to misinformation when consuming news online.
Major discussion point
Media Literacy and Public Service Broadcasting
Giovana Fleck
Speech speed
141 words per minute
Speech length
483 words
Speech time
204 seconds
Sustainability challenges for journalism require agency for both journalists and civil society
Explanation
Protecting journalism voices and using journalism to counter disinformation requires significant resources, money, and time. The sustainability of journalism must be addressed alongside giving both journalists and civil society greater agency to participate meaningfully in the information ecosystem.
Evidence
The speaker represents dozens of journalists and media servants worldwide through RNW Media. Female and minority journalists face disproportionate attacks online, and many jurisdictions in Europe don’t protect journalists who seek help from local police when attacked online. This also relates to SLAPP lawsuits that target journalists and limit their resources.
Major discussion point
Criticism of Western Approach
Agreed with
– Julie Posetti
Agreed on
Women journalists and minorities face disproportionate online violence and targeting
Luljeta Aliu
Speech speed
110 words per minute
Speech length
215 words
Speech time
116 seconds
Constitutional court challenges to new media laws create additional regulatory obstacles
Explanation
Media regulators face challenges not only from platforms and lack of legal frameworks, but also from constitutional court decisions that can overturn new regulatory laws. This creates additional uncertainty and obstacles for effective regulation of harmful online content.
Evidence
Kosovo’s Independent Media Commission had a new law repealed by the Constitutional Court after it was challenged by media rights groups and media representatives. This creates a difficult position for regulators who are ‘between two fires’ – wanting to regulate for citizens while facing opposition from civil society groups.
Major discussion point
Legal Framework and Regulatory Gaps
Oksana Prykhodko
Speech speed
122 words per minute
Speech length
75 words
Speech time
36 seconds
US funding cuts to counter-disinformation programs create resource gaps
Explanation
The new American administration has closed many projects that played important roles in counteracting disinformation. While the Council of Europe and European Union lack sufficient funds to replace all such projects, there is a need to discuss alternative asymmetrical responses to fill this gap.
Evidence
American institutions previously played a very important role in counteracting disinformation, but the new administration has discontinued many of these programs, creating a funding and capacity gap that European institutions cannot fully replace.
Major discussion point
Financial Aspects of Disinformation
Giacomo Mazzone
Speech speed
138 words per minute
Speech length
159 words
Speech time
68 seconds
EDMOD successfully managed cooperation with platforms during European elections
Explanation
The European Digital Media Observatory (EDMOD) successfully implemented activities during the European elections that went smoothly and demonstrated effective cooperation with platforms. This represents a successful model of collaboration that existed before recent changes in the platform landscape.
Evidence
The cooperation during European elections represented ‘another world’ compared to current conditions, suggesting that effective platform cooperation was possible and successful just one year ago during the electoral process.
Major discussion point
European Digital Media Observatory Success
Model should be replicated elsewhere for effective platform cooperation
Explanation
The successful EDMOD model from the European elections should be used as a template and replicated in other contexts and regions to maintain effective cooperation with platforms for countering disinformation and ensuring information integrity.
Evidence
The speaker worked with Andrin Eichin on developing this model and emphasizes its success during the European elections as proof of concept for effective platform cooperation.
Major discussion point
European Digital Media Observatory Success
Changes in US approach may affect platform dialogue in Europe
Explanation
There are concerns that the changes occurring in US policy toward platforms and disinformation may impact the Council of Europe’s dialogue with platforms, potentially affecting the cooperative relationships that had been established based on goodwill.
Evidence
The Council of Europe had previously opened dialogue with platforms based on goodwill, but there are signs that changes in the US approach to platform regulation and disinformation may be replicated in Europe, affecting these collaborative relationships.
Major discussion point
European Digital Media Observatory Success
Moderator
Speech speed
145 words per minute
Speech length
347 words
Speech time
142 seconds
Discussion focuses too much on Russian/Chinese disinformation while ignoring Western propaganda
Explanation
The discussion disproportionately emphasizes disinformation from Russia and China while failing to adequately address or critically examine disinformation that originates from within European countries or from closest Western allies. This creates an unbalanced perspective on the global disinformation landscape.
Evidence
The speaker notes that if voices from countries where the US and allies had financed foreign interventions were present, they would disagree with the Western narrative. Examples include misleading claims about weapons of mass destruction and promises to bring democracy that left countries ‘completely torn apart.’
Major discussion point
Criticism of Western Approach
Disagreed with
– Multiple speakers including Aneta Gonta, Valentyn Koval
Disagreed on
Western approach to disinformation – balanced vs. geopolitically focused
Narrative appears to push for censorship rather than balanced approach to misinformation
Explanation
The overall discussion narrative seems to advocate for censorship measures rather than promoting a balanced, credible, and fair approach to addressing misinformation regardless of its source. This raises concerns about the true intentions behind counter-disinformation efforts.
Evidence
The speaker questions the hypocrisy of requesting free information while simultaneously asking to ban content that goes against certain narratives on platforms like YouTube. They argue this represents a push for censorship rather than genuine information freedom.
Major discussion point
Criticism of Western Approach
Need for more self-critical assessment of disinformation from all sources
Explanation
There should be greater self-reflection and critical examination of how Western countries and institutions may contribute to disinformation and propaganda. The focus should be on addressing misinformation regardless of its origin rather than targeting specific geopolitical actors.
Evidence
The speaker argues that isolating from different narratives undermines capabilities of access to information and influences decision-making, which could itself constitute censorship. They question whether the US and Europe generate their own propaganda and misinformation.
Major discussion point
Criticism of Western Approach
Oleksandr Shevchuk
Speech speed
126 words per minute
Speech length
48 words
Speech time
22 seconds
Need for improved instruments to fight Russian historical propaganda in Ukraine
Explanation
There is a specific need to develop better tools and mechanisms to combat Russian historical propaganda targeting Ukraine. This represents a particular subset of disinformation that requires specialized approaches and instruments.
Major discussion point
Historical Propaganda Assessment
Assessment needed of European countries’ and Council of Europe’s effectiveness against propaganda
Explanation
There should be an evaluation of how effective European countries and the Council of Europe have been in their actions against Russian propaganda, particularly historical propaganda targeting Ukraine. This assessment would help improve future strategies and approaches.
Major discussion point
Historical Propaganda Assessment
Alina Tatarenko
Speech speed
145 words per minute
Speech length
1848 words
Speech time
763 seconds
Three main parts of Council of Europe guidance: media literacy, fact-checking, and platform-based solutions
Explanation
The Council of Europe guidance focuses on developing and emphasizing media literacy, supporting fact-checking organizations, and implementing platform-based solutions that incorporate safety by design into algorithms. These three pillars work together to address disinformation comprehensively.
Evidence
The guidance addresses platforms directly to ensure they have design safety incorporated in the initial design into every algorithm, and works with regulators to understand compliance.
Major discussion point
Council of Europe Standards and Guidance on Countering Disinformation
Gap between EU DSA coverage and non-EU member states creates regulatory challenges
Explanation
There is a significant problem with the gap between countries covered by the EU’s Digital Services Act and non-EU member states. This creates unequal regulatory frameworks and enforcement capabilities across different jurisdictions.
Evidence
The issue was mentioned in context of Western Balkan countries and other non-EU states struggling with platform regulation and harmful content online.
Major discussion point
Legal Framework and Regulatory Gaps
Countering Russian propaganda online is a Europe-wide and global problem
Explanation
The threats of Russian propaganda online extend far beyond Ukraine and affect countries throughout Europe and worldwide. This represents a shared challenge that requires coordinated international response.
Evidence
Examples provided include Moldova being heavily targeted, and the discussion of how Russian disinformation affects multiple countries in the region.
Major discussion point
Russian Disinformation and Propaganda Threats
Agreed with
– Valentyn Koval
– Aneta Gonta
– Alina Koushyk
Agreed on
Russian disinformation represents a systematic and coordinated threat
Council of Europe supports journalists in exile through specific projects
Explanation
The Council of Europe works not just with member states but also provides support to journalists in exile, recognizing their unique challenges and importance for democratic discourse. This includes working with Belarusian journalists operating from exile.
Evidence
Specific mention of project with Belarusian journalists in exile and invitation of Belsat TV director to share their experiences and challenges.
Major discussion point
Media in Exile and Language Discrimination
Problems exist but mitigation and prevention efforts are worthwhile despite inability to eliminate issues completely
Explanation
Like crime, disinformation and online harms will always exist to some degree, but this doesn’t mean efforts to mitigate damage and prevent problems are futile. Whatever can be done to help reduce and prevent these issues should be pursued.
Evidence
Comparison made to crime as an analogy – crime will always exist but we still work to prevent and reduce it.
Major discussion point
Council of Europe Standards and Guidance on Countering Disinformation
Agreements
Agreement points
Quality journalism and independent media are fundamental to countering disinformation
Speakers
– Andrin Eichin
– Valentyn Koval
– Julie Posetti
– Alina Koushyk
Arguments
Quality information is the most effective long-term antidote for disinformation
Need to replace disinformation with reliable information rather than just debunking fake news
Sustainability and security of journalism and democracy are intertwined, and both are dependent on the integrity of information, which is under unprecedented attack
Quality and independent journalism, this is what Andrin actually said, is a foundation of any strategy against disinformation
Summary
All speakers agree that investing in quality, independent journalism and reliable information sources is more effective than reactive fact-checking or content removal approaches to combating disinformation
Topics
Human rights | Sociocultural
Platform self-regulation has failed and stronger regulatory measures are needed
Speakers
– Julie Posetti
– Mykyta Poturaiev
– Amela OdobaÅ¡iÄ
Arguments
Time for self-regulation has passed; need legal obligation and punitive action against big tech
Social platforms don’t care about DSA and MFA as they’re not in EU jurisdiction
Need for co-regulatory approach involving all stakeholders rather than just regulatory authorities
Summary
Speakers agree that platforms cannot be trusted to self-regulate and that legal obligations, enforcement mechanisms, and multi-stakeholder approaches are necessary to hold them accountable
Topics
Legal and regulatory | Human rights
Russian disinformation represents a systematic and coordinated threat
Speakers
– Valentyn Koval
– Aneta Gonta
– Alina Koushyk
– Alina Tatarenko
Arguments
Ukraine lacks institutional media heritage and faces reactive responses to disinformation
Moldova is most targeted country in region with Russia investing over 200 million euros in disinformation campaigns
88% of Belarusian independent media outlets are closed, with 45 titles operating from exile
Countering Russian propaganda online is a Europe-wide and global problem
Summary
All speakers from affected regions agree that Russian disinformation campaigns are well-funded, systematic, and pose existential threats to democratic institutions and media freedom across multiple countries
Topics
Human rights | Sociocultural | Legal and regulatory
Women journalists and minorities face disproportionate online violence and targeting
Speakers
– Julie Posetti
– Giovana Fleck
Arguments
73% of women journalists experience online violence, with 37% targeted by political actors
Sustainability challenges for journalism require agency for both journalists and civil society
Summary
Both speakers emphasize that female and minority journalists face systematic targeting and that jurisdictions often fail to provide adequate protection, requiring specific attention and resources
Topics
Human rights | Sociocultural
Similar viewpoints
Both speakers advocate for the comprehensive three-pillar Council of Europe approach that combines fact-checking, platform design improvements, and user empowerment/media literacy as the most effective strategy
Speakers
– Andrin Eichin
– Alina Tatarenko
Arguments
Three-pillar approach: fact-checking, platform design solutions, and user empowerment
Three main parts of Council of Europe guidance: media literacy, fact-checking, and platform-based solutions
Topics
Human rights | Legal and regulatory | Sociocultural
Both Western Balkan regulators face similar challenges with outdated legal frameworks and institutional obstacles that prevent effective regulation of online harmful content
Speakers
– Amela OdobaÅ¡iÄ
– Luljeta Aliu
Arguments
Lack of adequate legal frameworks is major challenge for Western Balkan regulators
Constitutional court challenges to new media laws create additional regulatory obstacles
Topics
Legal and regulatory
Both speakers from post-Soviet countries highlight how platform algorithms systematically disadvantage their content and truthful information, particularly during conflicts or for minority languages
Speakers
– Valentyn Koval
– Alina Koushyk
Arguments
Social media platforms are structurally incapable of supporting truthful narratives during crises
Belarusian language content is de-prioritized by algorithms, forcing media to choose Russian for better reach
Topics
Sociocultural | Human rights | Legal and regulatory
Unexpected consensus
Proactive information creation over reactive debunking
Speakers
– Andrin Eichin
– Valentyn Koval
Arguments
Need for systemic solutions that strengthen information ecosystem resilience rather than just removing bad content
Need to replace disinformation with reliable information rather than just debunking fake news
Explanation
Despite coming from different perspectives (policy expert vs. regulator from conflict zone), both speakers independently arrived at the same conclusion that proactive information strategies are more effective than reactive content removal
Topics
Human rights | Sociocultural
Limitations of media literacy as immediate solution
Speakers
– Andrin Eichin
– Mykyta Poturaiev
Arguments
Media literacy efforts must be available for all age groups to build critical thinking
Traditional media losing audiences to anonymous social media accounts
Explanation
Both speakers acknowledge media literacy’s importance but also recognize its limitations – Eichin notes current efforts are sparse and opaque, while Poturaiev points out that even with literacy efforts, people still trust anonymous accounts over traditional media
Topics
Sociocultural | Human rights
Overall assessment
Summary
Strong consensus exists on the failure of platform self-regulation, the systematic nature of Russian disinformation threats, the importance of quality journalism, and the need for comprehensive regulatory approaches. Speakers also agree on the disproportionate targeting of women journalists and minorities, and the challenges faced by smaller countries in regulating global platforms.
Consensus level
High level of consensus among speakers, particularly on fundamental issues. This suggests broad agreement across different stakeholder groups (regulators, journalists, academics, policy experts) on the core problems and general direction of solutions. The consensus strengthens the legitimacy of calls for stronger regulatory action and increased support for independent journalism, while highlighting the urgent need for coordinated international responses to systematic disinformation campaigns.
Differences
Different viewpoints
Approach to platform regulation – self-regulation vs. legal obligation
Speakers
– Julie Posetti
– Andrin Eichin
Arguments
Time for self-regulation has passed; need legal obligation and punitive action against big tech
Platform architecture and their design play a vital role. The Guidance Note insists that platforms must adopt human rights by design and safety by design principles
Summary
Julie Posetti argues that self-regulation has failed and legal obligations with punitive actions are necessary, while Andrin Eichin’s Council of Europe guidance still emphasizes working with platforms through design principles and cooperation
Topics
Legal and regulatory | Human rights
Strategy focus – reactive debunking vs. proactive information creation
Speakers
– Valentyn Koval
– Andrin Eichin
Arguments
Need to replace disinformation with reliable information rather than just debunking fake news
Three-pillar approach: fact-checking, platform design solutions, and user empowerment
Summary
Koval advocates for abandoning reactive fact-checking in favor of flooding information space with truthful content, while Eichin maintains fact-checking as a key pillar alongside other measures
Topics
Sociocultural | Human rights
Western approach to disinformation – balanced vs. geopolitically focused
Speakers
– Moderator
– Multiple speakers including Aneta Gonta, Valentyn Koval
Arguments
Discussion focuses too much on Russian/Chinese disinformation while ignoring Western propaganda
Moldova is most targeted country in region with Russia investing over 200 million euros in disinformation campaigns
Summary
The moderator criticizes the discussion for focusing disproportionately on Russian/Chinese disinformation while ignoring Western propaganda, while other speakers provide evidence of specific Russian targeting
Topics
Human rights | Sociocultural
Effectiveness of current regulatory frameworks
Speakers
– Mykyta Poturaiev
– Amela OdobaÅ¡iÄ
Arguments
Social platforms don’t care about DSA and MFA as they’re not in EU jurisdiction
Need for co-regulatory approach involving all stakeholders rather than just regulatory authorities
Summary
Poturaiev argues current frameworks like DSA are ineffective for non-EU platforms, while OdobaÅ¡iÄ advocates for co-regulatory approaches as viable solutions
Topics
Legal and regulatory
Unexpected differences
Criticism of the entire discussion framework as potentially censorious
Speakers
– Moderator
– Multiple panelists
Arguments
Narrative appears to push for censorship rather than balanced approach to misinformation
Need for more self-critical assessment of disinformation from all sources
Explanation
Unexpected because the moderator (representing young voices in Europe) fundamentally challenged the premise of the entire discussion, arguing it was hypocritical and censorious rather than genuinely addressing information freedom
Topics
Human rights | Legal and regulatory
Constitutional court challenges undermining regulatory efforts
Speakers
– Luljeta Aliu
Arguments
Constitutional court challenges to new media laws create additional regulatory obstacles
Explanation
Unexpected because it reveals that even when regulators successfully create new frameworks, they can be undermined by constitutional courts responding to civil society challenges, creating a complex multi-front regulatory battle
Topics
Legal and regulatory
US funding cuts creating resource gaps for counter-disinformation work
Speakers
– Oksana Prykhodko
– Julie Posetti
Arguments
US funding cuts to counter-disinformation programs create resource gaps
Trump administration has cracked down on counter disinformation work while also defunding so very many international programs that support public interest media
Explanation
Unexpected because it reveals how changes in US policy directly impact European and global counter-disinformation efforts, creating new vulnerabilities just as threats are escalating
Topics
Economic | Human rights
Overall assessment
Summary
The discussion revealed significant disagreements on fundamental approaches to countering disinformation, from regulatory strategies (cooperation vs. legal force) to tactical approaches (fact-checking vs. content creation) to the very framing of the problem (geopolitical focus vs. balanced criticism)
Disagreement level
High level of disagreement with significant implications – the lack of consensus on basic approaches suggests the counter-disinformation field is still in a formative stage with competing paradigms. The fundamental challenge to the discussion’s premise by the young moderator indicates deep philosophical divisions about what constitutes legitimate counter-disinformation work versus censorship. This fragmentation may undermine coordinated responses to disinformation threats.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers advocate for the comprehensive three-pillar Council of Europe approach that combines fact-checking, platform design improvements, and user empowerment/media literacy as the most effective strategy
Speakers
– Andrin Eichin
– Alina Tatarenko
Arguments
Three-pillar approach: fact-checking, platform design solutions, and user empowerment
Three main parts of Council of Europe guidance: media literacy, fact-checking, and platform-based solutions
Topics
Human rights | Legal and regulatory | Sociocultural
Both Western Balkan regulators face similar challenges with outdated legal frameworks and institutional obstacles that prevent effective regulation of online harmful content
Speakers
– Amela OdobaÅ¡iÄ
– Luljeta Aliu
Arguments
Lack of adequate legal frameworks is major challenge for Western Balkan regulators
Constitutional court challenges to new media laws create additional regulatory obstacles
Topics
Legal and regulatory
Both speakers from post-Soviet countries highlight how platform algorithms systematically disadvantage their content and truthful information, particularly during conflicts or for minority languages
Speakers
– Valentyn Koval
– Alina Koushyk
Arguments
Social media platforms are structurally incapable of supporting truthful narratives during crises
Belarusian language content is de-prioritized by algorithms, forcing media to choose Russian for better reach
Topics
Sociocultural | Human rights | Legal and regulatory
Takeaways
Key takeaways
The Council of Europe has developed a three-pillar approach to counter disinformation: fact-checking, platform design solutions, and user empowerment, emphasizing systemic solutions over content removal
Russian disinformation campaigns pose an existential threat to European democracies, with Moldova being the most targeted country (over 200 million euros invested) and Ukraine facing structural challenges due to lack of institutional media heritage
Social media platforms are structurally incapable of supporting truthful narratives during crises and current self-regulation has failed, requiring legal obligations and punitive measures
There is a significant regulatory gap between EU member states covered by DSA/MFA and non-EU countries, leaving many vulnerable to disinformation without adequate legal frameworks
Media in exile face unique challenges including algorithmic discrimination against minority languages (Belarusian) and severe persecution, with 88% of Belarusian independent media outlets closed
Online violence against journalists, particularly women (73% experience online violence), is systematically used to silence democratic voices and undermine information integrity
Quality information and well-funded independent journalism are the most effective long-term antidotes to disinformation, requiring investment in reliable sources rather than just reactive fact-checking
A global disinformation industry motivated by financial gain operates across borders, requiring international cooperation to cut funding sources
Media literacy must be comprehensive across all age groups, but traditional approaches take 15-20 years to implement effectively
Resolutions and action items
Member states should integrate Council of Europe recommendations into national frameworks with consistent alignment to human rights obligations
Platforms must make meaningful steps to reform system design beyond post-facto moderation, implementing human rights by design principles
Policymakers and researchers should collaborate to evaluate impact of counter-disinformation measures and adjust to new technological threats
European legislators and regulators should double down on efforts to make big tech responsible, accountable, and transparent through legal obligation and punitive action
Focus resources on creating and disseminating reliable information to fill the ‘information tube’ rather than just debunking disinformation
Strengthen regional collaboration among Western Balkan regulators to address platform communication challenges
Develop co-regulatory approaches involving all stakeholders rather than relying solely on regulatory authorities
Address algorithmic discrimination against minority languages like Belarusian on digital platforms
Establish international cooperation mechanisms to cut financial resources to the disinformation industry
Unresolved issues
How to effectively regulate global platforms that operate outside EU jurisdiction and don’t comply with DSA/MFA requirements
How small countries can establish meaningful communication and enforcement mechanisms with big tech platforms
How to protect individuals’ reputations, children from bullying, and vulnerable groups from hate speech on social platforms where traditional legal remedies don’t apply
How to replace American funding cuts for counter-disinformation programs and media support initiatives
How to balance freedom of expression with necessary restrictions during wartime and crisis situations
Whether measures taken during emergencies can be designed to be reversible to restore liberties later
How to address the fundamental business model of social media platforms that incentivizes engagement over truth
How to distinguish between legitimate counter-disinformation efforts and censorship accusations
How to address Western/US-generated disinformation and propaganda while focusing on Russian and Chinese threats
How to make media literacy effective in the short term when traditional approaches require decades to implement
How to ensure sustainability of independent journalism and fact-checking organizations amid resource constraints
Suggested compromises
Adopt co-regulatory approaches that involve government, regulators, civil society, and platforms rather than top-down regulation
Focus on process-based regulation rather than targeting individual content, treating content removal as last resort
Apply proportionate regulation tailored to platform size and risk profile rather than one-size-fits-all approaches
Use friction-based mechanisms to reduce reach and impact of harmful content rather than outright removal
Strengthen existing institutions like public service broadcasting while developing new regulatory frameworks
Combine short-term regulatory measures with long-term investments in media literacy and quality journalism
Develop regional cooperation mechanisms for smaller countries to collectively engage with global platforms
Balance immediate security needs during crises with long-term democratic values and human rights protections
Thought provoking comments
The complexity of the situation means that people’s fear of misinformation often generates as much polarisation and distrust as the problematic content itself.
Speaker
Andrin Eichin
Reason
This comment is deeply insightful because it identifies a paradox at the heart of counter-disinformation efforts – that the very awareness and fear of misinformation can become as damaging as the misinformation itself. It challenges the assumption that simply raising awareness about disinformation is beneficial.
Impact
This observation set the tone for a more nuanced discussion throughout the session, moving beyond simple ‘good vs. bad information’ narratives and acknowledging the psychological and social complexities involved in information consumption.
A core challenge of countering disinformation is that most responses are reactive. First comes the fake news and only later fact-checking… Worse, attempts to debunk falsehoods can sometimes amplify them, especially when the debunking lacks credibility.
Speaker
Valentyn Koval
Reason
This comment exposes a fundamental flaw in traditional counter-disinformation strategies and introduces the concept of the ‘amplification paradox’ – where efforts to combat false information inadvertently spread it further.
Impact
This shifted the discussion from focusing on reactive measures to proactive strategies, leading to Koval’s later emphasis on ‘flooding the information space with verified truthful content’ rather than just debunking false content.
Belarusian language content is de-prioritized by algorithms of social media, especially in YouTube… If you do shorts in Russian, you will have a million easily. But if you do it in Belarusian, you will have less than 100.
Speaker
Alina Koushyk
Reason
This comment reveals how algorithmic bias can constitute a form of cultural suppression and highlights an often-overlooked dimension of platform governance – how technical systems can inadvertently support authoritarian narratives by marginalizing minority languages.
Impact
This brought a new dimension to the discussion about platform responsibility, moving beyond content moderation to algorithmic fairness and cultural preservation. It demonstrated how technical design choices can have profound political implications.
The broligarchy, as the obscenely wealthy tech bros in power are collectively referred to more frequently now, is choking democracy… the time for self-regulation has passed as it applies to big tech actors.
Speaker
Julie Posetti
Reason
This comment reframes the discussion from technical solutions to power dynamics, introducing the concept of ‘broligarchy’ and directly challenging the prevailing narrative that platforms can be trusted to self-regulate.
Impact
This marked a turning point in the discussion toward more confrontational approaches to platform governance, moving away from collaborative solutions toward regulatory enforcement and legal obligations.
Do we have any protection from what is happening there? No. Does DSA work for now? No. Does MFA work for now? No… So do we have answers on political level or governmental level? I’m not sure… We don’t know maybe an ultra-right candidate will win… we are losing this battle, and we are very, very close to lose this war, informational war.
Speaker
Mykyta Poturaiev
Reason
This intervention was provocative because it systematically dismantled the optimism of previous speakers, presenting a stark assessment that current approaches are failing and democracy itself is at risk. It challenged the entire premise of incremental reform.
Impact
This created a pivotal moment in the discussion, forcing other participants to confront the limitations of their approaches. It sparked immediate reactions and led to more urgent calls for action, while also prompting defensive responses about the importance of continuing the fight despite challenges.
Because we often criticize this information propaganda mentioning that it only comes from Russia or China. But are we sufficiently aware and critical of this information that originates from within our own countries or from our closest allies?… How can we ensure our efforts to combat disinformation remain balanced, credible, and fair by addressing misinformation and propaganda respective of its source?
Speaker
Young participant from EFD
Reason
This comment challenged the fundamental framing of the entire discussion by questioning whether the focus on external threats (Russia, China) was creating blind spots about domestic and allied disinformation. It forced participants to confront potential bias in their approach.
Impact
This intervention created visible tension in the room and prompted Julie Posetti to acknowledge the legitimacy of concerns about US-led disinformation, demonstrating how challenging assumptions can lead to more honest and comprehensive analysis.
Overall assessment
These key comments fundamentally shaped the discussion by progressively challenging assumptions and deepening complexity. The session began with technical and procedural approaches to counter-disinformation but evolved into a more critical examination of power structures, systemic failures, and the limitations of current approaches. The most impactful comments either revealed hidden paradoxes (like fear of misinformation causing harm), exposed systemic biases (algorithmic discrimination against minority languages), or challenged the entire framing of the problem (questioning Western-centric perspectives on disinformation). The discussion’s trajectory moved from optimistic policy recommendations toward a more sobering acknowledgment of the scale of the challenge and the need for more fundamental changes in how platforms are governed and regulated. The interventions created a productive tension between hope and realism, ultimately resulting in a more nuanced understanding of the disinformation challenge.
Follow-up questions
How do small countries, such as the countries in the Western Balkans, establish collaboration or communication with big online platforms?
Speaker
Amela OdobaÅ¡iÄ
Explanation
This is a critical challenge for smaller regulatory authorities who lack the market influence to effectively engage with global platforms, requiring exploration of regional collaboration approaches.
How can we cut the financial resources to the disinformation industry and is there space for international cooperation with law enforcement?
Speaker
Marilia Maciel
Explanation
Understanding how to disrupt the economic incentives behind disinformation operations is crucial for effective counter-measures, particularly for companies selling disinformation services across borders.
How important is public service broadcasting in countering disinformation and how can it be supported in this context?
Speaker
Jordan Ogg (Ofcom UK)
Explanation
Public service broadcasting plays a vital role in providing quality information as an antidote to disinformation, but requires policy support and adequate funding.
Are measures designed to deal with harmful content in times of war designed to be reversible at a later date to restore liberties or are they set in stone?
Speaker
Siva Subraminna (online participant)
Explanation
This addresses concerns about the temporary versus permanent nature of emergency measures that may restrict freedoms during conflicts.
How can we ensure our efforts to combat disinformation remain balanced, credible, and fair by addressing misinformation and propaganda respective of its source?
Speaker
EFD representative
Explanation
This challenges the focus on certain sources of disinformation while potentially overlooking others, raising questions about bias in counter-disinformation efforts.
What are the improvement instruments and mechanisms to fight against Russian propaganda in Ukraine, especially historical propaganda, and how effective are actions by European countries and the Council of Europe?
Speaker
Oleksandr Shevchuk
Explanation
This seeks specific strategies and effectiveness assessments for countering Russian propaganda operations, particularly those targeting historical narratives.
What asymmetrical responses can be developed to replace closed American counter-disinformation projects?
Speaker
Oksana Prykhodko
Explanation
With American funding cuts to counter-disinformation initiatives, alternative approaches and funding sources need to be identified and developed.
Are there signs that changes in US platform policies will be replicated in Europe, and what is the status of Council of Europe dialogue with platforms?
Speaker
Giacomo Mazzone
Explanation
Understanding whether US policy changes regarding content moderation and fact-checking will affect European operations is crucial for regulatory planning.
How can digital platforms be made to stop penalizing Belarusian language content and support algorithms that help provide free information according to professional journalistic standards?
Speaker
Alina Koushyk
Explanation
This addresses algorithmic bias against smaller languages and the need for platforms to support linguistic diversity in content distribution.
How can media rights groups and civil society organizations avoid being used as instruments to oppose legitimate regulation while maintaining their watchdog role?
Speaker
Luljeta Aliu
Explanation
This explores the complex dynamics where legitimate regulatory efforts may be opposed by civil society groups, potentially influenced by external actors.
What empirical data exists on the actual reach and impact of disinformation, and how can more research be conducted in this area?
Speaker
Andrin Eichin
Explanation
The expert committee noted that empirical data on disinformation’s actual impact is still limited, requiring more comprehensive research to inform policy decisions.
How can policymakers and researchers collaborate to evaluate the impact of counter-disinformation measures and adjust to new technological threats?
Speaker
Andrin Eichin
Explanation
This emphasizes the need for ongoing assessment and adaptation of counter-disinformation strategies as technology evolves.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
