How prevent external interferences to EU Election 2024 – v.2 | IGF 2023 Town Hall #162
Table of contents
Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.
Knowledge Graph of Debate
Session report
Full session report
Audience
Upon analysing the provided statements, it is evident that there are several concerns and inquiries raised by the speakers. These concerns are centred around various issues related to TikTok, platform APIs, engagement with overseas countries, fake news and disinformation, algorithm transparency, and online content moderation.
One of the main concerns is regarding TikTok’s censorship and user information “bubbling.” This refers to a situation where TikToks from certain countries, such as Russia and Ukraine, cannot be viewed by users in other countries, even with a direct link. Instead, videos from these links are replaced with unrelated content, such as videos of cats and dogs. This has triggered negative sentiment among users and raised concerns about the control over user information.
Additionally, there is an inquiry about the regulation of TikTok’s policy and mechanism for controlling viewer access. The speaker seeks clarity on how TikTok manages and controls viewer access to its platform. Although no supporting facts are provided, it reflects a neutral stance and highlights the need for understanding TikTok’s policy regulation.
Another concern raised relates to platform APIs and data access. The de-emphasis of CrowdTangle, restrictions on APIs, and expenses for research organizations are mentioned as supporting facts. These issues have generated negative sentiment among users who worry about the limitations and costs associated with platform APIs and data access.
Furthermore, the speakers express curiosity about engagement with overseas and partner countries. While one speaker mentions work done with these countries, no details are provided about the nature and extent of this engagement. Nonetheless, this topic is approached with a neutral sentiment, indicating an interest in learning more about the engagement process.
The increase in the manifestation of fake news and disinformation in Taiwan raises concerns. It is highlighted that private sector platform providers play a crucial role in enforcing regulations and dealing with such information. This negative sentiment reflects worries about the impact of fake news and disinformation on society.
The desire for algorithm transparency in content recommendation is another argument put forth. However, no supporting facts are mentioned regarding this issue. Despite this, the neutral sentiment reflects a general interest in making the content recommendation algorithm more transparent.
There is also a speaker who wants to understand how online content moderation systems work. While no supporting facts are provided, this neutral stance suggests a curiosity about the mechanisms and processes involved in content moderation on platforms like TikTok.
Lastly, there is an inquiry if it is possible to retrieve a post or video once it has been removed by the content moderation system. No additional information is provided on this topic, but the neutral sentiment implies a desire to explore the potential of content recovery.
In conclusion, the concerns and inquiries presented in the statements cover a wide range of topics, including TikTok’s user privacy and information control, policy regulation and control over viewer access, platform APIs and data access, engagement with overseas and partner countries, manifestation of fake news and disinformation, algorithm transparency, online content moderation systems, and content recovery. These matters highlight various aspects of platform management, user experience, and the impact of social media platforms on society. The analysis helps identify the speakers’ viewpoint and concerns while emphasising the need for further insights and information on these subjects.
Paula Gori
The European Digital Media Observatory (EDMO) is an independent consortium of organizations that focuses on fact-checking, academic research, and media literacy. Although funded by the European Commission, EDMO operates autonomously. It aims to combat misinformation by providing a platform where experts can collaborate on addressing this issue.
One of the main objectives of EDMO is to provide tools and evidence to counter disinformation. The organization establishes networks of fact-checkers who work together to identify false narratives and share information with one another. This collaborative approach allows for quicker and more efficient debunking of misleading information, especially when done within the first 24 hours.
In addition to combating disinformation, EDMO also focuses on mapping and evaluating media literacy initiatives. It strives to thoroughly understand the impact and effectiveness of these initiatives, ensuring that efforts to enhance media literacy are productive and fruitful.
An important consideration for EDMO is data accessibility. They have produced a code of conduct for accessing online platform data and are working towards creating an independent intermediary body that handles requests for such data. EDMO recognizes the necessity of granting access to platform data for research purposes while fully respecting GDPR regulations.
However, there are challenges in accessing platform data, particularly for researchers from smaller universities and countries with minority languages. Data access is more readily available to well-established universities, which amplifies the inequality in research opportunities between larger and smaller educational institutions.
Paula, in her stance, advocates for the accessibility of platform data, especially for researchers from smaller universities and countries with minority languages. She points out the difficulty faced by these institutions in accessing data and emphasizes the importance of ensuring equitable research opportunities. Paula also acknowledges the need for proper infrastructures to effectively handle and manage data, highlighting that data accessibility is not the only concern; having the necessary infrastructure is equally crucial.
In conclusion, EDMO plays a significant role in addressing misinformation by providing a collaborative platform for experts in fact-checking, research, and media literacy. Their efforts to combat disinformation, map media literacy initiatives, and promote data accessibility are commendable. However, challenges remain in terms of accessing platform data, particularly for researchers from smaller universities and minority language contexts. It is essential to address these challenges and create a level playing field for all researchers to contribute to the fight against misinformation.
Erik Lambert
The European Commission is currently engaged in the process of regulating artificial intelligence (AI) with a specific focus on preventing the manipulation of public opinion. These regulations aim to curb coordinated activities by foreign powers or specific groups seeking to influence public sentiment. It is important, however, that these regulations do not impede freedom of speech.
According to Erik Lambert, an expert in the field, the younger generation’s trust in social media platforms is shifting. Platforms like Facebook and Twitter, which have traditionally dominated the digital sphere, are experiencing a decline in trust. Instead, younger people are turning to platforms such as TikTok that offer more personal experiences. This shift underscores the need for social media platforms to adapt and address the concerns of their user base.
Furthermore, Lambert emphasizes the importance of understanding and evolving our approach to public opinion formation in the 21st century. The rise of digital platforms, social media, and the rapid dissemination of information have changed the way public opinion is shaped. It is essential to recognize and adapt to these changes in order to effectively engage with the public and address their needs and concerns.
In conclusion, the efforts of the European Commission to regulate AI and combat the manipulation of public opinion are commendable. However, it is crucial to strike the right balance between preserving freedom of speech and preventing coordinated activities that aim to deceive or manipulate the public. Additionally, social media platforms must adapt to the changing trends in trust among the younger generation. Finally, understanding and evolving our approach to public opinion formation is essential for effective engagement with the public in the 21st century.
Esteve Sanz
Esteve Sanz highlights the crucial role of the Internet Governance Forum (IGF) in discussing critical issues related to disinformation and internet governance on a global scale. The attendance of the Vice President of the European Commission further emphasizes the importance placed on the forum and the seriousness with which disinformation is being addressed.
At the IGF, countries exchange ideas and concerns about disinformation, demonstrating collaborative efforts to combat its spread and the need for international cooperation. Esteve Sanz emphasizes that the IGF provides a substantial and concrete platform for these discussions.
One specific concern raised is the increasing influence of generative Artificial Intelligence (AI) in amplifying disinformation. Policymakers are urged to be alert and proactive in countering this issue. The affordability and ease with which generative AI can produce disinformation campaigns make it a significant threat. The European Commission is considering measures such as watermarking AI-generated content to tackle this challenge.
Esteve Sanz also emphasizes the importance of a clear definition of disinformation within the European Union (EU). It is argued that disinformation is an intentional action carried out by specific actors. This aligns with the EU’s human-centric approach to digital policies and underscores the need for accurate understanding and identification of disinformation to effectively combat it.
In conclusion, Esteve Sanz’s stance on the IGF underscores its critical role in addressing global disinformation and internet governance issues. The attendance of the Vice President of the European Commission and the exchange of concerns among countries highlight the significance placed on the forum. The threat posed by generative AI in amplifying disinformation calls for heightened alertness from policymakers. Moreover, a clear definition of disinformation is deemed essential within the EU, reflecting its human-centric approach to digital policies. These insights shed light on the international and regional efforts to combat disinformation and ensure the integrity of online information exchanges.
Stanislav Matejka
The European Regulators Group for Audiovisual Media Services (ERGA) plays a vital role in enforcing and implementing the Audiovisual Media Services Directive, with a strong focus on effectiveness. ERGA’s members have the responsibility of not only enforcing European legislation but also their own national legislation, ensuring comprehensive media regulation.
ERGA is particularly focused on political advertising, establishing rules for advertising in general and paying particular attention to political advertising. Since the creation of the first code of practice in 2018, ERGA has consistently directed its efforts towards this issue. Their aim is to ensure fair and transparent political campaigns.
ERGA also places significant importance on election integrity and transparency. They have introduced a code of practice that includes transparency obligations and commitments to publish transparency reports. ERGA emphasizes the effective enforcement of platforms’ own policies and closely monitors this aspect. Transparency is key to protecting election integrity and ensuring accountability.
To combat misinformation on online platforms, ERGA supports the establishment of reporting mechanisms. They propose the creation of functional reporting mechanisms for regulators, researchers, and anyone else who wishes to report or flag instances of misinformation. This initiative aims to address the spread of false information and provide a platform for accountability.
Access to data is crucial for ERGA in promoting public scrutiny through independent research. They recognize the significance of data for the research community in informing the enforcement of regulatory frameworks. ERGA supports the idea that independent research should have access to relevant data, enabling a more informed analysis and evaluation of media services.
In summary, ERGA is dedicated to effectively implementing the Audiovisual Media Services Directive. Their focus on political advertising, transparency in elections, reporting mechanisms for misinformation, and access to data for independent research are essential aspects of their work. By addressing these areas, ERGA aims to ensure fair and transparent media services in Europe.
Giovanni Zagni
The European Digital Media Observatory (EDMO) has recently established a new task force with a specific focus on addressing disinformation during the 2024 European elections. This task force aims to build upon the success of a previous one that focused on tackling disinformation during the Ukraine war. Comprising 18 members from various sectors, the task force is committed to understanding the nature of disinformation and disseminating valuable insights to combat its harmful effects.
One of the key objectives of the task force is to review past electoral campaigns, analyze their outcomes, and identify the main risks associated with the upcoming European elections in 2024. Through this process, they seek to develop strategies and frameworks to counteract disinformation and safeguard the integrity of the electoral process. Additionally, the task force plans to disseminate best practices from the media and information literacy world. By sharing successful approaches, they hope to enhance media awareness and empower citizens to critically evaluate and navigate the information landscape.
Giovanni Zagni, a strong advocate for democracy and inclusivity, fully supports this initiative. He emphasizes the need for a democratic and inclusive approach in addressing disinformation, ensuring that the diverse issues faced by each country are properly represented. Zagni highlights the task force’s role in facilitating the exchange of best practices and experiences in combating disinformation, thereby enhancing the effectiveness of efforts to promote peace, justice, and strong democratic institutions.
In conclusion, the establishment of the new task force by EDMO represents a significant step in addressing disinformation during the 2024 European elections. Building on the success of the previous task force, they aim to develop comprehensive strategies to tackle disinformation, review past electoral campaigns, and disseminate best practices. With the support of individuals like Giovanni Zagni, the task force aims to foster a democratic and inclusive environment where diverse issues are adequately considered. Through these collective efforts, they hope to reinforce media literacy, combat disinformation, and uphold the integrity of the electoral process.
Caroline Greer
TikTok actively participates in the Code of Practice on Disinformation, taking a leading role in developing structural indicators. They, along with other platforms, recently published their second reports on tackling disinformation. As a signatory of the Code of Practice on Disinformation, TikTok co-chairs the election working group, demonstrating their dedication to addressing disinformation during elections.
TikTok advocates for a multi-stakeholder approach to combat disinformation, promoting partnerships with fact-checkers, civil society, and other actors. They are part of a larger ecosystem that encourages collaboration in combating disinformation.
To ensure the integrity of elections, TikTok has a comprehensive global election integrity program in place. They work with local experts for each election and provide authoritative information about the election on their platform. Additionally, TikTok collaborates with external partners to gather additional intelligence.
TikTok has a strict policy against political advertising, which they have upheld for several years. They restrict the activities of political parties and politicians during elections, including campaign funding.
TikTok runs media literacy campaigns to promote critical thinking and verification of information. They sometimes partner with fact-checkers to enhance the effectiveness of these campaigns.
TikTok applies community guidelines globally, which help create a safe and inclusive environment for users.
In response to the Ukraine-Russian situation, TikTok has implemented special measures to mitigate the spread of harmful content and support peace and justice.
TikTok offers features to enhance user experience, such as the ability to refresh the content feed for a broader range of content. They have also introduced a second recommender system as required by the Digital Services Act, which presents popular videos based on the user’s location.
The Digital Services Act (DSA) plays a crucial role in promoting transparency in online platforms, including TikTok. Platforms must provide a detailed explanation of their recommender systems and reasons for any action taken. Users have the right to appeal platform decisions, and transparency reports are published to provide insights into content moderation practices.
In summary, TikTok actively engages in combatting disinformation, ensuring election integrity, promoting media literacy, and enhancing user experience. They adhere to policies and regulations such as the Code of Practice on Disinformation and the Digital Services Act, upholding transparency and fostering trust. Through collaboration and effective measures, TikTok creates a safe and engaging platform.
Albin Birger
The European Union (EU) is taking comprehensive action to combat disinformation. This includes implementing measures in three key areas: legislation, external actions, and communication. The EU institutions, such as the Commission and the European External Action Service, reflect these actions through their institutional architecture. The Director-General (DG) of the European Commission, Albin Birger, represents DG Connect, which is responsible for legislation regarding disinformation.
The EU is strengthening its regulatory framework with the introduction of the Digital Services Act (DSA), which mandates that online platforms be accountable for content moderation, advertising, and algorithmic processes. The Commission has been granted extensive investigatory and supervisory powers under the DSA.
Furthermore, the Code of Practice on disinformation, a voluntary and industry-based measure, plays a significant role in combating disinformation. Established in 2018 and strengthened in 2022, the Code aims to reduce financial incentives for those spreading disinformation and empower users to better understand and report disinformation content.
The EU is particularly focused on addressing disinformation related to electoral processes. To tackle this issue, a specific working group has been established. This group aims to exchange information and develop actions that can be implemented during elections to effectively counter disinformation-related risks.
The European Digital Media Observatory (EDMO) also plays a crucial role in the EU’s fight against disinformation. This observatory supports the development of a multi-disciplinary community of independent fact-checkers and academic researchers. EDMO operates as a central system, with national or regional hubs covering the EU territory and population. Additionally, EDMO has a specific task force for elections that carries out risk assessments ahead of European elections.
The DSA adds an additional layer of accountability for large online platforms, introducing mechanisms to audit the data and information provided by these platforms. Failure to comply with DSA obligations may result in enforcement measures and fines based on a percentage of the platform’s global turnover.
While signing the code of practice is voluntary for online platforms, it serves as a tool to demonstrate their compliance with DSA obligations. Even if platforms choose not to sign, they can still align their actions with the expectations outlined in the code of practice.
In conclusion, the European Union is taking comprehensive action against disinformation through legislation, external actions, and communication. The implementation of the Digital Services Act and the Code of Practice on disinformation provides a framework for accountability and empowers individuals to combat disinformation. The EU’s focus on tackling disinformation related to electoral processes, along with the support of the European Digital Media Observatory, further strengthens its efforts in this area.
Giacomo Mazzone
This town hall meeting focused on the upcoming European election in 2024 and the measures being taken to secure the elections and minimize interference. Representatives from the European Commission, the European Digital Media Observatory (EDMO), the regulatory body ERGA, TikTok, and civil society were present.
The European Commission, as the main proponent of this initiative, discussed the broader framework of the election and the role of independent regulators. They emphasized the importance of securing the elections and minimizing interference while enabling voters to freely express their views.
EDMO, responsible for tackling disinformation, addressed concerns from other regions about the creation of a “minister of truth.” They clarified that involvement of independent regulators, like ERGA, ensures a multi-stakeholder approach and prevents any monopolization of truth.
A representative from civil society questioned the effectiveness of self-assessment reports from big tech companies in preventing social harm on digital platforms. They discussed additional measures and actions that need to be taken for better results.
TikTok’s representative highlighted the platform’s commitment to preventing harm and maintaining a safe environment during the elections. They emphasized the responsibility of platforms like TikTok to proactively address harmful content and uphold the integrity of the democratic process.
The issue of what happens if large platforms refuse to comply with the code of practice was also discussed. The European Commission representative addressed this concern and assured that remedial actions would be taken to prevent significant harm.
Research in the field was another topic raised in the meeting. The EDMO representative acknowledged the importance of research in understanding and addressing election security and disinformation.
The meeting briefly discussed concerns about European citizenship modules and their impact on the election process. The need to address these concerns and provide clarity was mentioned, though no specific solutions were discussed.
Overall, the meeting aimed to provide valuable insights into securing elections, minimizing interference, and combating disinformation during the European election in 2024. The multi-stakeholder approach, involving the European Commission, regulators, platforms like TikTok, and civil society, demonstrated a collective commitment to ensuring the integrity of the electoral process.
Session transcript
Giacomo Mazzone:
Yes, okay. So thank you everybody for being with us. Thanks to the people in Europe that are with us even if they wake up not long time ago. Thank you for being with us. So this is a town hall meeting that is dedicated to a very specific topic, regional topic let’s say, but that we hope could be a learning experience for other region of the world. As you know this is dedicated to the European election 2024 that will take place in June next year and to the measures that has been put in place by the European Union and other stakeholders in order to secure these elections and make a normal process with not so many interference but at least where the interference that will happen will be reduced at the minimum in terms of impact on the freedom of the voters in order to express their views and their opinions. To discuss about this complex topic we have many actors that are those working this complex machinery to try to ensure the security of the elections. We have two representatives of the European Commission because the European Commission is the main actor that promoted this initiative. One is with us here in the room in Kyoto, Esteve Sands, and another one is in Brussels, is Albin Birger if he’s German or Berger if he’s French, we don’t know, this is the beauty of Europe. And then we have the chair of the task force that European Union through EDMO has put in place to deal with the disinformation issues during the elections, Giovanni Zagni. We have Paola Gori in Florence that is the person behind EDMO, that is the European Digital Media Observatory, that is the body in charge from the European Union to put in place this task force. Then we have a representative of ERGA, because as I said this is a multilateral effort, multi-stakeholder effort, so the regulatory body plays a very important role, so we have Stanislas Matejka from Slovakia that is with us, will explain the role of the regulator in that. And of course, last but not least, we have two other important components, the industry, represented here by Caroline Greer from TikTok, welcome Caroline, and civil society represented by Eric Lambert in Rome. So all European panel, but very composed, very multi-stakeholder. I would start giving the floor to the initiator of the process, the European Commission, because this initiative is not a stand-alone initiative, but is part of a larger framework that has been mentioned, by the way, the other day from Commissioner Jourova that was here at the opening, and Esteve was with her, please Esteve.
Esteve Sanz:
Thank you so much, Giacomo, for inviting the European Commission to this event, I will give the floor very quickly to my colleague Albin, who is a real expert on this information for the Commission, I’m the head of Internet Governance in DigiConnect as well, from the point of view of Internet Governance, of course the IGF is a critical institution for us of the multi-stakeholder system, and when it comes to this information which is such a crucial development in these societies that we live in, what we have seen precisely these days is how good the IGF is a platform to discuss these critical issues, and this testifies of the health of the IGF, how the IGF is really ready to discuss all these critical issues in ways that are very concrete and very substantial, and you mentioned that our Vice President was indeed here, which also testifies of how important the IGF is for the European Commission, and she had the chance not only of participating in this high-level panel on this information, but also to exchange with the multi-stakeholder community, with all stakeholders, including governments, about this information, and I can tell you that there seems to be an agreement that this is a very strong concern in every country that we have the possibility to exchange with. There are similar, very clear campaigns, there’s information campaigns going on that potentially relate to electoral processes, and really the IGF I think that provided the VP a very good venue to take the polls of this global phenomenon, which is for sure not only European, and that it’s really impacting across the globe. I would maybe just remind a bit what one of the things that the VP said during this high-level session on this information before Albin comes to more concrete aspects on how the Commission is tackling the phenomenon. The VP put a lot of emphasis in the definition of this information. She said very clearly that all that we do in policymaking process in the EU on this information starts with a definition, which is basically that this information has to be intentional. It’s something that happens because some actors engage intentionally in a disinformation action or disinformation campaign. And what she said as well, it’s something that has also been part of the overall discussion in this IGF, which is that with generative AI, these intentional elements of disinformation are basically amplified. Amplified up to a point where producing disinformation campaigns is increasingly becoming extremely cheap. And with that phenomenon, the alert of policymakers on the issue should be just higher as it is in the EU. In the EU, and she was very clear about that, there is of course a human-centric approach to technologies and digital policies, and this also is involved into this process. On the one hand, she said we don’t give rights to AI. We don’t give free speech rights to AI. We don’t give copyright rights to AI. But at the same time, we do give tools and rights to citizens in relation to this phenomenon. And she emphasized how positive it is, the European Commission, in considering measures, including in the AI Act, of watermarking AI-generated content to basically help the users and the citizens identify when content has been produced by these technologies. So overall, it was a very powerful presence to have here at the VP. Of course, she could only engage in general messages about our policymaking process. I think that my colleague Albin will provide you more concrete aspects of the framework in which the Commission is operating these days on the disinformation landscape.
Giacomo Mazzone:
So Albin, you have been put on thespot. We are pending from your mouth. Thank you very much. Good morning. Ask me for the slides.
Albin Birger:
I’ve here the tool for advancing it. Okay. Thank you very much. Indeed. Pleasure to be with you. Thanks, Estef. Thank you, Giacomo, for the introduction and having me. Indeed, we could move to the presentation. I cannot see it right now here, but leave it up to you. As you said, Estef, the idea is to set the scene on how the EU is addressing disinformation. Basically, in very broad terms, the EU is taking action in three different fields. And these are essentially reflecting also the institutional architecture of the EU institutions. On the one side, you have the Commission, but also the European External Action Service. Three fields, legislation, external actions, and communication. Again, in broad terms, and due to the division of tasks by policy of the European Commission, each Director General, DG Connect, whom I represent today, takes care of a different field. When I mention legislation, and I’ll come back to that in the next slide, probably, in more details, but that can come in a moment. I’d like to just give you an overview of the other aspects briefly. Regulation, legislation, or co-regulation, when it comes to the code of practice on disinformation. Regulation being the Digital Services Act and the Digital Market Acts. But also, in broader terms, when it comes to funding projects, we are also supporting the European Digital Media Observatory, EDMO, and Paola will talk about that in a moment, I suppose. When it comes to the EAS, well, for many years, the EAS has been addressing and tackling foreign information manipulation and interference. So, that is basically how external actors may affect the discourse or the public opinion in the EU. A number of tools are set up, putting in touch either the European institution and its member states, but also wider stakeholders at international level to ensure or seek for a more systematic information exchange with those stakeholders, be it the G7, be it NATO, and in other forums. The EAS also has a very operational aspect or division, STRATCOM, which then looks into more details into data analysis and media monitoring to identify and expose actually cases originating in media or covert influence operation by external state or non-state actors. Finally, from the Commission side, which is a bit more internally oriented, but worse to mention, is obviously also to address or pre-bank possible narratives that are developing in various policy areas and to put in touch the various responsible DGs per policy area, be it climate change, migration, we seek or the DG communication seeks to establish communication channels and possibly also pre-bank or debunk narratives developing. On the next slide, I’ll get into more of the DG Connect part that I mentioned. The Digital Services Act adopted and enforced currently is the new EU regulation establishing standard for the accountability of online platforms regarding illegal content, disinformation, and other societal risks. Accountability on how they moderate content on advertising, on algorithmic processes, and so through this, very large online platforms and very large online services, sorry, search engines, have to address the risks that are related to disinformation and the Commission is equipped with wide-ranging investigatory and supervisory powers. Linked to that, but in a sense not at all regulation as such, is the Code of Practice, which is a self-regulatory and voluntary tool that is not totally new. It was established, developed in 2018, but revamped, strengthened in 2022, and it really is the industry attempt to establish commitments and measures at a granular level to address various aspects that are pertinent when one aims to address the disinformation phenomenon. Here I mentioned a few areas or chapters of the Code. Demonetization, of course, the aim would be to cut financial incentives for purveyors of disinformation, so signatories in that field would take commitments to avoid, for instance, the placement of advertising next to disinformation content on their services or also to avoid disseminating advertising that contains disinformation or links to disinformation sources. Fact-checking, access to data for researchers, these are also important fields. You would add user empowerment through tools and initiatives to understand and flag disinformation for the users, to better understand and identify disinformation content. One could also mention integrity of services, which is basically what was already mentioned a bit earlier by Estef. For instance, prevent manipulative behaviors on their services in the forms of deepfake or AI-generated content. The core regulatory aspect, if you want, of the Code of Practice is an important innovation and that links to the DSA in the sense that for certain signatories of the Code, the major online platforms, the Code of Practice aims to become a code of conduct under the DSA, basically a possible means for them to demonstrate that they comply with their obligation to mitigate risks. And finally, last but not least, pillar of our approach is indeed the EDMO, the European Digital Media Observatory. Through EU financing, we support the development of a cross-border, multidisciplinary community of independent fact-checkers and academic researchers. This is comprising, if you want, a central system, digital platforms, combined with national or regional hubs covering the EU territory and population. But I think I will leave also Paula to get into that more specifically. Last slide, if you may. Getting more into EU elections, this is of course an important, we have an important calendar ahead in the EU that you already mentioned. National elections culminating with European elections in spring, and as part of the Code of Practice, but more generally as part of the enforcement of the DSA as well, we are seeking to, an important part will be to focus on countering disinformation related risks in these periods on elections. When it comes to the Code of Practice, this has prompted the signatories to strengthen the exchanging and setting out of all the actions that they are expected to take during elections. So we established a specific working group to tie also to what has been said earlier on regarding generative AI being a challenge, including in that particular context. There will also be work carried out on that in particular with dedicated subgroups. So this basically illustrates a little bit again our approach to a multi-stakeholders involvement as everyone has responsibilities and tools and needs to take up the fight on disinformation. The same applies, if you want, with Edmo taskforce on election which has been set out to carry out a risk assessment ahead of the European election and foster the participation at a more expert level on those aspects. And being mindful of time, I think I will leave it here, but happy to take any questions during the discussion. Thank you very much.
Giacomo Mazzone:
Thank you very much, Albin. Now Stanislav Matejka, that I said before is the regulator, representative of the regulator, and in particular is within ERGA with a specific task that is linked to the election process. So Stanislav, one question just to introduce you. The question is, my colleagues from other regions of the world hear when they hear about this initiative, they say, but this will mean to create a minister of truth. And I say no, because we have on board other stakeholders and we have the regulators, independent regulators that are the arbiters of this process. So if you can explain better what is your role, because this is a key question for the other regions of the world. Thank you.
Stanislav Matejka:
Thank you very much, Arko. And thank you, Albin, also for setting the scene here so that I don’t have to go into details that you already described. First of all, I should say that ERGA stands for the European Regulators Group for Audiovisual Media Services, which is an expert body, a European body that focuses on effective implementation. Specifically, the body was created to enforce and implement in the most effective way the Audiovisual Media Services Directive. So this is our core mission of the whole group. And the members of this group are charged to enforce both the European and their own national legislation when it comes to media regulation. For several decades, audiovisual media regulators have been focusing on broadcasting, which basically everybody means television and radio. But for the last decade, we also entered the field of digital media. The media regulators cover rules for advertising in general, and political advertising in particular is something very important on the agenda of the media regulators in Europe. And building on this, we have started to look at election integrity protection, and we put it high up on our agenda as well. And we have focused on this issue ever since the first code of practice that came to existence, as Albin mentioned, in 2018. And ERGA has been tasked by the European Commission at the time to monitor the initial code of practice and publish several reports. And we focused a lot on political advertising because there’s a lot of interest and a lot of expertise on the side of the regulators. So those of you interested in the role of regulators and the assessment of regulators of the initial code of practice, you can go ahead on the ERGA website and read all the findings that we have come together to. Now, since, as Albin mentioned, the code has been revamped in 2020, ERGA has taken a very active role in both providing expertise to the Commission in negotiations of the code of practice, and coming together with the industry and other stakeholders, fact-checkers and researchers, with a newly created task force under the code of practice in this information. And I feel that this is a very important step in actually making the multi-stakeholder approach to this a reality. So we are sitting around the table as regulators together with EDMO, with the European Commission, with the industry, and with the fact-checkers and researchers. And so you can see in this multi-stakeholder approach that there’s a lot of independent bodies, independents from the government, from the industry, like regulators, researchers, and fact-checkers, that have a say in how the code of practice, first of all, should look like in the first instance, and then how it should be understood, interpreted, and kind of implemented, not necessarily enforced, is its self-regulation, as Albin mentioned. So our approach to protection of integrity of elections, as ERGA, is, and this is my interpretation of the code of practice, is through transparency. So code of practice introduces transparency obligations, basically commitments, at least, to publish transparency reports on the measures taken by the platforms to protect elections or to fight misinformation and disinformation in general. One more area that we focus on a lot, I hope you can hear me, my video just froze. Yes, we can hear you. You are frozen, but we can hear you. I’m not sure why, let me just check. But I wanted to say, I wanted to talk a bit also about the upcoming regulation on targeting of political advertising, that this is upcoming, this is not yet enforced, it’s still in the legislative process, but ERGA is taking an active role in providing the expertise there as well. And so when it comes to the measures that, as ERGA, we would propose when it comes to protection against misinformation, and specifically in the context of elections, we very much focus and appreciate the focus on transparency and this new regulation, together with the code of practice, focuses on this aspect very much. For example, the regulation introduces obligations to publish transparency notices next to political advertising so that users and citizens in the EU can be informed that the advertising they’re seeing on online services is actually the political advertising. Other area or measure that we focus on is monitoring the effective enforcement of the platform’s own policies on something that is called, in certain areas, TTPs, which is tactics and techniques. So, manipulative behavior, coordinated inauthentic behavior, or contents, all of these areas are very well covered by most of the very large online platforms actually now in their terms of service, and our role within this whole context is to oversee and monitor how effective they are in enforcement of their own policies. Then what we’re proposing as a measure to protect elections against misinformation and manipulation is, we want to see functioning reporting mechanisms for regulators and researchers, or anybody actually, citizens, to report or flag to the platforms that there is a misinformation happening at the moment, and we want you to look at it and enforce your policies in place. This is relevant very much for regulators. Another, maybe it’s a bit technical but still very important issue is of course effective repositories of political advertising to be scrutinized then by the regulators and independent researchers as well. One very key area, and I think Paola will probably also touch upon this, is access to data. This also has a link to DSA, the Digital Services Act. Access to data is crucial for public scrutiny through independent research, and I think this is key for us as regulators as well to have the input from the research community to inform enforcement of the regulatory framework that we have in place. I will stop here and apologize for no camera. Apparently, I can’t connect the camera back. Sorry for that.
Giacomo Mazzone:
Thank you, Stanislav. So, you’re denied to be the Minister of Truth in this case. Very much. If it’s not you, then probably it will be Paola that has to play this role. She’s now our next speaker. Can you hear me well? Yes, and we can see your slides. Super.
Paula Gori:
Thank you very much. Spoiler, I’m not the Minister of Truth, and I’ll tell you why. Hello, everybody. I’m Paola. I’m the Secretary General of EDMO, which was mentioned already by my colleagues in this panel. So, the European Digital Media Observatory. Why do we need this observatory? There is agreement, and it was also mentioned previously, that when it’s about this information, it’s about a multi-stakeholder and disciplinary approach. There is no one solution, single solution. There are many different approaches and solutions that together actually build, if you want, a macro solution. And let me just mention some key words that show how it is important to have this approach. Some of these concepts were already mentioned by the other colleagues. Think of human rights, research, AI, fact-checking, content moderation, media literacy. But I’m adding here also other stuff. For example, the role that is played by emotions in sharing this information, or the impact that actually visuals have compared to text. And the fact that, of course, we have to analyze the data. I mean, these are just a few key words to show you how many expertises we need. Because when I go back, for example, to emotions, I mean, I have a legal background. I cannot give you evidence of the role played by emotions, but neuroscientists can do. And the fact-checking organizations, and later on you will talk to Giovanni, they don’t work in silos. They work with other experts. And what they do actually, what they produce, is very important for citizens. It’s important for research and so on. So as you see, it’s many different fields and many different experts and many different expertises. So that’s basically behind EDMO. So the idea is that we have a platform, which is EDMO, which is funded by the European Commission. As it was previously said, it is acting completely independently. And the platform gathers the stakeholders and the expertises that I was mentioning previously. And when possible, actually provides also evidence and tools. So basically, think of a big platform, a big both, where all the experts come together and where tools are offered to those experts to work in the best way. These are the partners of EDMO. So we are a consortium of different organizations. And our main activity basically focuses on fact-checking, on academic research and on media literacy. How do we do that? We basically have secure collaborative platforms for this community. So this is a secure online place where they can gather, where they can share best practices, where they can work together. We do work on maps and repositories, for example, scientific articles, or we map the media literacy initiatives in the EU member states, so that basically the experts can have comparable data, for example, or can access evidence that is of their interest. We are working, and I get back to that, on a framework to access the data of the online platforms for research purposes. We have a training program. Actually, our trainings are all online and for free. On specific topics, on disinformation, we carry out policy analysis, we have specific task force, and so on. Let me now focus on the main activities. For fact-checking, here is just a couple of examples, but let me say that we have a network of fact-checkers that apply to join the network and respect a given number of criteria that we identify. And these fact-checkers, this network, is really something very precious, because, I mean, probably it looks like something very obvious, but when you have a network, you can really advance to the second, if you want, level, which is that, let me just mention, for example, it is actually still about the war in Ukraine. The moment the war started, the fact-checkers were united in sharing information, in sharing this information that they were detecting in their countries, and they were informing the other fact-checkers in the other countries. That helped us a lot. I have to say here, my colleagues from Pagela Politica are doing an incredible work in coordinating, so in gathering the disinformation narratives that are detected in the member states and in sharing it also with others. And considering that we know that fact-checking is most effective when it is done in the first 24 hours, the fact that you can count on a colleague in another member state saying, you know what, here in my country, today we discovered, I mean, we realized that there is this disinformation narrative, be prepared, it may arrive in your country as well, you can understand, actually, how important such a network is. Here you see, actually, our database that is regularly updated with this information, which is debunked by the fact-checkers in our network. And out of the work that they are doing, we also publish monthly briefs in which we basically recap the main disinformation narratives that were detected in a given month in the EU. On media literacy, for those who are familiar with media literacy, this is a very large field because it involves many different actors, it has different target audiences and so on. So here what we are trying to do is try to put some, if you want, some order, precisely because it is implemented by so many different actors and with different techniques and with different approaches. We started by mapping the main media literacy initiatives in the various countries and what we are working on a lot now is to, for example, work to understand how to assess the impact of a media literacy initiative. So not only a media literacy initiative that is implemented, trying to understand if actually it had an impact, so it was, if you want, useful for society. And then there is research and Stan already mentioned the importance of accessing data. For those who are less familiar, we are talking about data that the online platform has on the behaviours, not on the behaviours but on the users, that could actually be accessed for research purposes to understand, for example, various behaviours or trends or how basically this information spreads and is spread by whom and so on. So, of course, this needs to be done in full respect of GDPR and this is why Edmo had a working group that actually released a report that includes a code of conduct on the basis of which this access could be given. And what we are doing now, we are basically thinking on how we could structure, an independent intermediary body that would on one side vet the researchers that are asking the access to those data and on the other side of course ensure that this access is given and that everything is going as it should. This whole work is chaired by Dr Rebecca Trumbull and it is indeed something that when we also talk to international stakeholders is seen with very much interest because it looks like something that is quite new actually in the sector and we really hope that this is helpful, the work that we are doing especially considering that as it was saying by Albin, this is now something that is in digital services act, the fact that this access needs to be given. And then you see here for example the repository that I was mentioning previously on that includes scientific articles on this information. Here again it’s multidisciplinary so we have many different approaches to the topic. And then as I was mentioning the policy debate so EDMO of course as it is part of this whole European strategy to tackle this information we are also part of the task force within the code of practice together with ERGA for example. The aim of the task force is basically to make sure that the code of practice keeps being aligned with the developments and also to if you want to work on some implementation parts of the code and one of the main tasks within the code of practice for EDMO is actually to propose structural indicators. Structural indicators are indicators that help us understand if the code is having indeed an impact on the information ecosystem and in part or reducing or not this information. And then we also produce of course policy analysis. Here I mentioned some example. Last but not least and it was also mentioned, our hubs. So we are lucky enough to have hubs in all members covering all member states. These are either national or multinational and those are really the doers in the sense that they implement media literacy initiatives in their countries, they do their local research, they are in contact with the national regulatory authorities, they do local fact checking. So really we at more you as a platform gathers what they are doing and this is really what we think is the added value of such a platform because as we know this information has no borders but there are clearly local specificities related to this information, how it spreads, messages that are more impactful or not and so on. So having the possibility of having on the ground experts in all member states is really a plus for a platform like Edmo. And I think that I will now pass it over to Giovanni because as it was already mentioned we have established a task force in view of the European Parliament elections next year and Giovanni is the chair of this platform and will tell you more about that. Thank you very much.
Giacomo Mazzone:
Thank you Paola.
Giovanni Zagni:
Thank you. Let me start by introducing a similar effort that was conducted in the context of the war in Ukraine and this effort was already mentioned by Paola before me. I’ll give a couple more details. On March 3rd, 2022 the European Digital Media Observatory, Edmo, established a task force on disinformation and the war in Ukraine. It was chaired by Dr. Claire Wartel and the task force included 18 members representing academia, journalism, media and civil society. The task force has met weekly for three months to discuss developments and trends in relation to disinformation in the context of the war in Ukraine and to design and steer different projects. Some of them were just mentioned before me. Considering the mission of Edmo, the work of the task force did not focus primarily on the security or foreign interfering aspects of disinformation related to the war but rather on understanding the phenomenon more generally, for example by focusing on the analysis of content that was circulating those weeks, by examining the role of public interest journalism and by researching efforts to build resilience across societies. The task force published three statements about urgent issues related to the war in Ukraine and those were about cyber security, foreign propaganda, disguised fact-checking and finally mental well-being on investigators as well as a final report that listed 10 recommendations for policy makers, technology companies, newsrooms and civil society based on the observations, the research activities and the discussions carried out in the previous three months. In addition, the task force facilitated the circulation of other content, for example monthly briefs on detected disinformation and specific cooperative investigations that were produced by the Edmo fact-checking network and were very much appreciated by many stakeholders including institutional ones. With the 2024 European elections approaching, Edmo has decided to replicate in some sense the experience and in January 2023 its executive board established a new task force, this time with a focus on the elections. The task force composition, and this is a partial difference from a previous one, reflects closely the network of national and regional Edmo hubs with one representative from each plus three members from the advisory council. The total number of components is again 18 and it reflects Edmo’s role as a multidisciplinary and multi-stakeholders platform to support and coordinate activities between relevant experts communities. Among the members there are representatives from the media, fact-checkers, academics, policy and media literacy experts. This task force is carrying out one line of activities focused on the past, one on the present and one on the future. About the past, we have been reviewing the electoral campaigns that took place in the past year around Europe, which were about a dozen, in order to understand the most relevant disinformation narratives at the national level and the dynamics of what happened then. We hope that these insights will be useful ahead of next year’s elections since the European elections can also be interpreted and in many countries are actually perceived as the sum of 27 different national ones. Secondly, about the present, the Edmo hubs representatives in the task force were asked to contribute with an overview of the main risks they see stemming from their own country or region in relation to the elections. The result of this ongoing exercise will be a preliminary risk assessment report to be published by the end of the year that will point out the main issue we can reasonably foresee ahead of the elections. But the European parliamentary elections are still eight months ahead after all, so a good deal of what the task force will be called to do is in the future. To better prepare, the Edmo fact-checking network is starting to collect information on the mise and disinformation trends regarding Europe, and at the same time, the task force is engaging in a challenging round of consultations with other stakeholders that are monitoring the elections in Europe, including institutions and civil society organizations. With this idea, it plans also to facilitate the dissemination of best practices and useful experiences from the media and information literacy world. The goal overall is to tackle the issue in a democratic and inclusive way, giving proper representation to the diversity of issues on the ground. For that, we will need the cooperation of the expert community, but also of technological platforms and civil society organizations. And this is a nice segue to who’s coming after me. Thank you.
Giacomo Mazzone:
So, we are still looking for the Minister of Truth, apparently we cannot find it. Eric, you represent civil society, and I see that you put some questions in the chat. But before to give the floor to you, I want to read one question that is in the chat from Kete Van, that says, new and upcoming EU regulation focus more on preventing potential social harm when it comes to digital platform, but still mostly depend on self-assessment report from big tech companies. Do you think that this approach will be effective? What could be more done in this direction? So, probably this is a question that is near to what you want to say.
Erik Lambert:
Yes, thank you. Giacomo, a word about Eurovisioni, Eurovisioni, which I’m representing. Eurovision is an association of Italian origin, European in scope, mostly interested in the idea of public service in the media, starting from television, but now looking at what the meaning of public service in the era of internet and social platforms. One question I have, which is regulation, because I’m listening with great interest to the presentation, is that, as I said at the beginning, the European Commission looking at the regulation is looking not at, especially in terms of artificial intelligence, not in limiting freedom of speech for individuals, but limiting coordinated activities from a foreign power or specific groups that try to influence and manipulate the public opinion. But we are confronted with a phenomenon that seems to be very strong among the younger generation, if you follow what’s the result of the Reuters Institute survey, is that younger generations don’t trust the social media platforms as we knew them, the old Twitter, for example, or Facebook, any more than the old media. They do trust much more the new forms like TikTok, which is based on personal experience, where it doesn’t seem at the moment any actor has been able to create, strictly speaking, coordinated and authentic activities. If this trend continues, many of the approaches there could be insufficient to form, how to say, waves of disinformation, waves of false narratives, if those narratives come from the direct perception of the users uploading those short recordings of their own life, of their own perception. So this is the problem of the necessary evolution of the view of how public opinion is formed. We are no longer in the 19th century, the 20th century, the 21st century seems to change the way the public opinion is formed.
Giacomo Mazzone:
Thank you. Thank you very much. You put some of the questions on the table, but now let’s go to the last speaker of the session, that is Caroline Greer. You have been put on the spot by many. Everybody refused to take the role of Minister of Truth, so this means that I’m the platform, the Minister of Truth?
Caroline Greer:
Absolutely not. So I’m sorry, I’m not TikTok, so I’m not sure who to pass that one to after me, but no, no, of course we’re not the Ministry of Truth. But yeah, good afternoon everyone, good morning from Brussels, really sorry not to be in Kyoto. Really interesting discussion, maybe just a couple of words on the kind of the infrastructure and the environment that was described by institutional colleagues. So TikTok is a signatory of the Code of Practice on Disinformation. We’re a very active signatory. We’re actually co-chairing the election working group that was mentioned, and we took a leading role in the work on structural indicators. And we, well, all platforms published their reports, their second reports a couple of weeks ago. You will find them on the Transparency Central website, disinfo-code.eu. TikTok alone has more than 2,000 data points, 2,600 data points, many, many pages. So there’s a lot of meat there, and if anybody wants to deep delve into how we tackle disinformation and elections as part of that, that report is there for the reading. But all to say, we really appreciate this ecosystem, this infrastructural model that has developed around disinformation, because as was said, I think, by the first speaker, we really think that this is a multi-stakeholder effort. It’s a really dynamic, complex area to tackle, and certainly we have a big role as platforms, but we can’t do it alone. We need the support of fact-checkers, of civil society, of other actors within the ecosystem. So really important that we’re coming together under the auspices of the Code. I thought I would just say a few words about how we tackle elections as TikTok, since that’s the subject of the panel, and just to let you know what sort of happens at a grand level, as it were. TikTok has a global election integrity program, but we also add in a layer of local flavour to that. So we work with local experts for each election, because we really feel we need the expertise. While there is a template, if you like, of things that we do in each election, obviously each election comes with its own flavour, its own nuances, its political sensitivities, cultural sensitivities, etc. So that local approach is really important. Planning for any election begins many, many months in advance. We have an election calendar, obviously elections are happening globally, so we’re just working around the clock, basically, and always moving on to the next election. So we have the Polish ones coming up this weekend, we’ve just been through the Slovakian ones. So there’s always an election, and obviously next year is going to be a huge year. The EU elections, 27 countries all at once, UK, US elections, so it’s going to be a very busy year. So what do we do as TikTok? We have election policies, number one, we have our community guidelines, which set out the rules of TikTok, if you like, what you can post, what you can’t post, what is appropriate behaviour on the platform. The election policies are a subset of that, and for example, we don’t allow political advertising as TikTok, so that was a decision that we took some years ago, and we’ve stuck with that. We restrict the activity of political parties and politicians around elections, so campaign funding, for example, is something that we put the brakes on. The external partners that I mentioned are really important, so we work with third party organisations that might give us additional intelligence around threats or trends or narratives. We have our fact checkers, who are really important partners for us in this work, so we make sure that we’re fully staffed up and resourced with our fact checkers. We have authoritative information about the elections that we put on our platform, so for every election we have an election hub, which has information about how and where to vote, and we typically link to the national authority that has that authoritative information. We typically run a media literacy campaign, sometimes partnering with the fact checkers, and our trusted flaggers are really important as well. faceted approach, a local approach for elections and a cross-functional approach internally. We’ve more than 40,000 staff working on trust and safety within TikTok and a large part of them are also working on elections. So I’ll pause there, I know we’re getting close to time, so thank you very much.
Giacomo Mazzone:
Thank you very much to you. So we are still looking for the Minister ofTruth. I would ask the room to be ready to raise questions and to take eventually the microphone that is on our back. I start with the question that is in the chat, that is quite direct and probably is for the EU representative. What happens if one of the large platforms refuses to follow the code of practice or even to sign it? Do you have remedies swift enough to prevent substantial harm to be done? Albin, I think that this is for you and there is not a name, not a name, but we can imagine who we are talking about.
Albin Birger:
But indeed the code of practice is a voluntary instrument. I mentioned the fact that under the DSA it may become a code of conduct which very much links then to the enforcement regime of the DSA, but even under the DSA the code of practice or adhering to a code of conduct for those very large online platforms will still remain a choice of theirs. Of course it is one mean for trying to demonstrate their compliance with their obligations under the DSA. If they choose not to sign to the code of practice slash code of conduct they will still be able to demonstrate that the actions they are taking are in the range of those expected or able to mitigate those risks. Of course then monitoring and transparency is key for the code and for the DSA. This is not an instant tool but the reporting is regular, the assessment is being done and the kind of exchanges that take place, these are on a regular basis within the code and so bringing together under the umbrella of the code task force a number of relevant actors is precisely the objective of making a decisive step towards possibly addressing emerging risks or discussing it. Again it is not about being a ministry of truth, we are not there to discuss what is true, what is false, but it’s all about bringing this information into a context. This might be a risk, how is it addressed and that’s possibly the role of the fact checkers who essentially provide context and try to make it understood that this might be a risky narrative evolving and then from there indeed if it has to be enforced under the DSA regime for some of the signatories the way to address it would be under the DSA enforcement tools which again provide for additional, the commission to ask for additional information and possibly open cases of investigation in more specifics about certain observed possible failure or concerns. And then to just mention a little bit or try to answer also a bit the second questions, of course this is self-reporting under the code, it’s also to a certain extent self-reporting under the DSA which is also, transparency is also very much under the DSA key objective. The DSA adds a layer to auditing the data and information provided by the very large platform search engine, so there you can also expect some additional tools to what is a legislation basically. Thank you very much. So this means that you have tools that could, under the DSA, the code of practice is something that is a voluntary subscription, so there are the limits of the code of practice, but DSA is mandatory, so especially for the seven platforms that are under observation from the European Commission, they have to respect a certain number of rules. If they don’t respect, they get a warning, and if the warning is not, doesn’t produce any effect, then you can enforce through what? Fines? Closing the platform to the European citizen? What are the tools that you can use? The teeth that you can use in this battle? In broad terms, indeed, there might be measures proposed or asked to be implemented by the platforms to address specific identified risks or concerns. In all eventuality, this could also lead to potential fines which represent, and there I would turn to Caroline maybe to know more about exactly the amount of what, but it’s a percentage of the global turnover of a platform, possibly 5%, but I would leave that to other informed colleagues of the panel. Okay, thank you very
Giacomo Mazzone:
much. We have a question from the room, so if we can give the mic to the person there.
Audience:
Can you hear me? Yeah, I see, okay. So we are actually talking, we are searching for Minister of the Truth here, I guess so, and my question will be posed directly to TikTok, because I know there is a mechanism of user bubbling in some kind of information bubble. So how it works? Basically, the situation is that if you are in Ukraine, you cannot view TikToks from Russia, and vice versa. If you are in Russia, you cannot view TikToks from Ukraine, even if you have a direct link, because if you follow the direct link, basically you will see some cats and dogs video instead of the real content which was posted by some different country user. So somebody can call it censorship, I’m not discussing that, because there are certain pros and cons in this mechanism regarding the context of Russian-Ukrainian relations, but there is still plenty of shadow Thank you. Can you please give us some light on that? in public about how this mechanism is being, and this policy is being regulated by TikTok.
Giacomo Mazzone:
So I guess this is a question for Caroline,
Audience:
but there is another question in the room, please. Yeah, hi, Dan Arnato from the National Democratic Institute. I’m curious, particularly from Edmo’s perspective, but maybe others on the panel, how you are thinking about approaching issues with platform APIs and data access. Generally, you’re seeing kind of de-emphasis of CrowdTangle, you’re seeing restrictions on APIs, X is becoming essentially unaffordable for ordinary research organizations. So I’m curious about that, and also if particularly Edmo has any engagement with accession countries or potentially future partner countries, because we do a lot of work with them and would be interested to hear if you have any coordination or programming there. Thank you. So the first is for Caroline,
Giacomo Mazzone:
the second is for Paul, I guess, unless somebody else wants to intervene. Please, Caroline.
Caroline Greer:
Yes, certainly. So for TikTok, the Bible, as it were, are our community guidelines which are applied globally. You’ll find those on our website. I will say that the Ukraine-Russian situation is quite unique. There’s a war going on, so we do have some measures there to ensure that we are protecting our users and making sure that the content is appropriate. But this is a very unique situation. So ordinarily, our community guidelines and our policies are what apply. You mentioned filter bubbles. We have a couple of mechanisms to try to push through that. So you can actually refresh your feed with TikTok. So you can just, if you feel you’re starting to see more and more of a particular type of content because the algorithm is seeing that you’re engaging with that content and delivering more, you can hit refresh and simply reset, if you like, almost start again. We’ve also introduced a second recommender system which was required under the DSA. And this is a recommender system that is a non-personalized feed. So it’s basically popular videos in your local area. So these are mechanisms where we try to push people away or at least nudge people away from, you know, if they’re falling into a bit of a rabbit hole with content. I hope that answers the question.
Giacomo Mazzone:
Thank you. So, Paola, what you can answer to our request about research?
Paula Gori:
Yeah, that was a very good question because indeed when Idid my presentation, I focused on the personal data access, which is still if you want something, it is not happening yet and should happen soon. While on public data, indeed, the platforms, they have different approaches. And indeed, as you mentioned, unfortunately, there is one elephant in the room, which is charging quite a lot of researchers. And of course, this cuts all the research projects because, I mean, the research budget cannot afford that. Another issue there is the fact that we learned that they very often actually access is like even more easily to the big famous universities rather than to universities in smaller countries in minority languages and so on. So definitely we are aware about that. What we are doing is, of course, we are having regular meetings, I have to say, and we really appreciate the platforms as well, because we are doing trainings that are actually accessible online with the platforms we already started. And they explain to the researchers how to access their data, which are the requirements and so on, they show how it works and so on. We did it also with Meta with a new user, I think it’s user interface or something, product. So Edma has a good collaboration with the signatories of the code and in general with the platform. And this is something that we offer for the research community. Clearly, what is often said is that it’s not only about accessing the data, but also having the infrastructure to manage that. And this is also something that was an outcome of the task force that mentioned Giovanni earlier, that in the EU, we need to be like have the research community more equipped to be able also to really technically speaking to work then on this data and with this data. So what concretely and to sum up what Edma is doing, we are organizing activities with the researchers to gather their feedback to understand how it works. We are actually also working now on a map that basically on a table that recaps how you access the data of the various platforms. And then we ask our community if they had troubles or not in following that procedure. And then in parallel, of course, the work that I was saying on private data.
Giacomo Mazzone:
Thank you, Paola. There is one more question in the room.
Audience:
Yes. I’m Chen from ISOC, Taiwan chapter. As everyone see, the information manipulation some situation is getting very worse in Taiwan right now. We’re facing more and more fake news and disinformation are happening on our online discourse environment. So I think the private sector, those platform service provider is a very key player in this kind of situation right now because they are the one who’s enforced this kind of regulation and deal with this information right now. I got two question. That is, first, is the way that can make the content recommendation algorithm more and more transparent so that we can know about what this kind of information or this short video is get on my own feed? And second one is, is there anything like the online content moderation system or the team are working? Is there any way that can reveal how this will work? What’s the process you are doing the online content moderation? And if there’s anything happened, like if my post or video got deleted, is there any way to get through, to get my thing or my post got deleted, like I can get it back? So that’s my question.
Giacomo Mazzone:
Thank you. I think that mostly for Caroline, but I don’t know if even the Commission want to add
Caroline Greer:
something. Please, Caroline. So yeah, thanks for those very good questions. I will say the Digital Services Act is here to provide all the answers that you need. So on content moderation and questions that you might have on decisions that were taken on content, number one, you can number one, you can appeal any content decision. Platforms need to provide you under the DSA with a full statement of reasons, outlining what action we took, why we took it, the basis for taking it. And again, you can appeal that if you don’t like the information that you see. Not only that, but we need to send that statement of reasons to a European Commission database that is publicly available. So all that information is there. There must be millions of reports in that database about every single content moderation decision that was taken by a platform. It’s all open there. So information provision and the ability to appeal is in the DSA. Also at the end of October, we need to publish transparency reports, which will outline how we moderate content. So giving much more detail around that, including language capabilities, et cetera. So this is for the EU region, of course. But, you know, maybe other regions are inspired by this. So more information coming on that. Recommender system. Again, it was in the DSA. We were asked to provide more information around the parameters of the recommender system. So really explaining in a lot of detail how the recommender system works. TikTok has a European online safety hub. You can find that from our website. You’ll find the link. But we post all that information there. So we want to be as transparent as possible. There’s a lot of information that’s being made available under the DSA. We hope folks take the time to read it because, you know, I think the DSA has done a great job really in setting up the rules of transparency and facilitating these transparency efforts by platforms. So your suggestion to our speaker in the room is that he has to move to Europe to be more protected. Well, you know, that’s a question for, you know, Brussels effect. I think, you know, the influence that, you know, some EU regulation has on other global pieces of legislation is also an interesting topic. But yeah, the DSA in the first instance is, of course, EU and EEA.
Giacomo Mazzone:
Okay. He’s got getting to the modules for applying for European citizenship. Thank you to the speakers. We are very late. We are beyond the schedule. So unless any of you has some urgent thing that need to share with the world, this is the last occasion. If not, I would thank all the speakers and the people in the room, even if they were hiding far away from the camera so that you cannot see them, but you’ve seen at the mic. Thank you very much. And I hope that you have learned some interesting information through this session. Thank you.
Speakers
Albin Birger
Speech speed
129 words per minute
Speech length
1847 words
Speech time
860 secs
Arguments
The EU is taking action against disinformation across three fields, including legislation, external actions, and communication.
Supporting facts:
- Actions are reflecting the institutional architecture of the EU institutions, including the Commission and the European External Action Service.
- Each Director General (DG) of the European Commission is responsible for a different field.
- The DG Connect, whom Albin Birger represents, is taking care of legislation regarding disinformation.
Topics: Disinformation, European Union, Legislation, Communication
The Digital Services Act is a new EU regulation that offers accountability for online platforms regarding illegal content, disinformation, and other societal risks.
Supporting facts:
- The Act mandates online platforms on content moderation, advertising, and algorithmic processes.
- The Commission is equipped with comprehensive investigatory and supervisory powers as per the Act.
Topics: Digital Services Act, EU Regulation, Online Platforms, Illegal Content, Disinformation
The Code of Practice on disinformation is a voluntary, industry-based measure to combat various aspects of disinformation.
Supporting facts:
- Established in 2018, the Code was strengthened in 2022.
- It aims to cut financial incentives for disinformation spreaders and empower users in understanding and flagging disinformation content.
Topics: Code of Practice, Disinformation, Voluntary Measures, Industry Commitments
Under the Digital Services Act (DSA), the code of practice may become a code of conduct, bringing it under DSA’s enforcement regime.
Supporting facts:
- The DSA is set to add an additional layer of auditing the data and information provided by large online platforms.
Topics: DSA, Online Regulation
DSA provides mechanisms to enforce compliance amongst very large online platforms.
Supporting facts:
- DSA enforcement tools allow the European Commission to ask for additional information and potentially open cases of investigation if there are observed failures or concerns. In case of non-compliance, it may lead to measures and potentially fines, based on a percentage of the global turnover of a platform.
Topics: DSA, Online Regulation
Report
The European Union (EU) is taking comprehensive action to combat disinformation. This includes implementing measures in three key areas: legislation, external actions, and communication. The EU institutions, such as the Commission and the European External Action Service, reflect these actions through their institutional architecture.
The Director-General (DG) of the European Commission, Albin Birger, represents DG Connect, which is responsible for legislation regarding disinformation. The EU is strengthening its regulatory framework with the introduction of the Digital Services Act (DSA), which mandates that online platforms be accountable for content moderation, advertising, and algorithmic processes.
The Commission has been granted extensive investigatory and supervisory powers under the DSA. Furthermore, the Code of Practice on disinformation, a voluntary and industry-based measure, plays a significant role in combating disinformation. Established in 2018 and strengthened in 2022, the Code aims to reduce financial incentives for those spreading disinformation and empower users to better understand and report disinformation content.
The EU is particularly focused on addressing disinformation related to electoral processes. To tackle this issue, a specific working group has been established. This group aims to exchange information and develop actions that can be implemented during elections to effectively counter disinformation-related risks.
The European Digital Media Observatory (EDMO) also plays a crucial role in the EU’s fight against disinformation. This observatory supports the development of a multi-disciplinary community of independent fact-checkers and academic researchers. EDMO operates as a central system, with national or regional hubs covering the EU territory and population.
Additionally, EDMO has a specific task force for elections that carries out risk assessments ahead of European elections. The DSA adds an additional layer of accountability for large online platforms, introducing mechanisms to audit the data and information provided by these platforms.
Failure to comply with DSA obligations may result in enforcement measures and fines based on a percentage of the platform’s global turnover. While signing the code of practice is voluntary for online platforms, it serves as a tool to demonstrate their compliance with DSA obligations.
Even if platforms choose not to sign, they can still align their actions with the expectations outlined in the code of practice. In conclusion, the European Union is taking comprehensive action against disinformation through legislation, external actions, and communication. The implementation of the Digital Services Act and the Code of Practice on disinformation provides a framework for accountability and empowers individuals to combat disinformation.
The EU’s focus on tackling disinformation related to electoral processes, along with the support of the European Digital Media Observatory, further strengthens its efforts in this area.
Audience
Speech speed
142 words per minute
Speech length
526 words
Speech time
222 secs
Arguments
Concern about TikTok’s ‘censorship’ and user information ‘bubbling’
Supporting facts:
- The user mentioned a situation in which TikToks from certain countries (like Russia and Ukraine) cannot be viewed by users in the other country, even with a direct link
- Videos from direct links are replaced with unrelated content like ‘cats and dogs’ videos.
Topics: TikTok, censorship, user information, data privacy
Concerns over platform APIs and data access
Supporting facts:
- De-emphasis of CrowdTangle
- Restrictions on APIs
- Expenses of APIs for research organizations
Topics: Platform APIs, Data Access, CrowdTangle, Research
Engagement with overseas countries
Supporting facts:
- Work done with them
Topics: Overseas Engagement, Partner Countries, Accession Countries
Chen is concerned about the increase in the manifestation of fake news and disinformation in Taiwan
Supporting facts:
- The information manipulation situation is getting worse in Taiwan.
- Private sector platform providers are crucial in enforcing regulations and dealing with this information.
Topics: Fake News, Disinformation, Content Moderation
The speaker wants to make the content recommendation algorithm more transparent
Topics: Algorithm Transparency, Content Recommendation
Chen wants to understand how online content moderation systems work
Topics: Online Content Moderation
Chen inquires if there is a possibility to retrieve a post or a video once it’s been removed by the content moderation system
Topics: Online Content Moderation, Content Recovery
Report
Upon analysing the provided statements, it is evident that there are several concerns and inquiries raised by the speakers. These concerns are centred around various issues related to TikTok, platform APIs, engagement with overseas countries, fake news and disinformation, algorithm transparency, and online content moderation.
One of the main concerns is regarding TikTok’s censorship and user information “bubbling.” This refers to a situation where TikToks from certain countries, such as Russia and Ukraine, cannot be viewed by users in other countries, even with a direct link.
Instead, videos from these links are replaced with unrelated content, such as videos of cats and dogs. This has triggered negative sentiment among users and raised concerns about the control over user information. Additionally, there is an inquiry about the regulation of TikTok’s policy and mechanism for controlling viewer access.
The speaker seeks clarity on how TikTok manages and controls viewer access to its platform. Although no supporting facts are provided, it reflects a neutral stance and highlights the need for understanding TikTok’s policy regulation. Another concern raised relates to platform APIs and data access.
The de-emphasis of CrowdTangle, restrictions on APIs, and expenses for research organizations are mentioned as supporting facts. These issues have generated negative sentiment among users who worry about the limitations and costs associated with platform APIs and data access. Furthermore, the speakers express curiosity about engagement with overseas and partner countries.
While one speaker mentions work done with these countries, no details are provided about the nature and extent of this engagement. Nonetheless, this topic is approached with a neutral sentiment, indicating an interest in learning more about the engagement process.
The increase in the manifestation of fake news and disinformation in Taiwan raises concerns. It is highlighted that private sector platform providers play a crucial role in enforcing regulations and dealing with such information. This negative sentiment reflects worries about the impact of fake news and disinformation on society.
The desire for algorithm transparency in content recommendation is another argument put forth. However, no supporting facts are mentioned regarding this issue. Despite this, the neutral sentiment reflects a general interest in making the content recommendation algorithm more transparent. There is also a speaker who wants to understand how online content moderation systems work.
While no supporting facts are provided, this neutral stance suggests a curiosity about the mechanisms and processes involved in content moderation on platforms like TikTok. Lastly, there is an inquiry if it is possible to retrieve a post or video once it has been removed by the content moderation system.
No additional information is provided on this topic, but the neutral sentiment implies a desire to explore the potential of content recovery. In conclusion, the concerns and inquiries presented in the statements cover a wide range of topics, including TikTok’s user privacy and information control, policy regulation and control over viewer access, platform APIs and data access, engagement with overseas and partner countries, manifestation of fake news and disinformation, algorithm transparency, online content moderation systems, and content recovery.
These matters highlight various aspects of platform management, user experience, and the impact of social media platforms on society. The analysis helps identify the speakers’ viewpoint and concerns while emphasising the need for further insights and information on these subjects.
Caroline Greer
Speech speed
175 words per minute
Speech length
1469 words
Speech time
503 secs
Arguments
TikTok is a signatory of the Code of Practice on Disinformation and is co-chairing the election working group.
Supporting facts:
- TikTok is actively involved in the Code of Practice on Disinformation, leading roles in work on structural indicators
- TikTok, among other platforms, published their second reports on tackling disinformation a couple of weeks ago
Topics: Code of Practice on Disinformation, Election Working Group
TikTok has a global election integrity program and works with local experts for each election.
Supporting facts:
- TikTok has a detailed election calendar covering global elections
- For every election, TikTok has authoritative information about the election on their platform as well as external partners for providing additional intelligence
Topics: Election Integrity Program, Local Expertise
TikTok runs media literacy campaigns, sometimes in partnership with fact-checkers.
Topics: Media Literacy Campaigns, Fact-checkers
TikTok applies its community guidelines globally
Supporting facts:
- TikTok’s ‘bible’ are its community guidelines that are applied globally
Topics: TikTok, Community Guidelines
TikTok has introduced measures in response to the Ukraine-Russian situation
Supporting facts:
- The Ukraine-Russian situation is unique and involves war, requiring TikTok to implement special measures
Topics: TikTok, Ukraine-Russia situation
TikTok offers ways to refresh and reset content feed
Supporting facts:
- Users can refresh their feed to reset the algorithm and see a broader range of content
Topics: TikTok, Content feed
TikTok has included a second recommender system as required by the DSA
Supporting facts:
- A second recommender system that presents popular videos in the user’s local area has been introduced
Topics: TikTok, DSA, recommender system
The Digital Services Act (DSA) provides answers and guidelines about content moderation and appeals.
Supporting facts:
- Under the DSA, a platform needs to provide a full statement of reasons explaining an action taken and the basis for it.
- The platform’s decision can be appealed by users.
- The platform’s statement of reasons must be sent to a publicly available European Commission database.
- End of October marks the time for publishing transparency reports.
Topics: Digital Services Act, Content Moderation, Appeals, Transparency
The Digital Services Act (DSA) requires platforms to offer more details about their recommender systems.
Supporting facts:
- DSA mandates detailed explanation on how the recommender system works.
- TikTok has a European online safety hub where such information is posted.
Topics: Digital Services Act, Recommender Systems
Report
TikTok actively participates in the Code of Practice on Disinformation, taking a leading role in developing structural indicators. They, along with other platforms, recently published their second reports on tackling disinformation. As a signatory of the Code of Practice on Disinformation, TikTok co-chairs the election working group, demonstrating their dedication to addressing disinformation during elections.
TikTok advocates for a multi-stakeholder approach to combat disinformation, promoting partnerships with fact-checkers, civil society, and other actors. They are part of a larger ecosystem that encourages collaboration in combating disinformation. To ensure the integrity of elections, TikTok has a comprehensive global election integrity program in place.
They work with local experts for each election and provide authoritative information about the election on their platform. Additionally, TikTok collaborates with external partners to gather additional intelligence. TikTok has a strict policy against political advertising, which they have upheld for several years.
They restrict the activities of political parties and politicians during elections, including campaign funding. TikTok runs media literacy campaigns to promote critical thinking and verification of information. They sometimes partner with fact-checkers to enhance the effectiveness of these campaigns. TikTok applies community guidelines globally, which help create a safe and inclusive environment for users.
In response to the Ukraine-Russian situation, TikTok has implemented special measures to mitigate the spread of harmful content and support peace and justice. TikTok offers features to enhance user experience, such as the ability to refresh the content feed for a broader range of content.
They have also introduced a second recommender system as required by the Digital Services Act, which presents popular videos based on the user’s location. The Digital Services Act (DSA) plays a crucial role in promoting transparency in online platforms, including TikTok.
Platforms must provide a detailed explanation of their recommender systems and reasons for any action taken. Users have the right to appeal platform decisions, and transparency reports are published to provide insights into content moderation practices. In summary, TikTok actively engages in combatting disinformation, ensuring election integrity, promoting media literacy, and enhancing user experience.
They adhere to policies and regulations such as the Code of Practice on Disinformation and the Digital Services Act, upholding transparency and fostering trust. Through collaboration and effective measures, TikTok creates a safe and engaging platform.
Erik Lambert
Speech speed
118 words per minute
Speech length
324 words
Speech time
164 secs
Arguments
Regulations for AI should not limit freedom of speech but should restrict coordinated activities from foreign powers or specific groups aiming at manipulating public opinion
Supporting facts:
- The European Commission is working on this line of regulation as mentioned by Erik Lambert
Topics: Artificial Intelligence, Regulation, Freedom of Speech, Public Opinion
Necessity to evolve the approach to understand how public opinion is formed in the 21st century
Supporting facts:
- Erik Lambert emphasized on the changing trend of public opinion formation
Topics: Public Opinion, 21st Century, Information Dissemination
Report
The European Commission is currently engaged in the process of regulating artificial intelligence (AI) with a specific focus on preventing the manipulation of public opinion. These regulations aim to curb coordinated activities by foreign powers or specific groups seeking to influence public sentiment.
It is important, however, that these regulations do not impede freedom of speech. According to Erik Lambert, an expert in the field, the younger generation’s trust in social media platforms is shifting. Platforms like Facebook and Twitter, which have traditionally dominated the digital sphere, are experiencing a decline in trust.
Instead, younger people are turning to platforms such as TikTok that offer more personal experiences. This shift underscores the need for social media platforms to adapt and address the concerns of their user base. Furthermore, Lambert emphasizes the importance of understanding and evolving our approach to public opinion formation in the 21st century.
The rise of digital platforms, social media, and the rapid dissemination of information have changed the way public opinion is shaped. It is essential to recognize and adapt to these changes in order to effectively engage with the public and address their needs and concerns.
In conclusion, the efforts of the European Commission to regulate AI and combat the manipulation of public opinion are commendable. However, it is crucial to strike the right balance between preserving freedom of speech and preventing coordinated activities that aim to deceive or manipulate the public.
Additionally, social media platforms must adapt to the changing trends in trust among the younger generation. Finally, understanding and evolving our approach to public opinion formation is essential for effective engagement with the public in the 21st century.
Esteve Sanz
Speech speed
137 words per minute
Speech length
653 words
Speech time
287 secs
Arguments
Esteve Sanz emphasized that the IGF is a crucial platform for discussing critical issues related to disinformation and internet governance, and are impacting globally
Supporting facts:
- The VP of the European Commission attended the IGF, reflecting its importance for the European Commission
- Disinformation is a shared concern of many countries exchanged at the IGF
- The IGF provides a substantial, concrete platform for these discussions
Topics: Internet Governance Forum, Disinformation, Global Impact
Esteve Sanz underlines the fact that the fight against disinformation in the EU starts with a clear definition that it is an intentional action carried out by certain actors.
Supporting facts:
- VP of the European Commission also emphasized on this definition
- The EU’s human-centric approach to digital policies emphasizes on this definition
Topics: Disinformation, Definition, Intentionality
Report
Esteve Sanz highlights the crucial role of the Internet Governance Forum (IGF) in discussing critical issues related to disinformation and internet governance on a global scale. The attendance of the Vice President of the European Commission further emphasizes the importance placed on the forum and the seriousness with which disinformation is being addressed.
At the IGF, countries exchange ideas and concerns about disinformation, demonstrating collaborative efforts to combat its spread and the need for international cooperation. Esteve Sanz emphasizes that the IGF provides a substantial and concrete platform for these discussions. One specific concern raised is the increasing influence of generative Artificial Intelligence (AI) in amplifying disinformation.
Policymakers are urged to be alert and proactive in countering this issue. The affordability and ease with which generative AI can produce disinformation campaigns make it a significant threat. The European Commission is considering measures such as watermarking AI-generated content to tackle this challenge.
Esteve Sanz also emphasizes the importance of a clear definition of disinformation within the European Union (EU). It is argued that disinformation is an intentional action carried out by specific actors. This aligns with the EU’s human-centric approach to digital policies and underscores the need for accurate understanding and identification of disinformation to effectively combat it.
In conclusion, Esteve Sanz’s stance on the IGF underscores its critical role in addressing global disinformation and internet governance issues. The attendance of the Vice President of the European Commission and the exchange of concerns among countries highlight the significance placed on the forum.
The threat posed by generative AI in amplifying disinformation calls for heightened alertness from policymakers. Moreover, a clear definition of disinformation is deemed essential within the EU, reflecting its human-centric approach to digital policies. These insights shed light on the international and regional efforts to combat disinformation and ensure the integrity of online information exchanges.
Giacomo Mazzone
Speech speed
142 words per minute
Speech length
1210 words
Speech time
512 secs
Report
This town hall meeting focused on the upcoming European election in 2024 and the measures being taken to secure the elections and minimize interference. Representatives from the European Commission, the European Digital Media Observatory (EDMO), the regulatory body ERGA, TikTok, and civil society were present.
The European Commission, as the main proponent of this initiative, discussed the broader framework of the election and the role of independent regulators. They emphasized the importance of securing the elections and minimizing interference while enabling voters to freely express their views.
EDMO, responsible for tackling disinformation, addressed concerns from other regions about the creation of a “minister of truth.” They clarified that involvement of independent regulators, like ERGA, ensures a multi-stakeholder approach and prevents any monopolization of truth. A representative from civil society questioned the effectiveness of self-assessment reports from big tech companies in preventing social harm on digital platforms.
They discussed additional measures and actions that need to be taken for better results. TikTok’s representative highlighted the platform’s commitment to preventing harm and maintaining a safe environment during the elections. They emphasized the responsibility of platforms like TikTok to proactively address harmful content and uphold the integrity of the democratic process.
The issue of what happens if large platforms refuse to comply with the code of practice was also discussed. The European Commission representative addressed this concern and assured that remedial actions would be taken to prevent significant harm. Research in the field was another topic raised in the meeting.
The EDMO representative acknowledged the importance of research in understanding and addressing election security and disinformation. The meeting briefly discussed concerns about European citizenship modules and their impact on the election process. The need to address these concerns and provide clarity was mentioned, though no specific solutions were discussed.
Overall, the meeting aimed to provide valuable insights into securing elections, minimizing interference, and combating disinformation during the European election in 2024. The multi-stakeholder approach, involving the European Commission, regulators, platforms like TikTok, and civil society, demonstrated a collective commitment to ensuring the integrity of the electoral process.
Giovanni Zagni
Speech speed
157 words per minute
Speech length
801 words
Speech time
307 secs
Arguments
The European Digital Media Observatory (Edmo) established a new task force focused on the 2024 European elections, after a successful one that focused on disinformation during the Ukraine war.
Supporting facts:
- The Ukraine war task force included 18 members representing various sectors and focused on understanding disinformation more generally.
- The task force facilitated the circulation of several content including monthly briefs on detected disinformation.
Topics: Edmo, Task Force, 2024 European elections, Ukraine war
The new task force is carrying out activities focused on the past, present and future.
Supporting facts:
- They are reviewing past electoral campaigns, providing an overview of the main risks related to the elections, and intending to disseminate best practices from the media and information literacy world.
Topics: Edmo, Task Force, Disinformation
Report
The European Digital Media Observatory (EDMO) has recently established a new task force with a specific focus on addressing disinformation during the 2024 European elections. This task force aims to build upon the success of a previous one that focused on tackling disinformation during the Ukraine war.
Comprising 18 members from various sectors, the task force is committed to understanding the nature of disinformation and disseminating valuable insights to combat its harmful effects. One of the key objectives of the task force is to review past electoral campaigns, analyze their outcomes, and identify the main risks associated with the upcoming European elections in 2024.
Through this process, they seek to develop strategies and frameworks to counteract disinformation and safeguard the integrity of the electoral process. Additionally, the task force plans to disseminate best practices from the media and information literacy world. By sharing successful approaches, they hope to enhance media awareness and empower citizens to critically evaluate and navigate the information landscape.
Giovanni Zagni, a strong advocate for democracy and inclusivity, fully supports this initiative. He emphasizes the need for a democratic and inclusive approach in addressing disinformation, ensuring that the diverse issues faced by each country are properly represented. Zagni highlights the task force’s role in facilitating the exchange of best practices and experiences in combating disinformation, thereby enhancing the effectiveness of efforts to promote peace, justice, and strong democratic institutions.
In conclusion, the establishment of the new task force by EDMO represents a significant step in addressing disinformation during the 2024 European elections. Building on the success of the previous task force, they aim to develop comprehensive strategies to tackle disinformation, review past electoral campaigns, and disseminate best practices.
With the support of individuals like Giovanni Zagni, the task force aims to foster a democratic and inclusive environment where diverse issues are adequately considered. Through these collective efforts, they hope to reinforce media literacy, combat disinformation, and uphold the integrity of the electoral process.
Paula Gori
Speech speed
190 words per minute
Speech length
2153 words
Speech time
681 secs
Arguments
The European Digital Media Observatory (EDMO) provides a platform where experts can gather and collaborate on issues related to misinformation, offering tools and evidence where possible.
Supporting facts:
- EDMO is a consortium of different organizations, focusing on fact-checking, academic research and media literacy.
- EDMO is funded by the European Commission but operates independently.
Topics: Misinformation, EDMO, Collaboration
EDMO establishes networks of fact-checkers who provide each other with information and assistance, thus enabling quicker and more efficient debunking of disinformation narratives.
Supporting facts:
- EDMO’s fact-checkers work together to identify disinformation narratives and inform each other about them.
- Fact-checking is most effective when done in the first 24 hours.
Topics: Fact-checking, Disinformation, Collaboration
EDMO works to map and evaluate media literacy initiatives to better understand their impact and eventual effectiveness.
Supporting facts:
- EDMO is working on understanding how to assess the impact of a media literacy initiative.
Topics: Media literacy, Assessment
Access to online platform data for research purposes must be granted while fully respecting GDPR.
Supporting facts:
- EDMO has produced a code of conduct for accessing such data and is currently working on structuring an independent intermediary body that would handle requests for data access.
Topics: Data access, Research, GDPR
Platforms charge a lot for researchers accessing public data
Supporting facts:
- This cuts all the research projects because the research budget cannot afford that
Topics: Research, Data Accessibility, Budgeting
Access to platform data is more easily available to big famous universities than to universities in smaller countries in minority languages
Supporting facts:
- Data access is easier for big universities compared to smaller ones
Topics: Research, Data Accessibility, Equality
EDMA is organising activities with researchers to gather their feedback on data access
Supporting facts:
- EDMA is organizing activities with the researchers to gather their feedback and understand how it works
Topics: Research, Data Accessibility, EDMA
Report
The European Digital Media Observatory (EDMO) is an independent consortium of organizations that focuses on fact-checking, academic research, and media literacy. Although funded by the European Commission, EDMO operates autonomously. It aims to combat misinformation by providing a platform where experts can collaborate on addressing this issue.
One of the main objectives of EDMO is to provide tools and evidence to counter disinformation. The organization establishes networks of fact-checkers who work together to identify false narratives and share information with one another. This collaborative approach allows for quicker and more efficient debunking of misleading information, especially when done within the first 24 hours.
In addition to combating disinformation, EDMO also focuses on mapping and evaluating media literacy initiatives. It strives to thoroughly understand the impact and effectiveness of these initiatives, ensuring that efforts to enhance media literacy are productive and fruitful. An important consideration for EDMO is data accessibility.
They have produced a code of conduct for accessing online platform data and are working towards creating an independent intermediary body that handles requests for such data. EDMO recognizes the necessity of granting access to platform data for research purposes while fully respecting GDPR regulations.
However, there are challenges in accessing platform data, particularly for researchers from smaller universities and countries with minority languages. Data access is more readily available to well-established universities, which amplifies the inequality in research opportunities between larger and smaller educational institutions.
Paula, in her stance, advocates for the accessibility of platform data, especially for researchers from smaller universities and countries with minority languages. She points out the difficulty faced by these institutions in accessing data and emphasizes the importance of ensuring equitable research opportunities.
Paula also acknowledges the need for proper infrastructures to effectively handle and manage data, highlighting that data accessibility is not the only concern; having the necessary infrastructure is equally crucial. In conclusion, EDMO plays a significant role in addressing misinformation by providing a collaborative platform for experts in fact-checking, research, and media literacy.
Their efforts to combat disinformation, map media literacy initiatives, and promote data accessibility are commendable. However, challenges remain in terms of accessing platform data, particularly for researchers from smaller universities and minority language contexts. It is essential to address these challenges and create a level playing field for all researchers to contribute to the fight against misinformation.
Stanislav Matejka
Speech speed
161 words per minute
Speech length
1028 words
Speech time
383 secs
Arguments
ERGA (European Regulators Group for Audiovisual Media Services) is focusing on the effective implementation of the Audiovisual Media Services Directive
Supporting facts:
- ERGA was created to enforce and implement in the most effective way the Audiovisual Media Services Directive
- ERGA’s members are charged to enforce both the European and their own national legislation when it comes to media regulation
Topics: ERGA, Audiovisual Media Services Directive
Political advertising is a key focus area for ERGA
Supporting facts:
- Media regulators cover rules for advertising in general, and political advertising in particular
- ERGA has focused on this issue ever since the first code of practice that came to existence, in 2018
Topics: ERGA, Political advertising
ERGA’s approach to election integrity protection is through transparency. Transparency is key for effective enforcement.
Supporting facts:
- Code of practice introduces transparency obligations and commitments to publish transparency reports
- Monitoring the effective enforcement of the platform’s own policies is emphasized by ERGA
Topics: ERGA, Election integrity, Transparency
Access to data is crucial for public scrutiny through independent research
Supporting facts:
- ERGA emphasizes on the importance of access to data for the research community in order to inform enforcement of the regulatory framework
Topics: Access to data, Independent research
Report
The European Regulators Group for Audiovisual Media Services (ERGA) plays a vital role in enforcing and implementing the Audiovisual Media Services Directive, with a strong focus on effectiveness. ERGA’s members have the responsibility of not only enforcing European legislation but also their own national legislation, ensuring comprehensive media regulation.
ERGA is particularly focused on political advertising, establishing rules for advertising in general and paying particular attention to political advertising. Since the creation of the first code of practice in 2018, ERGA has consistently directed its efforts towards this issue. Their aim is to ensure fair and transparent political campaigns.
ERGA also places significant importance on election integrity and transparency. They have introduced a code of practice that includes transparency obligations and commitments to publish transparency reports. ERGA emphasizes the effective enforcement of platforms’ own policies and closely monitors this aspect.
Transparency is key to protecting election integrity and ensuring accountability. To combat misinformation on online platforms, ERGA supports the establishment of reporting mechanisms. They propose the creation of functional reporting mechanisms for regulators, researchers, and anyone else who wishes to report or flag instances of misinformation.
This initiative aims to address the spread of false information and provide a platform for accountability. Access to data is crucial for ERGA in promoting public scrutiny through independent research. They recognize the significance of data for the research community in informing the enforcement of regulatory frameworks.
ERGA supports the idea that independent research should have access to relevant data, enabling a more informed analysis and evaluation of media services. In summary, ERGA is dedicated to effectively implementing the Audiovisual Media Services Directive. Their focus on political advertising, transparency in elections, reporting mechanisms for misinformation, and access to data for independent research are essential aspects of their work.
By addressing these areas, ERGA aims to ensure fair and transparent media services in Europe.