Sharing “Existing Practices against Disinformation (EPaD)” | IGF 2023 Day 0 Event #202
8 Oct 2023 07:15h - 08:15h UTC
Event report
Speakers and moderators
Speakers
- Mr. Shinichi Yamaguchi, Executive Research Fellow/Associate Professor, GLOCOM, International University of Japan, Japan(Onsite)
- Ms. Madeline Shepherd, Digital Safety Lead, Microsoft Operations Pte Ltd, Australia (Online)
- Ms. Chay F. Hofileña, Editor Rappler, Philippine (Online)
- Mr. Aribowo Sasmito, Co-Founder and Fact-Check Specialist MAFINDO, Indonesia(Onsite)
 Moderator
- Mr. Daisuke Furuta, Editor-in-chief, Japan Fact-check Center
Table of contents
Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.
Knowledge Graph of Debate
Session report
Full session report
Chay F. Hofileña
Chay F. Hofileña shares her experiences with Rappler in their ongoing fight against disinformation, highlighting the role of Facts First PH. This initiative utilises a multi-sector approach, incorporating aspects of fact-checking, research, and accountability from fields such as media, civil society, academia, and legal professionals, to tackle the rising issue of misinformation.
Hofileña emphasises the significance of digital engagement for reaching the reader demographic, especially considering that the bulk of Rappler’s readership falls within the 18-34 age category. She demonstrates the efficacy of employing platforms like TikTok and visual content, as seen in the 2022 presidential elections, where cartoons were created and shared to make fact-checking more appealing and engaging.
However, she voices genuine concerns over the intensifying harassment and intimidation of journalists, actions which potentially weaken media institutions and undermine democratic structures. Specific instances, such as Filipino journalists being accused of communist associations under the Duterte administration, are highlighted to illustrate these concerns. She points out the urgent need for legal retaliation, such as the counteraction led by the Movement Against Disinformation against META for failure to disclose information regarding anonymous accounts targeting a provincial publication’s editor-in-chief.
Strengthening the journalism industry is suggested as an effective way to mitigate the spread of disinformation. Hofileña proposes that newsrooms and journalists be equipped with adequate tools and resources for online investigations. Moreover, she supports the concept of ‘pre-bunking’, a proactive approach to prevent the spread of misinformation, acknowledging the financial implications associated with its implementation.
Given the uneven dispersal of skills among journalists within the Philippines and the wider region, Hofileña moots potential collaborations for skill enhancement as a viable solution. Such collaborations are exemplified by Rappler’s fellowship programmes for journalists’ upskilling, facilitated by grants and funding. The benefits of interdisciplinary collaboration among reporters from various newsrooms are similarly highlighted to encourage the sharing of unique skills and knowledge.
Lastly, alerting the journalism community about forthcoming challenges, Hofileña alludes to the looming threat that Artificial Intelligence (AI) poses to newsrooms. At present, specific details about this threat are unspecified, yet her warning underscores an urgency to prepare journalists and newsrooms for this impending challenge. This comprehensive summary underscores the necessity of multi-pronged strategies to tackle disinformation; from digital adaptation and cross-industry collaboration to enhanced resource funding, and preparation for emerging tech trends like AI.
Madeline Shepherd
Microsoft is creating a significant impact in the fight against disinformation through its proactive use of technology and collaborative strategies. This is evidenced by the implementation of its Democracy Forward Initiative, focusing on cultivating a healthier information ecosystem, and by the introduction of its experimental information integrity rules. These initiatives demonstrate an upfront approach and commitment to bolstering societal resilience against disinformation.
A defining aspect of Microsoft’s strategy lies in partnerships and alliances. The corporation has secured partnerships with NewsGuard, providing credibility ratings for news websites. This collaborative endeavour empowers users to identify credible sources, aiming to build a more reliable digital landscape. Furthermore, as a founding member of the Coalition for Content Provenance and Authenticity, Microsoft reinforces the shared corporate responsibility to tackle the widespread issue of disinformation.
Microsoft’s commitment also extends to broader discussions concerning the role of the private sector in supporting democratic institutions. This reflects an understanding of the significant influence that technology giants can exert within such institutions. Microsoft adopts stringent measures on platforms such as LinkedIn, systematically eliminating fake accounts. This pattern also pervades their Bing platform, which systematically promotes authoritative information.
Education is a central pillar of Microsoft’s approach, with efforts to build resilience against disinformation through information literacy. Highlighting this, Microsoft utilises its advertising spaces to promote resources and skills in information literacy. Additionally, they plan to introduce an innovative educational game using Minecraft to further augment these efforts.
Microsoft also places strong importance on utilising technology, expressed in its willingness to establish alliances with newsrooms and fact-checkers. With AI transforming the terrain of news, Microsoft highlights the need to leverage these beneficial advancements to enhance the capabilities of fact-checkers, thereby improving the quality of news output.
Finally, Microsoft addresses the future implications of AI, underlining the importance of education in digital resilience and information literacy. The corporation identifies these skills as crucial for younger generations growing up amidst AI technology and emphasises the importance of responsible technology usage and safe digital habits.
In summary, Microsoft’s holistic approach embodies a balanced combination of technology utilisation, collaborative partnerships, and educational initiatives. This reflects their commitment to maintaining information integrity and promoting digital resilience, ultimately contributing to the achievement of the sustainable development targets of Industry, Innovation and Infrastructure (SDG 9) and Peace, Justice and Strong Institutions (SDG 16).
Shinichi Yamaguchi
Misinformation and disinformation have a significant impact on citizens’ views, particularly in political spheres. Specifically, ‘mild supporters’, who comprise a large portion of voters, are greatly susceptible to the sway of such false information. This underlines the potency of misinformation and disinformation in altering public opinion, evidenced by individuals significantly changing their perspectives upon exposure to fabricated data.
Moreover, the challenge of false information is not confined to the digital realm. It has permeated all areas of society, with 15% to 35% of individuals reportedly disseminating the misinformation they received, often through direct conversations with their peers. This illustrates that the issue of misinformation and disinformation extends beyond the internet, serving as a comprehensive societal problem necessitating holistic solutions.
The issue is further amplified by the advent of Artificial Intelligence (AI)-created disinformation. This novel form of questionable information, generated by ‘Generative AI’, adds another dimension of complexity to the situation. It facilitates easy exploitation of public sentiment through deceivingly real images and videos. Regrettably, humans often find it challenging to distinguish between credible information and AI-generated misinformation. This could result in exponential growth in misinformation volumes, outpacing content produced by humans.
The problem of false information is not contained within borders; rather, it casts a global shadow, capable of crossing national boundaries and creating chaos on an international scale. This underscores the need for global cooperation and concerted efforts in tackling this issue. Utilising tools like advanced technology, quality education, and robust stakeholder engagement can prove instrumental in such endeavours.
An instance of this approach is observed in the collaboration of IT companies with the Japanese government for conducting verification studies. Similarly, fostering critical thinking skills through educational materials can arm students effectively against disinformation. Additionally, hosting events that encourage dialogue between different stakeholders can facilitate meaningful progress in the fight against misinformation.
There is an urgent call for standardised development processes and regulations for AI, prioritising safety and security. Such guidelines would aid in mitigating the detrimental impacts of AI-generated disinformation, highlighting the necessity for regulations that align with technological advancements.
International collaboration is essential, particularly for streamlining fact-checking processes, making them more efficient and trustworthy. Going a step further, techniques for detecting misinformation and disinformation should be globally shared, fostering broader international knowledge-sharing and cooperation. Through implementing these measures, societies can build a solid defence against the onslaught of disinformation and misinformation.
Moderator
Across G7 countries and the European Union, extensive countermeasures are being implemented to tackle the growing issue of disinformation. These measures fall into four principal categories: civil society, social media platforms, government bodies, and research entities. An array of projects, tools, and initiatives are being developed within these groups to educate the public, fact-check information, and ensure the accountability of online platforms.
A key part of these counter-disinformation strategies revolves around promoting media literacy. For example, the UK has designed projects such as E-Internet Citizens and Find a Fake to enhance the understanding of media among its citizens. Germany has pursued a similar path, providing training for journalists to interpret and accurately report statistics – a critical tool in the fight against disinformation.
Social media platforms are not only a significant battleground for addressing disinformation but also crucial in mitigation efforts. Countries like the UK, France, and those within the EU have implemented various measures to hold these platforms accountable and promote reliable information.
Meanwhile, research entities are launching strategic projects targeting disinformation. A case in point is Toronto’s Digital Public Sphere project, which has developed an educational tool named ‘Know it or Not’. Germany has infused their academic curriculum with official statistics, thereby equipping learners with the skills to better interpret the information landscape.
Active in the fight against disinformation are government bodies, too. The EU, for example, has introduced regulatory measures like the Digital Service Act to ensure transparency and platform accountability. Both the US and Italy are developing official communication channels and working groups respectively, to preserve the freedom and accuracy of information.
On the front line of the battle against disinformation is journalism. Despite instances of distrust and criticism, organisations like Rappler have taken determined strides to counter disinformation through journalism, community engagement, and tech innovation. Rappler has become a fact-check partner for Facebook in the Philippines, supported by a range of journalists’ groups, NGOs, businesses, universities, and legal entities.
In the rapidly evolving landscape of information dissemination, technology and notably AI, have presented new challenges and opportunities. Microsoft, a tech giant, has embraced the challenge, employing technology and fostering partnerships to combat disinformation. Their proactive measures include blocking fake accounts on LinkedIn and promoting trustworthy information.
The prediction that AI-generated content will soon surpass people-generated information highlights an imminent shift and emphasises the urgent need to develop an international AI standard. It also points to the potential utility of AI in mitigating disinformation, such as in improving fact-checking efficiency.
Ultimately, it is abundantly clear that the fight against disinformation is a multi-dimensional endeavour requiring international collaboration and multi-stakeholder cooperation – from the governments that regulate, the platforms that disseminate information, the educational institutions that teach media literacy, to the tech companies blazing a trail with AI-based and technological solutions.
Aribowo Sasmito
MAFINDO, an esteemed organisation established on 19 November 2016, plays an instrumental role in combating disinformation and enhancing literacy education within Indonesia. Their work aligns with the aims of SDG 4: Quality Education and SDG 16: Peace, Justice, and Strong Institutions. The breadth of their impact extends to branches in 40 cities nationwide, marshalled by approximately 1,000 dedicated volunteers, showcasing the ascending support for these initiatives.
Enhancing their capability to tackle misinformation, MAFINDO holds a prestigious position as an authorised third-party fact checker for dominant social media platforms, such as Facebook and Instagram. In embracing technological advances, MAFINDO utilises innovative tools like the increasingly popular WhatsApp chatbots to extend their services.
In the face of the COVID-19 pandemic, MAFINDO has expanded their collaborations, allying with government and international entities like the Indonesian COVID-19 Task Force, the World Health Organisation (WHO), UNESCO, UNICEF, and the Centers for Disease Control and Prevention (CDC).
Complementing these efforts, relationships with over 20 national media platforms have been established to collectively address the challenges of health misinformation on the platform, Cekfakta.com, significantly contributing to SDG 3: Good Health and Well-being.
MAFINDO utilises a strategic combination of debunking and pre-bunking misinformation tactics, addressing spread of false information both post and pre-dissemination. They offer proactive pre-bunking training in various cities before the general election, equipping citizens with knowledge to counter misinformation.
The analysis underscores the global nature of battling disinformation and misinformation, advocating for deepened international cooperation. These issues require joint global efforts and collaboration amongst members of the internet ecosystem to effectively address misinformation. The summary suggests a consensus around this call-to-action.
In sum, the analysis shed light on MAFINDO’s crucial and proactive work in mitigating disinformation and misinformation, fostering literacy education, and building collaborative partnerships, all while prioritising public wellness in the face of major global events. There’s a pressing need for shared responsibility and global action within the internet ecosystem, both in Indonesia and worldwide, to ensure robust mitigation against potential damages of disinformation and misinformation.
Session transcript
Moderator:
Madeline Shefford from Microsoft and Shinichi Yamaguchi, welcome from Japan. I will ask each of you to introduce how you work on countermeasures against this information. Okay, so let’s start. First, I would like to share the e-pad. Okay, G7 countries and the EU are practicing many measures. So not all of them can be presented today. So please use this QR code to see the list of all measures. The measures can be divided into four categories, civil society, social media platforms, research entities, and government. Today, I will introduce only those unique measures. Oh, I will show the QR code later again. So don’t worry. First, about civil society. From Germany, they support projects such as the European Statistics Contest for Young Students and a webinar on common mistakes in dealing with statistics for journalists. This is unique statistics training for journalists. You know, many journalists, including me, not very good at numbers. So yeah, it’s really useful. And then from UK, for media literacy education, they developed resources and projects to help build resilience to misinformation. For example, E-Internet Citizens by Institute for Strategic Dialogue, Fake News and Misinformation Advice Hub, and Find a Fake, Gained by Internet Matters. Media literacy education is, of course, essential to prevent sharing this information. And from Japan, Japan Fact Check Center, yeah, it’s my organization, JFC, was established. Now, we are a signatory of the international fact-checking network, so each country needs its own fact-check organization. And about social media platform, the social media platform are such a big part of this issue, and so many countries have been introducing some measures. I think this is a significant change during these years. From the UK, promoting reliable information in such functions, for example, direct users towards Gov.UK or electoral commission in the lead-up to elections. And from the EU, improving large platforms’ accountability through the Digital Service Act, DSA. And from France, strengthening the accountability of platforms by requiring them to analyze the systemic risk generated by the operation of their services on the missing information. And then research entities from Canada, know it or not, this is a tool for education made by Digital Public Sphere as a project by University of Toronto. And MediaSmarts, it’s NPO. And from Germany, integration of topic of official statistics in Becker and master programs. And from Japan, they released videos to raise awareness of anti-fake news in April, along with the G7-related event, Fake News and Japan. And from government, EU, regressory or co-regulatory measures to ensure transparency and platform accountability, code of practice of disinformation, Digital Service Act. And from US, developed official digital communication channels that ensure collectible fact-based information. is publicly available. And from Italy, AGCOM, Communication Regulatory Authority, established a working group aimed at fostering pluralism and freedom of information also on digital platforms. Okay, so yeah, that’s just a brief summary of ePADS, so please jump to this QR code I mentioned earlier for a list of all measures. The list includes a variety of initiatives by multi-stakeholders such as government platforms, civil society, and research institutions. Okay, I will now ask each of you two presenters or the speakers to share what your own organization is doing to ensure a healthy information space. So first, Ali.
Aribowo Sasmito:
This one is on, thanks. So good afternoon, Kyoto, Japan time, and good day, good morning, good evening, to whatever time you are right now, for those who are joining online. My slides, please. So for this occasion, I’d like to first thank the MIC for inviting me to be able to attend to this such important event. So what I would like to present, since it’s going to be just a few minutes, I’m going to present just the highlight. So the title of my presentation would be the highlight of MAFINDO’s role in today’s information ecosystem. So, MAFINDO, we are established officially in November 19, 2016. MAFINDO stands for Masyarakat Antifina Indonesia, or the Indonesian Anti-host Community. We are a non-profit organization, mainly fighting disinformation. and providing literacy education. At the moment our chapters or branches are established in 40 cities with approximately 1,000 volunteers. So this is how I would like to describe the information ecosystem. We have the platforms, we have the government and related, we have the media, and we have the consumers as the users are also commonly called the netizens. And we are thankful, we are grateful to be able to work with every member of the information ecosystem. Let’s move on to the first member of the ecosystem, the platforms. With Google, we are working on various programs. You will see some of them in upcoming slides again. Since this is just a highlight, it is not possible for me to share every program that we are running with the platforms. With Mera, on Facebook and Instagram, as the IFCN certified organization, Mavindo is one of the 3PFC or the third party fact checker partner. By the way, IFCN is the International Fact Checker Network. With WhatsApp, this is also quite common with other fact checker organizations. WhatsApp chatbot is currently quite popular because in many countries, Indonesia, we are providing the services through WhatsApp chatbot. At the moment, this is the most popular platform. With TikTok, starting from early this year, we are working on several programs. Some of them are the safety workshop trainings, the training of introduction to this information for content creators, FGD sessions, the focus group discussion, expert roundtable, NGO day event, etc. With the government, this is the next member of the information ecosystem. During the Pan-Or Infodemic, we are working not just with the Indonesian COVID-19 Task Force, but also with WHO, UNESCO, UNICEF, and CDC, and not only on fact-checking, we are also working on some backstage unseen by the public, quote-unquote, meaning the work is not displayed, is not shared in our social media accounts. The activities are just the misinformation inoculation training, and providing SML or the social media listening data. The next member of the ecosystem is the media. Cekfakta.com is quite a unique platform. This is a collaboration platform where Marvindo works with more than 20 Indonesia’s national media, supported by AMSI, it is the Indonesian Cyber Media Association, and also with IG Independent Journalist Alliance of Indonesia, and also with the GNI, Google News Initiative. On a daily basis, we are sharing resources, we are coordinating, we are sharing fact-check articles, and also other activities such as fact-checking trainings for journalists and fact-checkers, and also digital literacy and fact-checking trainings for the general public. The last member, but not least, of the information ecosystem, the consumers, working with CSO, NGO, or community, as the representation of the public, Marvindo is working with Cibercrasy, this is Indonesian National Digital Literacy Movement, where more than 100 organizations collaborating for digital literacy education. Also, we have a program called The Media, which is media empowerment for democratic integrity and accountability. This is our program with USAID. We established the PESAT, it’s the Paguyuban ecosystem. information health or the community of healthy information ecosystem in several cities to harness the existing communities and local potential such as the leaders and influencers to gather them in a collaboration vessel. And not just with the representation, sorry, representations, we are also directly working with the public on daily basis, which is the fact-checking or the debunking. These are the full-timers working with community fact-checkers on the crowdsourcing aspect. Their work is stored in a central database. We have a turn-by-closer ID site, which is, it is open, it is accessible by the public through secure HTTP, meaning you can browse it using browser. Also it’s available in RSS feed, and also you can ask for us with the API key. Generally this is related with webmaster or website owner. And from turn-by-closer ID, we publish it through social media accounts, we have Twitter, now it’s called X. We have Facebook, we have Instagram, we have TikTok, and also other social media accounts. Not only that, we are working with national radio’s weekly program. Some radio, some of them also have the podcast, so podcast is something that you can play back many, many times, it depends on when you want to listen to it. And also we built an application called the HPT or the Hooksmaster tools apps, and also again we have a chatbot, WhatsApp chatbot. And I just want to debunking, this is at the moment has become a trend, it’s called pre-bunking. Pre-bunking is a preventive way to empower by inoculating against misinformation. So it’s kind of like debunking misinformation before the misinformation actually appears. Pre-bunking is proactive, and debunking is reactive. So pre-bunking is actually something that… able to prevent misinformation. So with CekFakta.com, again, this consists of Aji Indonesia, AMSI, Google News Initiative, and Mavindo. We provide pre-banking training in several cities. This time is for the purpose of preparation of the general election next year because usually misinformation are often tied directly to the disturbance to elections, political events. So one thing that most likely you already know is that ecosystem, the member of the ecosystem depends and influences each other, which means there are no single cause of the current condition of the media ecosystem. So there is no single cause, everyone contributes and depends on each other. So please do keep in mind something that most likely you already know, because in the previous sessions, we are also being reminded that governments, private sectors, technology sectors, private companies, and everyone in CSO and other movements is to collaborate, collaborate, collaborate.
Moderator:
Thank you. Yes, thank you, Ali. Yes, now, fact-check organizations, including my Japan Fact-Check Center, we are working not only for fact-checks, but also for media literacy education, and we need collaboration. Mavindo is a great example. I learned a lot from Mavindo when I started Japan Fact-Check Center. Thank you. So, next, Chai. Next, please.
Chay F. Hofileña:
Let me share my slides for a moment. Give me a minute, please, please. Give me a minute, please, please. Thank you. Okay. Sorry. Are you seeing my slides. Okay. Yes. Hi, I’m Chai Hofilenya. I’m the investigative editor of Rappler. I also handle training, and I’d like to share with you, and I’m also one of the founders of Rappler, along with three other women. When we started the Rappler about 10-11 years ago, I’d like to share with you the journey of Rappler in the fight against disinformation. Rappler started on Facebook in 2011, but we created our own website in 2012. When we started out, we identified three important pillars of the organization. One is journalism, the second is community, and the third is tech and data. We believe then, as we do now, that journalism needs a community to thrive, most especially now when the trust levels for journalists and journalism as a profession as a whole has dropped tremendously and dangerously. As Rappler journalists, we’ve told them, we’ve told our young reporters that they need to be comfortable. with technology and data, if they want to be at the head of the game and they want to do cutting edge journalism. So by mutually supporting and reinforcing each other, Rappler journalists working with community and using technology and data, we hope to be able to build stronger and better communities of action that can bring about change. This is after all the essence of journalism. So we are purely online and we turned 11 only last January. We’ve been recognized for investigative and data stories that we do. And we’re also a verified signatory of the IFCN code of principles. And we are one of two fact-check partners of Meta or Facebook here in the Philippines. The other one being Verifiles. We also have remained independent despite the partnership with Meta and we’ve done investigations on the platforms since 2016. And for accountability purposes, we have adopted a corrections policy since 2012. Let me tell you about what we’ve tried to build this past few years. This did not happen instantaneously. This is, even as I speak, we continue to build and to create and to expand this network. So what we’ve created is what we call Facts First PH. It’s really a community built around fact-checking and facts-based reporting. There are different layers as you can see in the pyramid. The first layer is fact-checking. The second layer is mesh. The third layer is research and the fourth layer is accountability. So essentially it’s. media, civil society, academe, and even lawyers. As Maria Reza said in her Nobel speech, without facts, you can’t have truth. Without truth, you can’t have trust. Without trust, we have no shared reality and no democracy. So it’s really anchored on truth-telling and democracy. So you will see at the very base of the pyramid is fact-checking. What we’ve done here is we’ve brought together different newsrooms and journalist groups, not only in Metro Manila, but even in the provinces. And we’ve also brought in, we’ve worked with volunteers to help us in the fact-checking effort. To expand this network, we’ve also done webinars online. We’ve found that it’s more efficient and we’re able to reach more students, teachers, and professionals when we do the fact-check webinars. And once they graduate, those who go through the webinars, the fact-checking webinars, many of them become volunteers, although different, varying degrees of activity. But beyond the newsrooms, we’ve also expanded the group, the effort. So we’ve tapped NGOs, business, and faith groups that help spread awareness about disinformation and disinformation operations. This, as I said earlier, this is still work in progress. Universities, we’ve also brought in universities and researchers because after all, they’ve, people from professors and researchers from academe are also interested in this. disinformation except that they have difficulty popularizing their research. So we’ve teamed up with some universities and we’ve published their, we’ve popularized and what we say we call storyfied their researches and published them on our website. And finally, we also have pulled in lawyers and other legal groups. They help journalists who have been attacked, who have been trolled and threatened online. Filipino journalists, most especially under the Duterte administration, they were accused, they have been accused of being communist. So the term that we use there is red bagging. So that’s very, very common. So lawyers have come to the defense of some of these journalists because they believe that journalism must survive if democracy in the Philippines must survive. So Facts First PH in general is really a multi-sectoral approach to fighting the infodemic. The appeal of this community is to fight the lies that weaken institutions and ultimately democracy. We also have a very, very young population. The audience, the Rappler audience is very diverse, but majority of our readers come from the 18 to 24 age group, extending a bit to the 24 to 34 age brackets. We have felt the need to go beyond text. Text just does not work anymore, especially for a population that has a very, very short attention span. The younger generation don’t read. They don’t read long form. They are attracted to video. They like things that are very, very visual. So we’ve adapted to that. We’ve adjusted and we’ve used visual. like cartoons as you can see on this slide. These were cartoons that were shared, that were created and shared during the campaign period preceding the 2022 presidential elections. The hope here was that we would make fact-checking a little more interesting and engaging. So the cartoonists and the comics creators became very, very active during the campaign period. Whether or not they were successful is another question altogether because we know who won the presidential elections in this country. We also tapped what we call influencers online. We wanted to go beyond our usual echo chambers. What was important in working with select influencers is that there was supposed to be a shared value for truth-telling and certain principles that they also shared. So the objective is to go beyond the echo chambers and to reach communities that these influencers have access to. So through them, we were hoping to reach new audiences. So they cut across age groups from the young to the more senior readers and followers. These influencers have established a degree of credibility also among the youthful and even the more mature audiences. And we invited them to be part of the community and they obliged. We also found that TikTok, as was shared earlier, has become an exponentially popular platform where we need to be to reach a more diverse audience. The messages essentially have focused on debunking falsehoods and providing useful information. So we turn to TikTok. We realize that fighting disinformation is not just reacting to lies that are being spread online, but it also means having to condense, to explain, to summarize, and to popularize very, very complex issues. Not very easy to do because how can you explain a very complex issue in one minute or 30 seconds or one minute and 30 seconds? It’s really, really very, very challenging, but we’ve been forced to adapt and to adjust to our audience and our readers and to use the platform. I will admit that initially at the start we were very, very hesitant to use TikTok because of privacy and data concerns, but our audience has shifted there and we know that just like in META, we are also very dependent on TikTok’s algorithm, but it’s a difficult choice. We said, we decided that we cannot not be on TikTok. So today our young researchers and reporters alike and the more senior Rappler journalists also use the platform to explain issues such as the fact-checking process, which you see on the left. In the middle, one of our researchers tried to explain the use of, or rather the misuse of confidential funds in the budget, and this was in reference to to Vice President Sara Duterte. And on the third, we also tried to explain the importance of making audit reports very transparent. So far, the feedback has been quite positive and the views have just been tremendous. So maybe we just have to balance things, but so far, so good. Finally, we’ve also tapped legal groups and lawyers. And we have, as I mentioned earlier, these lawyers are concerned about journalists, especially those who have been harassed and intimidated and been accused of being communists. There’s one particular group of lawyers very active today. This is called, their group is called The Movement Against Disinformation. They helped file a case against META to compel it to disclose information about anonymous accounts that attack the editor-in-chief. The editor-in-chief is the guy on the left, the first image there. He’s the editor-in-chief of a provincial publication. So he said, let me know who my anonymous attackers are. But of course, META has refused to disclose the information, but at least the effort is there and we will see where it will go. Another case involves a former government official and her co-host who had defamed another journalist and accused him also of being a communist. So this is the fad nowadays. You’re a communist if you criticize government. So what he did was he filed a civil case and take note, it was not a criminal case, but a civil case for damages because he does not believe in criminalizing. This is essentially part of the pushback against the spread of disinformation and the aggressive attempt to further weaken the rule of law and suppress democratic discourse. It’s a work in progress and we hope that the community continues to grow. Thank you. Yeah, thanks, Kei. This is another case of correlation, Facts First PH, working with journalists, civil society, academia, lawyers, and other professionals. Especially, it is essential to work with influencers on social media like TikTok to reach out to young people who are vulnerable to this information.
Moderator:
Okay, Mabi, next please.
Madeline Shepherd:
Thank you so much, Daisuke-san, and thank you so much for the opportunity to present today. I’m just going to share my screen. Can I get a thumbs up if that has worked? Yes, fantastic. Okay. So today, I’m going to just provide a very brief overview about Microsoft’s existing efforts to combat disinformation. And it’s really heartening to hear about the fantastic work of organizations like Mofindo and Rappler, because our view really is centered on the fact that – sorry, can you still see my screen there? No, okay, sorry, let me go back. Apologies. Apologies. Okay, thank you for your patience. Okay, so absolutely, Microsoft believes that the private sector has a responsibility to proactively and constructively engage in supporting democratic institutions and democracies around the world. And the importance of a collaborative approach has been alluded to already. So we will just build on that in our overview. So on the screen are the five principles that really guide our work when it comes to preventing disinformation. And they illustrate our role as to where we think the private sector can add value, in addition to the important work that civil society organizations and government are already doing in this space. We think it’s absolutely crucial to be leveraging technology to help democratic institutions because quite often it is technology that is causing some of the challenges in the first We want to play a leadership role in industry and make sure that other companies and other parts of the private sector are also doing their part. We think it’s very important to develop strategic partnerships that do cut across sectors, including partnerships with civil society and government. Of course, be nonpartisan in our efforts and always be working to support democracies around the world. So all of our efforts when it comes to disinformation at Microsoft come out of what we call the Democracy Forward Initiative. And this initiative works to preserve, protect and advance the fundamentals of democracy by promoting a healthy information ecosystem, by safeguarding open and secure democratic processes and by advocating for corporate civic responsibility, both from ourselves and from other companies in this space. I think we all acknowledge very strongly that disinformation erodes trust in the information that we rely on to keep us alive often. And unfortunately, the local news outlets that many of us previously turned to are disappearing. And so Microsoft and many other companies are dedicated to supporting a healthy information ecosystem where we can still access news that is trusted and information that is credible. In June, 2022, Microsoft actually announced its pilot information integrity principles, which outline how we approach disinformation from foreign actors across our products and services. And just quickly, these four principles are freedom of expression. So really making sure that we uphold our customers’ ability to create, publish, and search for information using our platforms. The importance of authoritative content. So we’re always trying to prioritize the surfacing of content that will counter foreign cyber influence operations or disinformation campaigns. Demonetization. So we will never willfully profit from cyber influence, content, or disinformation actors. And then the fourth principle is proactive efforts. So we’re always exploring opportunities to work more proactively to prevent our platforms and products from being used to amplify foreign cyber influence or disinformation campaigns. So the Democracy Forward initiative collaborates with teams all across Microsoft, but also external partners to increase societal resilience against disinformation and develop technical solutions and drive impactful thought leadership. So we do this under a number of different areas. The first one being societal resilience. And here we’re really focused on the development of partnerships across industries to create whole of society approaches to address the challenge that is disinformation, which as we all know is really a whole of society problem. One example of a partnership here is our partnership with NewsGuard, which is a third-party site. that provides credibility ratings and detailed nutrition labels for thousands of news and information websites around the world. And these websites at the moment are quite concentrated in Europe and English speaking countries. And in fact, the websites account for 95% of online engagement across the United States, the United Kingdom, Germany, France, and Italy. So there’s a lot of value in being able to provide what we call nutrition labels around the content that people are seeing in these countries. We are also of course, signatories to the European Union’s Code of Practice on Disinformation. And we’ve actually just published our report for the first half of 2023. The report notes that more than 6.7 million fake accounts were blocked on LinkedIn or prevented from being created in the first place in the first half of 2023. And that Bing search promoted authoritative information or downgraded less authoritative information in relation to almost 800,000 searches relating to the war in Ukraine. So they’re just some examples of proactive efforts that we have leveraged on our own products and services. Another key area of our work in the information integrity space is really data integration and working with internal and external stakeholders to detect and learn from disinformation campaigns and leverage these findings to develop new solutions to help take these actors offline. We’re consistently conducting research and creating reports on threats and the attacks that Microsoft and our Digital Threat Analysis Center have taken action against. And we’re increasingly looking at the intersection between cyber attacks or security breaches and information influence operations. And I think it’s fair to say that the traditional techniques used by… information and cyber attacks are now being deployed by those running information influence operations and targeted disinformation campaigns as well. And then finally, technical solutions are a really important part of the information integrity approach. Microsoft is actually a founding member of the Coalition for Content Provenance and Authenticity, the C2PA, alongside other companies such as Adobe, Intel, Twitter, the BBC, and many other companies. Earlier this year, the coalition actually launched its first version of an open source content provenance tool, which allows creators to claim authorship while empowering consumers to make informed decisions about what kind of digital media they should trust. And we think this is really important as more and more of us are using generative AI, there needs to be a lot of transparency around what content has been generated by AI so consumers have that knowledge. Another really important aspect of our work is around information literacy. And our goal here is really to build trust in the information ecosystem by enhancing the skills that consumers have when it comes to media literacy and also just consuming information. We see this part of our work as helping to address what we call the demand side of disinformation. So obviously there’s lots of work we’re doing on the supply side to try and target the campaigns and gain intel. But on the demand side, we really need to be building resilience in the population to be able to actually intake information in a critical manner. And that’s in addition to the great work that other civil society organizations do. Here, we have a multi-layered approach and that includes partnering with lots of different organizations to… embed information literacy campaigns and concepts into products and training around the world, utilizing our own platforms to help educate consumers on how to find and consume trusted information in a correct way, and also sourcing, developing, and sharing best practices based on industry research, both internally across our company, but also with external partners across the information space. A couple of quick examples of how we’ve done this is by providing inclined advertising space across various platforms that we have, including Microsoft Star and Outlook, to organizations that promote information literacy resources and skills. In the program’s first 12 months, we actually reached over 130 million Microsoft consumers with information literacy resources and skill campaigns. And in 2023, a little bit later this year, we’re very excited to be launching a Minecraft education information literacy game, along with accompanying educator materials, which we know will be very popular amongst younger children, starting to give them the skills and resilience they need to be critical consumers of information. So I will leave it there, and looking forward to the discussion. Thank you. Yeah, thanks Maddy.
Moderator:
Yeah, as Maddy said, technical solutions are very important. These information spreaders are using AI, and so we need AI and other technologies to prevent it, and performers are the best at these things. Okay, next scene please.
Shinichi Yamaguchi:
Thank you, Futa-san. Hello everyone, I’m Shinichi Yamaguchi. Today I’d like to talk about misinformation and disinformation in our society. I’m sorry, but I’d like to speak in Japanese, so could you please use this one? Thank you. At the beginning, let us introduce myself. I am working for GLOCOM, an international research institute. It is to do the research of social science. I have an economics Ph.D., and also the computational economy is my field. Using this method, I look at the disinformation, misinformation, and online bullying. These are some of the empirical studies I do for the realistic society. And when it comes to this topic today, together with the MIC of the Japanese government, I have engaged in various joint research, and the media for retailers’ textbooks were also edited with the government. What I want to talk about today is the thing I did with Google Japan. It is Innovation Nippon, some of the outcomes of this research. This joint project started in 2015, and from 2019, we shifted our focus to misinformation and disinformation. Every year, we do surveys for more than 10,000 respondents to analyze the people’s behaviors. The thing I want to share today is focusing on the 2022 and 2023, the latest outcomes of our research. This was about the vaccinations of COVID-19 and political information. We picked up six misinformation and six conspiracy theories. In total, we have 18 survey items. to analyze the people’s behavior. The first thing we understood is that political misinformation and disinformation, after encountering them, those who understood that it is wrong, it is incorrect information. Only 13% of the respondents understood that they are encountering misinformation or disinformation. Even the conspiracy theories and the COVID vaccine misinformation, only 40% of them, after reading or exposed to this information, realized that they are getting the wrong information. And also, people in their 50s and 60s are more vulnerable to this misinformation and disinformation. So this issue is not limited to young people alone. After reading this misinformation and disinformation, 15% to 35% of them shared this information after reading them. And then how they shared this information, the most frequent method was a direct conversation with the people around them. So the misinformation and disinformation are spreading beyond the internet. This is the total ecosystem problem of an entire society. Now, we can use these mathematical models to do the analysis of how this information spread. Misinformation and disinformation, if people believe that, they are more likely to spread it compared with those who thought that it was an incorrect information. And also, the people with lower literacy are more likely to spread disinformation and misinformation. Our information environment is greatly affected by the people with lower literacy and also those who are deceived to believe misinformation and disinformation. Then what kind of impact does this misinformation and disinformation have on society? We used two particular disinformation, misinformation, and we compared the people’s perception between before and after reading this disinformation and misinformation. One misinformation is something that is negative against the conservative politicians. The other was misinformation that is negative for the progressive politicians. And quite a few people change their opinions after reading disinformation. They no longer support these politicians after reading it. And especially the mild supporters are swayed much more by reading disinformation and misinformation. As you may know, these mild supporters are actually the majority of the voters. They have a big impact on the result of the elections. These people are swayed by misinformation and disinformation, and they change opinions easily. This means that disinformation and misinformation are very influential on the result of the elections. Now, recently, generative AI is widely used, and this is going to intensify this issue because now everyone can. manipulate the public opinion by using fake images and videos. One concern that I have is that, well, AI-generated misinformation and disinformation, they are no longer able to be told as such by humans. We are not able to tell whether they are AI-generated or not. So, we needed to have a new technology to detect what is the AI-generated information and misinformation. Now, I would like to talk about what our organization is doing. We do these verification studies together with the IT companies and also the Japanese government. And the outcomes are shared with many stakeholders. Beyond that, as you can see on the left, with the Japanese government, we prepare this kind of educational materials to help students fight against disinformation. And also, we have the collaboration with YouTube creators with the topic of misinformation for education purposes. And also, as you can see on the right-hand side, we hold some events where many stakeholders can have a discussion. And the outcome of the discussions are shared generally with the public. There are many other initiatives in Japan. For example, multiple stakeholders are invited in the committee to have a discussion on misinformation. And there is a Japan Fact-Check Center in which Mr. Furuta is the chairperson. He is also a member of IFCN. There are many things that we can do going forward. First of all, we needed to secure transparency, not only global level but also the local Japan level. Different countries also have to have its own transparency platforms, and that media and information literacy education have to be expanded, and we needed to come up with a technology to counteract these problems. Number four, we needed to set up the mechanism to prevent the ad revenue from flowing to the site of this information. We needed to do the factor-checking initiatives more efficiently, and then number six, we needed to engage a lot of stakeholders for the collaboration, and of course the international cooperation is very important, and the Ministry of Communication had announced the E-PAD, and I think that will play a critical role as a foundation. Thank you very much.
Moderator:
Thank you. This data shows not only that literacy is important but also what kind of literacy is useful and how. International sharing of such data is very important to make measures more effective. Okay, now we only have seven minutes for discussion, so I have a lot of questions each of you, but I think I can have only one question to you all. Okay, so my question to you all is, so what is needed to deepen international cooperation, not only inside your country, but also globally? So, any opinions would be greatly appreciated. So, how about you? Do you want to start?
Aribowo Sasmito:
So, previously, Maria-Ressa, in the previous… said that it’s time to start. So I think after this good event, such an important event, like what Maria said, let’s do some concrete steps so we can actually start on handling disinformation and misinformation. Earlier I said that basically everyone is in the same boat. So disinformation and misinformation, no matter in what country, basically it’s the same. It’s just more like local context. So I think it is a good start. This is a good occasion for the internet to start on any type of collaboration on handling disinformation by the members of the ecosystem that I previously presented. Thanks.
Moderator:
Thank you. Actually, we sometimes exchange our knowledge for fact-checking inside Asian organizations. So what do you think, Chai? How can we deepen our collaboration?
Chay F. Hofileña:
I think what’s essential really is to strengthen journalists and newsrooms because that’s what – it’s the journalists who produce the information. Earlier I just mentioned that there’s pre-bunking. So related to pre-bunking is really the ability, making sure that journalists and newsrooms, not just in specific countries but in the region and even worldwide, have the tools and have the resources to be able to do the job that they need to do. That includes investigations and being able to track players and actors, those who are part of the disinformation network. Doing this is very expensive. and not all journalists have the skills or are able to do it, and not all newsrooms have the resources to be able to do these types of online investigations. So if there are people, if there are private groups, companies, or even IT companies who have the resources and can share those with journalists, that would go a very, very long way. We have to be able to do our jobs well, and if we’re able to do our jobs well, then we can be able to proactively prevent the spread of disinformation.
Moderator:
Thank you. I have an additional question. Independence is really important for journalists, news organizations, so do you have any advice on how to deepen cooperation with other organizations while maintaining the independence?
Chay F. Hofileña:
Well, at least for Southeast Asian newsrooms and even Asia, like Rappler has offered fellowships, for example, and this is also with the help of grants and funding. This is upskilling of reporters and journalists, and we’ve found that this isn’t just in the region, but even in the Philippines, the skills of journalists are very, very uneven, and if we’re able to work together and share what we know best, especially with the advent of AI, AI is going to be a very, very serious threat to newsrooms. We have to be prepared to be able to deal with it, so the collaboration and the sharing, I don’t know if this can be done through training or even exchanges. Maybe reporters can work in that. in one newsroom for a specific period of time, just to be immersed and to know what newsrooms are capable of doing. And then that the skills can be shared with other colleagues that can probably help.
Moderator:
Thank you. Mari, as Chai said, AI and technology are essential, but the many newsrooms are local to hiring the engineers. They are not good at using technology. So how the Microsoft and performers can support the newsrooms or fact-checkers with technology?
Madeline Shepherd:
Yeah, it’s a great question. It really does emphasize the importance of these kinds of conversations where we just have the opportunity to connect industry and civil society who have excellent ideas in this space, but perhaps lack the resources to make them scalable and take them to all corners of the world and to all sorts of different newsrooms. So I think these sorts of initiatives are a really important starting point to bring us all together, but then absolutely technology companies, technology has disrupted the way that people get their news. So there is an obligation for some work to be done to kind of support newsrooms as they advance into the next chapter. And AI will be beneficial to many of them, but it will also present lots of new challenges. And so those partnerships between technology companies and journalists are very important. And Microsoft has lots of those, and we’re always interested in identifying new journalist partners in other countries. So I welcome anyone to reach out to me after this panel today. But I guess the other point is that that’s all looking at sort of the supply side of the information and where the information is coming from. But I think another really important aspect for us is trying to tackle that demand side. So, you know, particularly with young children and generations growing up with AI, making sure that they develop the digital resilience and information literacy skills that they will need to actually use this technology in their lives as they move forward in a responsible way.
Moderator:
Thank you. I think many journalists and newsrooms are really interested in working with platforms for using AI and technologies. Thank you. Shin-san, the question for you is, from the researcher’s point of view, what collaboration and measures are needed for our information ecosystem?
Shinichi Yamaguchi:
That’s right. The problem of misinformation or disinformation doesn’t close within the country for sure. If there’s a problem overseas that can be imported, the vice versa is also true. And there is a lot of impact that is spreading across the border. Now, AI-generated content is going to be having more volume than people-generated information very quickly. So that is like misinformation, disinformation volumes can exponentially go up. So AI, in terms of AI as a keyword, how do we leverage AI? How do we want to bring safety, security in AI? How do we want to have a development process and development standard for AI? We need to have an international standard for that. So sometimes we may have to work together to put a set of rules together for development of AI. And for fact-check too, right? Fact-check organization, if international collaboration is possible, I think the fact-checking process can be much more efficient. And as somebody has mentioned previously, mis- and disinformation detection process, approaches to identify or do the fact-checking process, knowledge sharing, crossing border would be something very helpful. Thank you.
Moderator:
We are running out of time, so let me wrap up the session. If information transcends borders, so countermeasures must also transcend borders. What came up again and again in today’s session was multi-stakeholder cooperation. So yeah, we will be happy to give you exchanges with the audience after the session. Thank you for your participation. Please give a round of applause once again to all the speakers. Thank you. Thank you.
Speakers
Aribowo Sasmito
Speech speed
143 words per minute
Speech length
1263 words
Speech time
529 secs
Arguments
MAFINDO actively fights disinformation and provides literacy education in Indonesia
Supporting facts:
- MAFINDO was established in November 19, 2016.
- It has chapters or branches established in 40 cities with approximately 1,000 volunteers.
Topics: MAFINDO, Literacy Education, Disinformation
MAFINDO collaborates with several social media platforms to tackle misinformation
Supporting facts:
- They are a third party fact checker partner of Facebook and Instagram.
- MAFINDO provides services through WhatsApp chatbot which is currently quite popular.
Topics: MAFINDO, Misinformation, Social Media Platforms
MAFINDO works with the government and media to address the spread of fake news during the pandemic
Supporting facts:
- During the pandemic, they have worked with the Indonesian COVID-19 Task Force, WHO, UNESCO, UNICEF, and CDC.
- They also work with more than 20 of Indonesia’s national media on a platform called Cekfakta.com.
Topics: MAFINDO, Government, Media, Pandemic, Fake News
MAFINDO uses a combination of ‘debunking’ and ‘pre-bunking’ to prevent misinformation
Supporting facts:
- Debunking is reactive and involves disproving misinformation after it circulates, while pre-bunking is proactive and involves preventing misinformation before it appears.
- They provide pre-bunking training in various cities for the general election.
Topics: MAFINDO, Debunking, Pre-bunking, Misinformation
Deepening international cooperation can start with effective measures against disinformation and misinformation
Supporting facts:
- Disinformation and misinformation are issues that affect all countries
Topics: International cooperation, Disinformation, Misinformation
Report
MAFINDO, an esteemed organisation established on 19 November 2016, plays an instrumental role in combating disinformation and enhancing literacy education within Indonesia. Their work aligns with the aims of SDG 4: Quality Education and SDG 16: Peace, Justice, and Strong Institutions. The breadth of their impact extends to branches in 40 cities nationwide, marshalled by approximately 1,000 dedicated volunteers, showcasing the ascending support for these initiatives.
Enhancing their capability to tackle misinformation, MAFINDO holds a prestigious position as an authorised third-party fact checker for dominant social media platforms, such as Facebook and Instagram. In embracing technological advances, MAFINDO utilises innovative tools like the increasingly popular WhatsApp chatbots to extend their services.
In the face of the COVID-19 pandemic, MAFINDO has expanded their collaborations, allying with government and international entities like the Indonesian COVID-19 Task Force, the World Health Organisation (WHO), UNESCO, UNICEF, and the Centers for Disease Control and Prevention (CDC).
Complementing these efforts, relationships with over 20 national media platforms have been established to collectively address the challenges of health misinformation on the platform, Cekfakta.com, significantly contributing to SDG 3: Good Health and Well-being. MAFINDO utilises a strategic combination of debunking and pre-bunking misinformation tactics, addressing spread of false information both post and pre-dissemination.
They offer proactive pre-bunking training in various cities before the general election, equipping citizens with knowledge to counter misinformation. The analysis underscores the global nature of battling disinformation and misinformation, advocating for deepened international cooperation. These issues require joint global efforts and collaboration amongst members of the internet ecosystem to effectively address misinformation.
The summary suggests a consensus around this call-to-action. In sum, the analysis shed light on MAFINDO’s crucial and proactive work in mitigating disinformation and misinformation, fostering literacy education, and building collaborative partnerships, all while prioritising public wellness in the face of major global events.
There’s a pressing need for shared responsibility and global action within the internet ecosystem, both in Indonesia and worldwide, to ensure robust mitigation against potential damages of disinformation and misinformation.
Chay F. Hofileña
Speech speed
128 words per minute
Speech length
2247 words
Speech time
1051 secs
Arguments
Chay F. Hofileña shares her journey with Rappler in the fight against disinformation, highlighting the efforts of Facts First PH, a multi-sector approach to tackle the infodemic.
Supporting facts:
- Facts First PH includes elements of fact-checking, research, accountability, media, civil society, academia, and legal professionals.
- Rappler started on Facebook in 2011, but created its own website in 2012, identifying three important pillars: journalism, community, and tech & data.
Topics: disinformation, Fact First PH
Uses of visual content and social media platforms like TikTok are essential to reach the young population who have a short attention span and prefer video content.
Supporting facts:
- Majority of Rappler’s readers come from the 18 to 24 age group, extending to the 24 to 34 age brackets.
- During the campaign period preceding the 2022 presidential elections, they created and shared cartoons to make fact-checking more interesting and engaging.
Topics: TikTok, youth engagement, visual content
It’s necessary to strengthen journalists and newsrooms to tackle disinformation
Supporting facts:
- Making sure that journalists and newsrooms have the tools and resources to do their jobs is essential
- Not all journalists and newsrooms have the resources to do online investigations
- Private groups, companies, or even IT companies who have the resources and can share those with journalists would make a big difference
Topics: Journalism, Fact-checking, Misinformation
Deepening cooperation with other organizations while maintaining independence can be done through sharing knowledge and skills
Supporting facts:
- Rappler has offered fellowships for the upskilling of journalists with the help of grants and funding
- Possible collaboration methods include training or exchanges where reporters can work in different newsrooms for specific periods to share skills
- The uneven skills among journalists within the region and in the Philippines can be addressed through such collaborations
Topics: Journalism, Collaboration, Training
Report
Chay F. Hofileña shares her experiences with Rappler in their ongoing fight against disinformation, highlighting the role of Facts First PH. This initiative utilises a multi-sector approach, incorporating aspects of fact-checking, research, and accountability from fields such as media, civil society, academia, and legal professionals, to tackle the rising issue of misinformation.
Hofileña emphasises the significance of digital engagement for reaching the reader demographic, especially considering that the bulk of Rappler’s readership falls within the 18-34 age category. She demonstrates the efficacy of employing platforms like TikTok and visual content, as seen in the 2022 presidential elections, where cartoons were created and shared to make fact-checking more appealing and engaging.
However, she voices genuine concerns over the intensifying harassment and intimidation of journalists, actions which potentially weaken media institutions and undermine democratic structures. Specific instances, such as Filipino journalists being accused of communist associations under the Duterte administration, are highlighted to illustrate these concerns.
She points out the urgent need for legal retaliation, such as the counteraction led by the Movement Against Disinformation against META for failure to disclose information regarding anonymous accounts targeting a provincial publication’s editor-in-chief. Strengthening the journalism industry is suggested as an effective way to mitigate the spread of disinformation.
Hofileña proposes that newsrooms and journalists be equipped with adequate tools and resources for online investigations. Moreover, she supports the concept of ‘pre-bunking’, a proactive approach to prevent the spread of misinformation, acknowledging the financial implications associated with its implementation.
Given the uneven dispersal of skills among journalists within the Philippines and the wider region, Hofileña moots potential collaborations for skill enhancement as a viable solution. Such collaborations are exemplified by Rappler’s fellowship programmes for journalists’ upskilling, facilitated by grants and funding.
The benefits of interdisciplinary collaboration among reporters from various newsrooms are similarly highlighted to encourage the sharing of unique skills and knowledge. Lastly, alerting the journalism community about forthcoming challenges, Hofileña alludes to the looming threat that Artificial Intelligence (AI) poses to newsrooms.
At present, specific details about this threat are unspecified, yet her warning underscores an urgency to prepare journalists and newsrooms for this impending challenge. This comprehensive summary underscores the necessity of multi-pronged strategies to tackle disinformation; from digital adaptation and cross-industry collaboration to enhanced resource funding, and preparation for emerging tech trends like AI.
Madeline Shepherd
Speech speed
162 words per minute
Speech length
1874 words
Speech time
696 secs
Arguments
Microsoft is actively involved in combating disinformation
Supporting facts:
- Microsoft has a Democracy Forward Initiative dedicated to combating disinformation and promoting a healthy information ecosystem
- Microsoft announced its pilot information integrity principles
- Microsoft’s Democracy Forward Initiative collaborates with teams and partners to increase societal resilience against disinformation
Topics: Technology, Disinformation, Corporate Responsibility
Partnerships and collaboration are crucial in combating disinformation
Supporting facts:
- Microsoft has partnered with NewsGuard for credibility ratings of news websites
- Microsoft is a founding member of Coalition for Content Provenance and Authenticity
Topics: Partnerships, Collaboration, Disinformation
Information literacy can build resilience against disinformation
Supporting facts:
- Microsoft provides ad space for promoting information literacy resources and skills
- Plans to launch a Minecraft education information literacy game
Topics: Disinformation, Information Literacy
Technology companies like Microsoft should support newsrooms and fact-checkers with technology
Supporting facts:
- AI has disrupted the way people get their news and presents both benefits and challenges
- Microsoft already has partnerships with newsrooms and is open to new journalists partners
Topics: Fake news, Fact checking, AI, Media, Microsoft
Report
Microsoft is creating a significant impact in the fight against disinformation through its proactive use of technology and collaborative strategies. This is evidenced by the implementation of its Democracy Forward Initiative, focusing on cultivating a healthier information ecosystem, and by the introduction of its experimental information integrity rules.
These initiatives demonstrate an upfront approach and commitment to bolstering societal resilience against disinformation. A defining aspect of Microsoft’s strategy lies in partnerships and alliances. The corporation has secured partnerships with NewsGuard, providing credibility ratings for news websites. This collaborative endeavour empowers users to identify credible sources, aiming to build a more reliable digital landscape.
Furthermore, as a founding member of the Coalition for Content Provenance and Authenticity, Microsoft reinforces the shared corporate responsibility to tackle the widespread issue of disinformation. Microsoft’s commitment also extends to broader discussions concerning the role of the private sector in supporting democratic institutions.
This reflects an understanding of the significant influence that technology giants can exert within such institutions. Microsoft adopts stringent measures on platforms such as LinkedIn, systematically eliminating fake accounts. This pattern also pervades their Bing platform, which systematically promotes authoritative information.
Education is a central pillar of Microsoft’s approach, with efforts to build resilience against disinformation through information literacy. Highlighting this, Microsoft utilises its advertising spaces to promote resources and skills in information literacy. Additionally, they plan to introduce an innovative educational game using Minecraft to further augment these efforts.
Microsoft also places strong importance on utilising technology, expressed in its willingness to establish alliances with newsrooms and fact-checkers. With AI transforming the terrain of news, Microsoft highlights the need to leverage these beneficial advancements to enhance the capabilities of fact-checkers, thereby improving the quality of news output.
Finally, Microsoft addresses the future implications of AI, underlining the importance of education in digital resilience and information literacy. The corporation identifies these skills as crucial for younger generations growing up amidst AI technology and emphasises the importance of responsible technology usage and safe digital habits.
In summary, Microsoft’s holistic approach embodies a balanced combination of technology utilisation, collaborative partnerships, and educational initiatives. This reflects their commitment to maintaining information integrity and promoting digital resilience, ultimately contributing to the achievement of the sustainable development targets of Industry, Innovation and Infrastructure (SDG 9) and Peace, Justice and Strong Institutions (SDG 16).
Moderator
Speech speed
137 words per minute
Speech length
1084 words
Speech time
474 secs
Arguments
Countermeasures against disinformation are being practiced across G7 countries and the EU
Supporting facts:
- The measures implemented can be divided into four categories, namely civil society, social media platforms, research entities, and government.
- Several projects, tools and actions are being developed across countries to educate, fact-check and ensure platform accountability.
Topics: Disinformation, G7 initiatives, EU initiatives
Effort is being made to promote media literacy in order to create resilience against misinformation
Supporting facts:
- The UK developed resources and projects such as E-Internet Citizens and Find a Fake to heighten media literacy.
- Germany trains journalists on dealing with statistics since they play a key role in combating disinformation.
Topics: Media literacy, Disinformation
Social media platforms play a significant role in mitigating the spread of disinformation
Supporting facts:
- Several measures have been introduced in countries like the UK, EU and France to hold large platforms accountable and to promote reliable information.
Topics: Social media, Disinformation
Research entities are contributing to the fight against disinformation through various projects
Supporting facts:
- Toronto’s Digital Public Sphere project developed an educational tool called ‘Know it or Not’.
- Germany introduced the topic of official statistics in their academic programs.
Topics: Research entities, Disinformation
Government bodies are taking stringent measures to ensure transparency and platform accountability
Supporting facts:
- EU has introduced regulatory measures like the Digital Service Act.
- The US and Italy are developing official communication channels and working groups respectively to ensure the freedom of information and facts.
Topics: Government bodies, Disinformation
Distrust in the profession of journalism has grown.
Supporting facts:
- Rappler has told its journalists they need to be comfortable with technology and data to do cutting-edge journalism.
Topics: journalism, trust, media
Rappler established the battle against disinformation on three pillars: Journalism, community, and tech and data.
Supporting facts:
- Rappler started on Facebook in 2011, and created its own website in 2012.
Topics: Rappler, journalism, community, technology, data
Fact-checking is essential in the fight against disinformation.
Supporting facts:
- Rappler and Verifiles are fact-check partners for Meta or Facebook in the Philippines.
- Joined by collaborators such as newsrooms, journalist groups, NGOs, business, faith groups, universities, researchers, and lawyers.
Topics: fact-checking, disinformation
Facts-based information and digital literacy can strengthen democratic processes.
Supporting facts:
- The Facts First PH is built around fact-checking and facts-based reporting.
- ‘Without facts, you can’t have truth. Without truth, you can’t have trust. Without trust, we have no shared reality and no democracy.’ – Maria Reza.
Topics: democracy, literacy, technology
Rappler uses a variety of communication methods, including cartoons and influencers to reach audiences.
Supporting facts:
- Rappler uses political cartoons during election campaigns to engage audiences.
- They collaborate with influencers that align with their values to extend their reach beyond usual echo chambers.
Topics: Media, Influencers, Cartoons
Legal groups and lawyers are essential in countering disinformation and protecting journalists and their freedom of speech.
Supporting facts:
- Movement Against Disinformation is an active group of lawyers who contribute to the fight against disinformation.
- They help journalists who have been attacked, trolled, threatened online and accused of being communist (red-tagging).
Topics: journalism, law, freedom of speech
Microsoft is dedicated to combating disinformation through collaborative efforts and technological solutions
Supporting facts:
- Microsoft’s five guiding principles in preventing disinformation include leveraging technology, leadership in industry, developing strategic nonpartisan partnerships, and supporting democracies
- Microsoft’s Democracy Forward Initiative works to preserve, protect, and advance the fundamentals of democracy
- Microsoft collaborates with external partners to increase societal resilience against disinformation
- Microsoft has taken proactive efforts in blocking fake accounts on LinkedIn and promoting authoritative information
Topics: Microsoft, Disinformation, Technology
Misinformation and disinformation in society
Supporting facts:
- Over 10,000 respondents surveyed each year to analyze behaviors related to misinformation and disinformation
- Only 13% of respondents recognize political disinformation as incorrect information
- Direct conversation is the most frequent method of spreading misinformation/disinformation, showing it extends beyond internet
- Mathematical models used to analyze spread of misinformation/disinformation show those who believe them are more likely to spread them
- Misinformation/disinformation can have a large impact on election results
- Generative AI can intensify issues by making it easier to manipulate public opinion
- New technology is needed to detect AI-generated disinformation
- Glocom conducts verification studies with IT companies and Japanese government
Topics: Conspiracy theories, COVID-19 vaccinations, Social science research, Generative AI, Election outcomes, Media and information literacy, Fact-checking initiatives
It’s time to start handling disinformation and misinformation
Supporting facts:
- Disinformation and misinformation is a problem in all countries
- This meeting is a good start for collaboration to handle disinformation
Topics: Disinformation, Misinformation
Strengthening journalists and newsrooms is essential in combatting disinformation
Supporting facts:
- Journalists produce the information
- Investigations and tracking of disinformation networks are expensive, requiring skills and resources not all have access to
Topics: Journalism, Disinformation, Media Literacy
Upskilling of reporters and journalists is crucial for dealing with emerging technologies like AI.
Supporting facts:
- Rappler has offered fellowships for upskilling journalists with the help of grants and funding.
Topics: Artificial Intelligence, Newsroom Efficiency
Collaboration and sharing in newsrooms through trainings and exchanges can be beneficial.
Topics: Collaboration, Newsroom Efficiency
Support from technological companies and performers can help improve the efficiency of newsrooms.
Topics: Support from Tech Companies, Newsroom Efficiency
Importance of initiatives that connect industry and civil society to make scalable resources
Supporting facts:
- Technology has changed the way news is delivered, necessitating partnerships between tech companies and journalists
Topics: Technology, Civil society, Industry collaboration
Technology and AI present new challenges to newsrooms
Supporting facts:
- Microsoft has partnerships with journalists, emphasizing the need and importance for collaborations between technology companies and journalists.
Topics: Artificial Intelligence, Journalism, Media
The need for demand-side intervention in information dissemination
Supporting facts:
- Children growing up with AI should be equipped with necessary digital resilience and information literacy skills to use the technology responsibly.
Topics: Information Literacy, Digital Resilience, Education
The problem of misinformation can spread internationally
Supporting facts:
- Impact that is spreading across the border
- Problem overseas can be imported, and the vice versa is also true
Topics: Misinformation, Disinformation, International Communication
AI-generated content will surpass people-generated information
Supporting facts:
- AI-generated content is going to be having more volume than people-generated information very quickly
Topics: Artificial Intelligence, Information Generation
The need for an international standard for AI development
Supporting facts:
- There’s a necessity for a development process and development standard for AI
Topics: Artificial Intelligence, Standardization
International collaboration could make fact-checking more efficient
Supporting facts:
- If international collaboration is possible, the fact-checking process can be much more efficient
Topics: Fact Checking, International Collaboration
Report
Across G7 countries and the European Union, extensive countermeasures are being implemented to tackle the growing issue of disinformation. These measures fall into four principal categories: civil society, social media platforms, government bodies, and research entities. An array of projects, tools, and initiatives are being developed within these groups to educate the public, fact-check information, and ensure the accountability of online platforms.
A key part of these counter-disinformation strategies revolves around promoting media literacy. For example, the UK has designed projects such as E-Internet Citizens and Find a Fake to enhance the understanding of media among its citizens. Germany has pursued a similar path, providing training for journalists to interpret and accurately report statistics – a critical tool in the fight against disinformation.
Social media platforms are not only a significant battleground for addressing disinformation but also crucial in mitigation efforts. Countries like the UK, France, and those within the EU have implemented various measures to hold these platforms accountable and promote reliable information.
Meanwhile, research entities are launching strategic projects targeting disinformation. A case in point is Toronto’s Digital Public Sphere project, which has developed an educational tool named ‘Know it or Not’. Germany has infused their academic curriculum with official statistics, thereby equipping learners with the skills to better interpret the information landscape.
Active in the fight against disinformation are government bodies, too. The EU, for example, has introduced regulatory measures like the Digital Service Act to ensure transparency and platform accountability. Both the US and Italy are developing official communication channels and working groups respectively, to preserve the freedom and accuracy of information.
On the front line of the battle against disinformation is journalism. Despite instances of distrust and criticism, organisations like Rappler have taken determined strides to counter disinformation through journalism, community engagement, and tech innovation. Rappler has become a fact-check partner for Facebook in the Philippines, supported by a range of journalists’ groups, NGOs, businesses, universities, and legal entities.
In the rapidly evolving landscape of information dissemination, technology and notably AI, have presented new challenges and opportunities. Microsoft, a tech giant, has embraced the challenge, employing technology and fostering partnerships to combat disinformation. Their proactive measures include blocking fake accounts on LinkedIn and promoting trustworthy information.
The prediction that AI-generated content will soon surpass people-generated information highlights an imminent shift and emphasises the urgent need to develop an international AI standard. It also points to the potential utility of AI in mitigating disinformation, such as in improving fact-checking efficiency.
Ultimately, it is abundantly clear that the fight against disinformation is a multi-dimensional endeavour requiring international collaboration and multi-stakeholder cooperation – from the governments that regulate, the platforms that disseminate information, the educational institutions that teach media literacy, to the tech companies blazing a trail with AI-based and technological solutions.
Shinichi Yamaguchi
Speech speed
121 words per minute
Speech length
1301 words
Speech time
645 secs
Arguments
Misinformation and disinformation have a significant impact on citizens’ perspectives, particularly on political matters.
Supporting facts:
- Comparisons were made between people’s perceptions before and after reading disinformation and a significant number changed their opinions.
- Mild supporters, who form a majority of voters, are especially influenced by disinformation and misinformation.
Topics: Misinformation, Disinformation, Politics
Disinformation and misinformation are spreading beyond the internet and is a total ecosystem problem in society.
Supporting facts:
- After reading misinformation and disinformation, 15% to 35% of the recipients shared the data, frequently through direct conversations with those around them.
Topics: Misinformation, Disinformation, Internet, Society
AI-generated misinformation and disinformation add a new level of complexity to the problem, as humans can’t easily distinguish them.
Supporting facts:
- Generative AI is now widely used and allows for easy manipulation of public opinion with fake images and videos.
- Humans are often unable to tell whether information or misinformation is AI-generated.
Topics: AI-generated Misinformation, Disinformation, Artificial Intelligence
Misinformation or disinformation is not confined to a single country, it spreads across borders
Supporting facts:
- Problems from overseas can be imported
Topics: Misinformation, Disinformation, International Impact
AI-generated content is soon to exceed people-generated information
Supporting facts:
- AI-generated misinformation and disinformation volumes can exponentially grow
Topics: AI, Content Generation, Information Volume
Need for development process and standard for AI
Supporting facts:
- To ensure safety and security in AI
- Need of international standard for AI development
Topics: AI, Safety, Security, Development Standard
Mis- and disinformation detection approaches should be shared internationally
Topics: Misinformation Detection, Disinformation Detection, Knowledge Sharing
Report
Misinformation and disinformation have a significant impact on citizens’ views, particularly in political spheres. Specifically, ‘mild supporters’, who comprise a large portion of voters, are greatly susceptible to the sway of such false information. This underlines the potency of misinformation and disinformation in altering public opinion, evidenced by individuals significantly changing their perspectives upon exposure to fabricated data.
Moreover, the challenge of false information is not confined to the digital realm. It has permeated all areas of society, with 15% to 35% of individuals reportedly disseminating the misinformation they received, often through direct conversations with their peers. This illustrates that the issue of misinformation and disinformation extends beyond the internet, serving as a comprehensive societal problem necessitating holistic solutions.
The issue is further amplified by the advent of Artificial Intelligence (AI)-created disinformation. This novel form of questionable information, generated by ‘Generative AI’, adds another dimension of complexity to the situation. It facilitates easy exploitation of public sentiment through deceivingly real images and videos.
Regrettably, humans often find it challenging to distinguish between credible information and AI-generated misinformation. This could result in exponential growth in misinformation volumes, outpacing content produced by humans. The problem of false information is not contained within borders; rather, it casts a global shadow, capable of crossing national boundaries and creating chaos on an international scale.
This underscores the need for global cooperation and concerted efforts in tackling this issue. Utilising tools like advanced technology, quality education, and robust stakeholder engagement can prove instrumental in such endeavours. An instance of this approach is observed in the collaboration of IT companies with the Japanese government for conducting verification studies.
Similarly, fostering critical thinking skills through educational materials can arm students effectively against disinformation. Additionally, hosting events that encourage dialogue between different stakeholders can facilitate meaningful progress in the fight against misinformation. There is an urgent call for standardised development processes and regulations for AI, prioritising safety and security.
Such guidelines would aid in mitigating the detrimental impacts of AI-generated disinformation, highlighting the necessity for regulations that align with technological advancements. International collaboration is essential, particularly for streamlining fact-checking processes, making them more efficient and trustworthy. Going a step further, techniques for detecting misinformation and disinformation should be globally shared, fostering broader international knowledge-sharing and cooperation.
Through implementing these measures, societies can build a solid defence against the onslaught of disinformation and misinformation.