Day 0 Event #265 Using Digital Platforms to Promote Info Integrity

23 Jun 2025 13:30h - 14:30h

Day 0 Event #265 Using Digital Platforms to Promote Info Integrity

Session at a glance

Summary

This panel discussion focused on how digital platforms, particularly TikTok, can be used to promote information integrity and combat misinformation. The session was moderated by Maurice Turner from TikTok’s public policy team and featured panelists from various organizations including a medical professional, representatives from humanitarian agencies, and advocacy groups. Dr. Ahmed Ezzat, a surgeon who creates medical content on TikTok, shared how he transitioned from casual content creation to serious public health communication after one of his posts about childhood infections reached 4.5 million views. He emphasized the importance of accountability and evidence-based practice for healthcare professionals on social media platforms.


Representatives from UNHCR, the Red Cross, and the Internet Society discussed their strategies for creating trustworthy content while facing significant challenges from misinformation campaigns. Gisela Lomax from UNHCR highlighted how misinformation directly harms refugee communities and can even contribute to forced displacement, citing the Myanmar crisis as an example. The panelists emphasized that effective content creation requires extensive expertise, collaboration across multiple disciplines, and significant time investment despite appearing simple to audiences. They stressed the importance of building trust over time rather than focusing solely on metrics like followers or views.


Key challenges identified included operating in polarized environments, competing against well-resourced misinformation campaigns, and balancing the need for quick responses with thorough fact-checking. The discussion concluded with recommendations for increased partnerships between humanitarian organizations, tech companies, and civil society groups to address information integrity challenges more effectively. The panelists agreed that while the battle against misinformation is complex and ongoing, strategic collaboration and maintaining high standards for content creation remain essential for success.


Keypoints

**Major Discussion Points:**


– **Platform strategies for promoting credible information**: Panelists shared how their organizations (TikTok, medical professionals, UNHCR, Red Cross, Internet Society) use digital platforms to disseminate trustworthy content, including partnerships with fact-checkers, creator networks, and evidence-based messaging to reach large audiences quickly and effectively.


– **Challenges in combating misinformation and disinformation**: Discussion covered the difficulties of fighting organized misinformation campaigns, including well-resourced disinformation “farms,” the polarized nature of online discourse especially in conflict zones, and the challenge of responding to false information once it spreads.


– **Building trust and credibility online**: Panelists emphasized the importance of accountability, transparency, using real names/credentials, engaging authentically with audiences, and building long-term relationships rather than focusing solely on metrics like views or followers.


– **Resource constraints and timing challenges**: A key tension emerged around balancing speed versus accuracy – the need to respond quickly to misinformation while taking time to verify information, assess risks, and coordinate with experts across multidisciplinary teams.


– **Collaborative approaches and partnerships**: Strong emphasis on the necessity of partnerships between humanitarian organizations, tech companies, civil society, academics, and content creators to effectively address information integrity challenges, especially given limited resources.


**Overall Purpose:**


The discussion aimed to explore how digital platforms, particularly TikTok, can be leveraged by credible organizations and creators to promote information integrity, share best practices for combating misinformation, and foster collaboration between different stakeholders in the information ecosystem.


**Overall Tone:**


The tone was professional yet conversational, with panelists sharing practical experiences and honest challenges. It remained consistently collaborative and solution-oriented throughout, with speakers building on each other’s points and emphasizing the shared nature of information integrity challenges. The discussion maintained an optimistic outlook about the potential for positive impact through digital platforms while acknowledging the serious difficulties involved.


Speakers

– **Maurice Turner**: Public policy team at TikTok, panel moderator


– **Dr. Ahmed Ezzat**: Surgeon in training doing general and breast oncoplastic surgery in London, creates clinical content and health news on TikTok


– **Eeva Moore**: Works at the Internet Society, focuses on educational content, advocacy, and community stories to help connect the remaining third of the world that’s not online


– **Gisella Lomax**: Leads UNHCR’s capacity on information integrity, works with the UN’s humanitarian agency protecting refugees, forcibly displaced persons, and asylum seekers


– **Ghaleb Cabbabe**: Marketing and advocacy manager of the IFRC (International Federation of Red Cross and Red Crescent Societies), based in Geneva


– **Audience**: Includes Bia Barbosa, a journalist from Brazil and civil society representative at the Brazilian Internet Steering Committee


Additional speakers:


None identified beyond those in the speakers names list.


Full session report

# Digital Platforms and Information Integrity: A Multi-Stakeholder Perspective on Combating Misinformation


## Executive Summary


This panel discussion, moderated by Maurice Turner from TikTok’s public policy team, brought together representatives from healthcare, humanitarian organisations, and civil society to examine how digital platforms can promote information integrity and combat misinformation. The session featured Dr. Ahmed Ezzat, a surgeon creating medical content on TikTok; Gisella Lomax from UNHCR’s information integrity team; Ghalib Cabbabe from the International Federation of Red Cross and Red Crescent Societies; and Eeva Moore from the Internet Society. The discussion highlighted both the transformative potential of digital platforms for credible information dissemination and the complex challenges organisations face when combating well-resourced misinformation campaigns.


## Key Themes and Discussions


### Platform Reach and Impact


Maurice Turner opened by highlighting TikTok’s partnerships with fact-checking organisations and media literacy programmes designed to empower users. The panellists demonstrated the unprecedented reach that digital platforms offer for credible information dissemination. Dr. Ahmed Ezzat provided a compelling example, describing how one of his posts about childhood infections reached four and a half million views and was shared 156,000 times within 24 hours at zero cost, illustrating the massive public health potential of these platforms.


Gisella Lomax emphasised that UNHCR was “the first UN agency to create a TikTok account” and noted the vital importance of digital platforms for their work with displaced populations globally. She explained that these platforms are essential for providing lifesaving protection information. Ghalib Cabbabe outlined how the Red Cross utilises digital platforms for brand awareness, crisis communication, information sharing, and fundraising campaigns, citing recent examples including the Myanmar earthquake, Gaza, and Iran. Eeva Moore added that platforms serve a crucial role in connecting communities through educational content and stories.


### Content Creation and Trust Building


Dr. Ahmed Ezzat, who is part of the Clinical Creator Network in the UK, emphasised the importance of accountability for healthcare professionals, advocating for the use of real names and avoiding commercial endorsements that could damage credibility. He made a particularly insightful observation about public intelligence: “Actually, members of the public, regardless of what their understanding is, bearing in mind in the UK the reading age is about six or seven, the maths age is about three to four, they are phenomenally intuitive. They can sniff out right from wrong. You just need to be able to explain that information in an easy and simple way.”


Gisella Lomax outlined UNHCR’s approach to successful content creation, which requires creativity, partnerships with influencers, uplifting community voices, and making facts entertaining whilst remaining educational. She stressed the importance of building partnerships with tech companies, academic institutions, and civil society organisations.


Eeva Moore provided crucial insight into the hidden complexity of content creation, noting that expertise must be “baked into” the production process with multiple layers of review. She explained that whilst content should appear simple to audiences, it requires extensive behind-the-scenes work. Ghalib Cabbabe reinforced this point, emphasising that organisations must rely on field experts rather than just marketing teams and prepare proactive strategies including scenario planning.


### The Challenge of Misinformation


A central theme emerged around the asymmetric nature of the battle against misinformation. Dr. Ahmed Ezzat described how misinformation creates an unfair defensive position where evidence-based voices must defend against accusations whilst being constrained by factual accuracy.


Gisella Lomax connected online misinformation to devastating real-world consequences: “Information risks such as hate speech and misinformation are directly and indirectly causing real world harm. And I mean violence, killings, persecution. It can even be a factor in forced displacement, in causing refugees… as we saw in Myanmar back in 2016, 2017, when hate speech… had a decisive role in the displacement, I think, of 700,000 Rohingya refugees into Bangladesh who are still there today.”


Ghalib Cabbabe highlighted the resource disparity, noting that organisations face opponents with significant resources including state actors and misinformation farms operating in highly polarised environments.


### Time as the Fundamental Challenge


Eeva Moore identified time as the core challenge: “You could boil the challenge down perhaps to one word, and that’s time. If you’re in the business of trying to disrupt or break something, you get to move at a much faster pace than if you’re on the other side of that equation.”


This time constraint creates tension between speed and accuracy. Ghalib Cabbabe advocated for balancing speed with thoroughness by assessing risks before communicating, sometimes taking time to understand how misinformation evolves rather than rushing responses. Eeva Moore emphasised the need for creative solutions and pre-prepared resources to overcome these constraints.


### Platform Responsibility and Resource Concerns


Gisella Lomax raised concerns about platform responsibility, particularly regarding content moderation in less common languages: “We have seen, to our dismay, a weakening of these capacities and perhaps less resourcing from some companies. And I would extend that, for example, to content moderation in less common languages… Are you adequately providing content moderation in less common languages in these very volatile contexts where you don’t actually have a business argument?”


This exposed a critical gap in platform safety measures for vulnerable populations who communicate in languages that aren’t commercially viable for platforms.


## Areas of Consensus and Strategic Differences


The panellists agreed that digital platforms provide unprecedented reach for information dissemination and that content creation requires multi-layered expertise across organisations. All speakers recognised that misinformation poses serious real-world threats requiring proactive, systematic responses.


However, some tactical differences emerged around response strategies. Dr. Ahmed Ezzat emphasised the defensive disadvantage of being constrained by evidence, whilst Ghalib Cabbabe advocated for strategic patience. Eeva Moore focused on speed through creative solutions and pre-preparation.


## Audience Engagement and Future Collaboration


The discussion included audience participation, with Brazilian journalist Bia Barbosa asking questions about platform engagement. Gisella Lomax actively invited collaboration from academics, tech companies, and NGOs, specifically mentioning a Wednesday 2 PM event showcasing UNHCR’s South Africa pre-bunking project and referencing “Katie in the pink jacket” for those interested in UNHCR’s information integrity toolkit.


The panellists recommended examining comment sections on Red Cross digital platforms to understand the complexity of misinformation challenges firsthand.


## Key Recommendations


Strategic recommendations included:


– Focusing on quality of engagement rather than vanity metrics


– Building partnerships to share resource burdens


– Preparing proactive content strategies rather than only reactive responses


– Collaborating across organisations rather than building all capabilities internally


– Developing scenario planning and pre-prepared responses


## Unresolved Challenges


Several challenges remained unresolved, including how to adequately resource content moderation in less common languages, addressing weakening trust and safety capacities from some platforms, and overcoming the fundamental time asymmetry between misinformation creators and fact-based responders.


## Conclusion


Maurice Turner’s closing remarks emphasised the tensions revealed in the discussion and the key takeaways about collaboration and resource sharing. The panel demonstrated that effective information integrity work requires sophisticated understanding of platform dynamics, audience psychology, and collaborative resource management.


As Eeva Moore noted, this work “should be labour-intensive” and should “look like a light lift, but actually, in fact, be a pretty heavy one.” The discussion revealed that information integrity is not merely a technical challenge but a humanitarian imperative with life-and-death consequences, requiring sustained collaboration between humanitarian organisations, tech companies, civil society, and academic institutions.


Session transcript

Maurice Turner: Welcome to the folks in the audience here in person as well as online. We’re going to be discussing using digital platforms to promote information integrity. My name is Maurice Turner and I’m on the public policy team at TikTok. Hello, hello. Welcome everyone who’s here in person as well as online. Welcome everyone who is here in person and online. My name is Maurice Turner. My name is Maurice Turner and we’re going to be discussing using digital platforms to promote information integrity. Again, I want to welcome everyone here in person and online. And for our panel discussion, we’re going to be talking about using digital platforms to promote information integrity. I have several panelists here and I’d like to start off with a brief introduction to how we’re facing this challenge at TikTok. And then I will get into introductions for our panelists. And we will jump right in to a back and forth that will also include Q&A from the audience. So if you have questions, please do make a note of them during this panel discussion. And towards the end, we will have microphones on both sides of the table so that you can get up and ask your questions. I’m also encouraging questions from our audience online. So if you have any questions, feel free to go ahead and type those in. And we have someone monitoring that online discussion. So your question can be asked toward the end of the session. At TikTok, we believe this conversation is important because information integrity itself is important, both to inspire our creators to promote credible information, but also to ensure that organizations are doing important work to amplify their own messages. We look forward to hearing more about organizations doing that work in our discussion later on today. At TikTok, we remain committed to using fact-checking to make sure that the content on our platform is free from misinformation. We partner with more than 20 accredited fact-checking organizations across 60 different markets. We do this work continuously to ensure that we are making sure that our platform is as free from misinformation as possible. And we also empower our users through media literacy programs to give them resources to recognize misinformation, assess content critically, and report any violative content. And now for our panelists. I’d like to go ahead and start off with introductions. Dr. Ahmed, he’s a creator that produces content. We also have Eva from the Information Society, as well as Gisela from the UNHCR. As an icebreaker, let’s go ahead and start off with the introduction. Dr. Ahmed, how would you like to introduce yourself and talk about how you produce content on TikTok?


Dr. Ahmed Ezzat: Thank you very much for the introduction. So I’m Dr. Ahmed Izzat. I’m a surgeon in training doing general and breast oncoplastic surgery in London. I create clinical content and my focus is health news. And I also do lots of content strategy and campaigns. And one of the draws to being on a platform like TikTok is the force of good it can portray in the way that it’s quite sensationalist, that if you get the balance right from a public health perspective, for instance, in the UK, there’s been a heat wave. You can reach a million people, half a million people within less than 24 hours at zero cost, essentially, which is a massive gain, I find. So I think there’s a massive power of good if it’s harnessed well using evidence-based information.


Maurice Turner: And Eva, how do you use the platform?


Eeva Moore: At the Internet Society, we are really focused on a couple types of content. Educational. We work to help connect the remaining third of the world that’s not online. And that can look like building infrastructure in one of the 181 countries where we have chapters. So a lot of it is educational. There’s a lot of advocacy. And then there’s a lot of community stories. How do we demonstrate the impact that we’re having in the world without making ourselves the center of the story itself? Thanks.


Gisella Lomax: Hi, everyone. My name is Gisela Lomax, and I lead UNHCR’s capacity on information integrity. For anyone not familiar with UNHCR, and you might not be familiar because certainly this is my first time at the IGF, we’re one of the UN’s largest humanitarian agencies tasked with protecting the world’s refugees, forcibly displaced, asylum seekers. That’s 123 million people now across, I think, 133 countries. So we’re using social media in many sophisticated and extensive ways. It’s vital. As Dr. Ahmed said, the reach is massive. One, to provide vital, lifesaving protection information to communities, both in emergency context, protracted context. And the other is to inform the general public as well as specific audiences and stakeholders, including government partners, civil society, the private sector, about our work. To lift up the voices of our refugee partners as well, and amplify communities, and to inspire. And I would say as a kind of a little unknown fact is that we were the first UN agency to create a TikTok account quite a few years ago now. And I think it’s very interesting the way TikTok really seeks to make important information fun and entertaining. Although that also comes with challenges, but I think we’ll come to that.


Maurice Turner: We’re also joined online by our panelist, Ghalib from the Red Cross. Would you please go ahead and introduce yourself and let us know how you are using TikTok?


Ghaleb Cabbabe: Sure. First, apologies for making this panel hybrid again. And it’s nice to connect even online. And hi, everyone. So I’m Ghalib. I’m the marketing and advocacy manager of the IFRC, the International Federation of Red Cross and Red Crescent Societies, based in Geneva. How do we use digital platforms? Because we’re also tackling different fields, different topics. So depending on the situation, it would be, let’s say, different platforms. But usually the main use is, of course, for brand awareness to make sure that people know what we do, how we do it. We also use digital platforms. And this is a big part of the work in crisis mode sometimes when, for example, we have a disaster, an earthquake, let’s say, lately in Myanmar or the situation also in Gaza or Iran. So this is more in crisis mode. Sometimes it’s to share information. Sometimes it’s to try not to have misinformation. I think we’ll open also this topic more in detail in the next coming minutes. And sometimes we also use digital platforms for direct or indirect fundraising campaigns. So we’re happy to be on TikTok. And I can recall working with the Red Cross that was maybe back in 2017, where it was, we can say, the beginning of the platform. And to see where it is today, I think that’s quite a great achievement. Thank you.


Maurice Turner: I’m going to go ahead and start us off with a specific question for Dr. Ahmed. How did you get your start on TikTok? And what are some of the ways that you achieve your goal of making sure that you are getting content out on the platform and out to an audience specifically related to STEM inequality?


Dr. Ahmed Ezzat: So I think I’ll share my journey because I think it’s representative and it’s an honest one. So when I first started my journey on TikTok as a medical professional and academic surgeon, I wanted to set up my content for actually very different reasons, which is trying to do stuff that’s seemingly for fun, lighthearted. But then I absolutely did not want to be that cliche medic who’s creating content on medicine. But then it was too difficult to resist because there was such a need. And we are quite fortunate in the UK. There’s a very nice and well-organizing climate of clinicians that create content. And then I had my first bit of content, which went viral and it hit four and a half million views. And this was on infections in childhood. At the time, there was an outbreak. And then this was shared 156 times. 6,000 times. And I thought, well, the UK population is, you know, in the in the in the 60 plus million four and a half million is phenomenal. So from that point on, I actually really started to look into tick tock in a very different way, because the impact is so phenomenal. So that if there were outbreaks on say, food poisoning, I’ve had agencies give me information, but say, well, we can’t disclose it publicly just to try and govern misinformation to try and help information. And so I shifted from a name that I had to begin with, which was a nickname, to actually moving on to using my real name on tick tock, which is to celebrate the whole and leverage the whole point of the fact that you should be held accountable for the medical the medical information that you say. And I really do also see a massive shift in healthcare professionals and colleagues of mine, who to begin with, used to look at social media, tick tock, you know, as a as a as a as a vanity project. But then now, government have been in, you know, been working with us, you know, institutions that are verified, you know, political organizations to charities to esteemed companies, because they see the value and the reach that you can have just by creating this content. But the really important bit, which is going to your point, how do you go about achieving your goals? tick tock, for example, has been fantastic in trying to balance freedom of information against integrity of information. And so we work together to set up the clinical creator network in the United Kingdom. And it’s through having some, you know, microclimate of clinicians that are really there to do a force of good, but also using evidence based practice in the same way that they would be accountable to the General Medical Council or whichever organization they’re at wherever in the world. I think that’s the most important thing. Because as a healthcare professional, you’re not a lifestyle creator, meaning that if I was to buy a brand of jeans tomorrow and put it on my posts, no one would care. But if I was to suddenly pick up a brand deal to try and promote weight loss medicines, for example, which is an allure that we get every day, then this would absolutely decimate credibility. And so you really have to carry the responsibility extremely carefully as a clinic, as a clinician.


Maurice Turner: Now, a question that I get pretty regularly is how do I make content that’s popular on the platform? And I’m not the expert in that. So I’ll leave it to others. But I think a related question is, what are some of the strategies that might be used to push out content that is actually trustworthy and credible, so that people on the platform are getting the information that they’re looking for? Gisela, do you have any sort of a response for that?


Gisella Lomax: Yeah, definitely. Well, I was going to add to how we use our strategies for social media platforms and digital channels. And I think first on the communication side, that’s very much being creative. It’s partnering with influencers and brands, uplifting refugee-led storytelling, using the facts, but trying to put them across in an entertaining, educative way. So all of this good stuff to amplify refugee voices, spark empathy, drive action offline, as well as online. However, the challenge, of course, is this is increasingly undermined and threatened by the growth of misinformation, disinformation, hate speech, these types of information risks. And so my role, and I used to work more on the communication side, and now I pay tribute to our communications colleagues and the social media team at UNHCR and at countries who do a fantastic job. My work now is on addressing these risks. So at UNHCR, we have this information integrity capacity, and then field-based projects around the world in around nine countries, basically developing a humanitarian response to mis- disinformation and hate speech. We recently launched a toolkit. If you Google UNHCR information integrity, you’ll find it, which has got practical tools, guidance, common standards on all of this quite technical stuff, such as assessing the problem and then different types of responses. And we really have tested a plethora of responses, and I can highlight just a couple, given the time. Obviously, there’s continuing to fill the feeds with proactive, trusted, reliable, credible, accessible, multilingual, the list goes on, content and information. And working with technical, with digital platforms, including TikTok, to try and get this content uplifted, because it often naturally isn’t the sort of content that rises to the top of algorithms. And so those partnerships are key. But the other aspects are how do we deal with a spread of, say, hate speech that’s inciting violence, disinformation that’s undermining humanitarian action, misinformation that’s making people feel, you know, more hostile or mistrustful of communities. And I’d like to highlight one quite exciting response. And if I may, a little plug, we have a really interesting project in South Africa, testing the notion of pre-bunking. This is the inoculation theory of trying to help societies and communities become more resilient to hate and lies in partnership with Norway, South Africa. But we’ve also benefited from some technical support from TikTok and from Google and from others as well. And we have an event where we’ll be showcasing this on Wednesday at two o’clock, and my colleague Katie there in the pink jacket can tell you more. So that’s a really positive example. But as you can see, it really runs the gamut, and I’m happy to go into a bit more detail on some of those.


Maurice Turner: Ghaleb, I’ll pass it over to you. As marketing manager, what are some of the strategies that you use to get out that trustworthy information?


Ghaleb Cabbabe: It’s a very, let’s say, timely question. It’s an important one, not an easy one, because unfortunately, we don’t have today all the tools. Because if we look today at the environment of misinformation, let’s face it, we have sometimes companies, farms in some places where their only goal is to produce misinformation. And usually, it’s very hard to fight against these in terms of resources, in terms of means, in terms of targets. So what we do, of course, internally, is always check, double check, and check again, the information we want to share, the information sometimes we want to react on, we want to comment on. I’ll give you an example. It’s been unfortunately the case in the past few weeks and months with the killing of PRCS, Palestinian Red Crescent colleagues in Gaza, for example, in Iran lately as well, where we had the Iranian Red Crescent colleagues killed in the latest attacks. So first, checking the information, rechecking at local, at regional level is, of course, something that has to be done. We also rely on the experts in their own fields. Of course, we’re the marketing, we’re the social media team. But it’s not us who make sometimes the call to a certain line, to a certain reactive line or key messages that we want to share. So relying on experts is really essential. And also the monitoring part, making sure that we try as much as possible to anticipate what could go wrong, to anticipate information that’s going out, and that could lead to a misinformation situation. So the monitoring part is also important. And also, we are always trying to be, trying to be again, because it’s not an easy battle, a step ahead in terms of being proactive. This would be, for example, by preparing reactive lines, by also trying to identify the different potential scenarios and be ready if and when it happens. So these are strategies that we put in place. Again, it’s not an easy file. It’s a very essential one. And we’re putting resources, we’re also as much as possible trying to train staff internally, to avoid situations like this, because sometimes also, this type of misinformation is triggered by a certain, let’s say, tweet or information or post, be it on TikTok or other platforms that could also trigger a reaction, a comment that can cascade. So being proactive, training also staff is part of the different strategies. And let’s not forget that the main objective of our communication is, of course, to be as impactful as possible. And we see this information as a threat, not only to the communication that we share on social media, but also sometimes to the credibility of the organization.


Maurice Turner: Thank you. Those were quite interesting strategies. Eva, what about you?


Eeva Moore: Kind of building on that, I think I think you have to bake expertise into the production process. Anything that you see on the social media platforms should be easily digestible, but it should be built on top of multiple layers of expertise and conversations and trust that we have across our community. We do advocacy efforts in countries that I may never have visited, which means I’m not the person to create that. And so it should be labor-intensive. I mean, dealing with these types of issues when we’re creating them, when we want to be accurate, should look like a light lift, but actually, in fact, be a pretty heavy one. And that takes, I think, just incredible amounts of conversation and listening and being very online and seeing what others are saying. I mean, you have to be consuming a lot of information. Sadly, that includes coming across the disinformation, if you’re going to understand it and to understand how it’s navigating the space. And the expertise also has to be credible, because your audience doesn’t want to be spoken down to. So when I say expertise, that is legal, that is technical, but it’s also the people who are impacted by the – I mean, you touched upon refugee communities. For us, it’s about the credibility of people. We can all be led down a disinformation path, but I think people have a pretty keen sense of whether or not they want to listen to somebody, and credibility is built into that. So I think relationships, it’s sort of the world that you inhabit in real life, how it manifests online is, I think, reflected into that work. So you just have to bake it in, and you have to have your work reviewed, and you have to be willing to produce things that maybe don’t make it out into the world. You also have to be just willing to see – to do your best and listen to others and make sure that by the time it gets out there, it’s accurate.


Maurice Turner: I think that’s so key in that it takes the consumption of so much information and the recognition that not one single person or even one organization has all the expertise. There’s a reliance on other folks in the community in building those relationships so that you can get that information, distill it, and at the end of the day put out a product that looks like it was easy to make because it was actually easy to understand, and there’s quite a bit of work that goes into that. I’m sure all of us, not only on stage and online as well as in the audience, have come across challenges in information integrity. So I’d like to shift the conversation to hear more about some of those challenges that we face specifically, and then also how we were able to overcome those challenges. Dr. Oppenman, can you share a challenge that you faced and how you were able to overcome that in terms of information integrity and maybe even an example where you weren’t necessarily able to overcome it?


Dr. Ahmed Ezzat: Yeah, absolutely, and some fantastic points here. I think just to reiterate, one of the strong points of social media, especially short-form social media, is that it can really reach those with lived experiences, and you are very much there, very much accessible, and very much accountable to the hundreds of comments you get. But of course, disinformation sometimes is difficult to combat. Once it’s out, it’s very difficult, but you’re almost on the defense, almost trying to defend an accusation, and sometimes it is an unfair footing because you can’t go outside of descriptive evidence. But I remember when I, and this was working after a call by a Labour MP, who now they’re in government, who had called out for a, oh, why wouldn’t it be perfect to do a campaign on the MMR vaccine, because there was a rise, and TikTok responded and said, oh, well, we’ll do it. And then I fronted that campaign, and I remember we filmed up in the Midlands of England, in London, in different demographics of different ethnicities, backgrounds, etc. And I actually remember the disconnect between the reality, which was that many people were believing in the vaccination importance, but then this was after the backdrop of the COVID vaccination saga, where first of all, there was a time where these medicines were tested in Africa. That created catastrophes in terms of bringing back lots of bad memories around it, historical memories. But then you had lots of top-down suited and booted officials telling people what to do, and that made things even worse. And the result was that the public health message has transformed forever, in that people really don’t want to be expected to listen to officials telling them what to do. And this is where social media had risen. But then I received a massive backlash when I did the content on the MMR vaccine, and with very, very, very loud voices who are a very, very, very small minority spreading disinformation. And I think going back to what you’ve said, that if you build trust and people know that you are doing your best and understand your values over a period of time, as opposed to just a one-off hit, then that tone starts to soften. And the other thing I would just say is, and I’ve had this conversation with very senior officials from another part of the world, who said, well, you may say that your group of viewers are intuitive, but ours aren’t. And this is another well-developed nation, one of the most well-developed and one of the richest. And I was actually saying it’s exactly this mindset which sets us back. Actually, members of the public, regardless of what their understanding is, bearing in mind in the UK the reading age is about six or seven, the maths age is about three to four, they are phenomenally intuitive. They can sniff out right from wrong. You just need to be able to explain that information in an easy and simple way. And the last thing I’d say, so I don’t take up too much time, when you’re looking at content quality, please don’t look at followership. Don’t look at likes. Look at engagement and quality of engagement in the comments and the type of content. Because if the content is high quality, it will deliver on engagement. I wouldn’t worry too much about the followerships and the millions of views. It’s about the quality of engagement.


Maurice Turner: Ghaleb, can you share a particular challenge that you’ve faced and maybe offer some recommendations to the audience about how you were able to overcome them in this particular space?


Ghaleb Cabbabe: Sure, actually. To give us maybe an indication on the situation that we’re dealing with and that sometimes we have to face. I believe and maybe I invite also people in the audience to maybe go on our digital platforms and also the digital platforms of maybe also other Red Cross and Red Crescent movement partners like the ICRC and check the comments section. And this is where you would see how complex this environment is. Because today, unfortunately, we are moving, working, reacting, engaging in a very, very polarized world. And this is even more present because we are talking and our work is in war zones, conflict zones. So this is even more polarized. So misinformation is even more present here. So I think the challenge here is really the nature of what we’re facing when we talk about misinformation. As I was saying previously, there are people whose work is only to create this type of misinformation and to feed it and to make it grow online. So sometimes the battle is not an easy one because the opponent, if we can call it so, of misinformation is an opponent with sometimes many resources. You have sometimes states behind it. So I think the challenge comes from here, the very polarized context and also the people who could be behind this misinformation. How we’re trying to deal with this, again, is by sharing, cross-checking, and checking again the information before sharing it, making sure sometimes also in very sensitive context to anticipate what could be the risk. This is really very important to assess the risk before communicating. And although we are in a world on digital platforms, not only TikTok, where you have to be prompt, you have to be really fast in terms of reaction, one of my recommendations would be that sometimes it’s better not to rush. You have to be prompt, of course. You have to be fast, reactive. But sometimes by taking the time to assess, to see how maybe a certain piece of misinformation is evolving online, the direction it’s taking, it gives you maybe a better understanding of the situation and a better way to tackle it. So, yes, being prompt and fast, but do not rush. It’s a daily work. It’s an ongoing work. It starts, actually, before the misinformation it shares. It starts with what we plan to produce, what we plan to communicate on in our statements, in our press release, in our interviews with the press. And again, it was mentioned in the panel, it’s not only a social media team work. It’s the work of the whole team in terms of expert on the legal side, on the media side, on the different expertise. So it’s an ongoing work, a daily work. And again, it’s a complex one, and it’s not an easy battle.


Maurice Turner: Thank you. And it seems that not only doing the work beforehand to prepare and have a process, but also having the confidence and the patience to be able to understand what you’re going to be responding to is part of the strategy in responding to that challenge. Gisela, do you have a particular challenge related to information integrity that you’d be able to share with us and maybe some recommendations for how you’re able to respond to that?


Gisella Lomax: Thank you. And I like you say a particular challenge. There are many, but perhaps I could just highlight three, but then there are some recommendations I can make as well. I think building off the points thatGhaleb Cabbabe just shared, the first is to say this is… Sorry, this is a very jargonistic term. This is multifunctional work. This is not just communications. This is protection, policy, operational, governance and communications. And so really it’s getting that message across and then making sure these multifunctional teams are resourced. That’s more of an internal challenge for how we address it. But I would say, I mean, there’s three very distinct challenges. And one is that protection challenge. Information risks such as hate speech and misinformation are directly and indirectly causing real world harm. And I mean violence, killings, persecution. It can even be a factor in forced displacement, in causing refugees. In fact, as we saw in Myanmar back in 2016, 2017, when hate speech, which had already been circulating for some time, really exploded and had a decisive role in the displacement, I think, of 700,000 Rohingya refugees into Bangladesh who are still there today. And then also on communities, it’s normalising hostility towards refugees and migrants. So the problem, the harms over time, the very direct harms, that’s on the community, the refugee side, on the operational side. It’s increasingly hampering humanitarians from doing their job around the world in conflicts in Sudan, for example, or other places, disinformation narratives about humanitarians perhaps politicising their work can also erode people’s trust. And, you know, trust is everything. And we know that trust can take years or decades to build and can be destroyed very, very quickly. And I think I recognise the importance of individual voices and I think I celebrate what you’re doing, Dr Ahmed, and people like you, but let’s not forget that we also need to help institutions remain trusted as well. If people can’t trust public health bodies or UN entities, then we also have a challenge. And then a third, very more specific one, is that of trust and safety capacities from digital platforms and tech companies. We have seen, to our dismay, a weakening of these capacities and perhaps less resourcing from some companies. And I would extend that, for example, to content moderation in less common languages. You know, I think this is always a question I ask to all tech companies. Are you adequately providing content moderation in less common languages in these very volatile contexts where you don’t actually have a business argument? You’re not selling ads or making money, but your platforms are still delivering fundamental information. So there are three challenges. But on to a recommendation. I think we’re sat here at an event designed to create international cooperation to solve these problems, so it’s partnerships. And I think speaking as a humanitarian, for us, it’s increasingly building partnerships with tech, with digital, digital rights organisations, civil society NGOs that bring in the skillset, expertise and knowledge that we don’t have, and given a very dramatic and grave financial situation for the humanitarian sector, we’re not going to have more capacity. So how can we partner, collaborate to bring in that knowledge, to test these responses, innovative, long over time, like digital literacy, more immediate for these kind of quite acute harms and conflicts? So my recommendation is to please talk to us. If you’re academic, there are many research gaps and we can help give you access to try and fill those. Then we can take that research to inform strategies and policies. If you’re a tech company, I’ve already mentioned some of the ways that we can help each other. If you’re a digital rights, an NGO, I think there’s many more. So that’s both a recommendation and a plea from us, and that’s what brings me here to Oslo this week.


Maurice Turner: Eva, I’d like to hear more about maybe more of the technical side of the challenges. What are some of the challenges that you’re hearing about from the folks that you’re working with and how they’re trying to tackle information integrity, and what are some of those recommendations that seem to have surfaced to the top?


Eeva Moore: You could boil the challenge down perhaps to one word, and that’s time. If you’re in the business of trying to disrupt or break something, you get to move at a much faster pace than if you’re on the other side of that equation. So from a purely practical standpoint, that means, as comms people, leaning into the creative side of it. So how can we tell a compelling story on a timeline that mirrors, say, a news cycle? We did that with the proposed TikTok ban in the United States. That was a timeline for which this long period of review did not lend itself well to. But we have the experts. They had already done the work. So my very qualified, fantastic colleague Celia went in and just plucked headlines from a blog post, designed it within seconds, and put it out there. Not seconds, perhaps minutes. So it’s knowing what resources are already there that you can act on, because I agree with what Gallup said, is that sometimes you have to wait, but there’s also the aspect of it where sometimes you can’t afford to, right? So the creative side of it, which is in and of itself interactive, but that’s one way around time. The other way is simply doing the heavy lifting for your community, remembering that if you’re the person whose job it is to create these things, you are not asking somebody. We work with volunteers, essentially, around the world, right? These are people with lives, with families, with jobs, often in challenging circumstances as well as less challenging circumstances. It’s my job, it’s my team’s job, to try to ask as little of them so that then when we do ask, they’re able to actually jump in and help. But time, it’s labor intensive. It’s not something that people are swimming in.


Maurice Turner: It seems like that’s a poignant point to wrap up that part of discussion, time. It’s a resource that no one quite has enough of, and as we heard earlier, there needs to be a balance of when do you find that right time do you do it fast, do you wait to be more thorough and what’s appropriate for the response that you need given that particular situation. At this time, I’d like to go ahead and open up the time for questions from the audience. Again, we have microphones, so if you have a question, feel free to step up to the microphones on the side. If you’re a little bit shy, that’s okay too. You can just raise your hand, maybe pass a note to someone who might have a question, and if you’re online, please do type in a question there and we’ll have that moderated as well.


Audience: May I? Hi, good afternoon. Thank you for the panel. My name is Bia Barbosa. I’m a journalist from Brazil. I’m also a civil society representative at the Brazilian Internet Steering Committee. Thank you for all the sharing of your experience producing content for TikTok, specifically if I got it, or for social media in general. And how you face these challenges regarding information integrity for us as a journalist perspective is something that is in our most… This is one of the most important preoccupations at this time. And I would like to hear from you because regarding the challenges, I didn’t see, besides Gisela mentioning, the resources. Thank you very much.


Maurice Turner: Do we have any other questions from the audience? Feel free to step up to the microphone if you have one. Any questions from the online audience? Excellent. Well, what I’ll go ahead and do is wrap us up. Feel free to engage with our panelists as we exit the stage and close out the session, or if not now, then go ahead and find them throughout the week. I think we had an interesting discussion in hearing about how these organizations that you all represent utilize the platform or utilize digital platforms to put out information to push back against some of that misinformation that’s out there. We also heard about some strategies for attacking the safety issue regarding information integrity, and we also heard some of the recommendations that came through as well. I particularly enjoyed hearing about those tensions that naturally take place between having to be viewed as experts, recognizing that you also rely on a network of expertise, so it’s not just one individual or one sole organization. And also that tension between balancing that limited resource of time, so being very prompt, but also allowing time for that situation to evolve to better understand how best to respond to it in a way that is not only authentic, but in a way that is also prompt, so that even when attentions might be shortened, that message can get out effectively. So again, please do join me in thanking the panel, and I’d also like to thank you all for joining this discussion, and I hope you enjoy the rest of your week at the conference. Thank you. Thank you.


M

Maurice Turner

Speech speed

164 words per minute

Speech length

1439 words

Speech time

525 seconds

TikTok partners with over 20 fact-checking organizations across 60 markets and empowers users through media literacy programs

Explanation

TikTok has established partnerships with accredited fact-checking organizations globally to combat misinformation on their platform. They also provide media literacy resources to help users recognize misinformation, assess content critically, and report violative content.


Evidence

More than 20 accredited fact-checking organizations across 60 different markets


Major discussion point

Using Digital Platforms for Information Integrity and Outreach


Topics

Sociocultural | Legal and regulatory


Agreed with

– Dr. Ahmed Ezzat
– Gisella Lomax
– Ghaleb Cabbabe
– Eeva Moore

Agreed on

Digital platforms provide unprecedented reach and impact for information dissemination


D

Dr. Ahmed Ezzat

Speech speed

166 words per minute

Speech length

1293 words

Speech time

465 seconds

Medical professionals can reach millions of people within 24 hours at zero cost using evidence-based information, representing massive public health potential

Explanation

Healthcare professionals can leverage TikTok’s reach to disseminate important public health information rapidly and cost-effectively. This represents a significant opportunity for public health communication when balanced correctly with evidence-based practice.


Evidence

During a UK heat wave, could reach a million people or half a million people within less than 24 hours at zero cost; first viral content on childhood infections hit 4.5 million views and was shared 156,000 times


Major discussion point

Using Digital Platforms for Information Integrity and Outreach


Topics

Sociocultural | Development


Agreed with

– Maurice Turner
– Gisella Lomax
– Ghaleb Cabbabe
– Eeva Moore

Agreed on

Digital platforms provide unprecedented reach and impact for information dissemination


Healthcare professionals must maintain accountability and use real names to build credibility, avoiding commercial endorsements that could damage trust

Explanation

Medical professionals creating content must be held accountable for their medical information by using their real names rather than nicknames. They must carefully avoid brand deals that could compromise their credibility, particularly those related to medical products.


Evidence

Shifted from using a nickname to real name on TikTok; receives daily offers for weight loss medicine brand deals which would ‘absolutely decimate credibility’


Major discussion point

Content Creation Strategies and Building Trust


Topics

Human rights | Sociocultural


Agreed with

– Gisella Lomax
– Ghaleb Cabbabe
– Eeva Moore

Agreed on

Content creation requires multi-layered expertise and cannot be done by single individuals or departments alone


Misinformation creates an unfair defensive position where evidence-based voices must defend against accusations while being constrained by factual accuracy

Explanation

When misinformation spreads, healthcare professionals find themselves in a defensive position trying to counter false claims. This creates an unfair advantage for misinformation spreaders who aren’t constrained by evidence, while medical professionals must stick to factual, evidence-based responses.


Evidence

Received massive backlash when creating MMR vaccine content, with very loud voices from a small minority spreading disinformation


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Sociocultural | Human rights


Agreed with

– Gisella Lomax
– Ghaleb Cabbabe

Agreed on

Misinformation poses serious real-world threats that require proactive, systematic responses


Disagreed with

– Ghaleb Cabbabe
– Eeva Moore

Disagreed on

Speed vs. Thoroughness in Response Strategy


Focus on quality of engagement and comments rather than follower counts or view numbers when measuring content success

Explanation

The effectiveness of content should be measured by the quality of engagement and meaningful interactions in comments rather than vanity metrics like followers or views. High-quality content will naturally deliver better engagement even without massive followership.


Major discussion point

Recommendations for Effective Information Integrity


Topics

Sociocultural


G

Gisella Lomax

Speech speed

156 words per minute

Speech length

1329 words

Speech time

509 seconds

Digital platforms are vital for providing lifesaving protection information to 123 million displaced people across 133 countries

Explanation

UNHCR uses social media extensively to provide critical protection information to refugees and displaced populations globally. These platforms also help inform the general public and stakeholders about their work while amplifying refugee voices.


Evidence

UNHCR protects 123 million people across 133 countries; was the first UN agency to create a TikTok account


Major discussion point

Using Digital Platforms for Information Integrity and Outreach


Topics

Development | Human rights


Agreed with

– Maurice Turner
– Dr. Ahmed Ezzat
– Ghaleb Cabbabe
– Eeva Moore

Agreed on

Digital platforms provide unprecedented reach and impact for information dissemination


Successful content requires being creative, partnering with influencers, uplifting community voices, and making facts entertaining while remaining educational

Explanation

Effective social media strategy involves creative approaches including influencer partnerships and refugee-led storytelling. The challenge is presenting factual information in an entertaining and educational format that can spark empathy and drive action.


Evidence

UNHCR works with influencers and brands, focuses on refugee-led storytelling, and tries to make facts entertaining and educative


Major discussion point

Content Creation Strategies and Building Trust


Topics

Sociocultural | Human rights


Information risks directly cause real-world harm including violence, killings, and forced displacement, as seen with Rohingya refugees in Myanmar

Explanation

Misinformation and hate speech have direct consequences including violence, persecution, and can even be factors in causing forced displacement. These information risks also normalize hostility toward refugee and migrant communities over time.


Evidence

Myanmar 2016-2017 hate speech explosion had a decisive role in displacing 700,000 Rohingya refugees to Bangladesh who remain there today


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Human rights | Cybersecurity


Agreed with

– Dr. Ahmed Ezzat
– Ghaleb Cabbabe

Agreed on

Misinformation poses serious real-world threats that require proactive, systematic responses


Build partnerships with tech companies, academic institutions, and civil society organizations to leverage expertise and resources that humanitarian organizations lack

Explanation

Given the dramatic financial constraints facing the humanitarian sector, organizations must collaborate with tech companies, researchers, and NGOs to access specialized knowledge and skills. These partnerships can help test innovative responses and fill research gaps.


Evidence

UNHCR has a project in South Africa testing ‘pre-bunking’ with technical support from TikTok, Google, and others in partnership with Norway and South Africa


Major discussion point

Recommendations for Effective Information Integrity


Topics

Development | Legal and regulatory


Agreed with

– Dr. Ahmed Ezzat
– Ghaleb Cabbabe
– Eeva Moore

Agreed on

Content creation requires multi-layered expertise and cannot be done by single individuals or departments alone


G

Ghaleb Cabbabe

Speech speed

138 words per minute

Speech length

231 words

Speech time

100 seconds

Digital platforms serve multiple purposes including brand awareness, crisis communication, information sharing, and fundraising campaigns

Explanation

The International Federation of Red Cross uses digital platforms for various strategic purposes depending on the situation. This includes building brand awareness, crisis response during disasters, combating misinformation, and supporting fundraising efforts.


Evidence

Used platforms during recent disasters in Myanmar, Gaza, and Iran; has been working with Red Cross since 2017 when TikTok was beginning


Major discussion point

Using Digital Platforms for Information Integrity and Outreach


Topics

Sociocultural | Development


Agreed with

– Maurice Turner
– Dr. Ahmed Ezzat
– Gisella Lomax
– Eeva Moore

Agreed on

Digital platforms provide unprecedented reach and impact for information dissemination


Organizations must rely on field experts rather than just marketing teams, and prepare proactive strategies including reactive lines and scenario planning

Explanation

Effective information integrity requires input from subject matter experts beyond just social media teams. Organizations need to prepare proactive strategies, develop reactive messaging in advance, and train staff to prevent situations that could trigger misinformation cascades.


Evidence

Red Cross relies on experts in their fields for key messages; prepares reactive lines and identifies potential scenarios; trains staff internally to avoid triggering misinformation


Major discussion point

Content Creation Strategies and Building Trust


Topics

Legal and regulatory | Sociocultural


Agreed with

– Dr. Ahmed Ezzat
– Gisella Lomax
– Eeva Moore

Agreed on

Content creation requires multi-layered expertise and cannot be done by single individuals or departments alone


Organizations face opponents with significant resources including state actors and misinformation farms operating in highly polarized environments

Explanation

The challenge of combating misinformation is amplified by well-resourced opponents, including organized misinformation farms and sometimes state actors. This is particularly difficult in polarized contexts involving war zones and conflict areas where the Red Cross operates.


Evidence

There are people and companies whose only work is to produce misinformation; sometimes states are behind misinformation efforts; Red Cross works in very polarized war zones and conflict zones


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Cybersecurity | Human rights


Agreed with

– Dr. Ahmed Ezzat
– Gisella Lomax

Agreed on

Misinformation poses serious real-world threats that require proactive, systematic responses


Balance speed with thoroughness by assessing risks before communicating, sometimes taking time to understand how misinformation evolves rather than rushing responses

Explanation

While digital platforms require prompt responses, organizations should resist rushing and instead take time to assess risks and understand how misinformation is developing. This strategic patience can lead to more effective responses even in fast-paced digital environments.


Evidence

Recommends being prompt and fast but not rushing; suggests taking time to see how misinformation evolves online and understanding its direction


Major discussion point

Recommendations for Effective Information Integrity


Topics

Sociocultural | Legal and regulatory


Agreed with

– Eeva Moore

Agreed on

Time constraints create fundamental challenges in responding to misinformation effectively


Disagreed with

– Dr. Ahmed Ezzat
– Eeva Moore

Disagreed on

Speed vs. Thoroughness in Response Strategy


Information verification requires multiple levels of checking and cross-checking, especially when dealing with sensitive situations involving colleague safety

Explanation

Organizations must implement rigorous verification processes that involve checking, double-checking, and checking again at local and regional levels. This is particularly critical when dealing with sensitive information such as attacks on humanitarian workers.


Evidence

Example of verifying information about killing of Palestinian Red Crescent colleagues in Gaza and Iranian Red Crescent colleagues in recent attacks


Major discussion point

Content Creation Strategies and Building Trust


Topics

Human rights | Legal and regulatory


Monitoring and anticipation are essential components of misinformation prevention strategy

Explanation

Organizations need to actively monitor information environments and try to anticipate potential misinformation scenarios before they occur. This proactive approach helps organizations prepare appropriate responses and identify risks early.


Evidence

Red Cross tries to anticipate what could go wrong and what information could lead to misinformation situations


Major discussion point

Recommendations for Effective Information Integrity


Topics

Cybersecurity | Sociocultural


Misinformation threatens organizational credibility and communication impact, requiring it to be treated as a serious institutional risk

Explanation

Misinformation is not just a communications challenge but poses a direct threat to organizational credibility and the effectiveness of their messaging. Organizations must view misinformation as a fundamental threat to their institutional reputation and mission effectiveness.


Evidence

Red Cross sees misinformation as a threat not only to social media communication but also to the credibility of the organization


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Legal and regulatory | Human rights


Red Cross has been working with TikTok since 2017 when the platform was in its beginning stages, witnessing significant platform growth

Explanation

The International Federation of Red Cross has a long history with TikTok, starting their engagement in 2017 when the platform was just emerging. This early adoption has allowed them to observe and benefit from the platform’s tremendous growth over the years.


Evidence

Working with Red Cross since 2017 when TikTok was in the beginning stages; seeing where the platform is today represents quite a great achievement


Major discussion point

Using Digital Platforms for Information Integrity and Outreach


Topics

Sociocultural | Development


Misinformation can be triggered by organizational posts and cascade across platforms, requiring staff training to prevent such situations

Explanation

Organizations must recognize that their own social media posts can inadvertently trigger misinformation campaigns that spread across multiple platforms. This requires comprehensive staff training to help employees understand how their communications might be misinterpreted or weaponized.


Evidence

Sometimes misinformation is triggered by a certain tweet or post on TikTok or other platforms that could cascade; training staff internally is part of different strategies


Major discussion point

Content Creation Strategies and Building Trust


Topics

Sociocultural | Legal and regulatory


The polarized nature of conflict zones and war environments makes misinformation challenges more complex and prevalent

Explanation

Working in conflict zones and war environments creates an especially challenging context for information integrity because these situations are inherently polarized. This polarization makes misinformation more likely to spread and more difficult to counter effectively.


Evidence

Red Cross works in war zones and conflict zones which are very polarized environments; misinformation is even more present in these contexts


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Human rights | Cybersecurity


Combating misinformation is an ongoing daily work that starts before misinformation appears and involves the whole organizational team

Explanation

The fight against misinformation is not just reactive but requires continuous proactive effort that begins with planning what to communicate in statements, press releases, and interviews. This work requires coordination across multiple departments and expertise areas, not just social media teams.


Evidence

It starts with what we plan to produce, communicate in statements, press releases, interviews; involves experts on legal, media, and different expertise areas


Major discussion point

Recommendations for Effective Information Integrity


Topics

Legal and regulatory | Sociocultural


The complexity of misinformation battles requires understanding that opponents may have significant institutional backing and resources

Explanation

Organizations fighting misinformation must recognize they face well-funded and organized opposition that may include state actors and dedicated misinformation operations. This creates an uneven playing field where the battle is inherently difficult due to resource disparities.


Evidence

There are farms in some places where their only goal is to produce misinformation; sometimes states are behind misinformation efforts


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Cybersecurity | Human rights


E

Eeva Moore

Speech speed

169 words per minute

Speech length

746 words

Speech time

263 seconds

Platforms help connect the remaining third of the world that’s not online through educational content and community stories

Explanation

The Internet Society uses digital platforms to support their mission of connecting unconnected populations globally. Their content focuses on educational material, advocacy, and community impact stories across their 181 country chapters.


Evidence

Internet Society works to connect the remaining third of the world that’s not online; has chapters in 181 countries


Major discussion point

Using Digital Platforms for Information Integrity and Outreach


Topics

Development | Infrastructure


Agreed with

– Maurice Turner
– Dr. Ahmed Ezzat
– Gisella Lomax
– Ghaleb Cabbabe

Agreed on

Digital platforms provide unprecedented reach and impact for information dissemination


Expertise must be baked into the production process with multiple layers of review, making content appear simple while requiring heavy behind-the-scenes work

Explanation

Creating trustworthy content requires extensive expertise and consultation built into the production process. While the final product should be easily digestible, it must be supported by multiple layers of expertise, conversations, and trust within the community.


Evidence

Content should look like a light lift but actually be a heavy one; requires consuming lots of information including disinformation to understand the space; involves legal, technical, and community expertise


Major discussion point

Content Creation Strategies and Building Trust


Topics

Sociocultural | Human rights


Agreed with

– Dr. Ahmed Ezzat
– Gisella Lomax
– Ghaleb Cabbabe

Agreed on

Content creation requires multi-layered expertise and cannot be done by single individuals or departments alone


Time constraints create fundamental challenges since disruption moves faster than constructive response, requiring creative solutions and pre-prepared resources

Explanation

The core challenge in information integrity is that those seeking to disrupt or spread misinformation can move much faster than those trying to provide accurate information. This requires creative approaches and having resources prepared in advance to respond quickly when needed.


Evidence

During the proposed TikTok ban in the US, they quickly adapted existing blog content into social media posts within minutes using pre-existing expert work


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Sociocultural | Legal and regulatory


Agreed with

– Ghaleb Cabbabe

Agreed on

Time constraints create fundamental challenges in responding to misinformation effectively


Disagreed with

– Dr. Ahmed Ezzat
– Ghaleb Cabbabe

Disagreed on

Speed vs. Thoroughness in Response Strategy


Prepare heavy-lifting resources in advance for community volunteers and maintain credibility through relationship-building and authentic expertise

Explanation

Organizations should do the intensive preparation work so that when they need community support, they’re asking as little as possible from volunteers who have their own lives and responsibilities. Credibility comes from authentic relationships and expertise that people can trust.


Evidence

Internet Society works with volunteers around the world who have lives, families, and jobs; credibility is built into relationships and people have a keen sense of whether they want to listen to somebody


Major discussion point

Recommendations for Effective Information Integrity


Topics

Development | Sociocultural


A

Audience

Speech speed

127 words per minute

Speech length

113 words

Speech time

53 seconds

Resource constraints are a major challenge for information integrity work that needs to be addressed

Explanation

A journalist from Brazil representing civil society highlighted that resource limitations are one of the most important concerns when dealing with information integrity challenges. This was noted as an area that wasn’t sufficiently covered in the panel discussion despite being a critical issue.


Evidence

Mentioned as a journalist from Brazil and civil society representative at the Brazilian Internet Steering Committee


Major discussion point

Challenges in Combating Misinformation and Disinformation


Topics

Development | Legal and regulatory


Agreements

Agreement points

Digital platforms provide unprecedented reach and impact for information dissemination

Speakers

– Maurice Turner
– Dr. Ahmed Ezzat
– Gisella Lomax
– Ghaleb Cabbabe
– Eeva Moore

Arguments

TikTok partners with over 20 fact-checking organizations across 60 markets and empowers users through media literacy programs


Medical professionals can reach millions of people within 24 hours at zero cost using evidence-based information, representing massive public health potential


Digital platforms are vital for providing lifesaving protection information to 123 million displaced people across 133 countries


Digital platforms serve multiple purposes including brand awareness, crisis communication, information sharing, and fundraising campaigns


Platforms help connect the remaining third of the world that’s not online through educational content and community stories


Summary

All speakers agreed that digital platforms, particularly TikTok, offer massive reach and potential for positive impact when used strategically for information dissemination, whether for health information, humanitarian aid, or global connectivity.


Topics

Sociocultural | Development | Human rights


Content creation requires multi-layered expertise and cannot be done by single individuals or departments alone

Speakers

– Dr. Ahmed Ezzat
– Gisella Lomax
– Ghaleb Cabbabe
– Eeva Moore

Arguments

Healthcare professionals must maintain accountability and use real names to build credibility, avoiding commercial endorsements that could damage trust


Build partnerships with tech companies, academic institutions, and civil society organizations to leverage expertise and resources that humanitarian organizations lack


Organizations must rely on field experts rather than just marketing teams, and prepare proactive strategies including reactive lines and scenario planning


Expertise must be baked into the production process with multiple layers of review, making content appear simple while requiring heavy behind-the-scenes work


Summary

All speakers emphasized that effective content creation requires collaboration across multiple expertise areas, from medical professionals to legal experts to field specialists, rather than relying on single individuals or departments.


Topics

Sociocultural | Legal and regulatory | Human rights


Misinformation poses serious real-world threats that require proactive, systematic responses

Speakers

– Dr. Ahmed Ezzat
– Gisella Lomax
– Ghaleb Cabbabe

Arguments

Misinformation creates an unfair defensive position where evidence-based voices must defend against accusations while being constrained by factual accuracy


Information risks directly cause real-world harm including violence, killings, and forced displacement, as seen with Rohingya refugees in Myanmar


Organizations face opponents with significant resources including state actors and misinformation farms operating in highly polarized environments


Summary

Speakers agreed that misinformation is not just a communications challenge but poses direct threats to safety, security, and organizational credibility, requiring systematic and proactive responses.


Topics

Human rights | Cybersecurity | Legal and regulatory


Time constraints create fundamental challenges in responding to misinformation effectively

Speakers

– Ghaleb Cabbabe
– Eeva Moore

Arguments

Balance speed with thoroughness by assessing risks before communicating, sometimes taking time to understand how misinformation evolves rather than rushing responses


Time constraints create fundamental challenges since disruption moves faster than constructive response, requiring creative solutions and pre-prepared resources


Summary

Both speakers identified time as a critical constraint, noting that while misinformation spreads quickly, effective responses require careful preparation and strategic timing rather than rushed reactions.


Topics

Sociocultural | Legal and regulatory


Similar viewpoints

Both emphasized that successful content strategy should prioritize meaningful engagement and authentic community connections over vanity metrics, focusing on quality interactions and creative approaches to make factual information accessible and engaging.

Speakers

– Dr. Ahmed Ezzat
– Gisella Lomax

Arguments

Focus on quality of engagement and comments rather than follower counts or view numbers when measuring content success


Successful content requires being creative, partnering with influencers, uplifting community voices, and making facts entertaining while remaining educational


Topics

Sociocultural | Human rights


Both humanitarian organizations recognized that they operate in highly polarized, conflict-affected environments where misinformation has direct, severe consequences including violence and displacement, and where they face well-resourced opposition.

Speakers

– Gisella Lomax
– Ghaleb Cabbabe

Arguments

Information risks directly cause real-world harm including violence, killings, and forced displacement, as seen with Rohingya refugees in Myanmar


Organizations face opponents with significant resources including state actors and misinformation farms operating in highly polarized environments


Topics

Human rights | Cybersecurity


Both emphasized the intensive, multi-layered verification and review processes required to ensure information accuracy, particularly in sensitive contexts, while making the final product appear accessible and simple to audiences.

Speakers

– Ghaleb Cabbabe
– Eeva Moore

Arguments

Information verification requires multiple levels of checking and cross-checking, especially when dealing with sensitive situations involving colleague safety


Expertise must be baked into the production process with multiple layers of review, making content appear simple while requiring heavy behind-the-scenes work


Topics

Legal and regulatory | Human rights


Unexpected consensus

Strategic patience in responding to misinformation rather than immediate reaction

Speakers

– Ghaleb Cabbabe
– Eeva Moore

Arguments

Balance speed with thoroughness by assessing risks before communicating, sometimes taking time to understand how misinformation evolves rather than rushing responses


Time constraints create fundamental challenges since disruption moves faster than constructive response, requiring creative solutions and pre-prepared resources


Explanation

Despite the fast-paced nature of social media and the pressure to respond quickly to misinformation, both speakers advocated for strategic patience and thorough preparation over immediate reactions. This is unexpected given the common assumption that social media requires instant responses.


Topics

Sociocultural | Legal and regulatory


The need for institutional trust alongside individual credibility

Speakers

– Dr. Ahmed Ezzat
– Gisella Lomax

Arguments

Healthcare professionals must maintain accountability and use real names to build credibility, avoiding commercial endorsements that could damage trust


Build partnerships with tech companies, academic institutions, and civil society organizations to leverage expertise and resources that humanitarian organizations lack


Explanation

While much discussion around social media focuses on individual influencers and personal branding, both speakers emphasized the continued importance of institutional credibility and partnerships, suggesting that individual and institutional trust must work together rather than compete.


Topics

Human rights | Sociocultural | Legal and regulatory


Overall assessment

Summary

The speakers demonstrated strong consensus on the transformative potential of digital platforms for positive information dissemination, the serious nature of misinformation threats, and the need for multi-stakeholder, expertise-driven approaches to content creation and verification.


Consensus level

High level of consensus with complementary perspectives rather than conflicting viewpoints. The agreement spans across different sectors (healthcare, humanitarian, technology, civil society) suggesting broad applicability of these principles. This consensus indicates that despite different organizational contexts, there are shared challenges and effective strategies that can be applied across sectors for information integrity.


Differences

Different viewpoints

Speed vs. Thoroughness in Response Strategy

Speakers

– Dr. Ahmed Ezzat
– Ghaleb Cabbabe
– Eeva Moore

Arguments

Misinformation creates an unfair defensive position where evidence-based voices must defend against accusations while being constrained by factual accuracy


Balance speed with thoroughness by assessing risks before communicating, sometimes taking time to understand how misinformation evolves rather than rushing responses


Time constraints create fundamental challenges since disruption moves faster than constructive response, requiring creative solutions and pre-prepared resources


Summary

Dr. Ahmed emphasizes the defensive disadvantage of being constrained by evidence when responding to misinformation, while Ghaleb advocates for strategic patience and taking time to assess situations. Eeva focuses on the need for speed through creative solutions and pre-preparation, representing different approaches to the time-versus-accuracy dilemma.


Topics

Sociocultural | Legal and regulatory


Unexpected differences

Platform Algorithm Engagement Strategy

Speakers

– Dr. Ahmed Ezzat
– Gisella Lomax

Arguments

Focus on quality of engagement and comments rather than follower counts or view numbers when measuring content success


Successful content requires being creative, partnering with influencers, uplifting community voices, and making facts entertaining while remaining educational


Explanation

While both work on the same platform (TikTok), Dr. Ahmed advocates for focusing on engagement quality over metrics, while Gisella emphasizes creative partnerships and entertainment value. This represents different philosophies about how to effectively use social media algorithms – organic engagement versus strategic content optimization.


Topics

Sociocultural


Overall assessment

Summary

The panel showed remarkable consensus on core challenges and goals, with disagreements primarily centered on tactical approaches rather than fundamental principles. Main areas of difference included response timing strategies and platform engagement methods.


Disagreement level

Low to moderate disagreement level. The speakers demonstrated strong alignment on identifying challenges (misinformation threats, resource constraints, need for expertise) and broad goals (information integrity, community protection, credible content creation). Disagreements were primarily tactical and complementary rather than contradictory, suggesting different but potentially compatible approaches to shared challenges. This level of agreement is significant as it indicates a mature understanding of the field where practitioners can focus on refining methods rather than debating fundamental approaches.


Partial agreements

Partial agreements

Similar viewpoints

Both emphasized that successful content strategy should prioritize meaningful engagement and authentic community connections over vanity metrics, focusing on quality interactions and creative approaches to make factual information accessible and engaging.

Speakers

– Dr. Ahmed Ezzat
– Gisella Lomax

Arguments

Focus on quality of engagement and comments rather than follower counts or view numbers when measuring content success


Successful content requires being creative, partnering with influencers, uplifting community voices, and making facts entertaining while remaining educational


Topics

Sociocultural | Human rights


Both humanitarian organizations recognized that they operate in highly polarized, conflict-affected environments where misinformation has direct, severe consequences including violence and displacement, and where they face well-resourced opposition.

Speakers

– Gisella Lomax
– Ghaleb Cabbabe

Arguments

Information risks directly cause real-world harm including violence, killings, and forced displacement, as seen with Rohingya refugees in Myanmar


Organizations face opponents with significant resources including state actors and misinformation farms operating in highly polarized environments


Topics

Human rights | Cybersecurity


Both emphasized the intensive, multi-layered verification and review processes required to ensure information accuracy, particularly in sensitive contexts, while making the final product appear accessible and simple to audiences.

Speakers

– Ghaleb Cabbabe
– Eeva Moore

Arguments

Information verification requires multiple levels of checking and cross-checking, especially when dealing with sensitive situations involving colleague safety


Expertise must be baked into the production process with multiple layers of review, making content appear simple while requiring heavy behind-the-scenes work


Topics

Legal and regulatory | Human rights


Takeaways

Key takeaways

Digital platforms like TikTok offer unprecedented reach for credible information – medical professionals can reach millions within 24 hours at zero cost, making them powerful tools for public health and humanitarian communication


Trust and credibility are built over time through consistent, accountable content creation – healthcare professionals must use real names and avoid commercial endorsements that could damage credibility


Effective content creation requires extensive behind-the-scenes work – multiple layers of expertise, fact-checking, and community input must be ‘baked into’ the production process to make content appear simple


Misinformation creates asymmetric warfare where bad actors with significant resources (including state actors) can move faster than those trying to provide accurate information


Quality of engagement matters more than follower counts – focus should be on meaningful comments and interactions rather than vanity metrics


Information risks cause real-world harm including violence, displacement, and erosion of trust in institutions, as demonstrated by the Myanmar Rohingya crisis


Partnerships are essential – humanitarian organizations, tech companies, academic institutions, and civil society must collaborate to leverage complementary expertise and resources


Time is the fundamental challenge – balancing the need for prompt response with thorough fact-checking and risk assessment


Resolutions and action items

UNHCR invited collaboration from academics, tech companies, and NGOs, specifically requesting partnerships to fill research gaps and develop strategies


Gisella Lomax promoted a Wednesday 2 PM event showcasing their South Africa pre-bunking project testing inoculation theory


Recommendation to check UNHCR’s information integrity toolkit by Googling ‘UNHCR information integrity’


Suggestion for audience members to examine comment sections on Red Cross digital platforms to understand the complexity of misinformation challenges


Unresolved issues

How to adequately resource content moderation in less common languages, especially in volatile contexts where platforms don’t have business incentives


The ongoing challenge of weakening trust and safety capacities from some digital platforms and tech companies


How to effectively combat well-resourced misinformation operations including state actors and misinformation farms


The fundamental time asymmetry between those creating misinformation and those trying to counter it with factual information


How to maintain institutional trust while individual voices gain prominence in information sharing


Balancing the need for speed in digital communication with thorough fact-checking and risk assessment processes


Suggested compromises

Balance speed with thoroughness by preparing reactive strategies and scenario planning in advance, allowing for quick but informed responses


Focus on proactive content creation to fill information voids rather than only reactive responses to misinformation


Leverage partnerships to share the resource burden – organizations should collaborate rather than trying to build all capabilities internally


Accept that sometimes it’s better to wait and assess how misinformation evolves rather than rushing to respond immediately


Combine individual creator authenticity with institutional expertise – use real names and personal accountability while maintaining organizational standards


Thought provoking comments

Actually, members of the public, regardless of what their understanding is, bearing in mind in the UK the reading age is about six or seven, the maths age is about three to four, they are phenomenally intuitive. They can sniff out right from wrong. You just need to be able to explain that information in an easy and simple way.

Speaker

Dr. Ahmed Ezzat


Reason

This comment challenges the condescending assumption that the public lacks intelligence or intuition about information quality. It reframes the problem from ‘people are gullible’ to ‘experts need to communicate better,’ which is a fundamental shift in perspective about information integrity challenges.


Impact

This comment shifted the discussion from focusing on platform mechanics to emphasizing the importance of respectful, accessible communication. It influenced the later emphasis on building trust through authentic engagement rather than top-down messaging.


Information risks such as hate speech and misinformation are directly and indirectly causing real world harm. And I mean violence, killings, persecution. It can even be a factor in forced displacement, in causing refugees… as we saw in Myanmar back in 2016, 2017, when hate speech… had a decisive role in the displacement, I think, of 700,000 Rohingya refugees into Bangladesh who are still there today.

Speaker

Gisella Lomax


Reason

This comment elevated the stakes of the discussion by connecting online misinformation to concrete, devastating real-world consequences. It moved beyond abstract concerns about ‘information integrity’ to demonstrate how digital platform content can literally displace populations and cause humanitarian crises.


Impact

This dramatically shifted the tone and urgency of the conversation. It transformed the discussion from a technical/marketing challenge to a humanitarian imperative, influencing subsequent speakers to emphasize the life-or-death importance of their work and the need for stronger partnerships and resources.


You could boil the challenge down perhaps to one word, and that’s time. If you’re in the business of trying to disrupt or break something, you get to move at a much faster pace than if you’re on the other side of that equation.

Speaker

Eeva Moore


Reason

This insight crystallizes a fundamental asymmetry in information warfare – that destructive actors have inherent speed advantages over those trying to build trust and provide accurate information. It’s a profound observation about the structural challenges facing legitimate information providers.


Impact

This comment provided a unifying framework for understanding the challenges all panelists had described. It led to the moderator’s closing reflection on the ‘tension between balancing that limited resource of time’ and influenced the final discussion about strategic timing in responses.


We have seen, to our dismay, a weakening of these capacities and perhaps less resourcing from some companies. And I would extend that, for example, to content moderation in less common languages… Are you adequately providing content moderation in less common languages in these very volatile contexts where you don’t actually have a business argument?

Speaker

Gisella Lomax


Reason

This comment exposed a critical gap in platform responsibility – the tendency to under-resource content moderation in markets that aren’t profitable, even when those markets may be experiencing the most severe consequences of misinformation. It challenges the business model underlying content moderation.


Impact

This introduced a more critical perspective on platform responsibility that hadn’t been present earlier in the discussion, adding complexity to what had been a more collaborative tone between content creators and platforms.


It should be labor-intensive. I mean, dealing with these types of issues when we’re creating them, when we want to be accurate, should look like a light lift, but actually, in fact, be a pretty heavy one… you have to be consuming a lot of information. Sadly, that includes coming across the disinformation, if you’re going to understand it and to understand how it’s navigating the space.

Speaker

Eeva Moore


Reason

This comment reveals the hidden complexity behind seemingly simple social media content and acknowledges the psychological toll of constantly engaging with misinformation. It challenges the expectation that good content should be quick and easy to produce.


Impact

This deepened the discussion about resource allocation and the true cost of maintaining information integrity, supporting other panelists’ calls for more institutional support and partnership approaches.


Overall assessment

These key comments fundamentally elevated and reframed the discussion from a tactical conversation about social media best practices to a strategic dialogue about asymmetric information warfare and humanitarian responsibility. Dr. Ahmed’s insight about public intuition shifted the focus from platform mechanics to communication respect and accessibility. Gisella’s Myanmar example transformed the stakes from abstract ‘information integrity’ to concrete life-and-death consequences, while her critique of platform resource allocation introduced necessary tension about corporate responsibility. Eva’s ‘time’ framework provided a unifying theory for understanding the structural disadvantages faced by legitimate information providers. Together, these comments created a progression from individual content creation strategies to systemic analysis of power imbalances, resource constraints, and humanitarian imperatives in the digital information ecosystem. The discussion evolved from ‘how to create good content’ to ‘how to address fundamental inequities in information warfare while serving vulnerable populations.’


Follow-up questions

How can digital platforms adequately provide content moderation in less common languages in volatile contexts where there’s no business argument?

Speaker

Gisella Lomax


Explanation

This addresses a critical gap in platform safety measures for vulnerable populations who communicate in languages that aren’t commercially viable for platforms but still need protection from harmful content


What are the research gaps in addressing information integrity challenges in humanitarian contexts?

Speaker

Gisella Lomax


Explanation

Academic research is needed to inform strategies and policies for combating misinformation and hate speech that directly harms refugee and displaced populations


How can the effectiveness of ‘pre-bunking’ or inoculation theory be measured in building community resilience against hate speech and misinformation?

Speaker

Gisella Lomax


Explanation

UNHCR is testing this approach in South Africa but more research is needed to understand its effectiveness and scalability across different contexts


What metrics should be prioritized when evaluating content quality beyond traditional engagement metrics like followers and likes?

Speaker

Dr. Ahmed Ezzat


Explanation

Understanding how to measure meaningful engagement and content impact is crucial for organizations trying to combat misinformation effectively


How can institutions maintain public trust in an era where people increasingly distrust official sources?

Speaker

Gisella Lomax


Explanation

This addresses the broader challenge of institutional credibility when individual voices are often more trusted than official organizations


What are the resource implications and funding challenges for organizations trying to combat misinformation?

Speaker

Bia Barbosa (audience member)


Explanation

The question was cut off in the transcript but relates to the resource constraints mentioned by multiple panelists in fighting well-funded misinformation campaigns


How can partnerships between humanitarian organizations, tech companies, and civil society be structured to effectively address information integrity challenges?

Speaker

Gisella Lomax


Explanation

Given the multifunctional nature of this work and limited resources, understanding effective partnership models is crucial for scaling solutions


Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.