WS #136 Leveraging Technology for Healthy Online Information Spaces

18 Dec 2024 11:00h - 12:00h

WS #136 Leveraging Technology for Healthy Online Information Spaces

Session at a Glance

Summary

This panel discussion at the Internet Governance Forum focused on leveraging technology for healthy online information spaces, addressing the challenges posed by big tech’s power over the digital landscape. Participants from diverse backgrounds, including civil society, international organizations, and government, shared insights on the complexities of the issue.

Key challenges highlighted included the defunding of professional journalism due to advertising revenue shifts, language-specific concerns in content moderation and fact-checking, and the impact on local news and media pluralism. The discussion emphasized the interconnectedness of these issues, linking sustainability of media to the availability of quality public interest information and democratic discourse.

Panelists stressed the importance of multi-stakeholder approaches in addressing these challenges. They discussed various initiatives, including regulatory frameworks like Switzerland’s approach to strengthening user rights and platform transparency, civil society efforts in fact-checking and media literacy, and industry collaborations to redirect advertising revenue to professional media.

The role of artificial intelligence in content moderation was examined, with calls for rigorous evaluation of AI systems, especially regarding vulnerable users. Transparency and accountability of platforms were emphasized as crucial elements in creating healthier information spaces.

The discussion concluded with a call to action for civil society to engage in global governance dialogues on AI and for continued multi-stakeholder collaboration to address the complex challenges in the digital information landscape. Participants agreed on the need for balanced approaches that empower users and protect fundamental rights while addressing the power imbalances in the digital sphere.

Keypoints

Major discussion points:

– The challenges of big tech’s power over online information spaces, including issues of content moderation, misinformation, and impacts on journalism

– The need for multi-stakeholder approaches involving governments, civil society, media, and tech companies to address these challenges

– The importance of media sustainability and funding for quality journalism in the digital age

– The role of states in regulating platforms while protecting free expression

– Opportunities to leverage technology for healthier information spaces, including fact-checking tools and AI governance

The overall purpose of the discussion was to explore the impacts of big tech on the online information landscape and identify ways to create healthier digital information spaces through multi-stakeholder collaboration and policy approaches.

The tone of the discussion was constructive and solution-oriented. Participants acknowledged the complex challenges but focused on identifying concrete actions and recommendations. The tone became more action-oriented towards the end as speakers discussed specific initiatives and calls to action.

Speakers

– Martin Samaan: Digital Communications Officer at the UN Department of Global Communications

– Claire Harring: OSCE team, Project Assistant

– Julia Haas: Advisor to the Representative on Freedom of the Media, OSCE

– Nighat Dad: Executive Director of Digital Rights Foundation, Member of Meta Oversight Board

– Isabelle Lois: Senior Policy Advisor at Ofcom (Federal Office of Communications), Switzerland

– Aws Al-Saadi: Founder of Tech4Peace and member of the International Fact-Checking Network

– Elena Perotti: Executive Director of Media Policy and Public Affairs at WAN-IFRA (World Association of News Publishers)

Full session report

Revised Summary: Panel Discussion on Leveraging Technology for Healthy Online Information Spaces

Introduction:

This Internet Governance Forum panel brought together experts from civil society, international organizations, government, and media to address challenges in creating healthier online information spaces. The discussion focused on big tech’s impact, content moderation, journalism’s financial crisis, and potential solutions.

Key Challenges and Issues:

1. Big Tech’s Dominance and Information Control:

– Concerns about tech platforms’ power over online information

– Lack of meaningful consultation with civil society on regulations

– Platforms acting as information gatekeepers without sufficient transparency

2. Content Moderation and Linguistic Diversity:

– Insufficient language support and fact-checking for non-English content

– Over-enforcement affecting journalism, especially in non-English languages

– Lack of fact-checking programs in languages like Kurdish

– Importance of human review in content moderation

3. Financial Crisis in Journalism:

– Shift in advertising revenue from publishers to tech platforms

– Publishers’ ad revenues halved over 15 years

– Projection: By 2024, legacy media to capture only 30% of trillion-dollar ad market

– Disintermediation problem: Tech platforms bypassing publishers to reach audiences directly

4. Media Literacy and Public Trust:

– Swiss study: Only half of the population believes independent media is essential for democracy

– Need for increased awareness about quality journalism’s role in democratic discourse

Proposed Solutions and Initiatives:

1. Multi-stakeholder Collaboration:

– Consensus on involving governments, civil society, media, and tech companies

– Swiss approach: Developing regulatory frameworks focused on transparency and user rights

– Swiss National Action Plan for the Safety of Journalists

2. Alternative Oversight Mechanisms:

– Meta’s Oversight Board as an example

– Need for institutions protecting users’ rights, especially where state regulation is problematic

– Nighat Dad’s emphasis on continuous evaluation of AI curation systems

3. Industry Collaborations for Financial Sustainability:

– Forming alliances between publishers, advertisers, and civil society

– Optimizing local media websites for programmatic advertising

4. Fact-checking and Awareness Initiatives:

– Building fact-checking coalitions, especially for non-English content

– Tech for Peace application: Developed to combat misinformation in Iraq

5. AI Governance and Ethics:

– Call for civil society engagement in global AI governance dialogues

– Meta Oversight Board’s recommendations on AI and content moderation

6. Global Digital Compact:

– Adoption of recommendations to address digital challenges

Unresolved Issues and Future Considerations:

– Effective regulation of global tech platforms from smaller countries’ perspectives

– Balancing freedom of expression with misinformation combat

– Addressing engagement-driven content amplification promoting polarization

– Ensuring local journalism sustainability in the digital age

Conclusion:

The panel emphasized the need for collaborative, multi-stakeholder approaches to create healthier online information spaces. Key actions include developing balanced regulatory frameworks, supporting linguistic diversity in content moderation, addressing journalism’s financial challenges, and promoting media literacy. The discussion highlighted the interconnectedness of these issues and the importance of considering diverse global perspectives in developing solutions.

Session Transcript

Martin Samaan: There will you also be moderating? Should I introduce you as well, since you’re on screen?

Claire Harring: In moderating, I can also turn off my camera if it’s more convenient.

Martin Samaan: No, I’ll just introduce Julia, so I might introduce you as well from the OSCE team. Do you have a title you want to give me? Project assistant.

Julia Haas: Ah, hello, hi. Here’s our fourth speaker. Excellent. So we will have an interesting conversation among ourselves, it seems.

Martin Samaan: Some people might be online too, though. I’m not sure if they’re…

Julia Haas: Yes, we see there are 10 people, at least. I don’t know how many are still able to join.

Martin Samaan: How do I pronounce your name? Nijad?

Nighat Dad: Nighat.

Martin Samaan: Nigat?

Nighat Dad: Yes. Nigatad, yes.

Martin Samaan: Thank you.

Nighat Dad: Sorry, I was late. I had a meeting in our parliament area.

Martin Samaan: Oh, that’s okay. I mean, it’s only two minutes past, right?

Julia Haas: Yes, I think we can still maybe give it two more minutes. But then, I guess because we will have to finish also on time, there will be another session in the same room at three? Your time? Oh, we probably shouldn’t wait for too long. But it looks like a very nice menu. Did you join already? A few good sessions, good discussions.

Isabelle Lois: I was just saying that the venue is very nice, but you can’t even see the ceiling. That’s the real highlight. It’s very beautiful, very decorated.

Julia Haas: Okay, okay.

Martin Samaan: It’s a palace.

Isabelle Lois: It is a palace, yes.

Julia Haas: Very nice. Okay, great. And I see we have a few people now in the audience, or I don’t know if they’re still getting or not.

Martin Samaan: I think they are working, yeah.

Julia Haas: Okay, well, maybe.

Martin Samaan: For sure, yeah.

Julia Haas: No, I mean, for us, for sure, but I meant for them. But in any case, I think probably it would be good to start.

Martin Samaan: Yeah, and you said each person has like three minutes to kind of keep their answers relatively short.

Julia Haas: Yes, I mean, now I don’t expect a lot of questions from the audience, from the floor, but it would maybe be interesting. It would be nice to just have a conversation, and then maybe you can really do these two rounds with kind of like first focusing a bit on challenges and way forward. But then maybe there’s also really a possibility for an interactive exchange, right?

Martin Samaan: Yeah, see if anybody. Yeah, and you guys can look on screen, right? I might stay here actually as well.

Julia Haas: We should be able to see. Yes, we see the screen.

Martin Samaan: So I might just ask each of you to introduce yourselves, because it’s nice to hear from. So I’m not just talking for two minutes. So I’m going to turn over. Maybe you can introduce just your trustee, and then I’ll kick it off after. So just a very quick name, title, and where you’re from.

Isabelle Lois: Okay, I’m happy to start. So my name is Isabelle Lois. I work for the…

Martin Samaan: When we start, have we started already? No, right? Not yet. Okay, so we can kick off, I think.

Julia Haas: I think we can kick off. Go ahead. Please go ahead, Martin.

Martin Samaan: Hello, everyone. My name is Martin Samman. I’m with the United Nations Department of Global Communications. I will be moderating this session together with Julia Haas, who you see online, advisor to the representative on freedom of the media with the Organization for Security and Cooperation, or OSCE for short. A warm welcome to everyone. We are here in Riyadh for the 19th Internet Governance Forum. And this workshop is led by our fellows on the screen here, OSCE. This panel will focus on leveraging technology for healthy online information spaces. And with us today, we have a great panel. I’m going to ask you each to introduce yourself briefly, starting with you.

Isabelle Lois: Yes. Hi, everyone. So my name is Isabelle Lois. I am a senior policy advisor at Ofcom in Switzerland, so the Federal Office of Communications. And we work a lot on internet governance, AI governance, data governance, but also on media policy and freedom of expression online and these topics.

Aws Al-Saadi: Hi. I’m Aousis Saadi. I’m from Iraq, founder and president of Tech4Peace, and also IFC and IWF, Advisory Fact-Checking Network. We are working as Tech4Peace for the debunking mis-disinformation and also initial security and digital rights.

Nighat Dad: Hi, everyone. My name is Negatad. I am an executive director of Digital Rights Foundation. We are based in Pakistan. We work on generally digital rights issues related to freedom of expression, right to privacy, tech-facilitated gender-based violence. I have a very keen eye on regulatory frameworks as well, emerging from different jurisdictions. So I’m a senior policy advisor at Tech4Peace. I’m also a senior policy advisor at the Office of Communications. Frameworks as well, emerging from different jurisdictions. And I sit on… looking at the content moderation decisions of the company and holding them accountable.

Martin Samaan: Thank you so much and welcome to the people online and those who have joined us here in the room. Today, we’ll hear about the main challenges of big tech in the information landscape and we’ll identify different stakeholders’ role in addressing the many challenges that arise with this ever-evolving and quickly evolving technology. For nearly two decades, the IGF, the Internet Governance Forum, has been a vital platform and at the forefront of tackling the digital opportunities and challenges that face our world. And as the United Nations’ largest multi-stakeholder gathering, the IGF brings together governments, businesses, civil society and the tech community so that we can all together shape a safer, more inclusive and equitable digital future. So it’s great to have you here at the Internet Governance Forum. I will now pass the floor to Julia online and she’s also joined by Claire Herring.

Julia Haas: Thank you so much, Martin, for kicking it off and introducing. It’s really great to join, at least online. It’s a pity I cannot share the table with you, but it’s great to really have this multi-stakeholder reason that you referred to, Martin, I think also really reflected on this panel. It’s great to see that we have people from civil society, from international organizations, from a state, the meta oversight board, of course, also being a particular body that stands somewhere in between, right? But all fulfill a very important role. So I truly believe that this will be a very important conversation, even if many people will be at lunch. But in any case, I’m very excited to see you all here and also to hear about this UN context and the importance of IGF, which of course the OEC would also agree to that the challenges we’re talking about can only be addressed in a multi-stakeholder manner. So we already briefly heard the title of this session, at least, but just also to give a little bit more context that our intention with this session was really to discuss both kind of the impact of big tech on the information landscape and how state can limit any undue power or control that we see over the information or concentration. But then also on the other side, what can be done in context where maybe states would not be so eager to intervene or where maybe we don’t have a lot of very strong democratic institutions or rule of law standards. So really, how can we find a way forward to address the challenges? And ideally, then in the last step, so to speak, to also see whether and how technology can actually be leveraged for a healthy online information space, which is this overarching title of today’s session. And that is also the name of a project that we are currently working on at the OECD RFM. And in the framework of this project, we are currently working on developing recommendations for states on media and big tech. So within, I mean, as you, Martin, already rightly referred to, of course, in the title of our organization, the representative freedom of the media, the media has a particular, which we also acknowledge in the information landscape, it is about freedom of expression, privacy, many words were already heard. But journalism and independent journalism has, of course, a particular role to play. So we are currently also exploring how media and big tech are interacting, intersecting, what are the particular challenges for journalists and how to address it. And in this context, the context we’re particularly looking at media sustainability, the accessibility and availability of public interest information. So also questions of visibility by platforms and these kinds of things and journalistic protections. And I’m just mentioning this not for purposes of self-promotion, we will be having now is really feeding into this process. This is a multi-month process and we are currently holding several roundtables and discussions, including the conversation we are having today. And we really want to build on all of your knowledge and all of this expertise and experience to feed into the guidance that we will be developing for states before summer next year. So having said that, I’m really very much looking forward to hearing from all of you and learning. from you and see how we can empower the information spaces and also actors that are particularly important for the information in that case. But maybe before we dive into specifics or into the media question already, it would be very useful, I think, for all of us to get a little bit help for setting the scene. And if I may ask you, Nika, to help us with this to tell us from your experience. I mean, you, of course, already mentioned that you not only have great expertise from civil society perspective, from the organization you founded and leading a lot of insights from the meta oversight board, what are the key concerns and what are the challenges? So can you maybe in the outside of this session, tell us what would you consider from your expertise, from your experiences, are the main challenges of big tech power, so to speak, over the information spaces? Thank you.

Nighat Dad: Is this working? Okay, right. Yeah, no, thank you so much, Julia, for setting the scene. Yes, definitely. I wear another hat, which is like sitting on Meta’s oversight board, which is now, it’s been four years and I’m one of the founding members. And I think board is a very unique institution, like one of its, like first of its own kind in terms of, you know, like the accountability and oversight mechanism when it comes to big tech platforms, but in our case, it’s Meta. And I think I come from a very unique perspective, the jurisdiction that I have come from, the rule of law is not really strong, right? So many of us who have been advocating for open, safe, online, accessible spaces, we have seen emergence of regulatory frameworks. And I think we need to keep one thing in mind that these regulatory frameworks emerging from different jurisdictions are not all the same, right? So we have very good example of DSA, which many of us really look towards to and looking forward to the enforcement of the regulatory framework from the Europe on the big tech platforms. But when we look at our own concern and worried in terms of, there is not really meaningful consultations when it comes to multi-stakeholderism, right? So including civil society, when these frameworks are being designed or debated by the government or parties related to these regulatory frameworks, for instance, regulators in our case as well. And then you look at other mechanisms. So you’re like, okay, we are in between two powerful actors. One is state and then another one is tech giants, right? So what kind of other mechanism you have, especially when it comes to users, that you can use to hold tech companies accountable? And I think that’s why I feel that Oversight Board is independent. It is an oversight body. It’s set out as a unique experience and now it has become an institution. And I would encourage people to look into the cases that we have decided. We are a diverse group of people. We are not, you know, like from Global North or folks from Silicon Valley. We are from different regions, diverse backgrounds, diverse experts. And we actually take up cases that sort of like appeals to that, but also where company struggles. For instance, in some of the cases, we have looked into like journalist cases. For instance, on top of my head, this one particular case that we decided in the very beginning of our term, it was the word Taliban, which was kind of wrongfully removed, kind of over-enforced on meta platforms. And this is very much related to South Asia, Afghanistan, you know, and folks kind of know about this in Global North. But we took up this because the over-enforcement was putting hurdles into journalist reporting, who were actually using these platforms. And we took this up, we deliberated on this case, and then we told meta that there is an over-enforcement of this word. So you really need to look into the terms that your community standards and the policies that you have in terms of, you know, sort of regulating such kind of terms. But it was basically to make it easier for media and journalists, especially in South Asia. So this is just one example. But having said that, what I mean to say is looking to the role of states, governments, and the power of tech giants, we should also broaden our own ideas that they are not just regulatory frameworks that we need to look into, but we need to look into other institutions as well, where the, you know, the jurisdictions where states are actually regulating on their own terms and not in a meaningful way, then what are the other institutions who can come to help to the users when it comes to protecting users’ rights on this.

Julia Haas: Thank you so much for giving us this oversight over the overboard work. No, but it’s super interesting to hear the broad variety of the work. And I think what is really particularly useful also for the work of people in this field and for the digital rights community, so to speak, is really to also look at the guidance that is being developed. So it’s not only about individual cases and individual protections, which are, of course, crucially important, as you rightly point out, but also to learn from that and to take it a step further and say, okay, what, how should policies be maybe be adopted, adapted, and what can be done in addition to it? Another thing that I found really extremely useful that you mentioned is really the understanding local perspectives, because it’s really different and this is something that we see over and over again with technology being deployed globally. The impact is very different, especially if we don’t have the same capacities or even language capacities, then it is a bit difficult or the implications might be even worse. I see now that our fourth, fifth speaker has also joined, but couldn’t yet turn on her camera. So maybe while hopefully our technical colleagues can help, spoke about this aspect of understanding local contexts where sometimes with specific words you might have an over-enforcement or under-enforcement, which are both problematic from a human rights perspective, and the context is so crucially important. And this is something where I would like to bring in, as you mentioned this also when we had our preliminary conversation on this, that you have this experience, that also in the context you work on, it’s really different, the big tech challenges, because the context, maybe less strong democratic institutions or less checks and balances, both from the state side as well as from the platforms who don’t deploy the same kind of resources and attention to the regions. Can you maybe build on this a little bit and tell us what your experience is or the specific challenges you would see in this context?

Aws Al-Saadi: Thank you for the question. In general, we are working in both languages, which is Arabic and Kurdish. So in general, the tech companies at the platforms, they are not supporting Kurdish anything. Like there is no program as a third-party fact-checker with META or as a global fact-checking program with TikTok working in fact-checking in Kurdish. So the October that we had an election in the Kurdistan, Iraq-Kurdistan area, there is a lot of misinformation, and even when we expose it, you cannot flag it, because there is no program supporting this language. Even there is more than 60 languages in their program, but still they are not covering this area. And for the Arabic content, they have, for example, I’ll give you an example. In Ukraine, they have an award. There is nine organizations working as a third-party fact-checker. If you go to Spain, which is there is no award, there is five organizations working and as a third-party fact-checker. But if you go with the whole MENA region, you just see two organizations only, which is one of them from France organization, which is the EPI, and the other one is for Turkey, which is the Jordanian. And even there is seven organizations, which is they are signatory by the FCN, because one of the rules that from META or TikTok, if they want to be as a partner for the fact-checking, you need to be a signatory from the International Fact-Checking Network. There is seven organizations from the Arabic region. They are signatory. And even no one from just one of them, which is FATABENU, they are as a third-party fact-checker. And the other side also for the gaps that we have in the Arabic language is after 7 October 2023, what’s happening between Israel and Gaza, there is a lot of restriction on the Arabic content. Even as fact-checkers, when we are… are exposing fake news. They flag our content that we are separate fake news. But in reality, it’s because of the automation. They take the pictures that we are exposing it, and they flag it as a fake news. And the sources, sorry, the links for the fake news is not flagged. So instead of fighting the fake news, they are flagging the organization in Arabic that’s working to expose it. And because we are a trusted partner with META, from 2019, so we send it to them like, look, we are exposing it, we’re not publishing. And then they just restore the content. And even they didn’t solve it completely for the Arabic content in the region. But if you have a relationships, like if you have connections with them, then they will restore your content. If you don’t have it, you will appeal. I know a lot of organization and a lot of users, influencers, journalists, human rights defenders, their accounts like stopped because they taken down. When they appeal, it’s the automation, those doing it, there is no human person reviewing it. And then it’s go again as a rejection. And the other things, which is the AI tools. As a technology, usually most of the AI tools, you can find it started in English language and for the fact checking, but in general, for the Arabic, it’s not supported. So this is, this gap is difficult because if you want to build your own tools, you need funding, you need to work a lot with this technology issues. And it’s in that time. And also I’m talking about the media literacy. We have it’s really low media literacy in our region. And around 30% people inside the region, they are not connected to the internet. And I give you an example for Iraq, most of the channels belong to the parties and it’s not independent. So if they have some kind of campaign with this information, the people will believe it. And then in the election, they will going to vote for them. And I will stop from here, there is the points, but it’s better to have more conversation from the outside. Thank you.

Julia Haas: Thank you so much. It’s really, I think you really greatly underlined the interconnectedness of it all, right? And you really say there are so many different layers to the challenges, but then technology is not necessarily helping, but might even perpetuate some of these challenges if there are biases or no language knowledge or attention or even fact-checking partners and entrusted partners and all of that. So it’s really important to see how different kind of what angles can be leveraged and how can then the entire ecosystem ideally benefit from it. The second point that I found really important that you mentioned is the necessity to have a human in a loop. This is really this kind of thing that has been called for by civil society and many actors for the entirety of the conversation around content moderation. But I think you pointed out to the fact that particularly in contexts where the language or the automation is less specific with regard, for example, to the Arabic language, it’s a bit difficult, even worse and more difficult. So thank you for outlining this. I wanted in this first round of better understanding the complexities of the issues when we speak about information spaces and big tech and technology also really bring in the particular component or the particular aspects of journalists, because this is something that both of you touched upon already a little bit in the sense of how journalists are affected in your respective work and your respective areas. But we know also from the global media scene that big tech has challenged or provides challenges to journalism also on other levels because of these dependencies that have been created when we speak about the distribution, when we speak also increasingly about generation as you all know. So I wanted to bring in Elena Parotti. We had a very brief introduction before, before you joined. So maybe you can also start with briefly saying who you are and what your role is at One Ifrah and also tell us this particular role or where you see from also, because it’s a global working on if there is some kind of like specific implications that you see big tech has on journalism. In addition to this overarching challenges that we discussed so far or specific regional and language specific challenges. Is there something you could help us to better understand what are the key kind of like overarching concerns by the current power that we see from big tech and the concentration of power that impacts journalists and journalism as such? Thanks.

Elena Perotti: Julia and I have to apologize towards everybody for being late. It was the train’s fault, but again, I’m very sorry, very happy to manage to be here on time kind of. Well, I’m Elena Parotti. I am executive director of media policy and public affairs at One Ifrah, which is a world association of news publishers, which is imagine a trade association for publishers all around the world. Our main constituencies are the national associations of news media publishers, therefore the bodies that try the European commission when they are European to obtain better conditions for the business of the publishers again. And my constituency in particular is exactly that, the one of directors of national associations. So I’m privileged enough to have a real global outlook on what the concerns are of the industry all around the world and how incredibly similar they became, particularly ever since the digital era starts, which I would place at the start of Google’s, but everything changed again, 2006 and seven when Facebook became mainstream. Well, you Julia asked specifically about journalism, whereas again, my expertise is with the publishers in business, but I do of course have also journalists in my radar. And one very interesting point of view to remember is that at the very beginning, when publishers were already starting to get worried about this increasing power of big tech, journalists were generally not, they just weren’t because the interests did not align, see? Whereas for publishers, it was clear right away that big tech was about to eat away at most of the revenues of the industry. For the journalists, the journalists still saw, and I’m speaking 15 years ago, even only 10 years ago, I would say, they were still seeing as much more relevant the fact that Google or Facebook or the others who came afterwards would allow their content to be disseminated more widely. So the interest really of journalists and publishers did not align at that point. What I see now ever since three or four years is that now also journalists realize the problem, which is in our world, the problem is disintermediation. So big tech has the power of giving to the person who is not extremely interested in news, enough news, and they would not click through to go to the publishers’ websites, and therefore the publishers will lose money directly, and therefore they will not have money enough to fund the journalism of professional journalists. That is the macro problem. The other macro problem, of course, is that in the last 15 years, the advertising revenues of publishers all around the world have halved, literally halved. I actually prefer to fire out to you, of course, I don’t remember them by heart, so I’m reading. But in 2024, there will be one trillion in advertising that will be transacted in the year 2024. One trillion, which is an 80% increase compared to 2019 pre-pandemic. Of that, one trillion. Legacy Media, so the professional publishers of news, will have about 30% of that ad spend, and a third of the rest is going to go to Alphabet, Meta, Amazon, and so on. Not even to speak of the new advertising revenue, because every year, new advertising revenue is created. Of that, and I’m not exaggerating, more than 80% goes to platforms. That is, I know I’m speaking only money, but by speaking money, I’m also speaking democracy, and speaking professional journalism, and ability to do investigative journalism. I’m also speaking security, or security of journalism. I’m also speaking security of journalists being sent in war zones. Because all that cannot happen if the publishers are not sustainable. So there is really, I would say, Julia, to answer your question, that the main threat brings to to journalism is the defunding, basically. That is what it is. And I don’t know whether Big Tech is in the room, I have no idea, but they have, they have tried to work with news around the world, with signing contracts, of course, and so on. They have given hands out sometimes, but that is just not enough. It is, I think, a democratic responsibility of governments, but all those like us, to find solutions so that the defunding of professional journalists does not happen because that is dangerous for democracy. I don’t know how much time I have, Julia, I could speak of this literally for hours. So just let me know.

Julia Haas: No, excellent. This was a very good overview already. I mean, I think it’s really important, important that you underline how this question of sustainability or funding is not linked to the question of running a business of a media outlet, right? But it is really the question of what kind of information is available, what kind of investments can take place from where and to what degree, so that it is really a democratic question, as you rightly pointed out. And if we want to discuss on how can we fight disinformation, as we heard before, or how can we avoid that, and people are attacked online, or how can we make sure that people have election integrity and have information available, all of this is linked to the availability of public interest information. So we can only speak about visibility and accessibility of such information if it is available, and it can only be available with sustainability and funding in the background. Yes, please.

Elena Perotti: I just saw my note, you asked me whether there was any language specific concern. And just very local specific concern, just extremely briefly, the main concern in small societies that speak a very specific language, so of course, not English, but not even Italian, French or Spanish, or German, I mean, I mean, Arabic in particular, for example, of course, the problem is misinformation. That is, I would say, is really number one problem, because platforms do not invest at all in fact checking and in checking misinformation in small languages, please pass me the term, 80% of Facebook’s money for dismantling disinformation goes to English languages. And as a consequence, as we know, this can have deadly consequences literally to people. And that is where local news is even more important. Because again, if you have to fight misinformation, you have to do it with professional journalism.

Julia Haas: Yeah, absolutely. Thanks for this addition. It’s also what we kind of started off with, I think, before you joined about this whole question of how difficult it is to have fact checking if they’re not sufficient, trusted flaggers in specific languages or contexts. So you have all these different layers how that add up to the challenges and local journalism is, of course, also at the forefront of being not only under threat very often from different actors, but also the first ones who struggle with funding and with advertising and all of that. So it is again, this interrelatedness. And when we speak about relatedness, and also, Elena, you mentioned this democratic responsibility, which is a term maybe I would want to build on and hand over to our state representative in the room, Isabel, you coming from the Swiss context and the Federal Office of Communication, you do, of course, a lot of work in the Swiss context, right. And also, on a more global level, Switzerland has been very engaged in and constructively engaged in many also regional and global initiatives that try to work towards this healthy online information space that we’re talking about with this journalistic component, but also with regard to fact checking and fighting disinformation and user rights. I mean, all of the things really that we talked about. And is there something still before we move? I mean, I know we are now already maybe moving more into this direction of what can we do and what can states do also. But can you either already refer to that or also say a few words still about how you see as a state really these challenges? And from the state perspective, how they’re interlinked from this global perspective? What does it really mean from your point of view?

Isabelle Lois: Absolutely. And thank you for this very interesting question and very interesting panel. I wanted to move a little bit away from the notion of big tech. Because ultimately, I think, when we’re discussing these issues, they exist independently of how big the economic actors behind them are. And at the Ofcom, we work a lot with media. And so when we’re looking at traditional media, what we want to promote is a diverse information landscape that will allow real debate in the public sphere. This is really the core aspect of having an important media. So the larger platforms today, they act as sort of gatekeepers because they are controlling what information is amplified or suppressed. And this might not be by design, but it is something that we have seen happen. So this is not… The issue is here that there isn’t enough transparency and accountability from bigger platforms of controlling or seeing how the information is flowing. So that means that the public who is reading and being online does not have the knowledge of how and why some information is put forward, why some posts are put forward and are viewed, and why others are not. And this gives these bigger platforms a sort of de facto agenda setting power. And this is where we can see the biggest issue. This new power of setting the agenda of certain issues on the map reshapes the public debates because we are, of course, I guess, all connected and all using social media and other platforms. And so it will prioritize certain debates, certain issues, and hide others. And the engagement or the contents that is often engaged the most with is polarizing content or misleading content, instead of having maybe informational content or educational ones. And this is where we see sort of imbalance, and we need more scrutiny. So I think this is the main perspective that we try to come with as a government. And in Switzerland, we have identified that it is very important to work and work with the media sector or the traditional media sector, if I want to put it this way, because there’s, of course, and this was mentioned as well before, significant challenges as the business model sort of changes with big technology platforms dominating the market. We talked about advertising revenue that is diverted from certain platforms to others, and that media organizations are struggling to sustain themselves financially, and it can lead to problems. One of the things that we have seen in Switzerland is that there is a sort of consolidation of the media market. So many of the smaller or medium-sized media outlet will have been bought up by larger media structures, and it is difficult to keep the news that is very local alive. And so we try to do our best in this, but this is a complicated issue to deal with. But the major problem we’re seeing here is really reducing the diversity and availability of local news, the quality of the reporting, having less journalists who can work in it or less money for investigative research, and this means that we have fewer in-depth stories being produced. So this is one of the points that we have identified, as well as online harassment that is getting bigger and bigger when we are getting to these polarizing conflicts, of course. I’m moderating a session tomorrow that is talking about safety of journalists online that will delve a little bit more on this point, so I’m going to stop here on that part, but I just wanted to highlight the importance of media pluralism and the importance of having it as a fundamental part of democracy. A study has been done recently that I found very interesting and also quite shocking. It has shown that only half of the Swiss population believes that independent media is essential for democracy. So only half of the population believes that. The rest are sort of either thinking that it does not matter or that it has no connection. About 14% I believe thought that it was not essential. And so this is something that is quite scary to think about because this detachment of understanding how important true media is and as a pillar, as the fourth pillar I would say, of democracy, if we don’t see the importance of it then we cannot safeguard it. And I think this is where we’re working a lot on, on awareness raising, on capacity building, and I know capacity building is something that is discussed a lot at IGF in many different sections. Accessing and getting the information and understanding the information you’re reading and why it’s important to have a certain source or another, I think that is really a strong point we need to work on. And I’m happy to delve into some ways to address these challenges. I don’t know if you want me to continue immediately or if you want to pass the word on to someone else, Julia, I’ll let you decide.

Julia Haas: No, thanks a lot. I do have a follow up question, but I think it’s really important that you point to this, this almost twofold notion of literacy, right? That it’s not only about information literacy in the sense of understanding why something is shown, as you pointed out. but also really of media freedom and media pluralism. This is something that we are also trying to work on and we try to connotate this terminology of media freedom literacy to really say the states, I mean, it’s not only about individuals, as you say, but also states sometimes like this understanding of how the link is between media freedom, independent media pluralism, and then the link to democracy, but also more broadly to stability, to peace, to safety and security. So this is a very important point indeed. Thank you. I think that was also very important is that you pointed out this engaging factor, right? That currently like content is boosted and more shown if it’s engaging, but we know people might be more engaging on polarizing content. And then if we speak about healthy information spaces and democratic deliberation, and also in other contexts, we have obligations with regards to diversity and also regional diversity and all of that, that might not gain the same attraction if different contents are being provided. But I think it’s really an important question to also ask whether we should have similar obligations in the digital context and in the online space where we, and with we, I mean, both democratic states, but also all stakeholders should just tell platforms that it’s not acceptable to prioritize contents just because it’s engaging and drives more advertising. And this maybe is already the follow-up question or the lead up to the follow-up question where you see as a democratic state or like coming from this perspective, like the states can work in that direction while acknowledging that, of course, Switzerland still remains a fairly small market from the media perspective, but big tech perspective and all of that. So what are the avenues and possibilities that states can take? And before I hand over back to you, sorry for that, I also just want to offer people, if there are people in the room who have a question or have a comment or want to say anything, please indicate it to us. We also already put the same in the chat for the online participants because there was somebody mentioning. So please feel free to also think about questions after we hear from Isabelle, please.

Isabelle Lois: Thank you. It is a very challenging question to answer. How can states or what should states do? And I think states definitely have a role to play in addressing the challenges posed by big tech, both at a national level, so what we could do in our own countries and also at the international level through coordinated international efforts. I also believe that it’s important to say and IGF being a multi-stakeholder platform, I want to highlight that it is, I think, in collaboration with other stakeholders that we could do the best work. So this general kind of view of what governments can do, I think we can do things, but we should not do it alone. So on the national level, sort of the Swiss perspective is that we have to protect the public’s access to quality information and ensure some sort of accountability from platforms. And this can include and this can be done through some sort of regulatory measures that really ensures transparency in the platform’s operational system, clarify how and which obligations they have, and then also empower the users to make informed decisions because at the end of the day, I think we are at the center of this. So state intervention is one of the ways to address the sort of power imbalance over information spaces, but it is not the only way and it is not, it should be done in a very carefully balanced way to respect fundamental rights and freedoms. There is a lot of potential harm that can be done with a strong state control over which information is accessible and how it is done. So I think it is a very complicated line to navigate and there has to be a lot of careful considerations brought in this. So in Switzerland we have a sort of an approach of strengthening the user’s rights and increasing the transparency of the platforms instead of moderating the content. So this would ensure that the state intervention does not compromise the freedom of expression or overstep any other fundamental rights that are essential for us. We are currently developing our own regulatory framework for large online platforms and it is largely based and inspired by the European legislation, so the EU Digital Service Act, the DSA, but the law we are envisioning, and it is not yet in place, focuses on due diligence requirements while strengthening the transparency rules and the user rights. So it is not really about controlling from above but sort of getting the users to have all of the information. So into more details on this if you are interested, but there are some significant differences maybe with the EU approach where our scope is much narrower whilst we only focus on very large online platforms and very large online search engines. And we are also limiting to only what we call communication platforms. And this is because we want to enable the public dissemination of information between users for the purpose of opinion forming, entertainment, and education. And we are excluding any marketplaces or booking platform sites or things within the DSA. So really, our most important point is to protect the fundamental communication rights of the users online. And this can be made to ensure that we have a well-informed public, because, and this goes to really the main point that we have in Switzerland where we’re trying to give the tools to the users, to the public, and not control it from above. And I think this is where we could work more with other stakeholders, with the media agencies, with other countries of not looking at it in a mock way. And I will just say one more thing on this. We also have a Swiss National Action Plan for the Safety of Journalists that was published last year where we have a whole set of different measures, most of them to address safety for the journalists, not only online, but mostly offline. But our main focus is to bring awareness on the importance of journalism for a functioning democracy. And so this is really where we can bring an added value. And we’re also working very closely with the Council of Europe, Safety of Journalists that I can only encourage you to look into. But yeah, this is sort of the ways that we see that we can do something and that states could do something. Of course, if we are looking into regulation, we have to make it balanced. We have to make sure that we are empowering the public and not controlling what is put out there. And so this is where we are happy to discuss and engage with any stakeholders and in meetings like today to have a bit of the other ideas. Thank you.

Julia Haas: Excellent. Thanks so much for this. I mean, I mean, for sure with this empowering, not controlling is certainly a nice way of phrasing it. And I think this is also something that we as the OEC have been pushing for constantly that we don’t speak about individual pieces of content when we speak about processes, right? I mean, it’s the same when we speak about disinformation or fact checking, it’s very much about how can we make sure that the processes work better that we have, as you say, like transparency, accountability, and all of these things. Multi-stakeholder engagement. And in a meaningful way, I think this is also something that we heard already today, but it’s addressed the undue power. And by big tech, it doesn’t mean that we wanna fill this with undue state power, right? And becomes particularly important in contexts where we don’t have maybe the same democratic institutions. We only have 10 minutes left. So again, I wanna ask if somebody in the audience would have a question, but if not, I would ask maybe first Elena, because now we had a few of the sentences also with regard to journalism or also with the public sphere. And build on that to say briefly what you think should be the role also of the media industry and to respond to the power of big tech over information spaces. And if you see there maybe also even an opportunity or possibility how technology can help in this regard.

Elena Perotti: Yes, yes, thank you, Julia. And indeed, this question comes at such a good time because I just had an idea for a new project for Winifra that is really about that. But first of all, I wanted to thank Isabel for her intervention. And I would like to add to what she just said that indeed it is only states, I think, can have the responsibility to choose what stays online and what goes only democratic states can, yeah, I should say only democratic states can have that responsibility. Believe that responsibility to platforms is just going to take down anything that could be even potentially bring lawsuits and economic problems for them. So the role of the states is very important. We saw how important it was in all the battles for freedom of expression, of course, also for the sustainability of media with all the laws and antitrust decisions that have been taken around the world, Australia, Canada. and so on. But I agree, Julia, that media to do something independently also, in addition to what states can do to help its sustainability and as a consequence, the democratic discourse, which is fostered by the professional media. What I believe, what we believe within Oneinfra is that stakeholders should pull together, the publishers, but also the advertisers and also the civil society and so on, to try and sustain, again, a media environment which is safe because it brings good information to the public. What I mean is that in my first intervention, I spoke of how problematic it is for publishers that all these advertising revenues go to big tech instead than to advertisers, instead that to news media. But it’s true also for advertisers, which very often find their advertising being placed by bots alongside content that is not flattering for their brands at all. It is actually a disaster. So in the end, we have a double interest from both advertisers and publishers to ensure that good advertising, at least a good portion of good advertising, goes to good professional brands. So what I’m trying to foster and what I’m trying to work on is an alliance between big stakeholders, which include professional media, not only the big professional media, in particular the local small media and advertisers and people who can advise us, OSC, for example, and so on, to find a way to make sure a certain portion of advertising goes back to media done in a professional way. And in the brief that you gave me at the beginning, before this session, Giulio, you also said, is there anything technical that should be explored as well? Well, yes, there is. It is really important. I just found out recently that local media, local regional media, very often has websites that are not optimized to receive programmatic advertising. So even if we are successful in producing some sort of project which would drive more programmatic advertising, which is a programmatic advertising, is advertised, it is automatically thrown onto websites which say, I’m waiting for advertising. That is basically what it is. But websites need to have some technical specifications to allow this to happen. So we’re going to try and put in place in the next months, in the next 18 months maximum, a process to ensure that more programmatic advertising goes to professional media sources, also local, but we’re also going to find a way to ensure that as many as possible of these websites are in the best conditions possible to receive technically this programmatic advertising.

Julia Haas: Thank you very much. Yes. Yeah, no, it’s very, it’s first of all, also already a call of action, a call to action, which is very good, because this is also what’s at the end. And it’s important to also speak about advertisers, which is part of this multistakeholder approach, of course. So maybe in the last five minutes, I want to turn now to the question also of civil society, right? And I mean, both Nigger then also asked you in the beginning, already explained what you’re doing and the work. But can you also tell us what you think is the role of civil society? And then maybe also, Nigger, I know that you’re also in the global UN advisory body on AI. So maybe also, I know it’s not a lot of time left, but if you can close on this call, call to action, also maybe with this perspective and expertise. I’ll give the floor to us on the more specific civil society experience or what you think should be done. And then, Nigger, maybe you can close it with this more global AI component and a call to action in the last few minutes. Thank you so much.

Aws Al-Saadi: Thank you for the question. Now, we are working, we build an IFCN Arabic collation from this seven organization. So we will make some pressure on the tech companies to be as a third-party fact-checkers or to combine them as disinformation, because it’s just one organization, usually they ignore it, especially it’s come from the Arabic region. The other things that we are doing, the other things that we are doing, raising awareness for people in different ways, like online, for example, last Tech for Peace, we have more than 2.3 million followers. We are the biggest fact-checking in the MENA region. So we’re doing raising awareness. We have also MOOCs platform that we have different kind of levels of fact-checking, digital security, etc. And also, we are doing webinars, some kind of raising awareness also on the ground, like with IDPs, which is internally displaced people and camps. That’s inside Iraq. And also, we are building some initiative, new organization. For example, we did in Yemen, we did in Tunis, and also in Libya. And we are also, for the future, plan for the other countries. And also, on the other side, we are building too. So there is a Tech for Peace application mobile that’s not only depending on our articles for fact-checking, and also we are building this for people. So anyone can install this app in three languages, which is Arabic, English, and also Kurdish. You can do fact-checking by pictures, videos, and also text. And the only application mobile you can do by videos in the three languages, because most of the tools for fact-checking, you need a computer to use it. So we developed these tools to use it by phone. And we have, until now, more than 100,000 downloads. And it’s still growing up really, really fast. And yeah, I will stop from here.

Nighat Dad: Yeah, no, thank you so much. Julia, I would, so I’ll talk a little bit about the UN Secretary General, HLAB, and the recommendations that we have given. But before that, I would also like to talk a little bit about the white paper on AI around content moderation that we released at the Oversight Board, and some of the really important things that our panelists have already highlighted. For instance, and we dwelled into so many cases. We engaged with so many stakeholders, including civil society around the world. So far, we have received 10,000 comments on our cases. And I would really encourage people to really look into those deliberations and the case decisions that we have released, and the recommendations, because those recommendations actually go deeper inside into the tools and policies of the platform. So some of the key lessons that we learned as a board that actually civil society and other tech platforms can also learn when we talk about AI and content moderation are basically, you know, curation system must be rigorously and continually evaluated on their performance for users who are most vulnerable and most at risk. Another one is global human rights, freedom of expression, and ethics experts should be consulted when designing and deploying new AI-powered content moderation tools early in the process. And then we always talk about that transparency is paramount. So giving access to third party or researchers is something that we have been talking a lot in our recommendations as well to the matter, which is UN Secretary General’s HLAB. So we gave several recommendations, and two of our recommendations actually, after negotiated by 193 states, became part of GDC Global Digital Compact. One was setting up AI scientific panel, and another one was establishing global governance dialogue on AI. And I think it’s so important now for civil society to actually keep watching this space, because especially setting up this dialogue like COP or IGF, it will be designed like that, and civil society will have like a lot of space to actually engage with other stakeholders like government or tech companies, actually designed from the beginning, in terms of the global governance of AI, be it the global governance or at the nation state level. I’ll stop here. Thank you.

Julia Haas: Thank you so much. I acknowledge that there’s no time left, so we were really on spot, and there’s really such a rich conversation with so many takeaways, and we will now certainly go through our notes and take out the calls to actions that we took from your crucial input and also report it back to the IGF, so we can live up to what you said, like learning lessons from one another, right, learning from one another experiences. It will also feed into our work. I just want to thank you at this stage, all of you, for your insights and your crucial contributions, and also, Martin, for doing the local moderation, even if there were not so many. I don’t know if you still want to have some closing sentence, but in any case, thank you all very, very much from our side, and very crucial input, and will be very useful for our continued work on this topic. Thank you very much.

Martin Samaan: Thank you, Julia, for your moderation. That was a really great conversation, and thanks to the panelists and the people in the room.

Julia Haas: Thank you all, and enjoy the rest of IGF. Bye. Bye.

N

Nighat Dad

Speech speed

149 words per minute

Speech length

1157 words

Speech time

465 seconds

Lack of meaningful consultation with civil society on regulatory frameworks

Explanation

Nighat Dad points out that regulatory frameworks for big tech are often developed without sufficient input from civil society. This lack of consultation can lead to policies that don’t adequately address the concerns of users and stakeholders.

Evidence

Reference to her experience in jurisdictions where rule of law is not strong

Major Discussion Point

Challenges of Big Tech in the Information Landscape

Differed with

Isabelle Lois

Differed on

Approach to regulating big tech

Over-enforcement of content moderation affecting journalist reporting

Explanation

Dad highlights how overzealous content moderation can hinder journalists’ ability to report on certain topics. This can lead to important information being suppressed or removed from platforms.

Evidence

Example of the word ‘Taliban’ being over-enforced on Meta platforms, affecting reporting in South Asia

Major Discussion Point

Impact on Journalism and Media

Agreed with

Aws Al-Saadi

Elena Perotti

Agreed on

Lack of language support and fact-checking for non-English content

Creating alternative oversight mechanisms like Meta’s Oversight Board

Explanation

Dad suggests that alternative oversight mechanisms, such as Meta’s Oversight Board, can help hold tech companies accountable. These bodies can provide independent review of content moderation decisions and policies.

Evidence

Her experience as a founding member of Meta’s Oversight Board

Major Discussion Point

Potential Solutions and Ways Forward

Global governance dialogues on AI should include civil society input

Explanation

Dad highlights the importance of including civil society in global governance dialogues on AI. She argues that civil society should actively engage in shaping AI governance frameworks at both global and national levels.

Evidence

Recommendations from the UN Secretary General’s High-Level Advisory Body on AI, including the establishment of a global governance dialogue on AI

Major Discussion Point

Role of Different Stakeholders

A

Aws Al-Saadi

Speech speed

164 words per minute

Speech length

1033 words

Speech time

375 seconds

Insufficient language support and fact-checking for non-English content

Explanation

Al-Saadi points out that big tech platforms often lack adequate support for non-English languages, particularly in fact-checking programs. This leads to a disparity in content moderation and fact-checking efforts across different languages and regions.

Evidence

Example of lack of Kurdish language support in fact-checking programs and limited Arabic fact-checking partners

Major Discussion Point

Challenges of Big Tech in the Information Landscape

Agreed with

Nighat Dad

Elena Perotti

Agreed on

Lack of language support and fact-checking for non-English content

Building fact-checking coalitions and awareness-raising initiatives

Explanation

Al-Saadi discusses efforts to build coalitions of fact-checking organizations and raise public awareness about misinformation. These initiatives aim to combat the spread of false information and improve digital literacy.

Evidence

Creation of an IFCN Arabic coalition and development of a fact-checking mobile app

Major Discussion Point

Potential Solutions and Ways Forward

Civil society should pressure tech companies and raise public awareness

Explanation

Al-Saadi emphasizes the role of civil society in pressuring tech companies to improve their practices and raising public awareness about digital literacy and fact-checking. This includes building coalitions and developing tools to combat misinformation.

Evidence

Creation of an IFCN Arabic coalition and development of a fact-checking mobile app

Major Discussion Point

Role of Different Stakeholders

E

Elena Perotti

Speech speed

139 words per minute

Speech length

1493 words

Speech time

642 seconds

Defunding of professional journalism through loss of advertising revenue

Explanation

Perotti argues that the shift of advertising revenue from traditional media to big tech platforms has led to a significant defunding of professional journalism. This financial strain makes it difficult for news organizations to sustain quality reporting and investigative journalism.

Evidence

Statistic that advertising revenues for publishers have halved over the past 15 years

Major Discussion Point

Challenges of Big Tech in the Information Landscape

Agreed with

Nighat Dad

Aws Al-Saadi

Agreed on

Lack of language support and fact-checking for non-English content

Halving of advertising revenues for publishers over 15 years

Explanation

Perotti highlights the dramatic decrease in advertising revenue for traditional publishers over the past 15 years. This loss of income has severely impacted the financial sustainability of news organizations.

Evidence

Projection that legacy media will have only about 30% of ad spend in 2024, with a third of the rest going to big tech companies

Major Discussion Point

Impact on Journalism and Media

Agreed with

Isabelle Lois

Agreed on

Financial challenges for traditional media and journalism

Forming alliances between publishers, advertisers and civil society

Explanation

Perotti suggests creating alliances between publishers, advertisers, and civil society to ensure that a portion of advertising revenue goes to professional media outlets. This collaboration aims to support the sustainability of quality journalism.

Evidence

Proposal for a project to drive more programmatic advertising to professional media sources

Major Discussion Point

Potential Solutions and Ways Forward

Media industry should work to sustain safe information environments

Explanation

Perotti suggests that the media industry should collaborate with other stakeholders to sustain a safe media environment that provides reliable information to the public. This involves finding ways to ensure that quality journalism receives adequate funding and support.

Evidence

Proposal for a project to drive more programmatic advertising to professional media sources, including technical support for local media websites

Major Discussion Point

Role of Different Stakeholders

I

Isabelle Lois

Speech speed

167 words per minute

Speech length

1752 words

Speech time

629 seconds

Platforms acting as gatekeepers of information without transparency

Explanation

Lois argues that large tech platforms have become de facto gatekeepers of information, controlling what content is amplified or suppressed. This power over the flow of information lacks transparency and accountability.

Major Discussion Point

Challenges of Big Tech in the Information Landscape

Difficulty sustaining local news outlets due to financial challenges

Explanation

Lois points out that financial pressures have led to the consolidation of media markets, making it difficult to sustain local news outlets. This trend reduces the diversity and availability of local news coverage.

Evidence

Observation of smaller media outlets being bought up by larger media structures in Switzerland

Major Discussion Point

Impact on Journalism and Media

Agreed with

Elena Perotti

Agreed on

Financial challenges for traditional media and journalism

Low media literacy and belief in importance of independent media

Explanation

Lois highlights the concerning trend of low media literacy and a lack of understanding about the importance of independent media for democracy. This disconnect poses a threat to the public’s ability to safeguard media freedom.

Evidence

Study showing only half of the Swiss population believes independent media is essential for democracy

Major Discussion Point

Impact on Journalism and Media

Developing regulatory frameworks focused on transparency and user rights

Explanation

Lois discusses Switzerland’s approach to developing regulatory frameworks for large online platforms. The focus is on increasing transparency and strengthening user rights rather than directly moderating content.

Evidence

Switzerland’s ongoing development of a regulatory framework inspired by the EU Digital Service Act

Major Discussion Point

Potential Solutions and Ways Forward

Differed with

Nighat Dad

Differed on

Approach to regulating big tech

States should protect access to quality information while respecting rights

Explanation

Lois argues that states have a responsibility to protect public access to quality information while ensuring accountability from platforms. However, this must be done in a way that respects fundamental rights and freedoms.

Evidence

Switzerland’s approach of strengthening user rights and increasing platform transparency

Major Discussion Point

Role of Different Stakeholders

Agreements

Agreement Points

Lack of language support and fact-checking for non-English content

Nighat Dad

Aws Al-Saadi

Elena Perotti

Insufficient language support and fact-checking for non-English content

Over-enforcement of content moderation affecting journalist reporting

Defunding of professional journalism through loss of advertising revenue

The speakers agree that there is a significant lack of support for non-English content, particularly in fact-checking and content moderation, which affects the quality and availability of information in various languages and regions.

Financial challenges for traditional media and journalism

Elena Perotti

Isabelle Lois

Halving of advertising revenues for publishers over 15 years

Difficulty sustaining local news outlets due to financial challenges

Both speakers highlight the financial difficulties faced by traditional media outlets, particularly local news, due to the shift in advertising revenue to big tech platforms.

Similar Viewpoints

Both speakers emphasize the need for mechanisms to increase transparency and accountability of big tech platforms, whether through independent oversight bodies or regulatory frameworks.

Nighat Dad

Isabelle Lois

Creating alternative oversight mechanisms like Meta’s Oversight Board

Developing regulatory frameworks focused on transparency and user rights

Both speakers advocate for collaborative approaches involving multiple stakeholders to address challenges in the information landscape and support quality journalism.

Aws Al-Saadi

Elena Perotti

Building fact-checking coalitions and awareness-raising initiatives

Forming alliances between publishers, advertisers and civil society

Unexpected Consensus

Importance of multi-stakeholder collaboration

Nighat Dad

Aws Al-Saadi

Elena Perotti

Isabelle Lois

Creating alternative oversight mechanisms like Meta’s Oversight Board

Building fact-checking coalitions and awareness-raising initiatives

Forming alliances between publishers, advertisers and civil society

States should protect access to quality information while respecting rights

Despite coming from different sectors (civil society, fact-checking organizations, media industry, and government), all speakers emphasized the importance of collaborative approaches involving multiple stakeholders to address challenges in the digital information landscape.

Overall Assessment

Summary

The main areas of agreement include the challenges posed by insufficient language support and fact-checking for non-English content, financial difficulties faced by traditional media, and the need for increased transparency and accountability of big tech platforms. There is also a strong consensus on the importance of multi-stakeholder collaboration in addressing these issues.

Consensus level

The level of consensus among the speakers is relatively high, particularly on the need for collaborative approaches and the challenges faced by the media industry. This consensus suggests that there is potential for coordinated efforts across different sectors to address the challenges posed by big tech in the information landscape. However, the specific approaches and solutions proposed by each speaker vary, indicating that while there is agreement on the problems, there may be diverse perspectives on how to solve them.

Differences

Different Viewpoints

Approach to regulating big tech

Nighat Dad

Isabelle Lois

Lack of meaningful consultation with civil society on regulatory frameworks

Developing regulatory frameworks focused on transparency and user rights

While Nighat Dad emphasizes the lack of civil society consultation in regulatory frameworks, Isabelle Lois focuses on developing frameworks that prioritize transparency and user rights without directly addressing the consultation process.

Unexpected Differences

Overall Assessment

summary

The main areas of disagreement revolve around the specific approaches to addressing big tech’s impact on the information landscape, including regulatory frameworks, oversight mechanisms, and strategies to support journalism.

difference_level

The level of disagreement among the speakers is relatively low. While they present different perspectives and solutions, their overall goals align in addressing the challenges posed by big tech in the information landscape. This suggests a potential for collaborative approaches in developing comprehensive solutions.

Partial Agreements

Partial Agreements

Both speakers agree on the financial challenges facing journalism, but Perotti focuses on the global shift of advertising revenue to big tech, while Lois emphasizes the local impact on media consolidation and sustainability of local news outlets.

Elena Perotti

Isabelle Lois

Defunding of professional journalism through loss of advertising revenue

Difficulty sustaining local news outlets due to financial challenges

Both speakers agree on the need for oversight and fact-checking, but propose different approaches. Al-Saadi focuses on building coalitions and awareness initiatives, while Dad emphasizes the role of formal oversight bodies like Meta’s Oversight Board.

Aws Al-Saadi

Nighat Dad

Building fact-checking coalitions and awareness-raising initiatives

Creating alternative oversight mechanisms like Meta’s Oversight Board

Similar Viewpoints

Both speakers emphasize the need for mechanisms to increase transparency and accountability of big tech platforms, whether through independent oversight bodies or regulatory frameworks.

Nighat Dad

Isabelle Lois

Creating alternative oversight mechanisms like Meta’s Oversight Board

Developing regulatory frameworks focused on transparency and user rights

Both speakers advocate for collaborative approaches involving multiple stakeholders to address challenges in the information landscape and support quality journalism.

Aws Al-Saadi

Elena Perotti

Building fact-checking coalitions and awareness-raising initiatives

Forming alliances between publishers, advertisers and civil society

Takeaways

Key Takeaways

Big tech platforms have significant power over information spaces, acting as gatekeepers without sufficient transparency or accountability

The current digital advertising model is defunding professional journalism, threatening media sustainability and diversity

There is a lack of language support and fact-checking for non-English content on major platforms, particularly affecting smaller markets

Multi-stakeholder approaches involving governments, civil society, media, and tech companies are needed to address challenges

Empowering users through transparency, literacy, and rights protection is preferable to top-down content control

Resolutions and Action Items

Form alliances between publishers, advertisers and civil society to redirect advertising revenue to professional media

Develop regulatory frameworks focused on transparency and user rights rather than content moderation

Build fact-checking coalitions to pressure tech companies to expand language support

Include civil society input in global governance dialogues on AI

Optimize local media websites to receive programmatic advertising

Unresolved Issues

How to effectively regulate global tech platforms from the perspective of smaller countries

Balancing freedom of expression with the need to combat misinformation

Addressing the engagement-driven content amplification that can promote polarizing content

Ensuring sustainability of local journalism in the digital age

Suggested Compromises

Focus regulation on transparency and user empowerment rather than direct content control

Involve multiple stakeholders in developing solutions rather than relying solely on government or platform action

Balance the need for human oversight in content moderation with the scale of automation required

Thought Provoking Comments

We should also broaden our own ideas that they are not just regulatory frameworks that we need to look into, but we need to look into other institutions as well, where the jurisdictions where states are actually regulating on their own terms and not in a meaningful way, then what are the other institutions who can come to help to the users when it comes to protecting users’ rights on this.

speaker

Nighat Dad

reason

This comment broadens the perspective beyond just regulatory frameworks to consider other institutions that can protect user rights, especially in contexts where state regulation may be problematic.

impact

It shifted the discussion to consider alternative approaches and institutions for addressing challenges with big tech, beyond just state regulation.

In 2024, there will be one trillion in advertising that will be transacted in the year 2024. One trillion, which is an 80% increase compared to 2019 pre-pandemic. Of that, one trillion. Legacy Media, so the professional publishers of news, will have about 30% of that ad spend, and a third of the rest is going to go to Alphabet, Meta, Amazon, and so on.

speaker

Elena Perotti

reason

This comment provides concrete data on the scale of the advertising revenue shift from traditional media to big tech platforms, illustrating the financial impact on journalism.

impact

It grounded the discussion in economic realities and highlighted the urgency of addressing the sustainability of professional journalism in the digital age.

A study has been done recently that I found very interesting and also quite shocking. It has shown that only half of the Swiss population believes that independent media is essential for democracy.

speaker

Isabelle Lois

reason

This insight reveals a concerning lack of public understanding about the importance of independent media for democracy, even in a developed country like Switzerland.

impact

It highlighted the need for public education and awareness-raising about the role of media in democracy, shifting the conversation to include public perception as a key challenge.

Curation system must be rigorously and continually evaluated on their performance for users who are most vulnerable and most at risk.

speaker

Nighat Dad

reason

This comment emphasizes the importance of considering the impact of AI-powered content moderation on vulnerable users, highlighting an often overlooked aspect of technology deployment.

impact

It introduced a human rights and ethics perspective into the discussion of AI and content moderation, emphasizing the need for ongoing evaluation and protection of vulnerable users.

Overall Assessment

These key comments shaped the discussion by broadening its scope beyond traditional regulatory approaches, grounding it in economic realities affecting journalism, highlighting public perception challenges, and introducing ethical considerations in AI deployment. They collectively painted a complex picture of the challenges facing the information landscape, emphasizing the need for multi-stakeholder approaches and continuous evaluation of both policies and technologies.

Follow-up Questions

How can we address the lack of fact-checking support for smaller languages like Kurdish?

speaker

Aws Al-Saadi

explanation

This is important because the lack of fact-checking in certain languages leaves communities vulnerable to misinformation, especially during critical events like elections.

How can we improve the automation systems for content moderation in Arabic to avoid wrongful flagging of legitimate fact-checking content?

speaker

Aws Al-Saadi

explanation

This is crucial because current systems are incorrectly flagging fact-checking content as misinformation, hindering efforts to combat fake news in Arabic-speaking regions.

How can we increase media literacy, especially in regions with low internet connectivity and media independence?

speaker

Aws Al-Saadi

explanation

This is important for empowering people to critically evaluate information, particularly in areas where media is controlled by political parties.

How can we ensure more equitable distribution of advertising revenue between big tech platforms and traditional media outlets?

speaker

Elena Perotti

explanation

This is crucial for the sustainability of professional journalism and maintaining diverse, quality news sources.

How can we create effective regulatory frameworks for large online platforms that balance user rights, transparency, and freedom of expression?

speaker

Isabelle Lois

explanation

This is important for addressing the power imbalance in information spaces without compromising fundamental rights.

How can we improve public understanding of the importance of independent media for democracy?

speaker

Isabelle Lois

explanation

This is crucial because a recent study showed only half of the Swiss population believes independent media is essential for democracy, indicating a need for awareness-raising.

How can we optimize local media websites to receive programmatic advertising?

speaker

Elena Perotti

explanation

This is important for ensuring that local media can benefit from automated advertising systems and improve their financial sustainability.

How can we ensure AI-powered content moderation tools are designed with input from global human rights, freedom of expression, and ethics experts?

speaker

Nighat Dad

explanation

This is crucial for developing content moderation systems that respect human rights and consider diverse global perspectives.

How can civil society effectively engage in the upcoming global governance dialogue on AI?

speaker

Nighat Dad

explanation

This is important for ensuring that civil society has a voice in shaping global AI governance from the beginning.

Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.