Lightning Talk #209 Safeguarding Diverse Independent NeWS Media in Policy

24 Jun 2025 14:30h - 15:00h

Lightning Talk #209 Safeguarding Diverse Independent NeWS Media in Policy

Session at a glance

Summary

Amy Mitchell from the Center for News Technology and Innovation presented a discussion on safeguarding diverse independent news media through policy considerations. She highlighted that society is currently passing more journalism-related laws than ever before, while simultaneously facing challenges in defining what constitutes journalism in the digital age. Mitchell emphasized that 50% of journalists surveyed internationally had experienced some form of government censorship, and global press freedom scores have declined to 1993 levels.


The presentation focused on how well-intentioned policies can have unintended consequences on press independence and journalism viability. Mitchell outlined critical questions that should be explored in any digital policy, including definitional language, oversight authority, diversity protection, public service, and cross-border impacts. She presented findings from three major studies conducted by her organization.


The first study examined 32 “fake news” policies across 31 countries between 2020-2023, finding that most policies created greater risks to journalistic independence than they provided protection. Only seven of these policies actually defined what constitutes fake or illegal content, while 14 placed control directly in government hands. The second study analyzed 23 media remuneration policies designed to provide revenue streams to journalism, revealing dramatic variations in how digital usage and compensation were defined across different jurisdictions.


Mitchell emphasized the importance of considering public perspectives in policy development, noting that the public has a broad definition of journalism producers beyond traditional news organizations. She concluded by advocating for collaborative, data-driven conversations among policymakers, technology companies, media organizations, and civil society to balance technological benefits while mitigating potential harms to independent journalism.


Keypoints

**Major Discussion Points:**


– **Growing Policy Challenges for Journalism**: The discussion highlights how more laws affecting journalism are being passed than ever before, while it’s simultaneously becoming harder to define what constitutes journalism. This is occurring amid declining press freedoms globally, with 50% of surveyed journalists experiencing some form of government censorship.


– **Definitional Problems in Policy Language**: A critical issue identified across policies is the vague or inconsistent definition of key terms like “fake news,” “journalism,” and “illegal content.” Most policies studied (25 out of 32 fake news policies) failed to clearly define these terms, leaving interpretation to government authorities.


– **Government Authority and Control Mechanisms**: The research reveals that many policies place oversight authority directly in government hands, with 14 of 32 fake news policies giving control to central government. This raises concerns about potential misuse of well-intentioned policies for information control.


– **Media Remuneration and Financial Sustainability**: The discussion covers policies aimed at creating revenue streams for struggling journalism industries through digital platform compensation, but notes wide variation in how “usage” and compensation are defined across 23 different policies.


– **Unintended Consequences of Digital Policies**: Even well-intentioned policies designed to protect the information space can inadvertently harm press independence and diversity. The speaker emphasizes the need to consider cross-border impacts and long-term effects, particularly with emerging AI policies.


**Overall Purpose:**


The discussion aims to present research findings on how digital policies worldwide are impacting journalism and press freedom, while proposing a framework of critical questions that policymakers should consider to safeguard independent, diverse media while addressing legitimate policy concerns.


**Overall Tone:**


The tone is academic and analytical, maintaining objectivity while expressing underlying concern about threats to press freedom. Mitchell presents as a researcher sharing findings rather than an advocate, emphasizing data-driven analysis. The tone remains consistent throughout – informative and measured, though with clear implications about the risks facing independent journalism. During the Q&A, the tone becomes slightly more conversational while maintaining the same analytical approach.


Speakers

– **Amy Mitchell**: Director, Center for News Technology and Innovation (CNTI). Has 25 years of experience at Pew Research Center where she helped launch and directed the journalism line of research. Currently leads a global research center focused on enabling independent, sustainable news media, maintaining an open internet, and fostering informed public policy discussions.


– **Audience**: Multiple audience members who asked questions during the Q&A session. Areas of expertise, roles, and titles not specified.


Additional speakers:


None identified beyond those in the speakers names list.


Full session report

# Safeguarding Diverse Independent News Media: Policy Considerations and Global Challenges


## Executive Summary


Amy Mitchell presented a comprehensive analysis of how digital policies worldwide are impacting journalism and press freedom. Drawing from extensive research conducted by her organisation, the Center for News Technology and Innovation (CNTI), Mitchell highlighted the unprecedented challenges facing independent media in an era where more journalism-related laws are being passed than ever before. The discussion revealed alarming trends in global press freedom, with 50% of journalists surveyed having experienced some form of government censorship, and worldwide press freedom scores declining to 1993 levels.


The presentation centred on the critical observation that well-intentioned policies designed to protect the information space often create unintended consequences that harm journalistic independence and diversity. Through analysis of fake news policies, media remuneration frameworks, and emerging AI regulations, Mitchell demonstrated how vague definitional language and inappropriate oversight mechanisms can transform protective policies into tools for information control.


## Background and Research Context


Mitchell began by establishing her background and the context for CNTI’s work. Coming from 25 years at the Pew Research Center where she helped launch the journalism line of research, Mitchell now leads CNTI, an organisation that is “not quite two years, a little over a year and a half now” old. CNTI works in partnership with organisations including the Global Fund for Media Development (GFMD), Online News Association, and several others.


The research presented emerged from collaborative efforts including a Mexico City convening co-sponsored with OEM, where stakeholders from across the journalism ecosystem gathered to examine how digital policies affect independent media. This work addresses the fundamental challenge that society is experiencing an unprecedented volume of legislation affecting journalism, both directly and indirectly, at a time when press freedoms are declining globally.


## Current State of Journalism and Policy Landscape


Mitchell established the gravity of the current situation facing journalism globally. The research revealed that 50% of the journalists surveyed had experienced some form of government censorship within the past year. Global press freedom scores have deteriorated to levels not seen since 1993, creating an environment where policies originally intended for protection are increasingly being used to imprison and control journalists.


The challenge is compounded by the evolving nature of journalism itself. In the digital age, traditional definitions of journalism and news organisations no longer capture the full spectrum of information producers that the public relies upon. This definitional ambiguity creates vulnerabilities in policy frameworks that may inadvertently exclude legitimate journalism whilst failing to address actual threats to information integrity.


Mitchell emphasised that the volume of legislation affecting journalism is unprecedented, with lawmakers worldwide grappling with how to regulate digital spaces without clear understanding of the implications for press freedom and independent media.


## Research Methodology and Key Questions


To address these challenges systematically, Mitchell outlined key questions that CNTI examines when analysing digital policies affecting journalism:


– How policies define crucial terms such as “journalism,” “fake news,” “illegal content,” and “digital usage”


– Who has oversight authority to interpret and enforce policies


– Whether policies adequately protect diverse voices in the media landscape


– How clearly policies articulate their goals for serving public information needs


– The cross-border impacts of national policies


Mitchell stressed the importance of thinking beyond national boundaries when considering policy impacts, as digital policies often have far-reaching effects through international platforms and the tendency for policies to be copied across jurisdictions, sometimes by authoritarian regimes for harmful purposes.


## Analysis of Fake News Policies


One of CNTI’s most significant studies examined 32 fake news policies across 31 countries implemented between 2020 and 2023. The findings revealed troubling patterns that suggest these policies create greater risks to journalistic independence than protection for the information space.


The most striking finding was the widespread failure to define key terms. Only seven of the 32 policies actually defined what constitutes fake or illegal content, leaving interpretation of these crucial concepts to authority figures. This definitional vacuum creates dangerous ambiguity that can be exploited for information control rather than protection.


The research revealed concerning patterns in oversight authority, with 14 of the 32 policies placing control very specifically in government hands. The penalties varied dramatically, with imprisonment terms ranging from less than one month to over three years (with Zimbabwe specifically mentioned for the longest sentences), reflecting inconsistent and often disproportionate regulatory approaches.


Mitchell emphasised that whilst the stated intentions of these policies were often laudable—protecting citizens from harmful misinformation—the practical implementation frequently created tools that could be used to suppress legitimate journalism and dissenting voices.


## Media Remuneration Policy Analysis


The second major study examined 23 media remuneration policies implemented between 2018 and August 2024, designed to create new revenue streams for struggling journalism industries through compensation from digital platforms. These policies, including various US state-level initiatives, represent attempts to address the economic challenges facing traditional media in the digital age.


The analysis showed dramatic variation in how different jurisdictions defined key concepts such as “digital content usage” and compensation criteria. This inconsistency creates confusion for both platforms and media organisations operating across multiple jurisdictions and may lead to uneven outcomes.


Particularly concerning was the finding that these policies inconsistently addressed the diversity of news media, with many appearing to favour large, established operations over smaller, independent outlets. Furthermore, most policies failed to clearly articulate how they would better serve public information needs, risking becoming mere economic transfers rather than tools for improving the information landscape.


## AI Policy Implications


Mitchell’s research also examined emerging artificial intelligence policies and their implications for journalism. While few AI policies directly address journalism, they have significant indirect impacts through their effects on content creation, distribution, and liability frameworks.


The research revealed that AI policies often place liability on users, including journalists, without providing clear definitions of appropriate use or adequate safeguards for legitimate journalistic activities. This creates uncertainty for journalists who wish to benefit from AI technologies whilst avoiding legal risks.


Mitchell emphasised the need to consider how AI policies affect journalists’ ability to harness technological benefits whilst guarding against potential risks, particularly given the cross-border nature of AI technologies.


## Public Perspective and Behaviour


A crucial element of Mitchell’s analysis focused on understanding public perspectives on journalism and information consumption. CNTI conducted surveys in four countries to understand how the public defines journalism and consumes information.


The research revealed that the public has a much broader definition of journalism than traditional policy frameworks typically recognise, including individual journalists working independently, mission-driven content creators, and principle-guided information producers. Both journalists and the public view technology as critically important for news production, gathering, dissemination, and consumption.


Importantly, Mitchell noted that substantial research shows disinformation campaigns don’t have as much impact on what people actually believe as previously thought. Instead, people’s own behaviour and choices about where to seek information appear to be more significant factors in determining what they accept as credible.


## Discussion and Key Exchanges


The presentation generated significant discussion during the question-and-answer session. Key exchanges included:


**Methodological Questions**: An audience member questioned the value of including established autocracies in policy analysis. Mitchell responded by emphasising that autocratic policies matter because they affect real people and can be copied by other countries for harmful purposes, highlighting the interconnected nature of global policy development.


**EU Policy Analysis**: When questioned about including individual EU member states rather than focusing on EU-wide legislation, Mitchell explained that they specifically looked at country-specific fake news policies rather than broader frameworks like the Digital Services Act.


**Alternative Approaches**: Discussion explored focusing on the demand side of disinformation—understanding why people believe and engage with false information—rather than concentrating primarily on supply-side regulation. Mitchell noted that psychological defence approaches, such as those employed by Sweden’s dedicated agency, could offer valuable alternatives to traditional content moderation policies.


## Key Findings and Implications


The comprehensive research yielded several critical findings:


**Definitional Failures**: The widespread failure to define key terms in digital policies creates dangerous ambiguities that can be exploited for information control, suggesting that policy development must prioritise clear, precise definitions.


**Government Oversight Concerns**: The tendency to place oversight authority directly in government hands raises serious concerns about potential misuse of well-intentioned policies.


**Diversity Challenges**: Many policies fail to adequately protect diverse voices in the media landscape, often favouring large operations over smaller, independent outlets.


**Cross-Border Policy Migration**: Policies developed in one jurisdiction often influence or are directly copied by others, sometimes for harmful purposes, emphasising the global responsibility that comes with policy development.


**Public Behaviour Complexity**: The research challenged assumptions about disinformation effectiveness, suggesting that individual choice and political bias may be more significant factors than external manipulation campaigns.


## Future Research and Recommendations


Mitchell announced that CNTI plans to spend more time examining the relationship between public behaviour and disinformation policy effectiveness. The organisation is developing a research working group focused on public response to AI content labelling and watermarking systems.


Key recommendations included:


– Collaborative, data-driven conversations among policymakers, technology companies, media organisations, researchers, and civil society


– Policy design based on clear understanding of how the public actually seeks and consumes information


– Clear articulation of desired digital information landscape goals before implementing content moderation policies


– Integration of specific safeguards against government overreach and mechanisms to protect diverse, independent voices


## Conclusion


Mitchell’s research revealed the complex and often contradictory nature of contemporary digital policy as it affects journalism and press freedom. While many policies intend to protect the information space and support quality journalism, implementation often falls short of these goals and may create new threats to press independence and diversity.


The findings suggest that effective digital governance requires more sophisticated understanding of public behaviour, global policy dynamics, and the changing nature of information production and consumption. The research provides a valuable framework for approaching these complex issues, emphasising the need for international coordination and careful consideration of unintended consequences in policy development.


As Mitchell emphasised, the challenge lies in creating policies that actually serve public information needs while protecting the diverse, independent media ecosystem that democracy requires.


Session transcript

Amy Mitchell: Hello, hello, that’s loud. I’m Amy Mitchell from the Center for News Technology and Innovation and I look forward to talking with you today about safeguarding diverse independent news media in policy. We are at a point in our society today where we are passing, debating and passing more laws that relate to journalism both directly and indirectly than we ever have. This is occurring at the same time that it is harder than ever to put borders around what journalism is and what it is not and that is from the perspective both of the business and legal kind of laws and policy space as well as what the public considers journalism and the things and sources that they rely on to keep them informed on a daily basis. We are also seeing a growing array of issues in the policy space that relate to journalism. Everything from content moderation to protection of the internet to disinformation to artificial intelligence, again, times that are directly related and other times where it’s indirectly related. In this space of the digital landscape, policies that are passed in one country very much tend to impact and be impacted by policies that are passed in another country. It’s very important to be thinking about these things across country and regional borders. We are seeing all of this happen amid a time when we are facing growing government encroachment on information control and on press freedoms. This comes through both in data we gather from journalists themselves. What you see in front of you is an international survey of journalists that CNTI conducted with a number of partnership organizations like GFMD, Global Fund for Media Development, Online News Association, and several others around the world. You see here that 50% of the journalists that we surveyed, this was in the fall of last year, had experienced some form of government censorship ranging from not being allowed to cover or access an event to complaints about their content to imprisonment. 50% had experienced at least one form of that in the last year. We’re also seeing the world press freedom scores across the board go down from the entities that are tracking this data year in and year out, so much so that we are down to 1993 levels of world press freedoms. This is occurring both when we look at the government censorship as well as the independent protection of journalists in this space. We’re also hearing it in the conversations that we have. One of the things CNTI does is host convenings, which are really daylong working sessions with a combination of folks from journalism, technology, policy space, research, civil society, to talk about these issues. This is one that we held in Mexico City with OEM, was our co-sponsor there. This conversation specifically focused on how to continue to produce journalism amid ongoing security threats, both on and offline. A lot of the discussion that came up from those in the rooms had to do with not only feelings of safety and other kinds of online abuse, but also in terms of the ways that policy had been used, policy that was technically in theory aimed at protection of information, or if journalists was being actually used to imprison or otherwise control journalists and the information space. If we look across the policy space amid this landscape today, what we find in the data is that even the best intended policy, that which is really looking to safeguard our information space to create vibrant digital landscapes, can end up having unintended consequences on the independence of our press and on journalism viability more broadly. The question becomes, how can policy address the issue areas of concern, of which we’re talking about here this week, while safeguarding an independent, diverse media and the public’s access to a plurality of fact-based news? That’s where CNTI comes in. The Center for News Technology and Innovation is a global research center. We’ve been around not quite two years, a little over a year and a half now, but we are an organization that focuses on enabling independent, sustainable news media, maintain an open internet, and foster more informed public policy discussions. We do this by conducting research as well as synthesizing research from others. My background is research. I come from 25 years at the Pew Research Center. I helped launch the journalism line of research there, directed that for many years. I have now decided to move into this space. We also help synthesize research. As a research community, I think we do a pretty lousy job sometimes of helping make sense of what our research all adds up to, where there are gaps, and where we need to do more. We then host convenings, like I was talking about, to try to really work through some of the challenging questions in these spaces and come up with informed solutions. Back to the question, how can policy address issue area concerns while safeguarding an independent, diverse news media and a vibrant digital landscape? Here we go to some of the research that CNTI has done over the last year that I’m going to talk about over the next few minutes. What we do is range from policy analysis to issue primers to surveys. We did the journalist survey. We’ve done a four-country public survey as well that both was a mix of focus groups as well as fully representative statistical surveys, convenings around the world, and more. What I put forward today is a set of questions that we have found to be critical questions to explore in any digital policy that is being debated and thought about today. Again, both those that directly relate to journalism, but also very much so where the journalism and a vibrant digital landscape are likely to be impacted, even if not directly related in the policy. I’m going to spend more time on each of these, so I’m going to run through them very quickly right now. The first question is, what’s the definitional language? We started with a lot of our work just saying, how is journalism being defined here? How about a journalist? How about news? The other areas that are being talked about in this policy, what’s the definitional language and how consistent or inconsistent might that be across policy in ways that matter? The independence of journalism, a big question to be spending a lot of time on is, who is given that oversight authority to determine the details of how a law gets practiced and enacted? Diversity, one of the things that the internet brought was a diverse landscape of information in a way that served minority communities, bringing new kinds of voices into this space. How do we work to build a vibrant digital landscape in a way that still safeguards that diversity of voice, especially in the news and information space? Serving the public, we talk about doing all of this to serve the public. How are we actually serving the public in these policies, especially in the behaviors and the ways that they access and get information today? Social relevance, being forward-thinking, cross-border impacts, and also unintended consequences that may or may not be clearly evident on the surface. These questions to explore all take time and deep thought and collaborative discussion. The first study I’m going to share a little bit about is one that we did that looked at fake news policies around the world. And we put the fake news in quotes many of these actually use the language of addressing fake news. We looked at 32 policies That were proposed for enacted between 2020 and 2023 They cut across 31 countries and so I’ll say straight out that there are more of these that were in a talker in autocracies, but 11 of the countries included here are democracies and so the findings that that Really resonate across different government types in terms of some of the takeaways that can happen and overall the the overall conclusion was that overall these policies created greater risk to journalistic independence and Diversity as well as the public’s access to a diversity of fact-based news than they did to actually safeguard the information space Get back getting back to definitions one of the first things we looked at here was How is fake news or illegal content defined And how might news or journalism or what might be considered real news? Be defined and you can see here that only seven of the 32 actually put a definition around what fake or illegal content was and the remaining left that vague which leaves it up to the Authority figure right the one who gets to make the decision about the enactment of that policy to put those definitions in place same thing on the news and journalism only to Actually spoke to what real news or journalism might be and this is very much a double-edged sword We talked a lot about this with some of the folks at UNESCO guy Berger was an advisor on this on this project he’s done a lot of work with UN and and information integrity and one of the things that You know we talked about in this report is a degree to which Well defining these things in policy can help safeguard Journalism and an independent press it can also be language that gets used against journalism and an independent press So it’s very important both to really consider the definitional language But also to then get to the next question, which is okay. Who’s the authority figure? Who’s the one that then gets to determine what that language means That’s what we looked at next in this study and you can see 14 of the 32 policies very specifically put the control of the authority there in the hands of the government itself and Most of those were talks about the central government being the ones that that was in control There were others that that gave the authority to some sort of body within the government where it was often unclear how closely Associated that other body was to the central government figure or not and the remaining Left it unclear as to who had authority to arbitrate that law or that policy Which then naturally puts it back in the hands of the government? The next question then becomes okay What’s the punishment if you get if you get put in to Determine that you were a part of this fake news and the bulk of these policies did have some sort of Imprisonment and it ranged from Less than one month in less those so I think up to over three years In Zimbabwe and so there was real government action that can be taken against journalists by the uses of but by the language in these policies and so before I move on to the next study if we just broaden this out to a question about content moderation policy more Broadly, one of the things that’s really important to ask in that space of policy Disinformation content moderation etc. Is what is the end goal for the way? The content is going to look and I’m not sure one of the things that we’re spending more time in the coming year looking at Is that very question? It’s not clear within the policy conversation space that we’ve really actually done a very good job at all of Articulating what would the digital landscape look like if this content moderation policy that we’re talking about gets put in what is it? That’s bad. That’s out. What is it? That’s good. That’s in what’s the mix? There’s always going to be a mix of content in there. So really taking the time to think about the ultimate goals there The second study that I’ll share a little bit about was this very specifically this news an example of one that specifically looked at Was very directly related to journalism in the media space So these were media remuneration policies which are basically revenue policies looking to be a revenue stream to add some revenue Financial lifeblood into the journalism industry, which many of you know has been having a hard time lately with its financial structures and support and this looked at 23 policies that were considered or passed from 2018 through August of 2024 And there’s a pretty wide mix. I will say it does include also a number of state Policies in the u.s. Because the state policy space in the u.s. Is very active these days. So there are quite a number in this That are included the first thing we did here was start a framework And this is a something else that we really recommend when you’re in a complex policy space with a wider range of the kind of Focus or orientation of these policies is to say, okay What’s the actual financial structure or the oriented subject? Orientation of how this is going to work and you can see here that the first three are really around actually usage usage Criteria digital interaction that then says that that warrants some sort of compensation The second set is subsidies that are either coming directly from tech platforms or in some cases from the government itself and then the third is a Tax mechanism, which is either creating or building off of new taxes and once you get this framework in place for whatever your subject area your policy is you can any new policy that comes in you Figure out where does it fit in into the framework that we’ve created? So then we broke the analysis into two parts again with these core questions that I showed you all earlier in the top of our minds and the first was definitional in the sense that it was how is how is the usage and interaction of digital content determined and Then when that definitional boundary is put on if and and then if so To what degree what amount of content should be paid for that content? How does that decision get made? When is it appropriate to charge for digital usage is compensation for digital usage applied consistently? Who benefits then who gets that money? Where does that money actually go and then the second half of the analysis looks at the questions of these core? Viability or sustainability elements to journalism. How do we keep an independent news media? How do we support diversity in this space? How do we sustain journalism that is actually serving the way the public gets informed today? And so I’m gonna walk through this really quickly because this is a lightning talk But there’s a lot more detail on the website if you want to go into any more of it and I welcome I’d be happy To talk with anybody about it in in greater detail as well so first on the digital usage side what this really shows is just when you look at what the Criteria of usage was or is articulated across these policies. It varies Dramatically and so the two green Circles are actually at the sort of what’s at the ends of the spectrum in terms of where what you know? Usage could be in time in terms of content and then you see here inside that it ranges from things like Clicking on a URL link to having a title of the article to actually well It’s usage of content for indexing or it’s creating an article summary and having that summary that that warrants content And so the definition of what usage and interaction actually means varies dramatically across these policies and so too Does what the level of compensation would be? as well as who that compensation would be going to. In some cases, it’s going to the organizational level. In some cases, it’s going to individual journalists. Usually, it’s actually at the organizational level. There’s been more in the recent months that has been shifted to going to journalists or journalism producers themselves. So we see a great variety there, which brings us back to the importance of really articulating from the get-go what the goal is and how do we use the best language to clearly articulate and in a consistent way what we mean here. And then the second half of the study, as I mentioned, looks at these journalistic viability elements. First, independence and diversity, public interest and access. And when we talk about independence, as we saw in the fake news study, anytime you’re creating policy, you give the government a role, which isn’t a bad thing, right? That’s what policy is for. But it does mean it’s really important to think about what are the safety mechanisms in place to be sure that in this case, particularly in the journalism and news space, that it doesn’t end up giving an individual government or figure, as the years progress, the ability to take control over the information space. And one of the things we saw in these policies was that in many cases, even if it was left unclear as to who had that authority to arbitrate the law, and if there were third party or third agencies, kind of how that got determined and how it would get determined over time. Diversity, again, the biggest thing we saw was that it was a really inconsistent and kind of haphazard approach to the diversity of news media that would be in there. A lot of these that began really ended up oriented more around your very large news operations and outlets. And some, as we got further into the policy timeline, called out ethnic, certain minority kinds of media or ethnic news outlets. There was only those that really focused on the tax extension element that focused on local journalism itself. And so how do we be sure that policies in this arena are going to support that diversity of voices in the journalism space that has been so valuable to the public? And then finally, public interest. And I will say on the innovation side, only the EU directive and one state law, state policy, New Jersey, in the states, actually spoke at all about being forward looking and the innovation side of technology and where that might lead us in the future. On the public interest side, all of the talk is about serving the public. And there were references to the public in there. But what was unclear was how these steps actually do a better job of getting the news and information to the public, especially thinking about the ways that the public is getting information, the diversity of producers of journalism that the public is turning to, including oftentimes many smaller individual journalism producers that the public has come to trust and rely on. The third policy area, which we are just in the early stages of now, so I’m just going to give you sort of a touch of the framework that we’re using, is in the AI policy space. And here there’s so much policy being talked about, being enacted, many are not actually law, can be enforced legally at this point. But very, very few in this space talk about journalism or really the news information space directly at all. But if we know about what’s happening with AI and where the digital landscape more broadly, there very much is indirect and in the end direct relationship with the way AI policy can and likely would affect the digital news landscape. So thinking about those things inside these other policies is really important before they get passed and too far down the line. How can it affect journalists’ ability to make use of the benefits of technology while also safeguarding against the risks? And then how do these policies work across state and national and country lines? So this is the framework in general, and I’m not going to go through this in detail because I want to have time for questions. But one, again, we’re starting with, okay, what’s the range of the kind of policies that are out there? And you can see here there are many that create a committee or an agency. Okay, do those committee or agencies have somebody from the journalism sector to play a role, to be a part of that? Do they take information integrity into account? Those that focus on deep fakes, synthetic content, this is one where we look at, okay, what’s the impact on journalists and their reporting? There’s a lot of good that can be had by having some sort of policies in place around deep fakes, but how does that resonate with the way journalists, some of what the journalists need to do, especially in unsafe areas, to be able to get their information out there? What labels? How do labeling these kind of things have public relevance? If you think about watermarks. CGPA was up here last week. There’s a lot of good in that kind of identity inside content, especially that’s AI-driven. But what does it mean to the public? We’ve done a lot of research and actually have started a research working group that’s global on this topic specifically to try to help make sense of what the research says about the public response to labeling, to not labeling, and how are there ways that journalists and others can make use of labels while actually fostering public trust with technology and with their work, as opposed to further diminishing it. There’s the focus on algorithmic bias and discrimination, and one of the things that’s important in this area of policy I’ll just mention as an example is that in that and the frontier model, there’s a lot of liability that goes on the user. That can be the public. It can be companies. It can also be the journalist that’s using it. It’s very important to think about who that user space may be and are there ways that the policy would want to define that and make sure that that is very clear in terms of who can be liable for content if there’s some sort of negative usage effect that comes out of it. Then comprehensive regulation. Again, we’re just in the beginning stages of this analysis and I will look forward to sharing it when the team is done. I’m just going to close with a couple more pieces of data that get to the public side of all this because ultimately this is about safeguarding an independent press and a diverse news media is about serving the public in the digital information space. It’s really important as we all delve deep into policy deliberation and making decisions that we don’t forget about the public that we’re saying we’re serving. How does the public think about journalism, about the ways that they can get informed today? That has been greatly expanded inside the digital landscape today. You can see here from this data, these are the four countries that we did this survey in, that there’s great value that the public places on journalism and the role that it plays in society. That really cut across the board. But we also see in the data that the public has a very broad definition of who can be producers of journalism today. It may be somebody inside an organization. It may be an individual who’s working on their own or doing their own work. What came through in the follow-up focus group discussions that we had is that it’s mission-driven. It’s guided by principles. It’s all of those elements that we think about of journalism, but it doesn’t necessarily have to be a news operation. So when we think about policy and for putting implications of policy in a space, how does all that work when we’re thinking about policy? And then finally, we can see that people are going to individuals that they consider journalists for their content today. And this is the final one, which is to also remember as we’re doing these policies that both the journalists, the journalism producers, and the public see technology as critically important in their ability to produce the news, to gather it, to disseminate it, and also to get informed. So again, coming back to these policy questions of how do we then do the best job of enabling the benefits and the needs in this space while also guarding against the potential harms. Quick wrap-up of the questions that we suggest you keep in mind. These are for journalism policies specifically, but they really carry through in a little bit of a nuanced way for all digital policy. And all in all, what’s most important is that in this critical policy space that policymakers, technology companies, media companies, journalism producers, researchers, civil societies actually work together to have really thorough conversations that are driven by data, a seeking of knowledge that keep the public interest in mind to be able to keep up with changing technology and determine how best to mitigate the risks while enabling the benefits. Thank you. And I’d be happy to take questions. You can sign up for our newsletter here, follow us, different places, the website here. Any questions?


Audience: Yeah, thank you so much. I’ve got two questions. First one relating to your, well, both actually relating to your sample choices of your first study that I think 31 countries you had covered. I was a little surprised that you, in the European Union, you took in four individual member states rather than the EU as a whole, because I think most of the aspects covered are now by the DSA, are covered by the Digital Services Act. So these laws, the national laws, have mostly become obsolete. And the second question is, to put it directly, what is the point of having really established autocracies among the sample, where it’s obvious and clear that they will use any excuse to control the information sphere and to use laws against disinformation to control, yeah, to exercise control. So what’s the point of having that? That is just, to me, obvious. And what conclusions can we draw from that?


Amy Mitchell: Yeah, those are two great questions. Thank you. On the EU, and I can share more on the methodology offstage, but the EU Online Safety Act is the one that was in place, and that was actually broader and didn’t talk directly about fake news. It talked about online safety. What we looked at instead were specific country policies that related directly to fake news or illegal content online, as opposed to broader online safety. We have a whole footnote in there in our methodology on that specific decision, but it’s a good question. And really, it’s also about examples. So the CNTI, Center for News Technology Information, does not advocate or call out for a specific policy. And what this is doing is saying, what’s the range of what’s out there, and how can we learn from it when we look at that, as opposed to commenting very specifically on one particular policy or another. But there is a whole methodology that does go into the detail on that decision and the timing of when we had the cutoff for those selections. On the autocracies, good question. One, they’re countries. They have people that live in them that get affected by the laws. We should care. That’s number one. Number two is, as I was mentioning earlier, there are so many policies that carry impact from one country to the next, whether it’s copycatting, and we saw that there was a policy that was done in a very well-intended, proactive way inside a democratic country that got then pulled by India to use for information control. So it’s also important to think about the ways that policy can be taken by another entity and used for ill service when it comes to the information landscape. But it’s also something, especially if we’re here at the UN and IGF talking about collaboration and how do we really have supportive environments to be aware of what’s happening in other countries. Do we have time for one more?


Audience: Hello. Thank you so much. That was super interesting. I specifically appreciate that you brought in that we should look more at the public perspective of this and serving, and the people we’re actually trying to serve. And so I was wondering, because these policies, especially disinformation, they get so tricky with the definitions, as you mentioned. I was wondering if you would recommend that actually countries should also put more emphasis on the demand side of disinformation. So why do people maybe believe disinformation? Why do people engage with disinformation? And why is it so easy for them to look at that? And I know that, for example, in Sweden, there’s this psychological defense agency by the government that kind of tries to prepare the population a little better how to engage with disinformation, how to recognize it. And I was just wondering if that would be a different approach to look at that more in terms of policy.


Amy Mitchell: Thank you for the question. It’s certainly an important element of it, right, is what is the public doing with all this content. There is actually a fair amount of research, and there’s a lot that actually shows the disinformation campaigns don’t have a whole lot of impact on what people actually believe or don’t believe, but that people’s own behavior and where they’re choosing to go can have that. One of the biggest things that we see, though, and this was some research I did back in the days that I was at Pew, is that what the public would categorize as disinformation can vary greatly, right? And so we have clear evidence to see that the way that the public would say, well, that’s disinformation, and our data show, at least in this one study we did, that there are very much alignments to one’s political thinking, to the kinds of sources you turn to. That’s a broader societal question, too, right? So I think your question comes back to the content moderation slide that I showed, which is articulating what’s the goal of the policy, and what’s the goal of the information landscape. I mean, it’s not going to be perfect. We’ve never had a perfect information landscape. So what is the goal, and then what are the best mechanisms to put in place that do the best job of reaching that, as close as we can get, without other unneeded risks and harms that can be brought into place? It’s a really tricky balancing act. It’s an area that CNTI plans to spend more time examining in the coming year. Thank you all. My time is up. Thank you.


A

Amy Mitchell

Speech speed

159 words per minute

Speech length

4881 words

Speech time

1831 seconds

Growing number of laws affecting journalism directly and indirectly, making it harder to define what journalism is

Explanation

Mitchell argues that society is currently passing and debating more laws related to journalism than ever before, occurring simultaneously with increased difficulty in defining journalism boundaries. This affects both business/legal policy spaces and public perceptions of what constitutes journalism and reliable information sources.


Evidence

Growing array of policy issues from content moderation to AI protection, disinformation policies, with digital landscape policies in one country impacting others


Major discussion point

Current State of Journalism and Policy Landscape


Topics

Freedom of the press | Content policy | Legal and regulatory


50% of surveyed journalists experienced government censorship in the past year

Explanation

Based on an international survey conducted by CNTI with partnership organizations, half of the journalists surveyed had experienced some form of government censorship. This censorship ranged from being denied access to events to receiving complaints about content to imprisonment.


Evidence

International survey conducted in fall of last year with GFMD, Global Fund for Media Development, Online News Association, and other partnership organizations


Major discussion point

Current State of Journalism and Policy Landscape


Topics

Freedom of the press | Human rights principles | Cybersecurity


World press freedom scores have declined to 1993 levels globally

Explanation

Mitchell presents data showing that global press freedom has deteriorated significantly, with current levels matching those from 1993. This decline affects both government censorship issues and independent protection of journalists.


Evidence

Data from entities tracking press freedom scores year over year, showing consistent decline in world press freedom scores


Major discussion point

Current State of Journalism and Policy Landscape


Topics

Freedom of the press | Human rights principles


Policy intended for protection is being used to imprison and control journalists

Explanation

Through convenings and discussions with journalism professionals, Mitchell found that policies theoretically designed to protect information or journalists are actually being used to imprison or control journalists and the information space. This represents a significant unintended consequence of well-intentioned policy.


Evidence

Conversations from CNTI convenings, including one in Mexico City with OEM focusing on producing journalism amid security threats, where participants discussed policy being used against journalists


Major discussion point

Current State of Journalism and Policy Landscape


Topics

Freedom of the press | Legal and regulatory | Human rights principles


Need for critical questions when analyzing digital policy: definitional language, independence, diversity, public service, and unintended consequences

Explanation

Mitchell proposes a framework of essential questions that should be explored in any digital policy debate. These questions address how journalism and related terms are defined, who has oversight authority, how diversity is maintained, how the public is served, and what unintended consequences might arise.


Evidence

CNTI research over the past year including policy analysis, issue primers, surveys, four-country public survey with focus groups and statistical surveys, and global convenings


Major discussion point

Policy Analysis Framework and Research Methodology


Topics

Legal and regulatory | Content policy | Human rights principles


Importance of examining who has oversight authority to determine how laws are enacted

Explanation

Mitchell emphasizes that a critical question in policy analysis is identifying who receives the authority to determine the details of how laws are practiced and enacted. This authority assignment significantly impacts the independence of journalism and can determine whether policies protect or harm press freedom.


Evidence

Analysis of fake news policies showing 14 of 32 policies placed control directly in government hands, with most focusing on central government control


Major discussion point

Policy Analysis Framework and Research Methodology


Topics

Legal and regulatory | Freedom of the press | Human rights principles


Cross-border policy impacts require thinking beyond national boundaries

Explanation

Mitchell argues that in the digital landscape, policies passed in one country significantly impact and are impacted by policies in other countries. This interconnectedness makes it essential to consider policy implications across country and regional borders rather than in isolation.


Evidence

Example of well-intended policy from a democratic country being adopted by India for information control purposes


Major discussion point

Policy Analysis Framework and Research Methodology


Topics

Legal and regulatory | Jurisdiction | Digital business models


Study of 32 fake news policies across 31 countries showed greater risk to journalistic independence than protection of information space

Explanation

CNTI’s analysis of fake news policies from 2020-2023 found that these policies created more risk to journalistic independence and diversity, as well as public access to fact-based news, than they provided protection for the information space. This finding applied across both democratic and autocratic countries.


Evidence

Analysis of 32 policies across 31 countries between 2020-2023, including 11 democracies, with findings consistent across different government types


Major discussion point

Fake News Policy Analysis


Topics

Freedom of the press | Content policy | Legal and regulatory


Disagreed with

– Audience

Disagreed on

EU policy analysis methodology – individual member states vs. EU-wide legislation


Only 7 of 32 policies defined what constitutes fake or illegal content, leaving definitions to authority figures

Explanation

Mitchell’s research revealed that most fake news policies failed to clearly define what constitutes fake or illegal content, with only seven policies providing definitions. The remaining policies left these crucial definitions vague, effectively placing definitional power in the hands of authority figures who implement the policies.


Evidence

Detailed analysis of definitional language in 32 fake news policies, with only 2 policies defining what constitutes real news or journalism


Major discussion point

Fake News Policy Analysis


Topics

Content policy | Legal and regulatory | Freedom of the press


14 policies placed control directly in government hands, with imprisonment penalties ranging from less than one month to over three years

Explanation

The study found that nearly half of the analyzed policies gave direct control to government entities, typically central governments, to arbitrate and enforce the laws. Most policies included imprisonment as punishment, with sentences varying dramatically from less than one month to over three years, with Zimbabwe having the longest sentences.


Evidence

Specific analysis showing 14 of 32 policies with government control, imprisonment penalties ranging from less than one month to over three years in Zimbabwe


Major discussion point

Fake News Policy Analysis


Topics

Legal and regulatory | Freedom of the press | Human rights principles


Analysis of 23 revenue-focused policies from 2018-2024 showed dramatic variation in defining digital content usage and compensation criteria

Explanation

CNTI’s study of media remuneration policies revealed significant inconsistency in how digital content usage is defined and what warrants compensation. The criteria ranged from simple URL clicks to article summaries, with equally varied compensation levels and recipient structures.


Evidence

Analysis of 23 policies from 2018 through August 2024, including US state policies, showing usage criteria ranging from clicking URL links to creating article summaries


Major discussion point

Media Remuneration Policy Analysis


Topics

Digital business models | Intellectual property rights | Legal and regulatory


Policies inconsistently addressed diversity of news media, often favoring large operations over smaller outlets

Explanation

Mitchell found that media remuneration policies took an inconsistent and haphazard approach to supporting diverse news media. Many policies ended up favoring large news operations and outlets, with only some later policies specifically addressing ethnic minority media or local journalism through tax mechanisms.


Evidence

Analysis showing policies initially favored large operations, with some later policies calling out ethnic minority media, and only tax-focused policies supporting local journalism


Major discussion point

Media Remuneration Policy Analysis


Topics

Cultural diversity | Digital business models | Legal and regulatory


Most policies failed to clearly articulate how they would better serve public information needs

Explanation

While all media remuneration policies claimed to serve the public interest, Mitchell found that they failed to clearly explain how their mechanisms would actually improve public access to news and information. The policies didn’t adequately consider how the public actually consumes information or the diversity of journalism producers the public relies on.


Evidence

Analysis showing policies referenced serving the public but lacked clear articulation of how steps would better deliver news to public, especially considering diverse journalism producers


Major discussion point

Media Remuneration Policy Analysis


Topics

Digital access | Content policy | Human rights principles


Agreed with

– Audience

Agreed on

Need for alternative approaches to disinformation beyond content restriction


Few AI policies directly address journalism, but they have indirect impacts on the digital news landscape

Explanation

Mitchell argues that while AI policies rarely mention journalism or news information directly, they have significant indirect and eventual direct relationships with the digital news landscape. This makes it important to consider journalism implications before AI policies are passed and implemented.


Evidence

Early-stage analysis of AI policy space showing very few policies directly addressing journalism or news information, but with clear indirect impacts on digital news landscape


Major discussion point

AI Policy Framework and Implications


Topics

Legal and regulatory | Future of work | Digital standards


Need to consider how AI policies affect journalists’ ability to benefit from technology while guarding against risks

Explanation

Mitchell emphasizes the importance of examining how AI policies can impact journalists’ ability to utilize technological benefits while also providing protection against potential risks. This requires careful consideration of both opportunities and threats that AI policies present to journalism.


Evidence

Framework analysis examining range of AI policies including committees, agencies, deep fakes, synthetic content, labeling, and algorithmic bias considerations


Major discussion point

AI Policy Framework and Implications


Topics

Future of work | Digital standards | Legal and regulatory


Liability often falls on users, including journalists, requiring clear policy definitions

Explanation

In AI policy analysis, Mitchell found that liability frequently falls on users, which can include the public, companies, and journalists using AI technology. This makes it crucial for policies to clearly define who constitutes a user and under what circumstances they can be held liable for content or negative usage effects.


Evidence

Analysis of algorithmic bias, discrimination policies, and frontier models showing liability placement on users, with need for clear user space definitions


Major discussion point

AI Policy Framework and Implications


Topics

Legal and regulatory | Liability of intermediaries | Future of work


Public places great value on journalism’s role in society but has broad definitions of who can be journalism producers

Explanation

Mitchell’s four-country survey revealed that the public highly values journalism’s societal role across all surveyed countries. However, the public also maintains a very broad definition of who can produce journalism, including individuals working independently, as long as the work is mission-driven and guided by journalistic principles.


Evidence

Four-country survey with focus groups and representative statistical surveys showing public value for journalism and broad definitions of journalism producers as mission-driven and principle-guided


Major discussion point

Public Perspective and Engagement


Topics

Content policy | Cultural diversity | Digital identities


Both journalists and public see technology as critically important for news production and consumption

Explanation

Mitchell’s research demonstrates that both journalism producers and the public view technology as essential for gathering, producing, disseminating, and consuming news. This mutual dependence on technology underscores the importance of policies that enable technological benefits while protecting against potential harms.


Evidence

Survey data showing both journalists and public consider technology critically important for news production, gathering, dissemination, and consumption


Major discussion point

Public Perspective and Engagement


Topics

Digital access | Future of work | Digital standards


Agreed with

– Audience

Agreed on

Importance of addressing public perspective in disinformation policy


Response that autocratic policies matter because they affect real people and can be copied by other countries for harmful purposes

Explanation

When questioned about including autocracies in policy analysis, Mitchell argued that these policies matter because they affect real people living under those governments. Additionally, policies from autocratic countries can be copied or adapted by other nations, and even well-intentioned democratic policies can be misused by autocratic regimes for information control.


Evidence

Example of well-intended policy from democratic country being adopted by India for information control, demonstrating cross-border policy copying for harmful purposes


Major discussion point

Methodological and Scope Questions


Topics

Human rights principles | Freedom of the press | Legal and regulatory


Disagreed with

– Audience

Disagreed on

Methodological approach to including autocracies in policy analysis


A

Audience

Speech speed

142 words per minute

Speech length

319 words

Speech time

134 seconds

Question about whether countries should focus more on demand side of disinformation – why people believe and engage with it

Explanation

An audience member suggested that countries should emphasize understanding why people believe and engage with disinformation rather than just focusing on supply-side controls. They questioned whether addressing the psychological and behavioral aspects of disinformation consumption might be more effective than content-focused policies.


Evidence

Reference to Sweden’s psychological defense agency that prepares the population to better engage with and recognize disinformation


Major discussion point

Public Perspective and Engagement


Topics

Content policy | Online education | Human rights principles


Agreed with

– Amy Mitchell

Agreed on

Importance of addressing public perspective in disinformation policy


Suggestion that psychological defense approaches, like Sweden’s agency, could be alternative policy approaches

Explanation

The audience member proposed that psychological defense mechanisms, such as Sweden’s government agency that helps prepare the population to recognize and engage with disinformation, could represent an alternative policy approach. This would focus on building public resilience rather than content restriction.


Evidence

Sweden’s psychological defense agency as an example of government efforts to prepare population for disinformation recognition and engagement


Major discussion point

Public Perspective and Engagement


Topics

Online education | Content policy | Capacity development


Agreed with

– Amy Mitchell

Agreed on

Need for alternative approaches to disinformation beyond content restriction


Question about including individual EU member states rather than EU-wide Digital Services Act in the study sample

Explanation

An audience member questioned the methodology of including four individual EU member states in the fake news policy analysis rather than examining the EU-wide Digital Services Act. They suggested that national laws may have become obsolete due to the overarching EU legislation.


Evidence

Reference to EU Digital Services Act covering most aspects that were previously handled by individual member state laws


Major discussion point

Methodological and Scope Questions


Topics

Legal and regulatory | Jurisdiction | Content policy


Disagreed with

– Amy Mitchell

Disagreed on

EU policy analysis methodology – individual member states vs. EU-wide legislation


Challenge regarding the value of including autocracies in policy analysis when their control intentions are obvious

Explanation

An audience member questioned the analytical value of including established autocracies in the policy study sample, arguing that it’s obvious these governments will use any excuse to control information and exercise control over the information sphere. They questioned what conclusions could be drawn from such predictable behavior.


Major discussion point

Methodological and Scope Questions


Topics

Freedom of the press | Human rights principles | Legal and regulatory


Disagreed with

– Amy Mitchell

Disagreed on

Methodological approach to including autocracies in policy analysis


Agreements

Agreement points

Importance of addressing public perspective in disinformation policy

Speakers

– Amy Mitchell
– Audience

Arguments

Both journalists and public see technology as critically important for news production and consumption


Question about whether countries should focus more on demand side of disinformation – why people believe and engage with it


Summary

Both speakers recognized the critical importance of understanding and addressing the public’s role in information consumption, with Mitchell emphasizing technology’s importance to both producers and consumers, and the audience member suggesting focus on why people engage with disinformation


Topics

Content policy | Online education | Digital access


Need for alternative approaches to disinformation beyond content restriction

Speakers

– Amy Mitchell
– Audience

Arguments

Most policies failed to clearly articulate how they would better serve public information needs


Suggestion that psychological defense approaches, like Sweden’s agency, could be alternative policy approaches


Summary

Both speakers implicitly agreed that current content-focused approaches are insufficient, with Mitchell noting policies fail to serve public needs and the audience member proposing psychological defense mechanisms as alternatives


Topics

Content policy | Online education | Capacity development


Similar viewpoints

Both recognize that the public’s perspective and behavior are central to understanding and addressing information challenges, whether in defining journalism or in consuming/believing information

Speakers

– Amy Mitchell
– Audience

Arguments

Public places great value on journalism’s role in society but has broad definitions of who can be journalism producers


Question about whether countries should focus more on demand side of disinformation – why people believe and engage with it


Topics

Content policy | Cultural diversity | Online education


Unexpected consensus

Value of studying autocratic policies despite predictable outcomes

Speakers

– Amy Mitchell
– Audience

Arguments

Response that autocratic policies matter because they affect real people and can be copied by other countries for harmful purposes


Challenge regarding the value of including autocracies in policy analysis when their control intentions are obvious


Explanation

While the audience member initially challenged the value of studying autocratic policies, Mitchell’s response about cross-border policy copying and real human impact created an unexpected area of understanding about the interconnected nature of global policy effects


Topics

Human rights principles | Freedom of the press | Legal and regulatory


Overall assessment

Summary

The discussion showed limited but meaningful consensus around the importance of public-centered approaches to information policy and the recognition that current content-focused policies may be insufficient


Consensus level

Moderate consensus on methodological approaches and public engagement importance, with constructive dialogue rather than disagreement on policy analysis scope. The consensus suggests a shared understanding that effective information policy requires deeper consideration of public behavior and cross-border implications.


Differences

Different viewpoints

Methodological approach to including autocracies in policy analysis

Speakers

– Amy Mitchell
– Audience

Arguments

Response that autocratic policies matter because they affect real people and can be copied by other countries for harmful purposes


Challenge regarding the value of including autocracies in policy analysis when their control intentions are obvious


Summary

The audience member questioned the analytical value of including established autocracies in policy studies since their intention to control information is predictable, while Mitchell argued that these policies matter because they affect real people and can be adopted by other countries for harmful purposes.


Topics

Freedom of the press | Human rights principles | Legal and regulatory


EU policy analysis methodology – individual member states vs. EU-wide legislation

Speakers

– Amy Mitchell
– Audience

Arguments

Study of 32 fake news policies across 31 countries showed greater risk to journalistic independence than protection of information space


Question about including individual EU member states rather than EU-wide Digital Services Act in the study sample


Summary

The audience member questioned why the study included four individual EU member states rather than examining the EU-wide Digital Services Act, suggesting national laws may be obsolete, while Mitchell defended the methodology based on focusing on specific fake news policies rather than broader online safety legislation.


Topics

Legal and regulatory | Jurisdiction | Content policy


Unexpected differences

Research methodology and scope decisions

Speakers

– Amy Mitchell
– Audience

Arguments

Cross-border policy impacts require thinking beyond national boundaries


Question about including individual EU member states rather than EU-wide Digital Services Act in the study sample


Explanation

The disagreement about research methodology was unexpected because it revealed different perspectives on how to analyze transnational policy frameworks. While Mitchell emphasized cross-border impacts and the value of examining diverse policy approaches, the audience member focused on regulatory efficiency and questioned the relevance of studying potentially obsolete national policies.


Topics

Legal and regulatory | Jurisdiction | Content policy


Overall assessment

Summary

The disagreements were primarily methodological rather than substantive, focusing on research approach and scope rather than fundamental policy principles


Disagreement level

Low to moderate disagreement level. The disagreements were constructive and focused on research methodology and analytical approaches rather than core policy goals. Both speakers appeared to share concerns about protecting press freedom and serving public interests, but differed on analytical frameworks and research scope. These methodological disagreements actually enhanced the discussion by raising important questions about how to effectively study and compare international policies.


Partial agreements

Partial agreements

Similar viewpoints

Both recognize that the public’s perspective and behavior are central to understanding and addressing information challenges, whether in defining journalism or in consuming/believing information

Speakers

– Amy Mitchell
– Audience

Arguments

Public places great value on journalism’s role in society but has broad definitions of who can be journalism producers


Question about whether countries should focus more on demand side of disinformation – why people believe and engage with it


Topics

Content policy | Cultural diversity | Online education


Takeaways

Key takeaways

Current journalism policy landscape is unprecedented in scope and complexity, with 50% of journalists experiencing government censorship and global press freedom at 1993 levels


Well-intentioned policies often create unintended consequences that harm journalistic independence and diversity rather than protecting the information space


Critical policy analysis framework should examine definitional language, oversight authority, diversity impacts, public service goals, and cross-border effects


Fake news policies across 31 countries showed most (25 of 32) failed to define key terms, leaving interpretation to government authorities, with 14 policies placing direct government control


Media remuneration policies vary dramatically in defining digital content usage and compensation, often favoring large outlets over diverse smaller operations


AI policies rarely address journalism directly but have significant indirect impacts on the digital news landscape through liability placement and content regulation


Public has broad definition of journalism producers beyond traditional news organizations, valuing mission-driven, principle-guided content creators


Technology is viewed as critically important by both journalists and public for news production, gathering, dissemination, and consumption


Policy impacts cross national borders through copycatting and international digital infrastructure, requiring global coordination


Need for collaborative approach involving policymakers, technology companies, media organizations, researchers, and civil society in data-driven policy discussions


Resolutions and action items

CNTI plans to spend more time in the coming year examining the relationship between public behavior and disinformation policy effectiveness


CNTI is developing a research working group focused on public response to AI content labeling and watermarking


CNTI will complete and share analysis of AI policy impacts on journalism when the team finishes their comprehensive study


Recommendation for policymakers to articulate clear goals for what the digital information landscape should look like before implementing content moderation policies


Unresolved issues

How to balance defining journalism and fake news in policy without creating tools for government control of information


What the optimal end goal should be for content moderation policies and digital information landscapes


How to ensure AI policies adequately protect journalists while enabling technological benefits


Whether demand-side approaches to disinformation (focusing on why people believe false information) should be prioritized over supply-side regulation


How to create consistent cross-border policy frameworks that respect national sovereignty while addressing global digital challenges


How to ensure media remuneration policies effectively serve public information needs rather than just supporting large media organizations


What constitutes appropriate liability distribution between AI platforms, users, and content creators including journalists


Suggested compromises

Balancing the need for policy definitions with safeguards against government overreach by carefully considering who has oversight authority


Including diverse stakeholders (journalism sector representatives) in AI policy committees and agencies rather than excluding media perspectives


Focusing on psychological defense and media literacy approaches alongside regulatory measures to address disinformation


Creating policy frameworks that enable technological benefits while implementing specific safeguards against identified risks


Developing policies that support both large and small journalism operations rather than favoring one over the other


Thought provoking comments

It’s not clear within the policy conversation space that we’ve really actually done a very good job at all of articulating what would the digital landscape look like if this content moderation policy that we’re talking about gets put in what is it? That’s bad. That’s out. What is it? That’s good. That’s in what’s the mix? There’s always going to be a mix of content in there. So really taking the time to think about the ultimate goals there

Speaker

Amy Mitchell


Reason

This comment is deeply insightful because it exposes a fundamental flaw in policy-making: the lack of clear vision for desired outcomes. Rather than focusing on technical mechanisms, Mitchell highlights that policymakers haven’t adequately defined what success looks like in the information landscape.


Impact

This observation reframes the entire discussion from ‘how to regulate’ to ‘what are we trying to achieve.’ It introduces a meta-level critique that challenges the foundation of current policy approaches and sets up the framework for more thoughtful policy design throughout her presentation.


What’s the point of having really established autocracies among the sample, where it’s obvious and clear that they will use any excuse to control the information sphere and to use laws against disinformation to control, yeah, to exercise control. So what’s the point of having that? That is just, to me, obvious.

Speaker

Audience member


Reason

This question is provocative because it challenges the methodology and underlying assumptions of comparative policy analysis. It forces consideration of whether studying authoritarian approaches has value when their intent to control information is predetermined.


Impact

This question shifts the discussion toward the interconnectedness of global policy and the practical implications of policy migration across different governmental systems. It leads Mitchell to articulate how well-intentioned democratic policies can be co-opted by authoritarian regimes, adding a crucial geopolitical dimension to the conversation.


I was wondering if you would recommend that actually countries should also put more emphasis on the demand side of disinformation. So why do people maybe believe disinformation? Why do people engage with disinformation? And why is it so easy for them to look at that?

Speaker

Audience member


Reason

This comment is thought-provoking because it fundamentally shifts the focus from supply-side regulation (controlling content) to demand-side intervention (addressing why people consume misinformation). It suggests a completely different policy approach focused on media literacy and psychological factors.


Impact

This question introduces a new dimension to the policy discussion, moving beyond content regulation to consider human behavior and education. It prompts Mitchell to acknowledge the complexity of public perception and the subjective nature of what constitutes ‘disinformation,’ adding nuance to the entire framework.


We also see in the data that the public has a very broad definition of who can be producers of journalism today. It may be somebody inside an organization. It may be an individual who’s working on their own or doing their own work… it’s mission-driven. It’s guided by principles. It’s all of those elements that we think about of journalism, but it doesn’t necessarily have to be a news operation.

Speaker

Amy Mitchell


Reason

This observation is crucial because it highlights the disconnect between traditional policy frameworks (which assume institutional journalism) and contemporary reality (where individual creators are considered journalists by the public). It challenges fundamental assumptions about who deserves protection under journalism policies.


Impact

This insight forces a reconsideration of how journalism protection policies should be structured. It suggests that current policy frameworks may be inadequate for protecting the diverse ecosystem of information producers that the public actually relies on, fundamentally challenging traditional approaches to media regulation.


There is actually a fair amount of research, and there’s a lot that actually shows the disinformation campaigns don’t have a whole lot of impact on what people actually believe or don’t believe, but that people’s own behavior and where they’re choosing to go can have that… what the public would categorize as disinformation can vary greatly… there are very much alignments to one’s political thinking, to the kinds of sources you turn to.

Speaker

Amy Mitchell


Reason

This comment is particularly insightful because it challenges the entire premise underlying much disinformation policy – that external disinformation campaigns are the primary problem. Instead, it suggests that individual choice and political bias are more significant factors, which would require entirely different policy approaches.


Impact

This observation fundamentally questions the effectiveness of content-focused disinformation policies and suggests that the problem may be more about political polarization and media consumption habits than external manipulation. It adds significant complexity to the policy discussion by suggesting that the problem may not be solvable through traditional regulatory approaches.


Overall assessment

These key comments collectively transformed what could have been a technical policy discussion into a fundamental examination of assumptions underlying digital governance. Mitchell’s insights about the lack of clear policy goals and the evolving nature of journalism challenged traditional regulatory frameworks, while audience questions pushed the conversation toward more nuanced considerations of global policy interconnectedness and human behavioral factors. The discussion evolved from presenting research findings to questioning the foundational premises of current policy approaches, ultimately suggesting that effective digital governance requires a more sophisticated understanding of public behavior, global policy dynamics, and the changing nature of information production and consumption. The comments created a progression from ‘what policies exist’ to ‘what should policies actually try to achieve’ to ‘are current approaches fundamentally flawed,’ resulting in a much more critical and comprehensive examination of digital policy challenges.


Follow-up questions

What would the digital landscape look like if content moderation policies get implemented – what content should be out vs. in, and what should the mix be?

Speaker

Amy Mitchell


Explanation

This is a fundamental question about policy goals that Mitchell identified as not being well-articulated in current policy discussions, which is critical for effective content moderation policy design


How do AI policies affect journalists’ ability to use technology benefits while safeguarding against risks, especially across different jurisdictions?

Speaker

Amy Mitchell


Explanation

This represents an ongoing research area that CNTI is just beginning to explore, focusing on the indirect impacts of AI policy on journalism


How can labeling and watermarking systems foster public trust with technology and journalism rather than diminish it?

Speaker

Amy Mitchell


Explanation

Mitchell mentioned they’ve started a global research working group on this topic to understand public response to labeling systems, which is crucial for effective implementation


How should liability be defined for AI users, including journalists, in frontier model policies?

Speaker

Amy Mitchell


Explanation

This is an important policy consideration as liability often falls on users, and it needs clarification for different user categories including journalists


What is the point of including autocracies in policy analysis samples when their control of information is obvious?

Speaker

Audience member


Explanation

This question challenges the methodology and value of including authoritarian regimes in comparative policy studies


Should countries focus more on the demand side of disinformation – why people believe and engage with it – rather than just content control?

Speaker

Audience member


Explanation

This suggests an alternative policy approach focusing on public education and psychological preparedness rather than content restriction


How can policies better serve the public’s actual information-seeking behaviors and their broad definition of journalism producers?

Speaker

Amy Mitchell


Explanation

Mitchell emphasized the need to understand how the public actually gets information today, including from individual journalists and diverse sources, to inform policy design


What are the best mechanisms to balance reaching information landscape goals while avoiding unintended risks and harms?

Speaker

Amy Mitchell


Explanation

This represents the core challenge of policy design that Mitchell identified as requiring more examination in the coming year


Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.