[Parliamentary session 2] Striking the balance: Upholding freedom of expression in the fight against cybercrime

23 Jun 2025 11:30h - 13:00h

[Parliamentary session 2] Striking the balance: Upholding freedom of expression in the fight against cybercrime

Session at a glance

Summary

This discussion focused on balancing cybersecurity measures and combating online harms while protecting freedom of expression and human rights in the digital space. The panel, moderated by Bjorn Ihler, brought together experts from civil society, government regulation, and the private sector to address the complex challenges of online content governance.


Paul Ash from the Christchurch Call Foundation emphasized avoiding false dichotomies between regulation and inaction, advocating for multi-stakeholder approaches that put human rights at the core of policy-making. He stressed the importance of transparency, technical expertise, and building digital resilience rather than relying solely on legislation. Mallory Knodel from the Global Encryption Coalition argued that encryption serves as a crucial democratic tool for protecting privacy and human rights, warning against policies that weaken technical safeguards in the name of security. She emphasized the need to protect people rather than protect against people, advocating for user-centric approaches to digital governance.


Cagatay Pekyour from Meta described the platform’s challenges in balancing user safety with freedom of expression across diverse global jurisdictions, highlighting their five-step due diligence process for government content removal requests. Pavel Popescu, a Romanian regulator and former parliamentarian, shared concrete examples from Romania’s recent election interference, calling for platforms to take greater responsibility for systemic risks like coordinated disinformation campaigns and cybercrime.


The discussion revealed tensions between national sovereignty over digital governance and global platform operations, with parliamentarians expressing frustration over platforms’ compliance with local laws. The panel concluded that effective digital governance requires ongoing multi-stakeholder collaboration, technical expertise, and a commitment to human rights principles rather than purely regulatory solutions.


Keypoints

## Major Discussion Points:


– **Balancing cybersecurity with human rights and freedom of expression**: The core challenge of creating legislation that combats cybercrime, terrorism, and online harms while preserving fundamental rights like free speech and privacy. Panelists emphasized avoiding “false dichotomies” between security and liberty.


– **Platform responsibility and regulatory compliance**: Extensive discussion on how social media platforms should be held accountable for content moderation, with debates over intermediary liability, transparency requirements, and the challenges platforms face in complying with varying national laws while respecting human rights.


– **Multi-stakeholder governance and technical expertise**: Strong emphasis on the need for collaborative approaches involving governments, civil society, technical communities, and private sector, with particular stress on including actual technical experts (not just policy makers) in decision-making processes.


– **Encryption and privacy protection**: Discussion of end-to-end encryption as a fundamental tool for protecting human rights, with civil society representatives arguing against weakening encryption even for law enforcement purposes, emphasizing user-centric security approaches.


– **Real-world implementation challenges**: Practical examples from Romania’s recent election interference, Malaysia’s licensing struggles with major platforms, and various countries’ difficulties in getting platform cooperation on illegal content removal, highlighting the gap between policy intentions and enforcement reality.


## Overall Purpose:


The discussion aimed to provide parliamentarians and policymakers with practical guidance on navigating the complex landscape of online governance, specifically how to craft effective legislation and policies that address cybercrime and online harms while upholding fundamental human rights and democratic principles.


## Overall Tone:


The discussion maintained a professional but increasingly urgent tone throughout. It began diplomatically with introductions and theoretical frameworks, but became more direct and candid as panelists shared real-world experiences. The tone shifted notably when the Romanian regulator spoke “bluntly” about platform weaponization and election interference, and audience questions brought additional tension around platform cooperation and sovereignty issues. Despite some heated moments, the overall atmosphere remained constructive and solution-oriented, with panelists emphasizing collaboration over confrontation.


Speakers

**Speakers from the provided list:**


– **Bjorn Ihler** – Director and co-founder of the Khalifa Ihler Institute, building peaceful and resilient communities online and offline. Works on countering and preventing violent extremism and terrorism. Session moderator.


– **Paul Ash** – Chief Executive Officer of the Christchurch Call Foundation, based in Wellington, New Zealand. Works on eliminating terrorist and violent extremist content online while respecting human rights.


– **Cagatay Pekyour** – Head of Community Engagement and Advocacy at META for the Africa, Middle East, and Turkey region. Focuses on understanding content risks and protecting voice and expression.


– **Pavel Popescu** – Vice President of ANCOM (Romanian telecommunications regulator). Former member of Romanian parliament who chaired the Defense and National Security Committee. Works on cybersecurity legislation and policy.


– **Mallory Knodel** – Executive Director of the Social Web Foundation and member of the Global Encryption Coalition. Former Chief Technologist at the Center for Democracy and Technology. Technical specialist focusing on encryption and human rights.


– **Audience** – Various parliamentarians and attendees who asked questions during the Q&A session.


**Additional speakers:**


– **Judge from Egypt** – Criminal circuit judge dealing with hate speech and terrorism cases, seeking guidance on legislation balancing crime prevention with human rights.


– **Ni Ching** – Deputy Minister of Communication from Malaysia, discussing platform responsibility and regulatory challenges.


– **Former Minister for IT and Telecom** – Advocating for countries to develop their own platforms according to their values and legal systems.


– **Gerasim Lora** – President of the committee on investigation of abuses, corruptions, and petitions in the Romanian parliament.


Full session report

# Parliamentary Discussion: Balancing Cybersecurity and Human Rights in Digital Governance


## Introduction and Context


This parliamentary discussion examined the complex challenges of online content governance while protecting fundamental rights. Moderated by Bjorn Ihler from the Khalifa Ihler Institute, who opened by sharing his personal experience as a survivor of the 2011 Norway terrorist attack and his subsequent work on preventing online radicalization, the panel brought together experts from civil society, government regulation, and the private sector.


Ihler emphasized that “we need to have evolved approaches” beyond traditional notice-and-takedown processes, given the scale and sophistication of online criminal activities. The discussion aimed to explore how to balance cybersecurity measures and combat online harms while preserving freedom of expression and human rights in the digital space.


## Key Participants and Their Perspectives


### Paul Ash – Christchurch Call Foundation


Paul Ash opened with a foundational observation: “Beware false dichotomies. Beware folk who say to you, we either have to legislate or we can’t do anything about this problem.” He emphasized that the Christchurch Call has grown to include 130-140 participating entities, including 56 governments and 20 tech firms, demonstrating the potential for multi-stakeholder approaches.


Ash argued that platforms “are not just the postman. They are the publisher. And the reason they’re the publisher is because the algorithmic curation that privileges some content over others on platforms is not transparent.” He advocated for placing human rights at the core of policy-making and stressed the importance of getting “good technological advice from people who actually understand how the internet works, not just high-level content issues.”


### Mallory Knodel – Social Web Foundation and Global Encryption Coalition


Mallory Knodel, Executive Director of the Social Web Foundation and member of the Global Encryption Coalition (founded in 2020), positioned encryption as a crucial democratic tool. She posed a challenging question to the panel: “Are people the ones that we are erecting these treaties and these policies and this technology to protect the people or are we protecting against people?”


Knodel consistently advocated for user-centric approaches to digital governance and warned against policies that weaken technical safeguards in the name of security. She emphasized the importance of considering whether policies empower citizens or control them.


### Cagatay Pekyour – Meta


Cagatay Pekyour described the challenges platforms face in balancing user safety with freedom of expression across diverse global jurisdictions. He noted that there are “over 120 voice-limiting regulations in Africa, Middle East and Turkey region, with 40 more in development,” highlighting the complex regulatory environment platforms must navigate.


He outlined Meta’s five-step due diligence process for government content removal requests, assessing legality, legitimacy, necessity, and proportionality. Pekyour positioned platforms as intermediaries, stating: “This situation puts companies like META almost like an intermediary between the governments and the users.”


### Pavel Popescu – ANCOM (Romanian Regulator)


Pavel Popescu, Vice President of ANCOM and former parliamentarian, provided concrete examples from Romania’s recent experiences with election interference and cybercrime. He described how “social media platforms can be weaponised by foreign actors for electoral interference” and detailed technical solutions implemented with telecom operators to address spoofing attacks.


Popescu argued that “existing regulations worldwide are sufficient but not properly enforced; the main problem is lack of implementation.” He called for platforms to take greater responsibility for systemic risks and use AI technology more effectively to eliminate cybercrime and child exploitation.


## Major Discussion Topics


### Platform Responsibility and Content Moderation


The discussion revealed different perspectives on the appropriate level of platform responsibility. Paul Ash’s characterization of platforms as publishers rather than neutral intermediaries created tension with traditional intermediary liability frameworks. Pavel Popescu advocated for stronger proactive responsibility from platforms, arguing that existing platform terms and conditions are “extremely strict” and should already protect users if properly enforced.


Cagatay Pekyour acknowledged the complex position platforms occupy, describing the challenge of balancing user expectations for both reduced abuse and protected free expression while responding to government demands for content removal.


### Technical Expertise and Implementation


All panelists emphasized the critical importance of technical expertise in policy-making. Paul Ash stressed the need for involvement of the broader technical community that built the internet, while Pavel Popescu focused on platforms using their own AI technology to solve problems.


Pavel provided specific examples of technical solutions, describing how Romanian authorities worked with telecom operators to implement systems that could identify and block spoofing attacks in real-time.


### Regulatory Challenges and Enforcement


The discussion highlighted significant frustration with the gap between regulatory intentions and enforcement reality. Cagatay’s mention of over 120 voice-limiting regulations in his region illustrated the proliferation of regulations that creates complex compliance challenges.


During the audience Q&A, Ni Ching, Malaysia’s Deputy Minister of Communication, described difficulties in getting major platforms to comply with local licensing requirements, highlighting the power imbalance between global platforms and smaller jurisdictions.


## Audience Questions and Key Exchanges


### Constitutional vs. Criminal Law Approaches


An Egyptian judge asked about the structure of legislation for addressing online harms. Paul Ash responded by distinguishing between constitutional protections for freedom of expression and criminal law enforcement, emphasizing that “you need to have very clear constitutional protections around freedom of expression” while also having effective criminal law enforcement mechanisms.


### Digital Sovereignty Concerns


A former IT minister challenged the panel about digital sovereignty, asking: “How can a social media platform sitting in the west in one country who have their own laws and regulations start determining what the law of the country is in the east?” This question highlighted tensions between global platform operations and national sovereignty.


### Romanian Election Controversy


A Romanian parliament member challenged Pavel Popescu’s analysis of election interference, arguing that “the actual political parties like PSD, PNL, and Udemere made the disinformation” rather than foreign actors. This exchange illustrated how complex information environments can lead to different interpretations of the same events.


## Practical Recommendations


### For Parliamentarians


– Establish digital trust caucuses that reach across party lines


– Bring platform CEOs before parliamentary committees for direct questioning


– Demand standardized transparency reports with independent oversight


– Integrate technical community input systematically into policy-making processes


### For Policy Development


– Develop safe harbor data sharing frameworks with multi-stakeholder oversight


– Prioritize civil society capacity building through dedicated funding


– Create voluntary and co-regulatory codes bringing together industry, government, and civil society


– Focus on systems-based regulation requiring companies to establish procedures rather than specifying content rules


### For Technical Solutions


– Provide dedicated funding for open source solutions and technical tools for resilience


– Use AI positively to counter AI-generated harms


– Implement procedural safeguards including effective rights of defense to prevent over-enforcement


## Unresolved Issues


Several critical challenges remained without clear resolution:


– The fundamental challenge of enforcing regulations when platforms refuse to comply with smaller nations’ requirements


– The tension between content-based and systems-based regulatory approaches


– The balance between platform responsibility and intermediary liability principles


– The appropriate role of fact-checking mechanisms given censorship concerns


– The effectiveness of international frameworks versus national regulatory approaches


– Questions about age restrictions for social media access


## Conclusion


The discussion demonstrated both the complexity of digital governance challenges and areas where different stakeholders can find common ground. While participants agreed on the importance of human rights, multi-stakeholder approaches, and technical expertise in policy-making, significant disagreements remain about implementation methods and the distribution of responsibility between platforms and governments.


The conversation highlighted that traditional regulatory approaches alone are insufficient for addressing digital governance challenges, pointing toward the need for more nuanced, collaborative approaches that can balance security concerns with fundamental rights protections. The practical recommendations emerging from the discussion provide concrete next steps for parliamentarians and policymakers, while the unresolved issues indicate areas requiring continued dialogue and development.


Session transcript

Bjorn Ihler: service providers and other stakeholders in improving regulatory compliance, increasing user and public safety. I’m also the director and co-founder of the Khalifa Ihler Institute, building peaceful and resilient communities online and offline. Over the past 14 years, following the terrorist attack here in Norway in 2011, I’ve been working to counter and prevent violent extremism and terrorism on and offline. A key component of this has been working with online platforms, multilateral organizations, and the wider multi-stakeholder communities to address online harms. Over the past decades, I’ve seen a rapid development of social media platforms and their societal impact. The power of online communities has changed the path of history, empowered individuals and communities to address and counter human rights abuses, and promote civil liberties around the world. Our online communities have also been abused and exploited by nefarious actors to undermine democratic processes, recruit terrorist and violent extremist organizations, conduct coordinated information operations, spread disinformation, harmful and exploitative content, intimidate individuals and representatives participating in public discourse and political processes, and engage in other forms of cybercrime causing significant harm. Considering both the opportunities and complex challenges posed by the evolving online landscape, it is vital that we across parliaments, governments, civil society, and multilateral organizations actively continue to engage with both tech through the private sector and regulation to strike the right balance between liberty and safety, to ensure respect for human life, dignity, and rights in the online sphere. As online criminal activities increase in scope and sophistication, efforts to combat cybercrime continue to evolve. Recent legislative initiatives in this area have focused on targeting misinformation and other Misconduct Online. However, laws on contents regulation, surveillance, and platform liability can pose serious risks to freedom of expression and access to information. Experts from media, the private sector, law enforcement, and technical and intergovernmental organizations will in this session provide a holistic view of this complex policy landscape and explore how policymakers can navigate the delicate balance between ensuring cyber security and upholding fundamental human rights, especially the right to freedom of expression. Without further ado, it’s therefore my privilege to introduce you to this esteemed panel consisting of Paul Ash, Chief Executive Officer of the Christ Church Call Foundation, Cagatay Pekyour, Head of Community Engagement and Advocacy at META, Pavel Popescu, Vice President of ANCOM, and Mallory Nodal, Executive Director of the Social Web Foundation and member of the Global Encryption Coalition. Paul, start with a brief introduction of yourself in one minute or so and then we move on.


Paul Ash: It’s such a pleasure to be here in Oslo and I’m doing my best to make sure that my body keeps up with my mind on that, having got here late last night after 27 hours of flying time. I’m based in Wellington, New Zealand. I’m the Chief Executive of the Christ Church Call Foundation. Some of you may recall back in 2019 a devastating attack on two mosques in Christchurch, New Zealand, after which the Christchurch Call itself was established by then Prime Minister Jacinda Ardern and President Emmanuel Macron of France. The intent of the Christchurch Call was to eliminate terrorist and violent extremist content online while respecting human rights and a free, open, and secure internet. Over the time since the call was launched, it’s grown now to somewhere around 130, 140 different participating entities, 56 governments, just on 20 tech firms, a dozen partners, and about 50 civil society organizations. Over the last year, we have established a secretariat in the Christchurch Call Foundation that is now independent of and acting outside of the New Zealand government. And I’ve been delighted to work with Bjorn and a number of the others on this panel over the time since the call was established and since the foundation was.


Bjorn Ihler: Thanks very much for having me here today. Thank you, Paul. Cagatay?


Cagatay Pekyour: Hi, everyone. My name is Cagatay Pekyour. I’m head of public policy for META for the Africa, Middle East, and Turkey region in charge of community engagement and advocacy for the region. My work involves engaging with communities and also civil society organizations to understand the risks on content when it comes to the content being harmful and also how we may protect voice and expression of these groups. Would you like me to…


Bjorn Ihler: The introduction is fine. Thank you very much. Pavel?


Pavel Popescu: Hello, everybody. Thank you for inviting me to this panel. My name is Pavel Popescu. I’m coming from Romania. I’m the vice president of our telco regulator called ANCOM. I actually lived for a few years in this country, which is my second motherland. I was almost for two terms a member of the parliament in the Romanian parliament chairing the Defense and National Security Committee and laying down most of the legislation that we have passed in the last years on cyber security, which involves definitely legislation treating the problem of cyber crime. Today I’m in the position of deputy minister of state secretary at ANCOM as a vice president. and our institution has many hats, it’s much more than a regulator it has to do with lots of domains starting from DSA, telecommunication, space more recently and postal services and cyber from December 2024, so we are actually covering lots of things and it’s starting to be more and more challenging in the world we are living in. I’m looking forward to discuss about this in the next minutes. Thank you.


Bjorn Ihler: Thank you Pavel. Mallory?


Mallory Knodel: Hi, thanks so much. I’m Mallory Nodal and I’m part of the what’s called the Global Encryption Coalition, so I’ll spend a little time introducing that as well as myself and just first thanks so much to the organizers for inviting me. It’s a real privilege to be able to speak in front of you as a member of civil society and as someone who engages in the technical community. I think it’s really important that we continue these dialogues in a multi-stakeholder fashion. So the Global Encryption Coalition, it was founded in year 2020. There was sort of pre-meeting at the Berlin IGF, maybe some of you were there in 2019, and it was because we saw an increasing attack on end-to-end encryption at the legislative level in a few key countries, notably Five Eyes countries, and we wanted to make sure that both the sort of private sector response and the civil society response as well as the scientific community and computer science and so on, cryptography were aligned in trying to talk about the benefits and protect encryption. So it’s I was in my capacity as chief technologist at the Center for Democracy and Technology, a founding secretariat member, and now I am one of the technical specialists. So the the coalition is made up of civil society organizations, only small companies, mind you, we don’t have members from the large big tech companies you might imagine, small medium-sized enterprises all over the world, and then as well academics. in the Internet Engineering Task Force and the Internet Research Task Force. There’s a research group there called Human Rights Protocol Considerations that I’ve chaired for a number of years. Some of these issues come up from time to time, although our mandate is much broader than free expression. But it’s a subject of the panel today that I’m really happy to talk about. And I’ll also say that on the topic of cybercrime, when I was at the Center for Democracy and Technology, we were members of the Ad Hoc Committee process as observer members, as civil society. So I also followed that for its duration and so have a lot of sort of aggregate thoughts and reactions to that process and then the way forward based on a civil society perspective, which I’m really, really, again, excited to share with you today. So, thanks.


Bjorn Ihler: Thank you, Mallory. So, as you can see, we have a really broad spectrum here from civil society organizations, what is now also civil society organization, which is exciting, to government and the corporate sector as well. I think this represents how multistakeholderism should be happening in conversation between all of those bodies. And so, I’m excited for this. Paul, the questioner’s call is working towards eliminating terrorist and violent extremist content online. How can members of parliament take action while making sure to uphold human rights online and free and secure and public internet? And what alternatives and effective measures can be drafted into legislation to counter cybercrime that you can recommend?


Paul Ash: Wow, no pressure. Where to start? It’s a real easy start. I’m sliding into it here. I guess as a starting point, the first thing I’d say, if I was sitting as a member of parliament considering this challenge, would be beware false dichotomies. Beware folk who say to you, we either have to legislate or we can’t do anything about this problem. Because both of those ways are quite perilous when dealing with technology issues. The second piece I’d say is get good technological advice. Find people within your jurisdiction who actually understand the technology well, who understand the issues around the way the technical community can feed into this discussion so that you’re being well advised as you’re thinking about how to deal with the challenges and issues in front of you. I guess that’s where we started when we were looking at the response to what happened in Christchurch. And I was fortunate enough to be sitting in a role in government where I was able to give that advice to a leader who took it and was keen to find ways to preserve human rights and a free, open and secure internet while not ignoring the fact that radicalization of violent extremism, broadcasting of terrorist acts, the use of the internet, the abuse of the internet to promulgate terrorist and violent extremist content was also happening and could not be left unchecked. But that those two things were not a dichotomy that we needed to fall on one side or the other of but we needed to try and bring together. Great. That all sounds good and high-minded in theory. It dovetails neatly with the theme of the IGF around building digital governance together. It reflects 20 years of cooperation. What does that look like in practice if you were an MP or a Member of Parliament? I guess the first thing I’d say there is ensuring that you put human rights at the real core of your work. That you’re looking at rights-based processes that involve the communities you represent in work trying to keep people safe and secure online. The key tools that can be used there are things like human rights, impact assessments, working with communities to ensure that as you’re thinking about responses to problems like terrorist and violent extremist content, you’re not compounding the problems that that content causes. You’re not creating a violation of other rights than the ones that you’re dealing with. The second is making sure that you build in really smart human rights review and sunset clauses for the provisions that you’re putting in place because as technology evolves, so too must our solutions and being able to make sure that regulatory measures are updated is really important. And ensuring that the work that you’re looking to do in this area is really tightly constrained and you don’t have scope creep or mission creep in it. How many times we have seen a piece of legislation well intended aimed at dealing with a specific problem like terrorist or violent extremist content or child sexual abuse online suddenly find a whole pile of things added to it at the last moment as it’s going through its third reading in Parliament and come out looking like, we would say in my part of the world, the proverbial Christmas tree with all manner of things on it that are difficult to deal with. So human rights first. The second, demanding radical transparency. Transparency about the way the online environment works is critical to trying to solve problems like terrorist and violent extremist content or child sexual abuse or any other of the things we’re grappling with in an open and, I guess, standardised way. So ensuring that you have platform agnostic transparency reports is important. Ensuring that you embed in the government work you do just as much transparency as is being demanded of tech platforms is also critically important and that’s a big part of the transparency working group that the Christchurch Call has been driving over the last couple of years. Standardised data formats so that a range of entities can have access to that data. Civil society can have inputs and can verify claims. And ensuring that you’ve got independent oversight around transparency mechanisms are all critically important. Probably the most important thing for me, though, both in how we work and I think in legitimising and building sound processes is multi-stakeholder oversight of the work. Making sure that as you, as a parliamentarian, take forward processes and solutions, you’re finding ways to ensure that you bring together tech platforms, government officials, members of law enforcement and civil society groups to oversee that work and have regular input into it is critically important. Creating digital trust caucuses is one way of doing that. Reaching out to counterparts across the floor to try and find ways to work together on digital issues. And meeting regularly with industry technical experts and rights defenders. And I’ll make one really critical point here. When I say technical experts, I mean technical experts. One of the biggest issues we’ve seen over the last five to ten years is sometimes the sidelining of the technical community that Mallory described earlier. Those who’ve built the internet and know how it works. And even if we’re dealing with issues right up at the top of the content layer, those folk have a really important role to play in helping advise how to keep that process safe. Finally, I think digital resilience. You’re not going to be able to legislate your way out of this challenge. One of the reasons we stood up a multi-stakeholder process was precisely that, that regulation in this area, in this field, is fraught, it’s challenging and it’s imperfect. And if you meet someone who tells you that their regulatory process has got this nailed and it won’t need to change, I would suggest you’re dealing with another of those false dichotomies and you need to let your intuition and your alarm bells kick in at that point. So making sure that you’ve got other tools, particularly tools around building resilience in the community, building digital literacy, all of those things are critically important. The challenge I’d just say in there is that there’s a huge focus on building digital resilience that is going to take much longer to deliver than the challenges we’re dealing with with technology. The tech has moved much, much faster than resilience is able to keep up with and you won’t be able to rely on resilience alone. Training in schools is great, but it’s a solution that will kick in in 10, 20, 30 years’ time and we may not have that long to deal with some of the challenges that we’re finding in the online environment right now. Very quickly, because I suspect I’m chewing through a chunk of my available time, what are the alternatives that you have in front of you to legislation as a tool? and others. So, I think that the principles of transparency, multistakeholderism, technical input and digital resilience need to be applied, I guess, to a range of different alternatives that exist to legislation. There’s a number of ways that legislators can impact on a positive digital environment that go beyond a simple regulation, and they’re often messy, they’re often much more challenging than regulating, or at first seem that way, but long-term they’re probably more sustainable. One of those is voluntary and co-regulatory codes, pulling together mixed groups across industry, across government, across civil society and the tech community to find ways that are a bit more flexible sometimes than regulation, and sometimes those can be a subset of regulatory measures, they can be mandated by regulation. We’ve done this in the Christchurch call, for instance, with the crisis response protocols where we work really, really hard to ensure multistakeholder input into work to respond to online crises, and into building out a toolkit now, as part of the work we have underway at the moment, building out a toolkit to help communities be more resilient after those crises. Technical support funding, in particular, open source solutions and funding ways for the technical community to build out tools for resilience is actually really important. Offering safe harbour data sharing frameworks, for instance, with multistakeholder oversight to try and deal with some of the questions around the way algorithms work while respecting privacy. Again, something we’re working on within the Christchurch call construct, something that was mandated in the call when it was launched. Critically important that happens, critically important also that as legislators you are not the only people overseeing that, that you actually have support from across the multistakeholder community. You can use economic levers. There are many, many economic tools that can be used by legislators to try and ensure that in their own jurisdiction, or working with other jurisdictions, they bring parties together to work constructively on digital solutions. Soft power and diplomacy, you’re here because you’re part of inter-parliamentary groups, those groups have a potential that I think is well more advanced than is currently being delivered upon, and there is plenty of scope to be working across jurisdictions with your peers and colleagues on solutions and on tools that work. And finally, I think capacity building for civil society. If you’re a regulator, if you’re a legislator, it can sometimes feel a bit painful. I’ve seen this advising legislators, the civil society elements, and this can sometimes even seem inconvenient, but they’re critically important. And making sure in the current environment where civil society is finding funding harder and harder to secure, that you support civil society’s role in this work is I think a critically important way of building out a more holistic approach to online safety. And I think in particular, some of the work around supporting civil society groups to build out counter speech or free speech narratives that manage the risk of what is in most jurisdictions illegal content, by providing alternatives to it is also critically important. I guess if I leave one quick snapshot on this, it’s ultimately that the work we’ve done with the Christchurch call has been around trying to show that shared norms can travel way faster and be far more effective and more flexible than any individual statute. And if Parliament’s focus on transparency, on accountability, and on capacity building, and they work hand in hand with the technical, academic, and civil society communities and with industry, I think we can curb terrorist and violent extremist content and radicalisation to violence while keeping the internet open, free, secure, and rights respecting. Thank you. That’s how we truly build digital governance.


Bjorn Ihler: Thank you, Paul. Applause. Picking up on some of the things you said, I think including the technical community and meaning the technical community is an important one. And with that, also thinking about resilience by design and other elements. That can be inspired by privacy by design, which I think is one key principle you’ve been working on, Mallory, but also something that we should learn from and adopt. And so, throwing it to you, I’m wondering how efforts to combat cybercrime and cybersecurity thinks about end-to-end encryption, the challenges that comes with that, but also the opportunities that come with that, and how to also frame that in the context of user-centric security and privacy tools.


Mallory Knodel: Yeah, thanks for the pointed question, because I’ll definitely appreciate, Paul, for covering all of human rights, and I think encryption protects all human rights, actually, but it does have a specific narrow focus on privacy. We’ve heard from the UN for a long time it also protects free expression and freedom of opinion, freedom of association. But yeah, let me get really more specific than just the overbroad approach. A lot of my comments here are coming from the perspective of civil society. Having worked for nonprofit organizations, civil society, for 15-plus years, I will say and acknowledge that we can often be very strident in our beliefs and in protecting this, but it’s part of a much longer struggle. And so, I recognize the opportunity I have in front of you today, and so I’m giving you, I think, a glimpse into how civil society organizations, human rights defenders tend to approach this issue. And I mean it to be direct, and of course, recognize that governments and states are in particular positions and agencies are complicated, and no one in this room is actually responsible for any one government’s policy. But I really start from the history far beyond the digital age, in which really the exercise of democracy is trying to strike this balance. We know that one of the very difficult things to get right is investigatory powers, crime and punishment, policing, and so on, and how then that infringes on rights. So if we take this as a long, long history, again, before the digital age even comes into view, we know that we need legislative power, legislative checks on this power. We need to keep this in mind at all times. What’s changed now is we have, as well, technical mechanisms to do the same, to create limits, to affect boundaries, and encryption is perhaps the paragon of this, right? It is the one thing, end-to-end encryption is the one thing that really democratizes the ability to set a boundary, the ability to say, you know, this conversation between you and I is private, and I’ve set those terms, and it can’t be disrupted, right? So I think we have to keep this as an imagination, as part of the historical struggle to create some balance here. And so that’s in the tradition that I’m coming at this from. I’m sort of on the side of human rights and trying to correct that power imbalance where it’s happened. And I think in the recent past, we have had a lot of examples of how we’ve, we’re over-correcting, right? We’re over-correcting for the strength of investigatory powers and policing and cybercrime instead of rights. You know, not every month it feels like there’s a new example in the headlines, but effectively we have as well, you know, this lesson we continue to learn, which is that when you intentionally weaken things from a technical perspective, in terms of security and privacy, you actually do create vulnerabilities that can be rather egregious and that can invite even more kinds of, I don’t know, hacking and vulnerabilities and things like that. But let me get concrete. Let me get concrete. I think the other thing that has happened in this digital age is there’s the belief, and Paul, you actually set me up because you mentioned this, that tech moves faster than policy, right? And that we need, in fact, the policy to somehow catch up and to help rein in the capabilities of technology. Now, I will say, just to point out, that it’s almost always used in service of speaking against democratization of the technology, right? It’s only people’s use of technology that needs to be reined in, when in fact, we should also be worried about government’s use of the same technology in a way that is exceeding legislative controls and powers and checks on that power. So when you have, for example, mass surveillance, or even targeted surveillance, right, with unchecked power that’s not going through proper channels, it’s unlawful, that is also an example of tech outpacing policy, right? And we tend to not think of it in those terms, but that is absolutely what, from a civil society perspective, is very clear to us. So I want to also mention that I think that when we imagine the digital age versus how this is different from previous tangles between the state and the people, the other issue is scale, right? The scale and the capabilities are now beyond natural limits for what we would consider a personal boundary, right? In the sort of offline, pre-digital world, there was a limit to the number of homes one could walk into and the number of things one could know about a person. We, with the digitalization of all of our lives, that scale is now beyond any sort of natural limit. And the idea that you could take all of that data, if you’re a company, right, this is sort of the business model right now, is you’re sort of harvesting, creating, storing, using all of that data, and that it should also, of course, be accessible to the governments of the jurisdiction where those companies reside as well. That seems like a very obvious thing to follow on. But again, it’s exceeded what our society was built upon in terms of the natural limits of the private realm. And so that’s where I think we have basically just one technology that does try to create that personal boundary, and that technology is encryption. That’s the only one, really. It’s the only thing that’s between me and someone else who would like to listen in or have access to my conversation. And that goes for companies as well. I want to say that for all of those out there that feel, you know, there might be this sort of pro-companies, pro-private sector side of encryption policy, a lot of end users, a lot of human rights defenders, a lot of people who care about and use encryption are also trying to create a boundary between themselves and companies, right? It’s a really important thing, and it’s, especially in the case of end-to-end encryption, that is in the power of the users. So let me move into, like Paul so helpfully gave, you know, very specific jurisdictional approaches to this issue. I’m going to try to do the same, have about four points, and then I’ll end on a question for everyone to sort of think about. So looking at the way we have articulated cybercrime, global cybercrime coordination, some themes… that have really come up that are of importance to civil society and I think that differ by jurisdiction are firstly cyber security research. So one of the ways that you ensure that the internet, digital technologies are secure is you have everyone concerned about the cyber security aspect and it’s a very holistic approach and you need a lot of different folks to use it, to test it, to audit it and so you have to accommodate cyber security research from every conceivable perspective. Civil society, they’re going to be concerned about it and that should not be criminalized. Also private sector and third party private sector as well as the academic community. It’s really crucial to allow for that, to uncover vulnerabilities and to secure everyone. I already mentioned the scale of surveillance but I think that the policies that are already in place in virtually every jurisdiction to limit power need to be strengthened during this time of international cooperation on cyber security and cyber crime, not weakened. So if you have these increased capabilities globally you need to make sure that jurisdictionally speaking those are stronger and again I am sure that they exist everywhere. The third thing is, this was already mentioned, but any sort of use of these transnational cooperation capabilities or the use of government hacking, the use of other things, need to be responsible, they need to be transparent, especially when there’s cooperation between governments and the private sector. Because I think that that’s a combination of two very strong forms of power that need additional scrutiny, that need additional oversight. Warrants, right, dual use technologies regulation, making sure that zero day exploits and back doors are not used because they are very dangerous and they present over and over again, we’ve seen how they present cyber security and securities for everyone. And the last thing I’ll just say is related to the UN treaty and for the countries in the room that have signed that potentially will be part of that treaty, I think again it’s important to think of the standards and the safeguards that a lot of civil society organizations feel are insufficient, that you jurisdictionally can of course exceed them. The human rights safeguards that are suggested in that treaty and in place, there’s no reason why you cannot go above and beyond to really protect human rights, human rights defenders, journalists, activists in your jurisdiction. So leaving then with a question, I think there is, there does persist a dichotomy unfortunately. Maybe that never gets resolved but I think if we admire the problem from different angles it can help to smooth the edges a bit. I will say I will be here in ten years plus still talking about encryption because I think we will never really quite get to it. You can’t ban it in one place, it will continue to persist in others. So what do we do? I think the question we should be asking is rather are people the ones that we are erecting these treaties and these policies and this technology to protect the people or are we protecting against people? Are we leaning into democracy or are we truly genuinely afraid of it to the point where we feel we must worry about the end user basically or the people as the threat? That’s how I think of this and I think once we sort of try to flip that around and you put people at the center of security, of privacy, of human rights, these edges seem to me to resolve very quickly. But that takes a lot of political will and I think that might not always be the political moment that we’re in but I think we can imagine it and we can imagine it being stronger in the end. So anyway, thanks very much, appreciate it.


Bjorn Ihler: Thanks, Carlos. Hi, one of the things I keep reminding the policy people at Meta about is that you’re effectively having a larger user base than any one country in the world and so the fact though… also governing the online experiences for a larger population than any one government in the world. And so it’s interesting to think about the scale of that and the responsibilities that come with that. And so building on that, I’m wondering what effective regulation looks like in terms of striking the balance between combating cybercrime, misinformation, and online misconduct on your platforms while also protecting freedom of expression. I’m also wondering if you can elaborate on how META works to protect freedom of expression while navigating the increased regulatory burden from governments and from also multilateral organizations such as the European Union. There’s a lot to balance there, and so I’m wondering if you can think a lot about that.


Cagatay Pekyour: Thank you. Of course. Thank you. And thank you so much, Mallory, for mentioning the historic background because that was also a part of my entrance actually for today. This topic is not completely new for us. Like for centuries, political leaders, philosophers, activists have wrestled with the question how and when governments should place limits on freedom of expression to protect people from harmful content. And increasingly in the online space, privately run companies like META are making these determinations. And on the other hand, a significant number of people are using our platforms to express their political opinions. And in many parts of the world, they actually don’t have any other alternatives to express these opinions or gather. So we are deeply aware of the importance of protecting the speech in our platforms while protecting people from This situation puts companies like META almost like an intermediary between the governments and the users. We are seeing the users, while they want us to reduce the abuse, they also expect us to protect their freedom of expression. And we are hearing from the governments who are not only expecting us to remove illegal content, but also us to take action on all sorts of harmful content. So we are trying to navigate this space for both the users and also for the governments. I will first start with talking about the regulations that we see. In the region that I’m also working for, Africa, Middle East and Turkey, there are at the moment more than 120 voice-limiting regulations. So these are cybercrime laws and also content regulations in the region. If they are enforced over broadly, they may constitute a risk on our users’ freedom of expression, while their original intent is protecting people from harm. And there are 40 other regulations that are in the making in the region. Ninety of these regulations are only bringing liability on the users directly and doesn’t bring any liability or responsibilities for the platforms like META. But the rest of them actually bring responsibilities and also sanctions on the platforms, which sometimes may lead platforms to even take the safer approach and over-enforce certain policies to avoid some sanctions. In META, to find a balanced approach, We have our community standards, as I’m sure like you all know, which sets the rules for which speech is allowed on our platforms and which is not. That’s mainly to protect users from harmful content. But also, at its very core, there are principles around freedom of expression. And we have several other policies that actually are designed to make sure that we are protecting our users’ freedom of expression, which is our regulatory compliance policy to begin with. That policy is about how we are responding to government’s requests on content takedown. And it is based on our human rights commitments and also our commitments as part of a global network initiative. And actually, it mandates our teams to conduct a five-step due diligence, which assesses the request’s legality, legitimacy, necessity, and proportionality, and also the external risks that such takedown requests may pose. We make our decisions on our users’ speech in relation to regulations after going through this process for each request that we receive. And we regularly publish transparency reports in relation to how we are managing government takedown requests and also how we are responding to government’s requests on user data, because we are also aware that the user data requests may also sometimes lead to silencing certain communities. These transparency reports not only happen in six-month periods, but also in some significant cases. We publish case studies on our Transparency Center, and we try to provide as much information as possible to make sure civil society and also the affected parties are well-informed about the requests that we receive and the affected parties’ impact is at least explained in a transparent manner. This is a very complex space. We are not only talking about the content itself, but also we know that sometimes it’s the actors who are causing the problematic situation. It can be, for example, a fake account that is causing what we are seeing as a problem. And sometimes it’s the behaviour, so it might be the fake engagement itself, not the content. So our policies also try to address these different types of problematic areas when it comes to the topic of cybercrime and severe harmful content. But also, any regulation that tries to regulate this space, I believe, should also consider the fact that speech norms actually evolve and change over time. Also, the threats that we see online that is around cybercrime, they are also constantly and repeatedly evolving. As Maleri mentioned, the volume and scalability is another thing that brings more complexity into this situation. If we only rely on notice and take down processes, it’s definitely not effective to protect people from harmful content, but also it puts companies like Meta into a difficult position time to time when it comes to protecting people’s freedom of expression as well. As again Maleri mentioned, some of the content and these behaviours happen in private spaces, like messaging apps, and sometimes they happen in public spaces, like a public page on our platforms. And how these regulations are treating these different spaces is also quite important. Another thing is how we are treating organic and also paid content, like should the regulation have the same approach for these two different types of content or should we have a different set of rules for content created by an ordinary user or for content that is backed by advertisements. When we look at the regulations in our region and also for the rest of the world, in the rest of the world, effective and also effective effective and rights respecting regulation has certain characteristics which I’d like to highlight before ending my intervention. The first bit is like complying with Article 19 of the International Covenant on Civil Rights and Political Rights of the UN Treaty. This is very important, how we are balancing protections on fundamental human rights, including privacy and access to information. This should be at the core of any regulation that may want to regulate this space. The second important thing that we see is that it has to have certain mechanisms and proportionality and necessity in limiting speech or certain actions online. So regulations that legislate only for the most harmful forms of content actually has less risk on infringing people’s freedom of expression. This can only happen if such regulation may consider the severity and also prevalence of harmful content in question and also its status in the law. Also, we believe if the regulations take into account the efforts of the platforms in addressing harmful content, they may have more flexibility which may allow them to adapt in evolving technology and also evolving threats. Another important thing to avoid limiting people’s freedom of expression is having clear definitions. We are seeing several regulations that has really broad definitions of harmful content which puts platforms in difficult positions as only a narrowly defined harmful content types would allow us not to expand our enforcement and protect people’s freedom of expression. One last thing that I believe has quite an importance in this space is having these regulations building around procedural safeguards. As I mentioned, it’s mostly about how these regulations are enforced rather than how they are written. If there are no effective right of defense or remedy for the users in relation to enforcement of these regulations, they can be quite easily enforced in an overbroad manner by the governments. If there are heavy sanctions attached to them, the expectation of the companies, what I’m trying to say is the companies may pick the safer side and not to take the risk because of those sanctions and may tend to over-enforce. I think it’s the right to expect that they are taking this approach. But if we have the legal safeguards and also paths for remedy for the users in relation to enforcement of these regulations, then we may avoid over-enforcement of the companies and also overbroad enforcement that may come from regulators and also the governments. Thank you very much.


Bjorn Ihler: We have a lot of parliamentarians in the audience, also a couple of regulators, I see you. So, Pavel, having been a parliamentarian and now filling the shoes of a regulator in Romania, you’re balancing a couple of hats here. And I think it would be interesting to listen to you and hear what kind of tangible takeaways there might be in terms of good practices for whole-of-government collaboration across both parliament and for regulators, but also what challenges and solutions you are seeing in addressing cybercrime and upholding freedom of expression in legislation and practice. I know Romania has also been having some interesting experiences with the recent elections, et cetera, that might be worth touching upon here. Oh, so you want to extend the panel to three, four hours, Pavel? Sure, we can do a couple of days on this.


Pavel Popescu: Yeah, it’s going to take a lot. How many parliamentarians are we having today here? Can you raise your hands? So, quite many. Yeah, it’s a very interesting, actually the most valuable thing that I’ve earned through these years, I consider to be the experience passing through these events that happened, including recently in my country, because it set up an example for all the countries across the world to see what’s happening, for example, when a social media platform is weaponized. There are so many, you know, there are two ways of doing this thing whenever we go, you know, at conferences. I quit going to many conferences, but there are two ways of doing it. Speaking out, I mean, attacking the problem or just… just, you know, being diplomatic and not being bluntly, I hate the second thing, so I’m here to speak bluntly to you and I think it’s with all the humbleness and modesty I can say the things I will say in the next minutes because I have the hats, I know what’s the power of being an MP and sometimes I feel it very frustrating that in the parliaments when you have actually the biggest power in the world it’s in the power of a parliament, you do not use that power the way you should use it and you let so many times at the level of a social media platform or at the level of a regulator which by the way today we regulators we are sweared a lot and as a regulator I’m against regulation, I want less regulation, I want less regulation in the European Union, I want the private companies to do profit but I want the citizens to be safe while the private companies are doing profit using the social media platforms, for example, so I think you are not aware of the power you have, in order to apply that power we all have to do something which is very complicated to keep the speed with what’s happening in the world because in three years from now you know everybody’s saying in ten years from now the world will look differently, let’s wait to see how it’s going to look in three years, in three years from now we’re not going to recognise the social media platforms and the social media ecosystem because of AI, we will not be able, even the most advanced from us which are keeping up with this race, we will not be able to see what’s wrong and right and we have to prepare for that, we have to prepare our kids for that and actually we have to enforce regulation which is already in place because if you go across the world starting from United Nations starting from this beautiful country where I live for almost seven years. Everywhere in the world, we have enough regulation in place, but it is not enforced. This is a fact. Today, let’s take an example of what’s happening these days across the world, and I’m going to touch the topic of cybercrime. I am the biggest supporter of free speech. I think everybody on this earth, no matter the color of your skin, no matter the religion, no matter the country you are coming from, has to be allowed to say what they think. People should say what they think without consequences. That’s the most valuable thing that God created for us, the freedom of saying what we believe. And I think this is the main, you know, the root from where we start the whole conversation. But, and this is a big but, let’s take, for example, in the social media ecosystem, when we discuss about freedom of speech, how much we discuss about the actual regulation in place. And let’s take, for example, Digital Services Act. In our institution as a regulator, we are also responsible for Digital Services Act, a very complicated regulation from, you know, from the European Commission, but it’s a regulation which actually raises a problem which was not solved before by the social media platforms itself. Today, if we would be honest, we do not need, we would not need that regulation, we would do not need extra regulation in place. If you go and… read the terms and conditions of each social media platform. So as a regulator, when I go back home, I’m challenged by thousands of, you know, independent journalists, people who are trying to say something online and they say, you know, I was censored and it’s your fault because you’re the regulator. While we do not have any fault, we do not moderate content, we do not ask the social media platform to bring that content down because we do not have legislation in place to censor people, definitely. But what we have in place is legislation about what’s legal and illegal. And my question is, going back to the Romanian elections, how would you define this code, for example? Somebody involved in a political campaign, let’s say a presidential candidate, where an MP goes online and says, tonight Romania would be invaded by French soldiers and Russian soldiers. Or a presidential candidate goes online and says, Romania is going to attack Russia with its military MK NATO base. What do you do as a government official? Because that certain phrase creates panic through millions of people. Instant panic. People are going to the passport offices, trying to renew their passports and saying, I’m leaving Romania tonight. I don’t want to go at war with Russia. And actually nobody wants to go at war with anybody. So when you have these kind of situations, as a government official, as somebody who has to put regulation in place, you have to take decisions while protecting freedom of speech. So imagine that the last nine months for my government, previous government and this government and our regulator, we’ve been challenged with thousands of cases like this. And if anybody in the world believes that, still believes that there are countries like Russia which, there are not countries like Russia which are putting, you know, serious money into rigging an election or influencing an election or using social media platforms to influence that elections, you should look at Romania’s case because we’re going to set up, you know, I would say an example of how we handle that. And here, let’s not even talk about countries, it can happen in any country with any state or non-state adversary. We’ve been through this, we’ve seen this, we handled this and we managed to go alive out of this but with a lot of consequences, I would say. And I want to emphasize the fact that the main responsibility in the social media ecosystem today is for the platforms. The platforms should wake up, it’s a wake-up call for the platforms to understand that their terms and conditions which are extremely strict should protect our kids from child pornography, should protect our citizens from cybercrime, should protect the whole citizens for the freedom of speech, should protect us all from the way that AI is evolving and using cybercrime instantly and constantly today. And I cannot accept, as a former MP, as a current regulator and as a former member of former programmer with technical background, I cannot accept that with the billions or trillions of dollars of profit a social media platform is doing today, cannot at least install a system, and I will give an example, to eliminate a systemic risk like for example the cybercrime campaign you have in a specific country using some specific companies using AI, and this is an ongoing discussion. Recently we took a very interesting decision and positive decision in Romania as a regulator together with our police office and the ministry of interns because we had in cybercrime the problem of spoofing, like tens of thousands of Romanian citizens were called from the police number or from the bank number with the real number appearing on the phone but of course behind it was not the real number. We took the decision to, a technical decision with the teleco-operators, to limit the possibility to be called with this type of numbers if they are called outside the country. It’s a complicated thing that they were using but the problem was not only there, the problem was starting from the campaigns we’ve always seen on social media platform, for example with the governor of Romania or the prime minister of the president asking a Romanian citizen to invest in a certain company. And I think today we need, as in many cases, the social media platforms to make a better, better mapping and cleaning of their social media platforms in terms of this kind of new challenges we have in. online. I cannot accept and we should not accept the idea that when you have AI doing something in a negative way you cannot have AI at that level reversing that. Everything is reversible in ITNC and I would like to see as a regulator and that’s what we try to do home with our, you know, interaction with the social media platform, to see the social media platform being more more responsible on these issues. I think responsibility, it’s a must when you use people data and this is a fact. I know we are all aware that people unfortunately do not care too much how their data is handled these days, you know, but I think we as regulators, we as government officials, you as parliamentaries, you should be able to understand by the power you have through the vote of the people, get a better understanding of what’s happening these days, get better experts and consultants who could come in your offices and teach you what’s actually happening across the social media platforms and I think we have a, we need a lot of responsibility from social media platforms. That’s at least what I feel today. Thank you very much.


Bjorn Ihler: Thank you so much and I think there’s a lot of parliamentarians here who can take some of this to heart. Some of you might have questions even and so I want to open the floor for some questions for those of you who have that. The gentleman here, can we get a microphone on the floor? There’s someone coming. Here, second row, third person in.


Audience: Thank you for this informative session. I benefited a lot, as a matter of fact. But this is a parliamentarian session, is that right? So I expect that I am going to hear about best practice in legislation, how legislation addresses these issues. I am a judge, I am a judge at criminal circuit, criminal chamber, so we are dealing with these cases. And we have to decide whether this is hate speech, whether this is terrorism, and we have to make this balance, and the line, we draw the line between what is crime, what is not a crime. But we need a legislation. So I wanted to hear here, like for example, in Egypt, we have criminal legislation, cyber crime legislation, but the cyber crime legislation would only address the crime. It will not address the issue of human rights. But what addresses the issue of human rights is the constitution, the Bill of Rights. So this is what I wanted to hear from you. How can legislation itself address these issues? Are we going to protect human rights and safeguard the freedoms through the legislation, the criminal legislation, or we leave it to the general principle of law, or constitutional principles, for example? And we are abide to, of course, implement the principle of constitution, which come over the normal legislation. So I wanted to hear this answer. What is the strategy to address this dilemma? Thank you. Thank you very much. We’ll lump together a couple of questions. So I think the lady on the second row here. Thank you, moderator. I’m Ni Ching from Malaysia. I’m the Deputy Minister of Communication, so a parliamentarian as well. I think the first question I would like to pose would be, shouldn’t the platform be held responsible? Because I think that was mentioned just a while ago. I definitely think platforms should be held responsible because platforms are very, very powerful these days. If I’m allowed to use the example given by one of my favorite authors, Yuval Harari, he basically says that the platform, the algorithm playing the role as the editor these days, and decide what type of information to be consumed by the user. So in that sense, I think as editor, you have huge power to decide on what type of information that the user consumes on a daily basis. And therefore, to me, the platform should be held responsible. But coming back to say that parliament is very powerful, government is very powerful, legislation is very powerful, I actually need to speak on behalf of Malaysia from the Malaysia perspective. We actually think that it’s very difficult, even though we have the legislation in place, it’s very difficult to get the cooperation from the platform. For example, in Malaysia, we do not legalize online gambling, for example. I understand in some other countries, perhaps online gambling is legal. But in Malaysia, it’s just not legal. And however, we continue to see posts promoting or related to online gambling, sponsored posts. And that is despite our numerous requests to tell the platforms that this is illegal in Malaysia, can you please ensure no such sponsored posts about online gambling appear on Malaysia’s users’ timeline? But again, we almost need to repeat it on a weekly basis, if we have communication on a weekly basis. So that is one example. Second, starting from this year, Malaysia tried to introduce a regulation whereby we said, if you are a platform with more than 8 million users in Malaysia, then please come and join us. and get a license for us, so that it will be easier for us to get cooperation from you. Eight million users, basically it means 25% of Malaysian population. We think that is a very reasonable threshold compared to India, compared to UK. I think eight million, 25% of our population, we think that is not a very, very, very unreasonable threshold. But frankly speaking, yes, some of the platform providers can apply for the license, but however, some of the larger ones, if I may, Meta and Google, did not comply with our regulations until today. The deadline, we started the communications last year, and it’s supposed to be in place from the 1st of January this year. But these two huge platforms did not comply with this request. And the next question for the government to decide, of course, would be what can we do? What can we do as a Malaysian government? Yes, we have about 35 million population, but we are not that influential in the international arena. We are not one of the superpowers, so what can the government do? Because it’s not our intention to ban the platforms. We know that our people are getting a lot of benefits from the platform as well. But to talk about legislation as a very super powerful tool, I really have some doubts here. I’m speaking on behalf of Malaysia, speaking from our experience, and therefore, how can we get better cooperation from the platforms? I’m saying this because I think there are things that I think at least, for example, Meta is doing for Singapore, which I think is wonderful. For example, nowadays, if you want to put a sponsor post for targeting Singaporeans, Meta now has an obligation to verify who is the advertiser. And I think that is a good way to actually prevent scammers from putting out sponsor posts. But this is only in place in Singapore and not in other countries. And of course, my question would be why? these requirements to be put in place and be practiced throughout the world because Malaysia, our citizens also suffer from scammers post. Thank you. So, thank you. The lady on the front here. And then we’ll let the panel answer and then we’ll hopefully do another round as well. Okay, thank you. I’m a former minister for IT and telecom and I would just like to ask the representative from ENCOM and we had a very insightful conversation coming in from the regulator. So I want to ask that how can the online and the offline freedom of expression can be different? Our online rights versus our offline rights are the same and every country legislates according to their own community, according to their own citizens, according to their own values. So how can a social media platform sitting in the west in one country who have their own laws and regulations start determining what the law of the country is in the east? So what gives them the right to start deciding the fate of the citizens having their own laws? So my concern is and has been and remains that now the time has come that the countries around the world are getting together to either collectively according to their own value system come together and it’s not about banning the social media platforms, they’re compelling us to do it now. Now the time has come that the like-minded countries need to come together and start having their own platform where they can unify, provide the information to their citizens according to their respective needs. Just because a certain platform is giving certain type of information doesn’t mean they’re not the only platform. So we have examples of China, we have other examples where their own platforms are serving the citizens. needs. So perhaps instead of struggling to get the social media platforms to discount their commercial interest in the favor of the citizens in the East largely, I think the time has come that the people living in the East look forward to having their own platforms to provide the services that are required. Because now the debate has gone in for too long. Even Mr. David Cameron in 2016 had to fight to secure the children from paedophiles. And the platforms refused to take his request of removing the content unless an independent group was determining whether or not a paedophile was a paedophile. What kind of attitude is that? So are we becoming a hostage to a social media platform or are we going to rise now as parliamentarians and start governing and take the rights of our citizens in our own hands? Enough of the freedom of expression debate has gone in. There is no violation of freedom of expression when my personal right is infringed because somebody somewhere doesn’t like me, decides to post against me, and the social media platform becomes the final deciding point, irrespective of what the law of my country is. So please now I’d like to ask Mr. Pavel, what gives you the right to decide that you are not going to listen to what the law of my country is? Thank you.


Bjorn Ihler: Thank you very much. We’ll also take the lady in the white suit on the back and then we’ll get to the panel to respond to this.


Audience: Thank you. Hi ladies and gentlemen. My name is Gerasim Lora. I’m the president of the committee on investigation of abuses, corruptions, and for petitions in the Romanian parliament. So please allow me to tell something to my dear colleague, because this kind of disinformation, it makes it so worse. We need more liberty, more democracy, and more critical thinking, not more forcing people to have this lack of expression. What was happening in Romania was the actual political parties like PSD, PNL, and Udemere that made the disinformation. So we also need to decide who will have this part and who will have this power to cut the disinformation. Thank you very much.


Bjorn Ihler: Thank you. Thank you all so much for this round of questions. Any takers on the panel to start responding?


Cagatay Pekyour: Thank you. So first I would like to start by saying that as META we comply with local laws of the countries that we operate in as long as they are legally enforceable on us and also as long as the requests that we receive from the governments are aligned with international human rights principles. There was a question around how we should be regulating this space, thinking that freedom of expression is most of the times protected on a consultational level, but these regulations are criminal laws, and how we can build a system that may allow, while protecting people from harmful content, us to protect people. So we have to think about how we can build a system that will allow us to protect people and also to protect people’s freedom of opinion and expression. Several years ago, as META back then, I think we were Facebook still, we published a white paper that was called Chartering a Way Forward for Online Content Regulation, that was based on the concept of the system. The white paper was based on the concept of the system. So the two main points that we were trying to make was that one is the content-based approach and the other one is the system-based approach. The white paper and also the research of these professors argued that the content-based approach, a regulation that specifies certain rules and procedures for the content-based approach, is not effective, and also has higher chances of impacting people’s freedom of opinion and expression. On the other hand, a system-based approach, a regulation that may actually require companies to set certain rules and procedures, or build some technical systems to reduce the prevalence of the severely harmful content, is highly more impactful and effective, and also has less risk on freedom of expression. This is still available online, and I encourage the interested parties to check this white paper, and I’m also happy to talk about it after this as well. Just also want to comment on the responsibility part, and comment on that. I believe there are many things that the companies are already being held accountable and also responsible for in different parts of the world. As I mentioned in my original intervention, there are several regulations that bring sanctions, sometimes actually heavy sanctions, on social media platforms in relation to their compliance with local legislations. This said, when we ask for more responsibility for social media platforms, I think it’s still very important to maintain the intermediary liability principles that was agreed several years ago, to be able to strike this balance that we are discussing today, because if we move to a model which doesn’t allow intermediary liability, it would definitely impact people’s freedom of expression, and then the companies would have no option, like no way to protect it. We are talking about millions of content that is shared by users every day. today. I’m thinking that social media companies should be responsible for each and all of them right after they are being posted. I think it’s not realistic to expect from the companies and also expect us to respect people’s freedom of expression.


Bjorn Ihler: Thank you, Cagatay. Any other takers? You want to go?


Mallory Knodel: Yeah, I’d love to. I thought all the questions were really great. I wanted to address, you know, practically how do you do this? I’ll give you an example. In the U.S., we have the Computer Fraud and Abuse Act. This is a problem because it’s often overused and it can punish security researchers who are really trying their best to uncover vulnerabilities and to expose either unlawful behavior or risky behavior, both by companies in the U.S. So if you have these kinds of cybercrime laws, they need to be relaxed in the cases where it’s not intentional to be hacking someone, right? It’s genuinely trying to improve the ecosystem. So that’s one very practical one. I want to say generally to both the Malaysia and Egypt parliamentarians that one problem with some legislation we’ve seen is that it gives the platforms more power. I’ll give you an example. Age verification. You’re asking companies to now ask everyone to show government ID to use the Internet, and the companies are the ones who have that data now. That is really antithetical to, you know, right to privacy. That’s just one example, but I think any time you are introducing new legislation, you have to ask, does this actually entrench the power of the platforms? Does it give them more power than they had before? And almost every time it does, right? Because we imagine that we can offload our problems onto platforms, and I think any legislation that does that is really anti-user. This relates then to the former minister’s question around how can a platform facilitate breaking of laws in my country based on the standards and norms in another country. I agree with the idea of plurality. In fact, I didn’t talk about this. The Social Web Foundation, the organization that I’m part of, is using open protocols to proliferate social media platforms. So you don’t just have a handful. You have many, and that is our hope. But I can tell you that the future in which we have multiple social media platforms per jurisdiction and so on does not solve content moderation. It makes it much more difficult. It makes it much more difficult to sustain, to fund, to support, and it doesn’t make these problems go away. It makes them more complex, which is all the more reason why the advice that human rights defenders, human rights organizations give to right now, Meta, Google, and so on, is do not only comply, of course comply with legal requests that are in line with human rights. If they’re not in line with human rights, they’re effectively illegitimate and illegal, right? So that will continue to be the position of civil society, no matter who owns the platform, no matter how they manage to do content moderation. And I’ll just say the last thing that would really facilitate this now, where we go from a centralized approach to social media to a much more decentralized approach to social media, is create for big tech consequences at the global level, at the UN level. And so for 10 years plus, civil society has been trying to push for hashtag binding treaty, so that you can directly take big tech companies to the ICC or the HRC, depending on their behavior. You can’t do that now. It would be really nice if you could. So if you’re, you know, if you’ve got folks in Geneva that are working on the ad hoc committee on the binding treaty, you know, give that a thought. I think that would be very helpful. Thanks.


Bjorn Ihler: Thank you very much. Pavel, let’s try to you. I think Paul wants to, you want to, please. Go ahead.


Pavel Popescu: Yeah, I’m going to summarize for both ministers, if I’m not wrong. Yeah, first of all, I want to make a comment regarding of what my colleague from the parliament said. At ANCOM in Geneva, Romania, we do policy, we don’t do politics, we do policy, so everything, the exact examples that I gave you are backed up by very consistent evidence that actually TikTok in March released a public document, their annual, you know, document, which transparency document where the platform itself came in front of the people and said that 27,000 accounts coordinated from Russia were influencing Romania’s elections for a specific candidate and a specific party. Of course, we knew this from November already, the European Commission has an investigation in place of that. What I’m trying to say is that regardless the party you, regardless the ideology, the orientation you have, every country can suffer from being weaponized by a social media platform and that’s what we do not want as citizens. That’s why continuously supporting freedom of speech and by the way, I think what I was trying to, what I was saying in the beginning is that today while we are speaking, we do not even have to speak about, I don’t think we have a real problem in terms of how people are moderated in terms of content, I mean, not we have a problem about that because we’ve seen examples, but I think the phenomenon, it’s much more lower compared to what’s happening on the idea of systemic risk, like as I gave you example, cybercrime, child pornography in social media platforms, which are called systemic risks, which should be solved by the platforms itself with their own technology and to answer to Mr. Minister… to Minister, I think we should have a standardised way that the platforms are applying their own rule in terms of regulations and also legislation which is law across all the countries in place. So I don’t think a Romanian citizen is more special or should be more special treated by a Malaysian one or by any other citizen across the world. I think we all have the same rights and I think we all should be protected in the same way. Let’s take for example the fact-checking topic. Today across the world we do not have fact-checking in social media platforms in other places. We still have them in Europe. I can say as a regulator, and it’s on the record, that we have a real problem that in many cases, and I underline, the fact-checking is not genuine. And we have people which were censored by somebody who’s doing the fact-checking. I’m openly saying that and we try to treat these problems. But for example, my biggest fear today for the next generations is how do we treat, I underline, the systemic risks of the platform. So having a campaign in TikTok in Romania with children falling from two meters, I mean jumping from two meters in the arms of their colleagues and they were pulling off their arms and they went to the hospitals. We had 140 people in the hospital. We contacted TikTok, the platform, and said we want you to solve this in 24 hours to get all the hashtags, technically speaking, with, I forgot the name of the campaign, out of TikTok. It took them like three weeks. It was unacceptable or the parasite. challenge. I mean, I think we live in a world where even the social media platforms itself do not have the capacity to process this so much data. I think as humans being we are living in a world where we reached the incapacity of handling so much data and I think we should move as soon as possible to use AI in a positive way to filter this information. I’m saying it again, I cannot accept that the social media platform does not focus more in using AI in eliminating these risks. For example, if you Google my, I mean, if you try to do a search on my social media account on Meta, for example, today because we have a friend from Meta here, I will show you hundreds of cases with Romanian companies, scam cases, with Romanian companies tricking, so-called Romanian companies, tricking Romanians to apply here which are sponsored by the way on the fact that we do not have flagged who’s sponsoring behind that account and I think this should be fixed as soon as possible and with this I’m gonna summarize. I always take US as an example. I’m not saying it’s not the perfect example, there’s no perfect example in the world, but the US democracy is still the best in the world in my opinion and this is for one reason because I’ve been personal in hearings with members of the Congress and the Senate which brought in the front of a committee the CEOs of these social media platforms and asked them very complicated and direct questions and I think you have the power to do the same in your parliaments. I think we should see this in the European Parliament much, much more, you know, often. And to ask, actually, the social media platforms to be responsible for the safety of our kids. There is a European debate about raising the age for certain children. We have two initiatives in Romania, while we are speaking here, where my former colleagues are actually asking that if you are under 16, you will not be able to access a social media platform today. Thank you very much.


Bjorn Ihler: Thank you, Pavel. Paul, I’ll hand it to you, last but not least.


Paul Ash: Thanks for that. We’re just about out of time, so some really quick thoughts here, based on what I’m hearing from these questions. If I were sitting in your seat, if I were walking in your shoes as a legislator, there’s a few things I’d recommend. The first is go back and look at Melvin Kranzberg’s laws of technology. The first of which is really important here. Technology is neither good nor bad, but nor is it neutral. It’s not neutral, and it will have impacts in your jurisdictions that you need to understand to study and find ways to grapple with. The second, and I’d go to our patron’s words when the Christchurch Call was launched, Dame Jacinda Ardern, we cannot simply sit back and accept that platforms just exist and that what is said on them is not in any way the responsibility of the place where they’re published. They are not just the postman. They are the publisher. And the reason they’re the publisher is because the algorithmic curation that privileges some content over others on platforms is not transparent. It influences the way people experience that content, and we need to find ways to grapple with the issues that come out of that. For regulators, there’s a few things that strike me as critically important at the moment. One of those is to go right back to the UN human rights are key. The human rights that have been so hard won and so hard worked for in the UN system over the years are under threat now like they never have been before. And as you think about regulating, as you think about solutions, going back to those is critically important. I think Mallory and I have probably got a conversation over a cuppa at some stage as to whether a UN treaty is going to work well or not, because having worked in cyber for years, I’ve watched the unintended consequences of UN efforts on technology. I’ve watched things like the cyber crime convention that I think have the potential to be deeply, deeply harmful to all of your constituents if they’re not operated well, and I don’t have the confidence at the moment that they will be operated well. So as you regulate, always go back to the UN human rights are key and how it fits. Coming back to that last point, this cannot be a case of all profit and no responsibility. There needs to be a set of multi-stakeholder conversations, and if you are able to, reach back out from within the seats you sit in in Parliament to your constituencies, to the technical community, to civil society, and yes, to the platforms to encourage a multi-stakeholder conversation. in response to this challenge, because it’s unprecedented. History tells us that people will behave badly no matter what technology they have in front of them. But the scale, the scope, the implications of the technologies we’re grappling with right at the moment are such that we have no alternative but to really double down on trying to find multi-stakeholder solutions. I’ll stop there. Thank you.


Bjorn Ihler: Thank you very much, Paul. I’m now the only thing standing between everyone and lunch. Lunch is provided in different areas around the conference. And we will all reconvene for a parliamentary roundtable session in the plenary hall on the ground floor at 2 PM. More information will be shared with the members of Parliament to register for visits to the Norwegian Parliament later. Let me thank the panel for their contributions. This was really great. All of these are world-leading experts in their own respects and I believe will be around for the coming days. So for those of you who had questions that weren’t answers, I at least can be grabbed in the hallways for conversations. I’m sure others here can too. Thank you all so much for attending this panel. We’ll see you after lunch. Thank you.


P

Paul Ash

Speech speed

160 words per minute

Speech length

2533 words

Speech time

945 seconds

Beware false dichotomies between legislation and inaction; seek multi-stakeholder solutions that preserve human rights while addressing online harms

Explanation

Paul Ash warns parliamentarians against viewing the response to online harms as a simple choice between heavy regulation or doing nothing. He advocates for finding balanced approaches that bring together multiple stakeholders to address issues like terrorist content while preserving human rights and maintaining a free, open internet.


Evidence

References the Christchurch Call approach which was developed after the 2019 mosque attacks, bringing together governments, tech firms, and civil society to eliminate terrorist content while respecting human rights


Major discussion point

Balancing Online Safety and Human Rights


Topics

Human rights | Legal and regulatory | Cybersecurity


Agreed with

– Bjorn Ihler

Agreed on

Multi-stakeholder approaches are essential for effective digital governance


Put human rights at the core of regulatory work through impact assessments and community involvement

Explanation

Ash emphasizes that legislators should prioritize human rights-based processes when developing responses to online harms. This includes conducting human rights impact assessments and ensuring affected communities are involved in developing solutions to avoid compounding existing problems.


Evidence

Recommends human rights impact assessments, building in smart human rights review and sunset clauses, and ensuring regulatory measures are tightly constrained to prevent scope creep


Major discussion point

Balancing Online Safety and Human Rights


Topics

Human rights | Legal and regulatory


Agreed with

– Mallory Knodel
– Cagatay Pekyour

Agreed on

Human rights must be at the core of digital governance and regulation


Platforms are publishers, not just postmen, due to algorithmic curation that privileges certain content over others

Explanation

Ash argues that social media platforms cannot claim to be neutral intermediaries because their algorithms actively curate and prioritize content, making editorial decisions about what users see. This algorithmic curation makes them publishers with corresponding responsibilities for the content they promote.


Evidence

Quotes former New Zealand Prime Minister Jacinda Ardern’s statement that platforms cannot simply claim they are just postmen when their algorithms influence content visibility


Major discussion point

Platform Responsibility and Accountability


Topics

Legal and regulatory | Sociocultural


Get good technological advice from people who actually understand how the internet works, not just high-level content issues

Explanation

Ash stresses the importance of parliamentarians seeking advice from genuine technical experts who understand the underlying technology and infrastructure of the internet. He warns against sidelining the technical community that built the internet, as their expertise is crucial even when dealing with content-level issues.


Evidence

Notes that one of the biggest issues over the last 5-10 years has been the sidelining of the technical community, and emphasizes the need for technical experts who know how the internet actually works


Major discussion point

Technical Community Integration and Transparency


Topics

Infrastructure | Legal and regulatory


Agreed with

– Mallory Knodel

Agreed on

Technical expertise is crucial for effective policy-making


Demand radical transparency through platform-agnostic reporting and standardized data formats with independent oversight

Explanation

Ash calls for comprehensive transparency requirements that go beyond individual platform reporting to include standardized, comparable data formats. He emphasizes that transparency should apply equally to government actions and platform operations, with independent oversight mechanisms to verify claims and enable civil society input.


Evidence

References the Christchurch Call’s transparency working group efforts and emphasizes the need for civil society to have access to data for verification purposes


Major discussion point

Technical Community Integration and Transparency


Topics

Legal and regulatory | Human rights


Agreed with

– Cagatay Pekyour

Agreed on

Transparency and accountability are fundamental requirements


The Christchurch Call demonstrates that shared norms can travel faster and be more effective than individual statutes

Explanation

Ash presents the Christchurch Call as evidence that international cooperation through shared norms and multi-stakeholder processes can be more agile and effective than traditional legislation. The initiative has grown to include around 130-140 participating entities across governments, tech firms, and civil society organizations.


Evidence

The Christchurch Call now includes 56 governments, nearly 20 tech firms, a dozen partners, and about 50 civil society organizations, demonstrating rapid international adoption


Major discussion point

Multi-stakeholder Governance Approaches


Topics

Legal and regulatory | Cybersecurity


Alternative approaches to legislation include voluntary codes, technical funding, safe harbor frameworks, and capacity building for civil society

Explanation

Ash outlines various non-legislative tools that parliamentarians can use to address online harms, including co-regulatory codes, funding for open-source technical solutions, data-sharing frameworks with multi-stakeholder oversight, and supporting civil society capacity. He argues these approaches can be more flexible and sustainable than pure regulation.


Evidence

Cites examples from the Christchurch Call including crisis response protocols and toolkit development for community resilience, as well as mandated work on algorithm transparency with privacy protections


Major discussion point

Multi-stakeholder Governance Approaches


Topics

Legal and regulatory | Development


Digital resilience and literacy are important but take longer to develop than the pace of technological challenges

Explanation

While Ash supports building digital resilience and literacy in communities, he warns that these solutions will take 10-30 years to fully implement through education systems. Given the rapid pace of technological change and current online threats, resilience-building alone cannot address immediate challenges and must be combined with other approaches.


Evidence

Notes that training in schools is great but represents a solution that will only be effective in 10, 20, or 30 years, while current technological challenges may not allow that much time


Major discussion point

Multi-stakeholder Governance Approaches


Topics

Development | Sociocultural


M

Mallory Knodel

Speech speed

160 words per minute

Speech length

2759 words

Speech time

1030 seconds

End-to-end encryption democratizes the ability to set privacy boundaries and protects fundamental human rights including free expression

Explanation

Knodel argues that end-to-end encryption is the primary technical mechanism that allows individuals to establish personal boundaries in the digital age. She positions encryption as part of the historical struggle for democratic balance between state power and individual rights, emphasizing that it protects not just privacy but also freedom of expression, opinion, and association.


Evidence

References UN recognition that encryption protects free expression and freedom of opinion, and notes that the Global Encryption Coalition was founded in 2020 in response to increasing legislative attacks on encryption in Five Eyes countries


Major discussion point

Balancing Online Safety and Human Rights


Topics

Cybersecurity | Human rights


Agreed with

– Paul Ash
– Cagatay Pekyour

Agreed on

Human rights must be at the core of digital governance and regulation


Technical mechanisms like encryption create limits and boundaries similar to legislative checks on power

Explanation

Knodel frames encryption as a technical solution that serves the same democratic function as legislative checks and balances on government power. She argues that just as democracy requires limits on investigatory powers and policing, encryption provides a technical mechanism to create boundaries and prevent abuse of power in the digital realm.


Evidence

Draws parallels to pre-digital struggles with investigatory powers and notes that encryption democratizes the ability to say ‘this conversation is private’ in a way that can’t be disrupted


Major discussion point

Technical Community Integration and Transparency


Topics

Cybersecurity | Human rights | Legal and regulatory


Cybersecurity research from all perspectives must be accommodated and not criminalized to ensure internet security

Explanation

Knodel emphasizes that comprehensive cybersecurity requires research and testing from diverse perspectives including civil society, private sector, and academia. She warns that criminalizing legitimate security research undermines overall internet security by preventing the discovery and disclosure of vulnerabilities.


Evidence

Cites the U.S. Computer Fraud and Abuse Act as problematic because it can punish security researchers who are genuinely trying to uncover vulnerabilities and improve the ecosystem


Major discussion point

Technical Community Integration and Transparency


Topics

Cybersecurity | Legal and regulatory


Agreed with

– Paul Ash

Agreed on

Technical expertise is crucial for effective policy-making


Companies harvest and store user data at unprecedented scale, exceeding natural limits of personal boundaries

Explanation

Knodel argues that the digitalization of life has created data collection capabilities that far exceed any natural limits that previously existed for personal privacy. She contends that the current business model of harvesting, creating, and storing vast amounts of personal data, combined with government access to this information, has fundamentally altered the balance between private and public realms.


Evidence

Contrasts the digital age with pre-digital limitations, noting there was previously a natural limit to how many homes could be searched or how much could be known about a person, but digital scale has removed these boundaries


Major discussion point

Platform Responsibility and Accountability


Topics

Human rights | Economic


Government surveillance capabilities have exceeded legislative controls, creating power imbalances that need correction

Explanation

Knodel argues that while there’s focus on technology outpacing policy in terms of user capabilities, there’s insufficient attention to how government use of technology has also exceeded legislative controls. She points to mass surveillance and unlawful targeted surveillance as examples where government technological capabilities have outstripped proper oversight and legal frameworks.


Evidence

Provides examples of mass surveillance and targeted surveillance with unchecked power that bypasses proper legal channels, arguing this is also technology outpacing policy but from the government side


Major discussion point

Cybercrime and Information Operations


Topics

Human rights | Legal and regulatory | Cybersecurity


C

Cagatay Pekyour

Speech speed

124 words per minute

Speech length

1989 words

Speech time

961 seconds

META conducts five-step due diligence for government takedown requests assessing legality, legitimacy, necessity, and proportionality

Explanation

Pekyour explains that META has established a systematic process for evaluating government requests to remove content, based on human rights commitments and Global Network Initiative principles. This process requires teams to assess whether requests meet standards of legality, legitimacy, necessity, proportionality, and consider external risks before taking action.


Evidence

References META’s regulatory compliance policy and membership in the Global Network Initiative, and mentions regular transparency reports published every six months plus case studies for significant situations


Major discussion point

Balancing Online Safety and Human Rights


Topics

Human rights | Legal and regulatory


Agreed with

– Paul Ash

Agreed on

Transparency and accountability are fundamental requirements


Effective regulation must comply with international human rights standards and have clear definitions to avoid over-enforcement

Explanation

Pekyour argues that rights-respecting regulation should align with Article 19 of the International Covenant on Civil and Political Rights and focus only on the most harmful content with clear, narrow definitions. He warns that broad definitions of harmful content put platforms in difficult positions and risk expanding enforcement beyond intended scope.


Evidence

Cites Article 19 of the UN International Covenant on Civil and Political Rights, and notes that regulations with broad definitions create risks for freedom of expression while narrow definitions allow better protection


Major discussion point

Balancing Online Safety and Human Rights


Topics

Human rights | Legal and regulatory


Agreed with

– Paul Ash
– Mallory Knodel

Agreed on

Human rights must be at the core of digital governance and regulation


There are over 120 voice-limiting regulations in Africa, Middle East and Turkey region, with 40 more in development

Explanation

Pekyour provides specific data on the regulatory landscape in META’s AMET region, noting the proliferation of cybercrime laws and content regulations. He explains that while most regulations target users directly, those that impose platform liability can lead to over-enforcement as companies take safer approaches to avoid sanctions.


Evidence

Specific numbers: 120+ existing voice-limiting regulations, 40+ in development, with 90 targeting users directly and the remainder imposing platform responsibilities and sanctions


Major discussion point

Regulatory Challenges and Enforcement


Topics

Legal and regulatory | Human rights


Platforms must be held accountable while maintaining intermediary liability principles to protect freedom of expression

Explanation

Pekyour acknowledges that platforms face accountability measures and sanctions in various jurisdictions, but argues that maintaining intermediary liability protections is crucial for balancing responsibility with free expression. He warns that removing these protections would force companies to over-censor due to the impossibility of reviewing millions of daily posts in real-time.


Evidence

References the millions of content pieces shared daily and argues it’s unrealistic to expect companies to be responsible for each piece immediately upon posting while still respecting freedom of expression


Major discussion point

Platform Responsibility and Accountability


Topics

Legal and regulatory | Human rights


Disagreed with

– Pavel Popescu

Disagreed on

Primary responsibility for content moderation and platform accountability


P

Pavel Popescu

Speech speed

132 words per minute

Speech length

2805 words

Speech time

1272 seconds

Social media platforms should take greater responsibility for eliminating systemic risks like cybercrime and child exploitation using AI technology

Explanation

Popescu argues that platforms with billions in profits should invest more in AI-powered systems to automatically detect and eliminate systemic risks such as coordinated cybercrime campaigns, child exploitation, and dangerous viral challenges. He contends that platforms currently lack the capacity to process the massive amounts of data effectively and should use AI proactively rather than reactively.


Evidence

Provides specific examples including TikTok challenges that led to 140 hospitalizations in Romania, spoofing campaigns using police and bank numbers, and scam advertisements featuring government officials. Notes it took TikTok three weeks to remove dangerous hashtags when it should have taken 24 hours


Major discussion point

Platform Responsibility and Accountability


Topics

Cybersecurity | Legal and regulatory | Sociocultural


Disagreed with

– Cagatay Pekyour

Disagreed on

Primary responsibility for content moderation and platform accountability


Recent Romanian elections demonstrated how social media platforms can be weaponized by foreign actors for electoral interference

Explanation

Popescu describes how Romania’s recent elections were targeted by coordinated foreign influence operations, with TikTok itself acknowledging that 27,000 Russia-coordinated accounts influenced the elections for specific candidates. He uses this as evidence that any country can suffer from social media weaponization regardless of political orientation, emphasizing this as a policy rather than political issue.


Evidence

TikTok’s own transparency document from March acknowledged 27,000 accounts coordinated from Russia influenced Romania’s elections; European Commission has an ongoing investigation; provides examples of disinformation including false claims about French and Russian soldiers invading Romania


Major discussion point

Cybercrime and Information Operations


Topics

Cybersecurity | Legal and regulatory


Existing regulations worldwide are sufficient but not properly enforced; the main problem is lack of implementation

Explanation

Popescu contends that from the UN level down to individual countries, there are already adequate regulations in place to address online harms, but the critical issue is enforcement. He argues that rather than creating new regulations, focus should be on implementing existing laws and holding platforms accountable to their own terms of service.


Evidence

States that if platforms followed their own terms and conditions, additional regulation wouldn’t be needed; notes that current regulations exist but aren’t enforced effectively


Major discussion point

Regulatory Challenges and Enforcement


Topics

Legal and regulatory


Platforms should apply rules consistently across all countries rather than having different standards for different jurisdictions

Explanation

Popescu argues for standardized application of platform policies globally, contending that all citizens deserve equal protection regardless of nationality. He believes platforms should not treat users differently based on their country of origin and should implement the same safety measures universally.


Evidence

Uses fact-checking as an example, noting it exists in Europe but not everywhere, and argues Romanian citizens shouldn’t be treated differently from Malaysian citizens in terms of platform protections


Major discussion point

Regulatory Challenges and Enforcement


Topics

Legal and regulatory | Human rights


Disagreed with

– Audience (Former IT Minister)

Disagreed on

Approach to platform regulation – standardization versus jurisdictional flexibility


B

Bjorn Ihler

Speech speed

131 words per minute

Speech length

1275 words

Speech time

583 seconds

Scale and sophistication of online criminal activities require evolved approaches beyond traditional notice-and-takedown processes

Explanation

Ihler introduces the session by noting that as online criminal activities increase in scope and sophistication, traditional approaches to combating cybercrime must evolve. He emphasizes that current legislative initiatives focusing on misinformation and online misconduct need to balance cybersecurity with fundamental human rights, particularly freedom of expression.


Evidence

References recent legislative initiatives targeting misinformation and notes that laws on content regulation, surveillance, and platform liability can pose serious risks to freedom of expression and access to information


Major discussion point

Cybercrime and Information Operations


Topics

Cybersecurity | Legal and regulatory | Human rights


Agreed with

– Paul Ash

Agreed on

Multi-stakeholder approaches are essential for effective digital governance


A

Audience

Speech speed

154 words per minute

Speech length

1584 words

Speech time

616 seconds

Smaller countries struggle to get platform cooperation despite having reasonable regulatory requirements

Explanation

The Malaysian Deputy Minister describes how Malaysia has difficulty obtaining platform cooperation even with reasonable requirements like licensing for platforms with over 8 million users (25% of population) and removing illegal gambling advertisements. She notes that major platforms like Meta and Google have not complied with licensing requirements despite the reasonable threshold and extended timeline.


Evidence

Specific example of Malaysia’s 8 million user threshold (25% of population) being reasonable compared to other countries; ongoing issues with online gambling ads despite repeated requests; Meta and Google’s non-compliance with licensing requirements that had a deadline of January 1st


Major discussion point

Regulatory Challenges and Enforcement


Topics

Legal and regulatory | Economic


Agreements

Agreement points

Human rights must be at the core of digital governance and regulation

Speakers

– Paul Ash
– Mallory Knodel
– Cagatay Pekyour

Arguments

Put human rights at the core of regulatory work through impact assessments and community involvement


End-to-end encryption democratizes the ability to set privacy boundaries and protects fundamental human rights including free expression


Effective regulation must comply with international human rights standards and have clear definitions to avoid over-enforcement


Summary

All three speakers emphasize that human rights principles, particularly those established by the UN system, must be the foundation for any digital governance approach. They agree that regulatory measures should be assessed for human rights impact and comply with international standards.


Topics

Human rights | Legal and regulatory


Multi-stakeholder approaches are essential for effective digital governance

Speakers

– Paul Ash
– Bjorn Ihler

Arguments

Beware false dichotomies between legislation and inaction; seek multi-stakeholder solutions that preserve human rights while addressing online harms


Scale and sophistication of online criminal activities require evolved approaches beyond traditional notice-and-takedown processes


Summary

Both speakers advocate for collaborative approaches that bring together governments, tech companies, civil society, and technical communities rather than relying solely on traditional regulatory mechanisms.


Topics

Legal and regulatory | Cybersecurity


Technical expertise is crucial for effective policy-making

Speakers

– Paul Ash
– Mallory Knodel

Arguments

Get good technological advice from people who actually understand how the internet works, not just high-level content issues


Cybersecurity research from all perspectives must be accommodated and not criminalized to ensure internet security


Summary

Both speakers stress the importance of including genuine technical experts in policy discussions and protecting the ability of researchers to study and improve internet security systems.


Topics

Infrastructure | Cybersecurity | Legal and regulatory


Transparency and accountability are fundamental requirements

Speakers

– Paul Ash
– Cagatay Pekyour

Arguments

Demand radical transparency through platform-agnostic reporting and standardized data formats with independent oversight


META conducts five-step due diligence for government takedown requests assessing legality, legitimacy, necessity, and proportionality


Summary

Both speakers agree that transparency in both government and platform actions is essential, with standardized reporting and clear processes for decision-making that can be independently verified.


Topics

Legal and regulatory | Human rights


Similar viewpoints

Both speakers reject the notion that platforms are neutral intermediaries and argue they must take greater responsibility for content and safety on their platforms, though they approach this from different angles – Ash focuses on algorithmic curation making them publishers, while Popescu emphasizes using technology to address systemic risks.

Speakers

– Paul Ash
– Pavel Popescu

Arguments

Platforms are publishers, not just postmen, due to algorithmic curation that privileges certain content over others


Social media platforms should take greater responsibility for eliminating systemic risks like cybercrime and child exploitation using AI technology


Topics

Legal and regulatory | Cybersecurity


Both speakers identify enforcement and implementation gaps as key problems, though Knodel focuses on government overreach while Popescu emphasizes inadequate enforcement of existing rules against platforms.

Speakers

– Mallory Knodel
– Pavel Popescu

Arguments

Government surveillance capabilities have exceeded legislative controls, creating power imbalances that need correction


Existing regulations worldwide are sufficient but not properly enforced; the main problem is lack of implementation


Topics

Legal and regulatory | Human rights


Both speakers acknowledge the proliferation of regulations globally and the complexity this creates, with Popescu advocating for standardized global application while Pekyour documents the regulatory burden platforms face.

Speakers

– Cagatay Pekyour
– Pavel Popescu

Arguments

There are over 120 voice-limiting regulations in Africa, Middle East and Turkey region, with 40 more in development


Platforms should apply rules consistently across all countries rather than having different standards for different jurisdictions


Topics

Legal and regulatory


Unexpected consensus

Limitations of purely regulatory approaches

Speakers

– Paul Ash
– Mallory Knodel
– Pavel Popescu

Arguments

Alternative approaches to legislation include voluntary codes, technical funding, safe harbor frameworks, and capacity building for civil society


Technical mechanisms like encryption create limits and boundaries similar to legislative checks on power


Existing regulations worldwide are sufficient but not properly enforced; the main problem is lack of implementation


Explanation

Despite representing different stakeholder perspectives (civil society foundation, human rights advocacy, and government regulation), all three speakers converge on the view that traditional regulation alone is insufficient. This consensus is unexpected given their different institutional positions and suggests a mature understanding of the complexity of digital governance challenges.


Topics

Legal and regulatory | Human rights | Cybersecurity


Scale challenges exceed traditional governance mechanisms

Speakers

– Paul Ash
– Mallory Knodel
– Cagatay Pekyour
– Pavel Popescu

Arguments

Digital resilience and literacy are important but take longer to develop than the pace of technological challenges


Companies harvest and store user data at unprecedented scale, exceeding natural limits of personal boundaries


Platforms must be held accountable while maintaining intermediary liability principles to protect freedom of expression


Social media platforms should take greater responsibility for eliminating systemic risks like cybercrime and child exploitation using AI technology


Explanation

All speakers, despite their different roles and perspectives, acknowledge that the scale of digital challenges has fundamentally changed the governance landscape. This unexpected consensus across civil society, platform, and government representatives suggests a shared recognition that traditional approaches are inadequate for current realities.


Topics

Legal and regulatory | Cybersecurity | Human rights


Overall assessment

Summary

The speakers demonstrate significant consensus on fundamental principles including human rights centrality, need for multi-stakeholder approaches, importance of technical expertise, and transparency requirements. They also share recognition that traditional regulatory approaches are insufficient for current digital challenges.


Consensus level

High level of consensus on principles and problem identification, with moderate consensus on solutions. This suggests the field has matured to where stakeholders understand the complexity of digital governance challenges, even if they approach solutions differently. The implications are positive for collaborative policy development, as there appears to be shared foundation for building more nuanced approaches that balance competing interests while maintaining core human rights principles.


Differences

Different viewpoints

Effectiveness of UN treaties and international legal frameworks for addressing tech company behavior

Speakers

– Paul Ash
– Mallory Knodel

Arguments

I don’t have the confidence at the moment that they will be operated well. So as you regulate, always go back to the UN human rights are key and how it fits. Coming back to that last point, this cannot be a case of all profit and no responsibility.


And so for 10 years plus, civil society has been trying to push for hashtag binding treaty, so that you can directly take big tech companies to the ICC or the HRC, depending on their behavior. You can’t do that now. It would be really nice if you could.


Summary

Paul Ash expresses skepticism about UN cyber crime conventions and their potential for unintended harmful consequences, while Mallory Knodel advocates for binding treaties that would allow direct accountability of tech companies through international courts like the ICC or HRC.


Topics

Legal and regulatory | Human rights


Primary responsibility for content moderation and platform accountability

Speakers

– Pavel Popescu
– Cagatay Pekyour

Arguments

Social media platforms should take greater responsibility for eliminating systemic risks like cybercrime and child exploitation using AI technology


Platforms must be held accountable while maintaining intermediary liability principles to protect freedom of expression


Summary

Pavel Popescu demands much stronger proactive responsibility from platforms to eliminate systemic risks using AI, while Cagatay Pekyour emphasizes the need to maintain intermediary liability protections to balance accountability with freedom of expression.


Topics

Legal and regulatory | Cybersecurity | Human rights


Approach to platform regulation – standardization versus jurisdictional flexibility

Speakers

– Pavel Popescu
– Audience (Former IT Minister)

Arguments

Platforms should apply rules consistently across all countries rather than having different standards for different jurisdictions


How can a social media platform sitting in the west in one country who have their own laws and regulations start determining what the law of the country is in the east? So what gives them the right to start deciding the fate of the citizens having their own laws?


Summary

Pavel Popescu advocates for standardized global application of platform rules, while the former IT minister argues that platforms should respect local jurisdictional laws and values rather than imposing Western standards globally.


Topics

Legal and regulatory | Human rights | Sociocultural


Unexpected differences

Role of technical community versus platform responsibility

Speakers

– Paul Ash
– Pavel Popescu

Arguments

Get good technological advice from people who actually understand how the internet works, not just high-level content issues


Social media platforms should take greater responsibility for eliminating systemic risks like cybercrime and child exploitation using AI technology


Explanation

This disagreement is unexpected because both speakers are concerned with technical solutions, but Paul Ash emphasizes involving the broader technical community that built the internet, while Pavel Popescu focuses primarily on platforms using their own AI technology to solve problems. This represents different philosophies about where technical expertise should come from.


Topics

Infrastructure | Cybersecurity | Legal and regulatory


Scope of content regulation enforcement

Speakers

– Pavel Popescu
– Audience (Romanian Parliament Member)

Arguments

Recent Romanian elections demonstrated how social media platforms can be weaponized by foreign actors for electoral interference


We need more liberty, more democracy, and more critical thinking, not more forcing people to have this lack of expression. What was happening in Romania was the actual political parties like PSD, PNL, and Udemere that made the disinformation.


Explanation

This disagreement is unexpected because both speakers are from Romania and discussing the same electoral events, but they have fundamentally different interpretations of what happened and what the appropriate response should be. This highlights how even direct witnesses to events can have opposing views on the nature of information operations versus legitimate political discourse.


Topics

Cybersecurity | Legal and regulatory | Human rights


Overall assessment

Summary

The main areas of disagreement center on the balance between platform responsibility and user rights, the effectiveness of international versus national regulatory approaches, and whether technical solutions should come from platforms themselves or the broader technical community. There are also fundamental disagreements about the interpretation of recent events like the Romanian elections.


Disagreement level

Moderate to high disagreement with significant implications. While speakers generally agree on broad goals like protecting human rights and addressing online harms, they have substantially different views on implementation methods, regulatory approaches, and the distribution of responsibility between platforms, governments, and technical communities. These disagreements reflect deeper philosophical differences about digital governance that could significantly impact policy development and international cooperation efforts.


Partial agreements

Partial agreements

Similar viewpoints

Both speakers reject the notion that platforms are neutral intermediaries and argue they must take greater responsibility for content and safety on their platforms, though they approach this from different angles – Ash focuses on algorithmic curation making them publishers, while Popescu emphasizes using technology to address systemic risks.

Speakers

– Paul Ash
– Pavel Popescu

Arguments

Platforms are publishers, not just postmen, due to algorithmic curation that privileges certain content over others


Social media platforms should take greater responsibility for eliminating systemic risks like cybercrime and child exploitation using AI technology


Topics

Legal and regulatory | Cybersecurity


Both speakers identify enforcement and implementation gaps as key problems, though Knodel focuses on government overreach while Popescu emphasizes inadequate enforcement of existing rules against platforms.

Speakers

– Mallory Knodel
– Pavel Popescu

Arguments

Government surveillance capabilities have exceeded legislative controls, creating power imbalances that need correction


Existing regulations worldwide are sufficient but not properly enforced; the main problem is lack of implementation


Topics

Legal and regulatory | Human rights


Both speakers acknowledge the proliferation of regulations globally and the complexity this creates, with Popescu advocating for standardized global application while Pekyour documents the regulatory burden platforms face.

Speakers

– Cagatay Pekyour
– Pavel Popescu

Arguments

There are over 120 voice-limiting regulations in Africa, Middle East and Turkey region, with 40 more in development


Platforms should apply rules consistently across all countries rather than having different standards for different jurisdictions


Topics

Legal and regulatory


Takeaways

Key takeaways

Multi-stakeholder governance involving governments, platforms, civil society, and technical communities is essential for addressing online harms while protecting human rights


Platforms function as publishers rather than neutral conduits due to algorithmic curation, creating responsibility for content amplification


End-to-end encryption serves as a critical technical mechanism for protecting privacy and human rights, democratizing the ability to set personal boundaries


Existing regulations worldwide are often sufficient but poorly enforced – the primary challenge is implementation rather than creating new laws


Foreign interference in elections through social media weaponization represents a clear and present danger requiring immediate technical solutions


Effective regulation must include human rights impact assessments, sunset clauses, transparency requirements, and avoid mission creep


The scale of digital data and interactions has exceeded natural limits and traditional regulatory approaches, requiring AI-powered solutions for systemic risks


Smaller nations face significant challenges in securing platform cooperation despite reasonable regulatory requirements


Digital resilience and literacy are important long-term solutions but cannot address immediate technological challenges


Resolutions and action items

Parliamentarians should establish digital trust caucuses and reach across party lines on digital issues


Governments should demand standardized, platform-agnostic transparency reports with independent oversight


Platforms should implement consistent global standards rather than jurisdiction-specific approaches


Legislators should bring platform CEOs before parliamentary committees for direct questioning, following the US Congressional model


Civil society capacity building should be prioritized through dedicated funding and support


Technical community input must be systematically integrated into policy-making processes


Safe harbor data sharing frameworks with multi-stakeholder oversight should be developed


Open source solutions and technical tools for resilience should receive dedicated funding


Unresolved issues

How to effectively enforce existing regulations when platforms refuse to comply with smaller nations’ requirements


Whether content-based or systems-based regulatory approaches are more effective for protecting both safety and free expression


How to balance platform responsibility with intermediary liability principles


The appropriate role and scope of fact-checking mechanisms given concerns about censorship


How to address the fundamental tension between global platform operations and local legal jurisdictions


Whether age restrictions for social media access (such as proposed 16+ requirements) are appropriate or effective


How to create binding international mechanisms for holding big tech companies accountable


The effectiveness of UN cybercrime conventions and whether they will be operated in ways that protect rather than harm human rights


Suggested compromises

Voluntary and co-regulatory codes that bring together mixed groups across industry, government, and civil society as alternatives to pure regulation


Systems-based regulation requiring companies to establish procedures and technical systems rather than specifying content rules


Economic levers and soft power diplomacy as alternatives to direct regulatory enforcement


Procedural safeguards including effective rights of defense and remedy for users to prevent over-enforcement


Technical solutions that use AI positively to counter AI-generated harms and systemic risks


Relaxed cybercrime laws that accommodate legitimate security research while maintaining protections against malicious activity


Multi-jurisdictional cooperation among like-minded countries to create alternative platforms serving regional needs and values


Thought provoking comments

Beware false dichotomies. Beware folk who say to you, we either have to legislate or we can’t do anything about this problem. Because both of those ways are quite perilous when dealing with technology issues.

Speaker

Paul Ash


Reason

This comment reframes the entire debate by challenging the binary thinking that often dominates policy discussions around technology regulation. It introduces nuance and suggests that effective solutions require moving beyond either/or approaches to find middle ground that preserves both security and rights.


Impact

This set the tone for the entire discussion by establishing that the conversation would focus on balanced, multi-stakeholder approaches rather than polarized positions. It influenced subsequent speakers to present more nuanced views and consider alternative approaches beyond traditional regulation.


Are people the ones that we are erecting these treaties and these policies and this technology to protect the people or are we protecting against people? Are we leaning into democracy or are we truly genuinely afraid of it to the point where we feel we must worry about the end user basically or the people as the threat?

Speaker

Mallory Knodel


Reason

This philosophical question cuts to the heart of the fundamental tension in digital governance – whether policies are designed to empower citizens or control them. It challenges participants to examine their underlying assumptions about the relationship between state power and individual rights in the digital age.


Impact

This comment shifted the discussion from technical implementation details to fundamental philosophical questions about democracy and power. It prompted other speakers to consider the broader implications of their approaches and influenced the audience questions that followed, with several parliamentarians grappling with these power dynamics.


Today, if we would be honest, we do not need, we would not need that regulation, we would do not need extra regulation in place. If you go and read the terms and conditions of each social media platform… The platforms should wake up, it’s a wake-up call for the platforms to understand that their terms and conditions which are extremely strict should protect our kids from child pornography, should protect our citizens from cybercrime.

Speaker

Pavel Popescu


Reason

This comment introduces a provocative perspective that the problem isn’t lack of rules but lack of enforcement of existing rules. It challenges both regulators and platforms by suggesting that platforms already have the tools they need but aren’t using them effectively, shifting responsibility back to the private sector.


Impact

This comment created tension in the discussion and prompted strong responses from both the Meta representative and audience members. It moved the conversation from theoretical policy discussions to concrete accountability, leading to heated exchanges about platform responsibility and enforcement mechanisms.


This situation puts companies like META almost like an intermediary between the governments and the users. We are seeing the users, while they want us to reduce the abuse, they also expect us to protect their freedom of expression. And we are hearing from the governments who are not only expecting us to remove illegal content, but also us to take action on all sorts of harmful content.

Speaker

Cagatay Pekyour


Reason

This comment reveals the complex position platforms occupy as quasi-governmental actors making decisions that affect millions of people. It highlights the unprecedented nature of private companies wielding such influence over public discourse and the inherent conflicts in their multiple roles.


Impact

This admission of platforms’ intermediary role sparked significant discussion about platform power and responsibility. It led to pointed questions from parliamentarians about platform accountability and influenced the debate about whether platforms should have such decision-making authority over content.


How can a social media platform sitting in the west in one country who have their own laws and regulations start determining what the law of the country is in the east? So what gives them the right to start deciding the fate of the citizens having their own laws?

Speaker

Former IT Minister (audience)


Reason

This question exposes the fundamental tension between global platforms and national sovereignty, challenging the current system where Western-based companies make content decisions that affect users worldwide based on their home country’s values rather than local laws and cultural norms.


Impact

This question dramatically shifted the discussion toward issues of digital colonialism and sovereignty. It prompted defensive responses from the platform representative and sparked a broader conversation about the need for alternative platforms and governance structures that respect local values and laws.


They are not just the postman. They are the publisher. And the reason they’re the publisher is because the algorithmic curation that privileges some content over others on platforms is not transparent.

Speaker

Paul Ash


Reason

This analogy fundamentally reframes how we should think about platform liability by distinguishing between passive content transmission and active editorial decisions. It challenges the legal fiction that platforms are neutral intermediaries when their algorithms actively shape what users see.


Impact

This comment provided a clear framework for understanding platform responsibility that resonated throughout the remaining discussion. It influenced how other speakers and audience members framed questions about accountability and helped clarify the distinction between different types of platform activities.


Overall assessment

These key comments fundamentally shaped the discussion by moving it beyond technical implementation details to address deeper questions about power, democracy, and sovereignty in the digital age. Paul Ash’s opening comment about false dichotomies established a collaborative tone that encouraged nuanced thinking, while Mallory Knodel’s philosophical challenge about protecting people versus protecting against people forced participants to examine their underlying assumptions. Pavel Popescu’s blunt criticism of platform enforcement created productive tension that led to more honest exchanges about accountability. The audience questions, particularly about digital sovereignty, transformed what could have been a technical policy discussion into a broader examination of how global platforms interact with national governance. Together, these comments elevated the conversation from procedural matters to fundamental questions about the future of democratic governance in a digital world, creating a more substantive and impactful dialogue than typical policy discussions.


Follow-up questions

How can parliamentarians effectively get good technological advice and find people within their jurisdiction who understand technology well?

Speaker

Paul Ash


Explanation

This was identified as a critical need for legislators to make informed decisions about technology issues, but the specific mechanisms for accessing such expertise were not detailed.


How can standardized data formats for transparency reports be implemented across platforms to allow civil society access and verification of claims?

Speaker

Paul Ash


Explanation

Paul mentioned this as important for transparency but didn’t elaborate on the technical implementation or governance mechanisms needed.


How can safe harbour data sharing frameworks with multistakeholder oversight be designed to address algorithmic transparency while respecting privacy?

Speaker

Paul Ash


Explanation

This was mentioned as work underway within the Christchurch Call but requires further development of the technical and legal frameworks.


How can jurisdictions strengthen existing policies that limit surveillance power during times of increased international cooperation on cybersecurity and cybercrime?

Speaker

Mallory Knodel


Explanation

Mallory emphasized this need but didn’t provide specific mechanisms for how jurisdictions can exceed minimum standards in international treaties.


How can countries that have signed the UN cybercrime treaty exceed the human rights safeguards suggested in that treaty?

Speaker

Mallory Knodel


Explanation

This was presented as an important opportunity but specific approaches for going above and beyond treaty requirements were not detailed.


How can effective fact-checking mechanisms be implemented globally while avoiding censorship issues?

Speaker

Pavel Popescu


Explanation

Pavel mentioned problems with non-genuine fact-checking and people being censored, indicating need for better systems but didn’t propose solutions.


How can AI be effectively used by platforms to eliminate systemic risks like cybercrime campaigns and harmful challenges?

Speaker

Pavel Popescu


Explanation

Pavel emphasized this as urgent but didn’t detail what specific AI implementations would be most effective or how to ensure they’re implemented.


How can smaller countries like Malaysia get better cooperation from large platforms when they don’t comply with local licensing requirements?

Speaker

Ni Ching (Malaysia Deputy Minister)


Explanation

This represents a practical challenge for smaller jurisdictions in enforcing regulations on global platforms, but no solutions were provided.


How can like-minded countries collaborate to develop their own platforms that serve citizens’ needs according to their respective values?

Speaker

Former IT Minister (audience member)


Explanation

This was proposed as an alternative to struggling with existing platforms, but the practical mechanisms for such collaboration were not explored.


How can legislation effectively address the balance between criminal law enforcement and constitutional human rights protections in cybercrime cases?

Speaker

Egyptian Judge (audience member)


Explanation

This represents a fundamental challenge for legal systems but specific legislative strategies were not provided.


How can the transition from centralized to decentralized social media platforms be managed while addressing content moderation challenges?

Speaker

Mallory Knodel


Explanation

Mallory mentioned this transition makes content moderation more complex but didn’t elaborate on management strategies.


How can binding international treaties be developed to create consequences for big tech companies at the global level?

Speaker

Mallory Knodel


Explanation

This was suggested as a solution for accountability but the specific mechanisms for such treaties were not detailed.


Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.