Parliamentary Session 5 Parliamentary Exchange Enhancing Digital Policy Practices
24 Jun 2025 15:30h - 16:30h
Parliamentary Session 5 Parliamentary Exchange Enhancing Digital Policy Practices
Session at a glance
Summary
This discussion focused on parliamentary approaches to regulating harmful online content while balancing digital safety with freedom of expression and human rights. Parliamentarians from Pakistan, Argentina, Nepal, Bulgaria, and South Africa shared their experiences with digital legislation and the challenges of creating effective regulatory frameworks.
Anusha Rahman Ahmad Khan from Pakistan highlighted the urgent need for social media platforms to respond more quickly to content removal requests, particularly regarding AI-generated harassment content targeting women and girls. She emphasized that delayed responses can have devastating consequences, including suicide, and called for platforms to be more culturally sensitive. Franco Metaza from Argentina’s Mercosur Parliament discussed harmful content including “fatphobia” that promotes eating disorders among young girls, and shared how disinformation led to an assassination attempt on a political leader, demonstrating the real-world dangers of fake news.
Yogesh Bhattarai from Nepal stressed the importance of regulating rather than controlling digital platforms while strengthening democratic institutions and maintaining constitutional protections for freedom of expression. Tsvetelina Penkova from Bulgaria outlined the European Union’s comprehensive legislative approach, including the Digital Services Act, Digital Markets Act, and GDPR, emphasizing human-centric digital transformation and the challenges of enforcement across 27 member states.
Ashley Sauls from South Africa discussed how disinformation about his country affected international relations and highlighted the need for balanced approaches that don’t infringe on privacy and human rights. The discussion also addressed youth engagement in policymaking, cybercrime legislation challenges, and the role of private sector companies in content moderation and capacity building. Participants emphasized the need for multi-stakeholder approaches, international cooperation, and the recognition of digital rights as a potential fourth generation of human rights.
Keypoints
## Major Discussion Points:
– **Platform accountability and content moderation challenges**: Multiple speakers highlighted the struggle with social media platforms’ responsiveness to government requests for harmful content removal, particularly regarding gender-based violence, harassment, and culturally sensitive content. Pakistan’s experience showed platforms treating regulatory requests as optional rather than legally binding.
– **Balancing freedom of expression with protection from harm**: Parliamentarians emphasized the need to protect vulnerable groups (especially women, children, and minorities) from online harassment, disinformation, and harmful content while preserving democratic freedoms and human rights. This tension between regulation and liberty was a central theme across different regions.
– **Legislative frameworks and enforcement challenges**: Speakers shared experiences with cybercrime laws, digital services acts, and content regulation, noting that having laws is insufficient without proper enforcement mechanisms and capacity. The EU’s comprehensive approach (DSA, DMA, GDPR, AI Act) was contrasted with implementation challenges in smaller countries.
– **Youth engagement and digital literacy**: The discussion emphasized involving young people in policymaking processes and the critical need for digital literacy programs to help users identify misinformation, develop critical thinking skills, and navigate online spaces safely.
– **Multi-stakeholder cooperation and capacity building**: Speakers called for enhanced collaboration between governments, civil society, private sector, and international organizations, with particular emphasis on the need for capacity building for parliamentarians and public officials to understand emerging technologies like AI.
## Overall Purpose:
The discussion aimed to facilitate knowledge sharing among parliamentarians from different regions about their experiences with digital governance, content regulation, and creating safer online environments while maintaining democratic principles and human rights protections.
## Overall Tone:
The discussion maintained a collaborative and constructive tone throughout, with speakers sharing both challenges and solutions from their respective contexts. While there were moments of criticism directed at tech platforms and concerns about enforcement gaps, the overall atmosphere remained professional and solution-oriented. The tone became slightly more technical and urgent when discussing specific harms (suicide, harassment, disinformation affecting democracy) but concluded on a forward-looking note emphasizing cooperation and shared responsibility.
Speakers
**Speakers from the provided list:**
– **Anusha Rahman Ahmad Khan** – Former Minister for Technology (Pakistan), worked on cybercrime legislation including the Prevention of Electronic Crimes Act 2016
– **Sorina Teleanu** – Session moderator/chair
– **Franco Metaza** – Parliamentarian from Mercosur (regional parliament of South America covering Brazil, Argentina, Uruguay, Paraguay, and Bolivia)
– **Yogesh Bhattarai** – Member of Parliament representing the Federal Democratic Republic of Nepal
– **Tsvetelina Penkova** – Member of the European Parliament representing Bulgaria
– **Ashley Sauls** – South African parliamentarian
– **Raoul Danniel Abellar Manuel** – Member of the Philippine House of Representatives
– **Bibek Silwal** – Advocate for youth in policy from Nepal
– **Olga Reis** – Private sector representative from Google, covers AI opportunity agenda for emerging markets
– **Anne McCormick** – Private sector representative from Ernst & Young (EY)
– **Amy Mitchell** – Representative from Center for News Technology and Innovation (United States)
– **Audiance** – Honorable representative from the Democratic Republic of Congo, Kinshasa
**Additional speakers:**
None identified beyond the provided speakers names list.
Full session report
# Parliamentary Approaches to Digital Governance: Balancing Online Safety with Democratic Freedoms
## Executive Summary
This comprehensive discussion brought together parliamentarians from across the globe to examine the complex challenges of regulating harmful online content whilst preserving fundamental democratic principles. The session, moderated by Sorina Teleanu, featured representatives from Pakistan, Argentina, Nepal, Bulgaria, South Africa, and the Philippines, alongside private sector voices and civil society advocates. The dialogue revealed both shared concerns and divergent approaches to digital governance, with particular emphasis on protecting vulnerable populations, enhancing platform accountability, and fostering international cooperation.
## Key Themes and Regional Perspectives
### Platform Accountability and Cultural Sensitivity
The discussion opened with a powerful intervention from Anusha Rahman Ahmad Khan, Pakistan’s former Minister for Technology, who articulated a fundamental challenge facing governments worldwide. She emphasised that the core issue is not a geopolitical struggle between East and West, but rather “a fight between revenue generation entities versus a revenue curbing request.” This economic framing of platform behaviour resonated throughout the session, highlighting how social media companies prioritise profit over cultural sensitivity and public safety.
Khan explained that “every single post on the social media platform is a revenue generating mechanism,” which creates inherent conflicts with content moderation requests. She shared disturbing examples of platforms’ inadequate responses to government requests for content removal, particularly regarding AI-generated harassment targeting women and girls. She noted that delayed responses can have devastating consequences, including suicide, and called for platforms to demonstrate greater cultural awareness in their content moderation decisions.
Khan also highlighted Pakistan’s innovative approach to AI governance, including an AI-powered Senate chatbot project. She referenced Pakistan’s Prevention of Electronic Crimes Act, which was developed over two years starting in 2014 and enacted in 2016. Her frustration was palpable as she declared: “We are now tired of waiting and I would urge and request all the other parliamentarians to come together to make a joint strategy where we can collectively speak to the social media platforms.”
Franco Metaza from Argentina’s Mercosur Parliament—the regional parliament of South America comprising Brazil, Argentina, Uruguay, Paraguay, and Bolivia with 100 parliamentarians—reinforced these concerns with specific examples of harmful content. Speaking in Spanish as he announced he would do, he detailed how platforms promote “racism, xenophobia, homophobia, explicit violence, banalization of the use of drugs, and fatphobia.” He provided particularly disturbing examples of “fatphobia” affecting young girls, noting that 13-14-year-old girls in Brazil are seeking aesthetic surgeries due to harmful content promoting unrealistic body standards.
Metaza also shared how disinformation led to an assassination attempt on a political leader, demonstrating the real-world dangers of inadequately moderated content. He offered a compelling metaphor, comparing unregulated social media consumption to “going at full speed with a vehicle without knowing what is in front of us or without having a traffic light,” which helped frame regulation as a safety necessity rather than freedom restriction.
### Legislative Frameworks and Implementation Challenges
The discussion revealed significant variation in legislative approaches across different regions. Tsvetelina Penkova from Bulgaria outlined the European Union’s comprehensive strategy, including the Digital Services Act (DSA), Digital Markets Act (DMA), GDPR, the European Democracy Action Plan, Media Freedom Act, and the emerging AI Act. Despite criticism of the AI Act, she defended it as “probably the best one which protects people” while ensuring “innovation and growth” alongside “protecting citizens’ rights.” She emphasised the EU’s commitment to human-centric digital transformation whilst acknowledging the substantial challenges of enforcement across 27 member states with different legal traditions and capacities.
In contrast, Yogesh Bhattarai from Nepal advocated for a more collaborative approach, arguing that “digital platforms should be regulated, not controlled.” He stressed the importance of cooperation and collaboration rather than strict governmental control, whilst ensuring constitutional compliance with freedom of speech guarantees. Nepal’s approach involves engaging youth through national and internet governance forums in legislative processes. Bhattarai noted Nepal’s linguistic diversity, with “125 languages and 125 castes,” which adds complexity to content moderation challenges.
Ashley Sauls from South Africa provided multilingual greetings and highlighted his country’s multi-faceted legislative response, including the Protection of Personal Information Act, Cyber Crimes Act, and Fullerman Publications Act. He emphasised the importance of multi-stakeholder approaches and warned against policies that might infringe on privacy and human rights. Sauls also introduced the concerning concept of “digital apartheid,” highlighting how AI training can perpetuate historical biases and discrimination.
Sauls shared a powerful example of how disinformation about “white minority genocide in South Africa” affected US government decisions and led to the cancellation of a rugby match between Atlanta Secondary School and Liff Burra Grammar School due to safety concerns. He quoted the Minister of Sport’s philosophy that “a child in sport is a child out of court,” emphasising sport’s role in social cohesion. He also noted South Africa’s recent transition from majority government to a “government of national unity.”
The Philippines’ experience, as shared by Raoul Danniel Abellar Manuel from the House of Representatives, provided a cautionary tale about legislative overreach. He criticised the country’s Cybercrime Prevention Act of 2012, particularly its cyber libel provisions that have been misused against journalists and teachers, demonstrating how well-intentioned legislation can be weaponised against legitimate expression.
### International Cooperation and Human Rights Framework
A particularly thought-provoking intervention came from the representative from the Democratic Republic of Congo, who proposed that the United Nations should recognise digital rights as a fourth generation of human rights. Speaking in French, he argued that such recognition would provide a common framework for national legislation, similar to existing human rights generations, and could lead to constitutional incorporation in many countries.
This proposal received immediate support from Tsvetelina Penkova, who acknowledged it could resolve many enforcement challenges currently faced by individual nations. The suggestion elevated the discussion from practical policy implementation to fundamental questions about the nature of rights in the digital age.
### Youth Engagement and Digital Literacy
A significant portion of the discussion focused on the critical role of young people in digital policymaking. Franco Metaza argued that “youth participation should be transversal across all policy-making rather than segregated into youth-only discussions,” advocating for integrated rather than separate consultation processes.
Bibek Silwal, an advocate for youth in policy from Nepal, emphasised that young people serve as “positive catalysts in policy implementation” and should be involved from initial policymaking through public outreach. He highlighted the importance of digital literacy programmes and critical thinking skills development to help users identify misinformation.
Tsvetelina Penkova noted that young people understand the economic implications of poor regulation, recognising that “the digital economy will shrink without proper regulation.” This insight challenged assumptions about youth attitudes towards digital governance, suggesting greater sophistication in their policy preferences than often assumed.
### Private Sector Responsibility and AI Governance
The session included notable contributions from private sector representatives, creating moments of both tension and unexpected consensus. Olga Reis from Google presented current content moderation efforts, citing statistics about video removal from YouTube, with “55% removed before being watched, 27% removed with less than 10 views.” She also mentioned Google’s AI Campus programme, which has already trained “500,000 officials” in AI literacy.
Anne McCormick from Ernst & Young highlighted the private sector’s need for clarity on AI liability frameworks as adoption spreads beyond large technology companies to smaller economic actors. She emphasised the importance of independent oversight and transparency mechanisms throughout the AI lifecycle.
The discussion revealed interesting tensions in platform relationships. Franco Metaza both praised YouTube Kids as a successful model of controlled digital ecosystems for children whilst simultaneously criticising Google for allowing defamatory content in search results. He specifically cited how searching for Cristina Fernández de Kirchner showed “ladrona de la nación argentina” (thief of the Argentine nation) in Google’s knowledge panel, demonstrating inconsistent standards across different Google services.
## Areas of Consensus and Disagreement
### Strong Agreements
The discussion revealed remarkable consensus on several fundamental principles. All speakers agreed that social media platforms need to take greater responsibility for content moderation and harm prevention. There was universal acknowledgement that protecting vulnerable populations, particularly children and women, must be a priority in digital governance frameworks.
Youth engagement emerged as another area of strong agreement, with all speakers supporting meaningful integration of young people into policymaking processes. Similarly, there was consensus on the necessity of multi-stakeholder approaches involving government, civil society, private sector, and international organisations.
### Key Disagreements
Despite broad agreement on principles, significant disagreements emerged regarding implementation approaches. The most notable tension concerned decision-making authority for content removal, with Anusha Rahman Ahmad Khan advocating for stronger government authority whilst Tsvetelina Penkova insisted on judicial oversight to prevent government overreach.
Speakers also disagreed on the appropriate level of regulation, with Yogesh Bhattarai emphasising light-touch regulation focusing on cooperation, whilst Franco Metaza supported stronger parliamentary regulation comparable to traffic laws. These disagreements reflect deeper tensions between national sovereignty and international coordination.
## Unresolved Challenges and Future Directions
### Implementation and Enforcement
The discussion highlighted persistent challenges in translating legislative frameworks into effective enforcement. Multiple speakers noted that having laws is insufficient without proper implementation mechanisms and capacity. Cultural sensitivity in content moderation remains particularly challenging, with platforms making uniform decisions without considering local contexts that could have severe consequences for users.
### Capacity Building and Education
Several speakers emphasised the critical need for capacity building amongst parliamentarians and public officials to understand emerging technologies like AI. Digital literacy emerged as equally important for general populations, with speakers calling for educational campaigns to help users identify misinformation and develop critical thinking skills.
### Economic and Social Justice Considerations
Ashley Sauls’s introduction of the “digital apartheid” concept highlighted how AI systems can perpetuate historical injustices and create new forms of discrimination. This concern extends beyond technical bias to fundamental questions about who benefits from digital transformation and who bears its costs.
## Recommendations and Action Items
### Immediate Actions
Parliamentarians agreed on the need to create joint strategies for collectively addressing social media platforms, recognising that individual national approaches lack sufficient leverage against global technology companies. Educational initiatives emerged as a priority, with speakers calling for campaigns to teach young people to identify fake news and develop critical thinking skills.
### Medium-term Developments
The proposal for UN recognition of digital rights as a fourth generation of human rights represents a significant medium-term objective that could provide clearer frameworks for national legislation. Platform accountability mechanisms need strengthening, with the YouTube Kids model suggested as a template for broader child protection measures.
### Long-term Structural Changes
The discussion pointed towards the need for coordinated international frameworks rather than individual national approaches. The integration of digital rights into constitutional frameworks could provide more robust protection against governmental overreach whilst ensuring consistent protection standards.
## Conclusion
This parliamentary discussion revealed both the urgency and complexity of digital governance challenges facing democracies worldwide. The session’s most valuable contribution was its reframing of digital governance from technical issues to fundamental questions about power, economics, and human rights in the digital age. Anusha Rahman Ahmad Khan’s characterisation of the struggle as one between “revenue generation entities versus revenue curbing requests” identified core tensions that must be addressed.
The proposal for digital rights as a fourth generation of human rights offers a potential framework for achieving balance between competing interests, but implementation will require unprecedented levels of international coordination. As parliamentarians continue to grapple with these challenges, the experiences shared provide valuable insights into both successful approaches and cautionary tales about legislative overreach.
The path forward requires sustained commitment to multi-stakeholder dialogue, international cooperation, and innovative approaches that can balance platform accountability with democratic freedoms whilst protecting the most vulnerable members of society.
Session transcript
Anusha Rahman Ahmad Khan: But there is this responsibility that goes with the governments that the digital progress is upholding human dignity. It is upholding democratic freedom and giving access to everybody. So we experience that in Pakistan, for example, that even if when we made the law, the social media platforms continue to govern our request as if we were two kilometers not above the ground, not impacted by the law. And they decided to choose what content they were going to remove and what content they were going to keep on the social media platforms. So there has been and is a continuous issue in our country that when the regulator sends out the request to remove the content, and I will give you by example that this content relates to a girl and she’s a university student and she’s being harassed by somebody and they have created a content on AI or any other means which looks like real. By the time that content is removed, the life of that girl is gone. So when I was legislating, I had dozens of examples of girls actually jumping off the wall, killing themselves, committing suicide and stuff like that. And in my country, even an aspersion on a girl is good enough to kill her. So even if they would not die physically, but they are dead emotionally. So we need to be very careful of the fact that the culture in which we are living, the social media platforms have to be sensitive about that culture. And this is the real challenge, how to make the social media platforms sensitive to the cultures. For them, it’s a revenue. Every single post on the social media platform is a revenue generating mechanism. It’s not a fight between East or the West. It’s a fight between revenue generation entities versus a revenue curbing request. So we need and we expect that this platform today, where the UN enters with the IGF, we can together use the technology platforms without fearing that these technology platform contents are going to be harmful for our children, for our girls, for our women, for the vulnerable. And believing in technology and being a former minister for technology, we all believe that we have to explore and we promise and we want to ensure that we are going to use the technology for shaping the future of the legislation, transparency and bringing more efficiency and effectiveness in the functioning of the way the parliamentarians work. So in the Senate of Pakistan, for example, advancing the vision of technology adaption, the chairman of the Senate has for the first time taken a concrete step towards developing an AI-powered Senate chatbot. It’s a virtual assistant designated to support lawmakers, secretariat staff and citizens with real-time access to legislative data, procedural guidance and multilingual services. This project proposes full-scale design, development and institutional deployment of the Senate chatbot, transforming it from a promising prototype into a high-impact digital parliamentary assistance tool. So we are working using technology at the same time. My concern and your question would still take us again back to this discussion table, that for how long are we going to continue to wait for the use and absorption of the technology positively when abused online, to continue to be abused online by the vested stakeholders and how long would social media platforms would take to listen to the governments and their requests to remove objectionable content and to secure the vulnerable groups and the non-vulnerable groups equally online. And we are now tired of waiting and I would urge and request all the other parliamentarians to come together to make a joint strategy where we can collectively speak to the social media platforms and help our vulnerable citizens in our respective countries to ensure that their offline rights are as secure online. This is what my humble request is. Thank you.
Sorina Teleanu: Thank you so much for bringing to the table so many issues. If I may ask a follow-up question, you mentioned you work on a cybercrime law. Was that law passed already in the parliament? Sorry, Ms. Rahman? If I may ask a follow-up question. No, I have to, your voice is actually echoing, yes. Yeah, you have to listen. Yeah. Let me also remove this. So, if I may ask a follow-up question. You mentioned you were working on a cybercrime law. Was it passed in the parliament? Is it approved?
Anusha Rahman Ahmad Khan: Yes, it was made in 2016. I started working on it in 2014. It took me two years to bring in a collaborative effort, bringing all the parliamentarians on board, listening to the civil society, listening to the NGOs, listening to the independent groups, listening to the media, because, as I said, the Internet is perceived, anything that will happen on the Internet is perceived as an activity, as if it is going to curb the freedom of expression, which is not the case. So, it is not about curbing the freedom of expression. We believe in it and we uphold it. It’s about protecting children, women, girls, and all vulnerable segments, and in this case, men and boys are equally vulnerable. Because the dignity of a natural person is extremely important to protect. So, this is what we made the media to understand, that we are not targeting electronic media. We are not targeting print media. We are talking about social media, which is full of misinformation, propaganda, and fake news. And we need to protect our citizens from this, because it leads to harassment and it leads to other kinds of vulnerable situations, which need to be looked at and criminalized in our law. So, in 2016, we made the cybercrime law. It’s called the Prevention of Electronic Crimes Act, and it’s a consensus document of the entire 240 million people representation in the parliament by their MPs in both the National Assembly and the Senate.
Sorina Teleanu: Thank you. I’ll get back later to you with one more question, but let me give the floor to Mr. Metaza to share your experiences.
Franco Metaza: Hello. Good afternoon to everybody. I’m going to speak in Spanish, which is one of the official languages of my regional parliament. I want to start by thanking the Department of Economic and Social Affairs of the United Nations, the Parliament of Norway, and the Inter-Parliamentary Union, the UIP, for organizing this important parliamentary tracking in the framework of the Forum for Global Internet Governance. My parliament is the Parliament of the Marcosur, it’s the regional parliament of South America. It’s made up of Brazil, Argentina, Uruguay, Paraguay, and Bolivia. It’s finishing its internal legislation to be able to be a full member. We are 100 parliamentarians, and we are having a very, very heated debate at the moment in our region regarding these topics. To answer one of the questions that Sorina asked at the beginning of this panel, well, the harmful content that we see with a lot of concern in our region, we are starting to list them and to be able, in some way, to encode what they are about. We are talking about racism, we are talking about xenophobia, we are talking about homophobia, we are talking about explicit violence, banalization of the use of drugs, and one in particular, as you asked for examples, Sorina, I’m going to give an example of a project that I presented in my parliament regarding something that perhaps does not have an exact translation in the world of the speaker. We are talking about fatphobia. What we are proposing is that there is content that is so harmful on social media, specifically on Instagram, specifically on TikTok, that leads young girls, the general population, but we are very concerned about young girls, to have behaviors that lead to anorexia and bulimia, that lead to what we call eating disorders. In this sense, we are very concerned about what the images generate. We have done tests and registering you on social media, saying that you are a 13-year-old girl, the content they bombard you with is content, images. In some real cases, in other fake cases, made with artificial intelligence of extremely skinny bodies, impossible to achieve in a natural way, and then advice to have extreme diets, and then advice or advertisements about surgeries. In Brazil, for example, 13-14-year-old girls have started going to the doctor on their own to consult for the possibility of having aesthetic surgeries. We are entering a very, very complex situation in terms of harmful content. And this dichotomy that exists between, well, regular or freedom of expression, I want to tell you that it seems to me that the regulations, when they are given in parliaments where all the social statements are represented, all the political expressions, will never go against freedom because the regulations express the will of the majority. I give you an example. There began to be cars, motor vehicles, in our societies, in the real world. We had to put a speed limit. We had to prohibit children from driving. And that was not against anyone’s freedom. Well, I think that today the permanent scrolling that we are all subjected to is as much or more harmful as going at full speed with a vehicle without knowing what is in front of us or without having a traffic light. And finally, I want to pick up something that I heard a lot yesterday, that I heard this morning, and it seems to me that it is a concern that we all have. At least, well, here there are many stakeholders in the auditorium, but perhaps it is something that I heard, especially from parliamentarians, and it is how the other issue, which is disinformation and fake news, affect our democracy. Democracy that we, as parliamentarians, have the obligation to protect and take care of. Look, I’m going to give you an example. In my country, in Argentina, a fake news began to circulate systematically about one of the leaders of my country, who was president twice and who is the main leader of the opposition, whose name is Cristina Fernández de Kirchner, about corruption, with many hate messages, and this went viral all over the networks. What ended up happening? Well, a person who consumed so many hate messages appeared at the door of the house and shot him in the head. Fortunately, the bullet did not come out, but look how far fake news and disinformation can go. Today, he has been arrested by the current president, in the framework of a great confusion of fake news and disinformation. So, we have the obligation, as parliamentarians, to put a stop to it, or at least nuance what social networks are, what Internet without government is, as the senator said here, to protect our democracies. It is not fair that our democracies, which is the last thing we have left, in this complex moment, where at any time a nuclear bomb explodes, the only thing we have left to safeguard humanity are democracies. Let’s please take care of democracies. Thank you.
Sorina Teleanu: Thank you also, including for bringing up some of the metaphors you raised. I like the one about having rules for the use of social media, similar to having rules for driving a car on the road. We can unpack that a little later. One curiosity, within the parliament of Mercosur, are you having these kind of debates, and are you looking into doing something collaboratively across the countries in dealing with harmful online content and creating more safe online? Again, not necessarily only in terms of legislation, but also looking at how to build more awareness, more capacity among users themselves, so they can be better prepared to deal with harmful content. Because, I guess, not all the answers are in passing a law and then expecting for it to be applied, but also seeing how you can prepare people to deal with these kind of things. Any reflections?
Franco Metaza: Yes, Sorina, what you ask is important. Today, in the parliament of Mercosur, if there is something in which we have consensus, it is that companies can do more than what they are doing. That they have the budget, that they have the money, and that they are not making enough efforts to be able to put a stop to harmful content, harmful content and fake news. In that we are absolutely in agreement. We believe that we have to go there. Thank you.
Sorina Teleanu: I think I’m already seeing a common thread about more responsibility for the private sector. And I know we have some private sector in the room and I think we will want to hear from them as well, but later on that. All right, let us continue. And let’s hear from Mr. Bhattarai, please.
Yogesh Bhattarai: Thank you very much, Sorina. Excellencies, distinguished delegates, fellow parliamentarians, ladies and gentlemen, friends from media. Good afternoon. It is a profound honor to be here today. Representing the Federal Democratic Republic of Nepal, I extend my sincere gratitude to the organizers, especially the UN and the government of Nauru, for creating this space where we can share our experiences and the collective Sikh wisdom. The topics before us, building the healthy information ecosystem, is one of the defined challenges of our era. This is not only the technical issues. It is a milestone for our ongoing democratic journey. Like many of you, we have witnesses to transformative power of the digital age. The Internet and social media have opened up avenues for the experiences, connected our diverse communities, and given a powerful platform to citizens to engage in the civic life of our nation. It has been a remarkable force for democratization. However, this progress is accompanied by complex challenges. We grapple with the very real harms of disinformation that can fear our social fabric, and the rise of online harassment that seeks to silence vulnerable voices. These are legitimate concerns that a very responsible government must address. In Nepal, we are currently in the midst of a profound national conversation about how to best delegate the balance between upholding the freedom of experience and protecting our citizens from harm. This debate is reflected in the legislative proposal currently under discussion, including the proposal Social Media Bill and the Information Technology Bill. This proposal streams for the genuine desire to create a safe digital environment. As a parliamentarian committed to the universal values of human rights and people-based democracy, I believe we must proceed with utmost care. The Constitution of Nepal guarantees the freedom of speech and expression, and Article 19 establishes the right to information and communication as fundamental rights. Parliament will not accept any law that contradicts the provision of the Constitution. In Nepal, we have a National Information Commission and the Press Council Nepal as an independent oversight agency. Myself and other MPs have been participating in the program organized by the Civil Society Organization, where there are discussions on the right to information and communication. We are concerned about the negative impact of misinformation and disinformation on society. Everyone should be aware of the possibility that it can divide society by spreading the confusion about caste, religious, racism, gender, and professions. Misinformation and disinformation are also having an impact on the tension and war taking place in different parts of the world today. This has also become the challenge for national security. I am convinced that only civil liberties, human rights, open societies, democratic competition, equal access, and citizen resilience can make the state accountable to its citizens. The challenges brought about by the revolution in the digital sector should strengthen the sovereignty of the nation and the people. It should support world peace and humanity. For this, digital platforms should be regulated, not controlled. Cooperation, collaboration, and solidarity should be strengthened. I am firmly convinced that the most effective and sustainable path forward lies in the empowerment and strengthening of democratic institutions. We believe that only a healthy information ecosystem can make healthy democratic practices strong and accountable. I believe that the Internet and digital platforms will connect the people’s hearts. It will make life easier. It will bring marginalized communities into the mainstream. Let us reform our collective commitment to this principle. Let us share not just our challenges, but our highest aspirations. I am confident that through the collaboration and the shared dedication to human rights, we can build a digital future that is not only safe and orderly, but also open, vibrant, and fundamentally free.
Sorina Teleanu: Thank you so much. I took quite a lot of notes while you were speaking, and I would like to get back to some points, maybe also later. But right now, I like how you said that we need responsible governance, and that states can and should be accountable to their own citizens. I think those are important points to keep in mind also when you work on legislation. And because you said that social media has to be regulated but not controlled, my question would be, how are you interacting with technology platforms as you’re working on this legislation in the country? you have any discussions with them, how is the relation being?
Yogesh Bhattarai: Yeah. Recently, the government submitted the bill about social media and digital platform, and we discussed with so many stakeholders and the different part and different organization, and the government requested the suggestion from the different stakeholder. It is ongoing, but not in the conclude. So, I hope we make the more effective law about social media and internet access, and especially in the cyber security also.
Sorina Teleanu: Thank you. Moving on to Ms. Penkova. Thank you.
Tsvetelina Penkova: Thank you, Serena. I will touch upon the question about the specific legislations, because we heard a lot of many examples, but we have the strong belief that Europe is still playing the leading role when we are speaking about digital legislation. Of course, we are at the stage of implementation still, but let’s keep in mind that when we speak about the EU legislations, we are representing 27 member states. So, if you allow me, I would start by emphasizing on some of the key and the most important EU legislations, and of course, it’s not going to surprise many people in the room, as it was mentioned many times. I’m starting with the Digital Services Act, which is the flagship EU legislation when we’re speaking about the regulation of the digital space. So, basically, the DSA is meant to be tackling a lot of the issues and problems that were already mentioned throughout the whole day’s discussion, at least today. So, we’re speaking about protecting minors and vulnerable groups, tackling cyber violence, tackling harmful content and disinformation. So, everything, more or less, is part of the content of this very key and important legislative framework of the EU. But, of course, it cannot act on its own, so we need some supportive legislations, and that’s where we come to the Digital Markets Act, for instance, which needs to complement the DSA by ensuring a fair competition in the digital economy. I mean, we believe that this is key, so the DMA does promote greater choice for consumers and the interoperability of the digital service providers. Here, I will also have to mention the data governance and the GDPR. Those are key legislations to reinforce individuals’ control over their data, while at the same time it promotes a trustworthy data sharing. So, we are speaking about protecting human rights in the digital space. When I mention GDPR, I’m sure this is probably one of the most popular legislations of the EU and the most controversial one, but it also needs approves, updates, additions. Only last week, for instance, in the EU Institution, we finished the negotiations of the procedural rules for handling cross-border cases. So, when we’re speaking about digital legislations, you have to be very pragmatic that the problems that we’re resolving today would be very different in tomorrow’s reality. So, we have to be very flexible and that has been extremely challenging for the regulators across the world, I would say, that you cannot always foresee the challenges in such a fast-growing and developing segments of the economy as it’s the digital field. I’ve spoken a lot about the human protection and the main legislative framework that the EU provides, but allow me to mention two other key legislations that we are working on. One of them is still a work in progress, but they’re focusing on the media freedom. So, the European Democracy Action Plan, it’s a plan actually, not a legislation necessarily. It needs to strengthen media freedom, but at the same time promoting pluralism. So, basically, it combats disinformation in very specific cases that we’ve been observing happening quite a lot, especially in the context of elections, for instance, and foreign interference. So, those are quite significant global challenges that we see at the moment. Of course, the Media Freedom Act, which is at the moment under negotiations, of course, tries to protect the independence of media freedom, transparency in media ownership and advertisement as well. So, there are many, many specific examples which are trying to resolve all the common problems that we are facing across the globe, but if you ask me to summarize the four key priorities which we’re having as members of the European Parliament, when we’re working on those legislations, I’ll probably focus on, I mean, there are many more than four, but I’ve chosen to focus on four main ones for today’s session. The first one, we really try to keep with the human-centric digital transformation. So, digital transition while protecting citizens’ rights. The second, combating online hate and disinformation. I’ve mentioned again, the DSA’s probably main goal is to ensure that there are stronger enforcement mechanisms against cyber violence, and we can get in more details if we need to. Third, it was mentioned many times, digital literacy and resilience. Without resolving and tackling this issue, none of those legislation or enforcements would be perceived, accepted and successful. So, the EU strategy at the moment is focused quite significantly on digital education. And last, but not least important, of course, is the children online safety. There are many sessions dedicated to this in the IGF. Tomorrow, we’re having one with the Utah IGF, and the younger generation really has a very significant role to play in ensuring that this protection is actually targeted to the most vulnerable and to the minors. And if you allow me the last 30 seconds, because when I’m speaking from the EU perspective, as I said, we have to take into account that we have 27 member states. So, each one of them has a very different experience in enforcing those legislations. In the last 30 seconds, I will just share, I’m from a small member state, actually, I’m from Bulgaria. So, we’re still struggling, actually, to enforce and implement all those legislations that I’ve listed. But we have a very active civil society sector that at the moment, for instance, is launching a lot of campaigns on teaching the youth generation to identify fake news and to be a bit more aware of the critical thinking. What I mean by that is basically try to allocate or analyze the sources of the information, where the information is coming, because we have observed that the younger generation lacks that question. So, they just see the information and they perceive it. I just wanted to mention that example because, yes, we do face a lot of challenges. Some of those legislations are very, very complicated, but once you have the state support, the regional governance and the active civil society, nothing is impossible.
Sorina Teleanu: Thank you so much for raising, I think, two important points. First of all, being the one on enforcement, it’s one thing to have laws in place, but then are authorities at the national level, as you’re saying, empowered to actually put that law into practice? I’m from your neighboring country, Romania, and I’m seeing the same challenges. It’s not easy to put in place all of that. Excellent points on literacy and capacity building and education and building critical thinking in young users, but also all of us. I think we would all benefit from a bit more critical thinking when we interact with digital technologies. If I may add one more thing that you might want to reflect on, how is the AI Act also connecting to all of these when it comes to more safe online environments and more transparency from the side of the private actors, for instance?
Tsvetelina Penkova: I’m sure maybe some people would not agree with me, but I think the AI Act as it stands in the European legislation is probably the best one which, again, protects the people, and of course we’ve seen a lot of criticism, but what we want to do is ensure that there is innovation and growth, but of course the first thing is protecting citizens’ rights and ensure that there is enough time for the consumers to understand all the risks and challenges before we put a technology for very wide use. I think this was a bit the protection mechanism and the way of thinking of the European Parliament when we were working on that legislation, and that’s why it did face a lot of criticisms.
Sorina Teleanu: Thank you. I think we can also come back to that later. All right, let’s hear from our… Final, but not the least, speaker, Mr. Ashley Sauls, please.
Ashley Sauls: Thank you very much, Serena. Esteemed parliamentarians, distinguished guests and fellow stakeholders in the realm of digital governance, being from a country where our constitution protects religious freedom, in line with my faith, I want to greet you in the name of our Lord and Saviour, Jesus Christ, and in my First Nation Indigenous Bushman mother tongue, at the brink of extinction, Mwenke Awunneki. It is with great honour, thank you, it is with great honour that I address you today at this pivotal forum where we converge to deliberate on the future of digital policy practices. As we navigate the complexities of our increasingly interconnected world, South Africa stands as a testament to the transformative potential of digital technologies while simultaneously confronting unique challenges that demand thoughtful and inclusive policy frameworks. A practical example is the recent disinformation about white minority genocide in South Africa where an executive decision by the US government was largely made based on online information. The ripple effect of that was also a fuelled narrative about my race classified as coloured, as a violent and gangster-ridden group. As a result, a local school, as an example, Atlanta Secondary School, had to host Liff Burra Grammar School from the UK in South Africa for a rugby match this July, but it was cancelled, and it was cancelled because parents feared for what was said in the White House about our country. Our current Minister of Sport says a child in sport is a child out of court. My fellow parliamentarians in the UK, maybe you guys can help us convince management and parents to contribute to a different narrative through sport by ensuring that that match actually takes place. I hope somebody is listening to me about that. In South Africa, we recognise the profound impact of the digital landscape on our socio-economic fabric with over 60% of our population accessing the internet. We see an unparalleled opportunity to enhance educational outcomes, stimulate economic growth and promote social inclusion. However, these opportunities are accompanied by significant obstacles, including the digital divide issues, cyber security threats and the need for robust regulatory frameworks that protect the rights of all citizens. Thus far, we have enacted the Protection of Personal Information Act, the Cyber Crimes Act and the Fullerman Publications Act that regulates these platforms. To effectively enhance our digital policy practices, we adopted a multi-stakeholder approach that engages government, civil society, the private sector and the public. This collaborative model is essential for fostering an inclusive digital economy where benefits are equitably distributed and innovation is harnessed to address local challenges. Moreover, as we advocate for robust cyber security measures, we must ensure that they do not infringe upon the rights to privacy and freedom of expression. South Africa is committed to aligning its digital policies with international standards, promoting a balanced approach that safeguards both security and fundamental human rights, as we would say in my mother tongue, goutes moet balance. By leveraging our position as a member of the African Union, we aim to encourage a continental dialogue that addresses common challenges and fosters regional cooperation in the realm of digital policy. In conclusion, the South African experience underscores the need for a proactive rather than the current reactive approach to digital governance. As we gather here today, let us reaffirm our commitment to a digital future that is inclusive, secure and respects the rights of all citizens. Together, we can craft policies that not only uplift our respective nations but also contribute to a more equitable global digital landscape that expresses the heart of politics, that prioritizes people above profits. Eyo. Thank you. Siyabonga kakulu.
Sorina Teleanu: Thank you as well. Also for highlighting what some of the previous speakers have mentioned, the fact that there can be and there should be a balance between protecting safety and security and also ensuring protection of human rights. And we don’t have to give one up to protect the other and the other way around. So thank you for highlighting that again. You mentioned the African Union, so my follow-up question to you would be, are there any examples of initiatives or discussions or projects being implemented or even put in place right now at the African Union level dealing with these issues that you might want to share with everyone? Sorry, again, I’m doing that. Apologies.
Ashley Sauls: Well, not specific programs. I don’t think it is as aggressive as one would want it to be. But I’d like to rather touch on the IGF approach. We have the South African Internet Governance Forum. And on that level, regionally and also continentally, there’s a lot of programs and initiatives around that. And which, for the first time, I think, because of that approach, we now have us as parliamentarians participating, I think it’s for the first time, that we actually join in the forum. And it is because of those engagements on that level.
Sorina Teleanu: Thank you. I’m glad to have you on board. All right. We’re kind of running out of time. And there are interventions that want to be made from the room. So let’s see. If you could introduce yourself, please. Thank you, Sorin.
Bibek Silwal: My name is Vivek Silwala and I’m from Nepal. I’m an advocate for youth in policy. So thank you very much to all of the member of parliament, senators. I think it was very enlightening in terms of what other work that has been going, not just across the region, but across the continents. And, you know, very much different focus areas and different reasons. So my question is regarding the involvement of youth in digital policymaking or, let’s say, just policymaking. You know, each and every processes where youth are involved, I think it amplifies the impact of the policymaking, whether is it in terms of implementations or whether in terms of the policymaking in the initial process. Youth are always the positive catalyst of reaching out to the end mile. So my question is to all of the parliaments in your region. How have you been involving youths in the days to come in policymaking or just, let’s say, a public outreach program where awareness, how do you plan to engage and indulge the youth in your specific events or policy? Thank you.
Sorina Teleanu: Thank you. Let’s take the second question as well.
Raoul Danniel Abellar Manuel: Hello to fellow parliamentarians. I’m Raul from the Philippines, member of the Philippine House of Representatives. I’d like to ask for your ideas or maybe concrete experiences about combating cybercrime because in the case of the Philippines, we’ve had the Cybercrime Prevention Act in 2012. Unfortunately, it contained a cyber libel provision that we already flagged years ago. We have foreseen that it can be used by abusive or repressive leaders and it turned out that it was actually used in recent times to go after journalists, even teachers who just had opinions that contradict those of the government of the day. So right now, we are discussing possible review and amendments to the cybercrime law that we have and you really are very guarded with that process. So there might be thoughts coming from fellow parliamentarians. Thank you.
Sorina Teleanu: Thank you as well. Shall we take these two questions and then continue? Anything on youth engagement and experiences with cybercrime law? Anyone would like to share?
Franco Metaza: Well, regarding young people, in the case of my country, we have a lot of encouragement for the participation of young people. You can vote from the age of 16 and we are proud to have a large number of young parliamentarians in our two chambers, in the House of Representatives as well. So we understand the participation of young people from a transversal point of view. Every time we make a law, when we listen to all parties, we obviously listen to young people. What we don’t like is to segregate young people. I mean, well, young people talk among young people and solve the problems of young people and the rest of the world follows separate paths. For us, youth must be transversal to political construction in society.
Tsvetelina Penkova: So they had many key messages, but the ones I remember were like, the digital economy will shrink if we don’t regulate it. So the young generation understands that and they want to be involved and they want to be asked and they want to be part of the conversation. Children, they do require protection. that is something which is limiting their rights. An interesting point that they brought, actually, was that the bloggers are not restricted in terms of the content that they’re publishing, so this is something that regulators should pay attention. They signaled about that. So the young generation is very much ahead of many of the legislators when it comes to the new trends, and they are an active stakeholder in that, so it’s not a matter if they want to be involved, it’s a matter of us going there and reaching out and asking them. And the comment on the cyber crime, I’m just going to give an example again how it is dealt with in the DSA, for instance, when we’re speaking about cyber violence. So illegal content has to be taken without a delay from the platforms once it’s detected, and this decision has to be taken by the judge, not by the government, so this is one way to tackle the specific example that was given.
Yogesh Bhattarai: In Nepal, you know, Nepal is a very specific country, because we have so many languages, like 125 languages and 125 castes, different castes, and Nepal is between China and India, you know, the very big countries, so that issue is very special for us also. And the youth engagement, it is very important. We have a national internet governance forum, and also the youth internet governance forum in Nepal, so our parliamentarians are engaged with them to make the law and other legislative process. And second thing, which is the cyber crime issues, we have a cyber crime law, and there is a branch within the police, police have a cyber crime branch, and they investigate the issues about the cyber crime, and they submit any case in the court, and the court maybe judge about the cyber crime.
Ashley Sauls: Thank you, Sorina. I think maybe I should start off by saying, let me be honest and transparent that in South Africa, we haven’t really had an emphasis on youth involvement as much, but with the dawn of the shift of our governmental structure nationally, that’s beginning to change. Many of you would know that we had a majority government, one party majority government, since 1994, and now there’s what is called the government of national unity, it’s a coalition government, and so there’s different lines, different approaches coming together, forming one government, and this has assisted to make it practical. We’ve got two, the leaders of the South African IGF, both the chairperson and the deputy chairperson are young people, they are here at the conference, and like I said, this is the first time that us as parliamentarians are forming part of this, and that is because of this different approach to government, which now includes the voices of young people, and because we listen, I’m here today. So I think that’s a good step in the right direction for South Africa.
Sorina Teleanu: Thank you, everyone, for sharing the experiences. I think there were a few more points, if you would like to get back to the mic before we wrap up the session. There’s a mic next to you.
Audiance: Merci beaucoup, Madame. Je suis l’honorable voie de la République démocratique du Congo, Kinshasa. J’aimerais aborder une question, je ne sais pas, I want to speak in French, please. Je peux? I can? Je voudrais aborder une question un peu plus technique et vraiment intense, parce que à chaque fois que l’on parle de la législation et des droits humains, on sait tous que c’est vrai que nous sommes des législateurs, mais ne votent pas ou bien n’écrit pas une loi qui veut. Et ça a été dit ici que la plupart des parlementaires ne sont pas outillés, mais j’aimerais évoquer ici le rôle que les Nations Unies ont joué sur la reconnaissance des droits humains. Lorsque les Nations Unies ont compris les réalités du monde, vers 1945-1948, ils ont parlé des droits civils et politiques. Ils les ont reconnus comme la première génération des droits humains. Ensuite, il y a eu les droits liés à l’économie, au travail. Ils ont créé la deuxième génération des droits humains, les droits économiques et sociaux, et sociales, si on peut le dire ainsi. Ensuite, en parlant de l’écologie, les Nations Unies ont reconnu les défis du moment. Ils ont créé la troisième génération des droits humains, les droits collectifs liés à l’environnement. Mais ici, il est clair que, concrètement, les Nations Unies n’ont pas encore reconnu les droits numériques comme étant, faisant partie de la quatrième génération des droits humains. Pourquoi je le dis ? Parce que lorsqu’on lit les constitutions de notre pays, on va voir qu’il y a beaucoup de travaux, beaucoup des pays ont repris les travaux des Nations Unies, donc comme droits fondamentaux. Il est clair qu’ici, si on parle des droits civils et politiques, nous savons de quoi nous parlons. Si nous parlons des desques droits économiques, nous savons de quoi nous parlons, effectivement. Mais lorsqu’il s’agit des droits numériques, on parle des droits humains, on parle des légiférés, mais en réalité, on ne s’est pas encore mis autour d’une table pour dire, en effet, que les droits numériques faisaient partie de la quatrième génération des droits humains. Si les Nations Unies passent cette étape, vous allez remarquer que beaucoup de pays vont enjamber et peut-être ça sera même inscrit dans nos constitutions et dans nos lois. Parce que je l’ai dit tout à l’heure, légiférer, c’est bien, c’est un vœu, mais il faut avoir la technique pour légiférer. Et il est clair que notre contribution en tant que parlementaire est qu’il faut que nous ayons un élément, un instrument commun comme on l’a avec les trois générations des droits humains et avec la génération des droits numériques. Merci.
Sorina Teleanu: Thank you. We’ll get back to that. And the second point, please.
Olga Reis: Thank you so much. My name is Olga Reis and I represent the private sector here. I work at Google and I cover AI opportunity agenda for the region of emerging markets. I wanted, first of all, thank you so much for this insightful conversation. And I wanted to highlight a couple of points and then maybe react to some of the points that were raised during the panel discussion. We at Google look at such technologies as AI as a really transformative technology once in a generation opportunity, especially for the region of emerging markets. But we also recognize that such technology should be developed and deployed boldly, responsibly and together with international community, with public sector, with civil society and our users. And one of the ways of how I believe this technology should be used in the context of content regulation, because there was a great deal of discussion around content regulation during the session, is that we as a company that manages YouTube platform utilize AI to tackle bad content on our platform. And I just pulled out statistics as we were talking about content moderation in the first quarter of 2025, so January, February, March this year, we pulled down or took down 8.6 million videos on YouTube that didn’t comply either with our own policies or with respective policies on the markets where we operate. And 55% of this 8.6 million videos were removed before they were watched at all, meaning that they were uploaded by content creators but not published and we detected them automatically. And further 27% of this bad content was removed with less than 10 views. So that just shows the scale of how we can actually use such technology as AI to tackle bad content on our platforms. However, I wanted to use this opportunity to speak not to talk about content, but I felt compelled to bring up these statistics, but about the need, I would say, of companies like Google to actually work on capacity building of public officials and parliamentarians, including around the use of AI for their work, but also around how AI can contribute to driving economic growth, especially in emerging markets. And just two points to highlight or two programs to highlight that we as a company are doing. Next week, we’re actually gathering a number of regulators from the MENA region and our London office to talk for three days about, you know, regulations, challenges around regulations, including on AI and cloud and content issues. This is something that we’ve been doing for many years already. And we have similar programs around the world, especially in emerging markets. And there is a second program that I wanted to highlight that is actually available for everyone. And specifically built for public officials. This is a program that is called AI campus that we as a company built in cooperation with a political which is an NGO based out of UK and This is a program that is built specifically for public officials to upskill them on AI issues we have already seven courses available on different aspects of AI and 500,000 officials have already gone through this online training and I would encourage and invite all public officials that are present in this room to You know to make use of this of this content and we will make sure that we actually update and develop it Because the technology does move and involve in the world very quickly. Thank you so much once again
Sorina Teleanu: Thank you. Also, and we have one more point Okay, a few more points if we can be quick, otherwise I will be taken down here
Anne McCormick: Hello, I’m Anne McCormick from Ernst & Young EY. Also private sector, but a different perspective. It’s very important not to simplify private sector small and big enterprises innovators, but also Organizations we work with clients across very different sectors across almost all the countries in this room we see there is a need for Clarity on what is reliable AI the liability gets transferred to more and more Economic actors in the economy small and big as we adopt embed and deploy AI So I would urge policymakers legislators to look at the health and dynamism of their economy and Consider the different aspects of the private sector not just the large tech companies that are the ones in the headlines but the similar and in some cases very different needs and very different interests of Economic players and companies that are adopting and embedding AI and who are increasingly concerned about AI governance we see company leaders board members, but also investors and insurance companies Asking about how do you know that the AI that you’re buying no matter what its brand, right? How do you know its limits its risks? Are you potentially liable? How are you going to deploy it with confidence? How are you going to make sure that your employees your own clients your reputation as a company? Right is maintained. So it is very important not to over regulate it Irregulate, but it is important that there are the right mechanisms to Encourage disclosure Encourage transparency not the black box encourage accountability through the life cycle of an AI with independent oversight possibly independent assurance or assessments so that everybody can use this extraordinary technology with Confidence and we get the best out of it. There are multiple economic voices The private sector has many different facets and I really want to emphasize this as adoption grows. Thank you very much
Sorina Teleanu: Thank you. And there was one more point if we can be very fast, please
Amy Mitchell: Thank you so much, I’ll be fast Amy Mitchell with Center for News technology and innovation in the States and I had a quick follow-up thought and question I guess if you have a second to to respond to it on The digital information space and it was great to hear topics of the freedom of expression of thinking about the balance of Safeguarding as well as maintaining freedom of expression and then media freedom specifically and in the new EU initiative that’s being passed and I’m curious as to the thoughts that are in place on How one puts definitional language around those things as we know The public access to information Is is vast today and the range of those that are producing journalism moving from the very traditional space to all kinds of different Producers including in some cases citizens themselves that can be need of that safeguard and protection of freedom so I’d be curious as you’re developing these acts the thought around the definitional language and then also on Safeguarding in the enforcement side so that things like that cannot per later government structures as we know government Governments can change over time inside a country To be sure that they don’t end up being any used in a way that can be used to harm. Thank you
Sorina Teleanu: Thank you. I’m looking at the hosts I just say it and I feel asking if we can take one more minute per speaker to try to reflect on the Points. Okay. Thank you so much Let’s try to reflect on some of these issues
Franco Metaza: Bueno, dos remarks para la ejecutiva de Google What’s your name? Olga Una buena y una mala. I’m going to speak in Spanish. Sorry una buena y una mala Creo que la experiencia de YouTube kids es una experiencia muy muy virtuosa porque genera un ecosistema digital distinto Las redes sociales no, no lo tienen. Esto es como si en la vida real nosotros permitiéramos que los niños entren al casino. Bueno cuando entran a Instagram es como un niño entrando a un casino o a una discoteca La experiencia de YouTube kids de generar un ecosistema distinto con un algoritmo absolutamente controlado me parece muy exitosa Y para que le podamos exigir al resto de las empresas. Y después una crítica Yo no sé si usted se acuerda Olga Pero en el año agosto del 20 en el panel de conocimiento de Google de Argentina cuando usted buscaba el nombre de quien era la vicepresidenta de ese momento Cristina Fernández de Kirchner figuraba ladrona de la nación argentina Ladrona. Eso en el panel de conocimiento hubo un juicio al respecto. Bueno, este tipo de cosas si le pasan a una persona que es la vicepresidenta del país y le pasa a una empresa tan grande como Google Imagínense la desprotección que hay hacia abajo, ¿no? Lo que ha generado el sentido común en la sociedad argentina para que hoy el presidente Milley haya osado ponerla en prisión siendo la principal líder opositora. Gracias
Ashley Sauls: I think just on the AI element from a African perspective There should be a balance also around the importance of the well-being of people and profits My country is known for an apartheid history and And we’ve realized that in the AI training That there is Still the presence of a risk of what I would call digital apartheid And this is quite a unique danger for us and there is no attention given especially by The private sector on the importance and on the importance of this in terms of even profiling based on historic racial desperation where if AI Just to make it practical if an AI like me. I look around the room. I’m like the darkest in the room and if AI picks up they differentiate between even on that level a Darker profiling for someone that looks like me and If that’s a repeat now Then we cannot rejoice for a digital future. We should be concerned that that digital future is repeating an Ugly history. Thank you
Sorina Teleanu: Thank you as well
Tsvetelina Penkova: Just very very briefly on the remark for the for a generation of rights and the need for a common approach Absolutely agree because I Actually believe that this is probably gonna resolve the enforcement issues Well, because you’ve made a very valid point on the fact that it’s very difficult to enforce something Which is not not defined or not. Well understood so point taken
Sorina Teleanu: Just to add quickly on the point on digital rights, it’s true We don’t really have new you an instrument dealing with them But there are quite a few Human Rights Council resolutions for instance Which say clearly the same rights that people have offline must also be protected online So at least we can use that as a starting point. We’ve taken 15 more minutes of everyone’s time. Thank you so much Thank you so much to all of you for contributing and for still being in the room We hope this has been useful as a last session of the parliamentary track. I know something will still happen in this room So please do not leave and my colleagues will be telling you a bit more Thank you so much, and good luck with the rest of the IGF You You
Anusha Rahman Ahmad Khan
Speech speed
141 words per minute
Speech length
902 words
Speech time
382 seconds
Social media platforms prioritize revenue over cultural sensitivity and fail to respond adequately to government requests for content removal, leading to serious harm including suicide cases
Explanation
Social media platforms treat government requests for content removal as revenue-curbing measures rather than legitimate regulatory concerns. They decide independently what content to remove or keep, showing insensitivity to local cultural contexts where even minor aspersions can have devastating consequences.
Evidence
Example of a university student being harassed with AI-generated fake content that looked real – by the time content was removed, her life was destroyed. Dozens of examples of girls committing suicide due to online harassment. In Pakistan, even an aspersion on a girl can be enough to kill her emotionally if not physically.
Major discussion point
Regulation of Social Media Platforms and Content Moderation
Topics
Human rights | Sociocultural | Legal and regulatory
Agreed with
– Franco Metaza
– Olga Reis
Agreed on
Social media platforms need to take greater responsibility for content moderation and harm prevention
Disagreed with
– Tsvetelina Penkova
Disagreed on
Decision-making authority for content removal
Pakistan’s Prevention of Electronic Crimes Act (2016) was developed through collaborative effort to protect vulnerable segments while upholding freedom of expression
Explanation
The law took two years to develop through extensive consultation with parliamentarians, civil society, NGOs, independent groups, and media. It specifically targets social media misinformation, propaganda, and fake news rather than traditional media, aiming to protect children, women, girls, and all vulnerable segments including men and boys.
Evidence
Law passed in 2016 after starting work in 2014. Described as a consensus document representing 240 million people through their MPs in both National Assembly and Senate. Focus on protecting natural person dignity and criminalizing harassment leading to vulnerable situations.
Major discussion point
Legislative Frameworks and Cybercrime Laws
Topics
Legal and regulatory | Human rights | Cybersecurity
Parliamentarians should create joint strategies to collectively address social media platforms and protect vulnerable citizens globally
Explanation
Individual government requests to social media platforms are often ignored or inadequately addressed. A collective approach by parliamentarians across countries would have more leverage to ensure that offline rights are equally protected online for both vulnerable and non-vulnerable groups.
Evidence
Personal experience as former technology minister showing that social media platforms continue to ignore government requests and treat them as if they were ‘two kilometers above the ground, not impacted by the law.’
Major discussion point
International Cooperation and Multi-stakeholder Approaches
Topics
Legal and regulatory | Human rights | Sociocultural
Agreed with
– Ashley Sauls
– Olga Reis
– Sorina Teleanu
Agreed on
Multi-stakeholder approaches are necessary for effective digital governance
Franco Metaza
Speech speed
155 words per minute
Speech length
1301 words
Speech time
501 seconds
Companies like Google can do more than they are currently doing to tackle harmful content, as they have the budget and resources but are not making sufficient efforts
Explanation
The Mercosur parliament has reached consensus that technology companies possess sufficient financial resources and technical capabilities to address harmful content and fake news more effectively than their current efforts demonstrate. There is agreement that these companies should increase their commitment to content moderation.
Evidence
Consensus reached within Mercosur parliament (representing Brazil, Argentina, Uruguay, Paraguay, and Bolivia with 100 parliamentarians) that companies have the budget and money but are not making enough efforts.
Major discussion point
Regulation of Social Media Platforms and Content Moderation
Topics
Legal and regulatory | Sociocultural | Economic
Agreed with
– Anusha Rahman Ahmad Khan
– Olga Reis
Agreed on
Social media platforms need to take greater responsibility for content moderation and harm prevention
Content targeting young girls promotes extreme dieting, impossible body standards, and leads to eating disorders, with 13-14 year olds seeking aesthetic surgeries
Explanation
Social media platforms, particularly Instagram and TikTok, bombard young users with harmful content promoting unrealistic body images through AI-generated or real images of extremely thin bodies. This content includes advice for extreme diets and advertisements for surgeries, leading to serious eating disorders among young girls.
Evidence
Tests conducted by registering as a 13-year-old girl on social media showed bombardment with images of extremely skinny bodies (some AI-generated), extreme diet advice, and surgery advertisements. In Brazil, 13-14 year old girls have started consulting doctors independently about aesthetic surgeries.
Major discussion point
Harmful Content and Its Impact on Vulnerable Groups
Topics
Human rights | Sociocultural | Cybersecurity
Disinformation can lead to real-world violence, as seen with assassination attempts on political leaders fueled by fake news and hate messages
Explanation
Systematic circulation of fake news and hate messages on social networks can escalate to physical violence. The spread of disinformation creates an environment where individuals consume so much hateful content that they may act violently against targeted individuals.
Evidence
Example from Argentina where fake news about corruption and hate messages against Cristina Fernández de Kirchner went viral on networks. A person who consumed these hate messages appeared at her house and shot at her head – fortunately the bullet did not fire. She has since been arrested by the current president amid confusion of fake news and disinformation.
Major discussion point
Harmful Content and Its Impact on Vulnerable Groups
Topics
Cybersecurity | Human rights | Sociocultural
Regulation through democratic parliaments representing all social and political expressions will never go against freedom, similar to traffic laws for vehicles
Explanation
When regulations are created in parliaments where all social statements and political expressions are represented, they express the will of the majority and therefore cannot be against freedom. Just as society created speed limits and age restrictions for driving when cars were introduced, similar reasonable regulations are needed for social media.
Evidence
Analogy provided: when motor vehicles appeared in society, speed limits were established and children were prohibited from driving – this was not against anyone’s freedom. Comparison made that permanent scrolling on social media is as harmful or more harmful than driving at full speed without knowing what lies ahead.
Major discussion point
Balancing Freedom of Expression with Safety and Protection
Topics
Legal and regulatory | Human rights | Sociocultural
Disagreed with
– Yogesh Bhattarai
Disagreed on
Level of regulation needed for digital platforms
Youth participation should be transversal across all policy-making rather than segregated into youth-only discussions
Explanation
Rather than creating separate spaces where young people only discuss among themselves and solve youth-specific problems, young people should be integrated across all political construction in society. This transversal approach ensures youth perspectives are included in all policy areas rather than being isolated.
Evidence
Argentina allows voting from age 16 and has a large number of young parliamentarians in both chambers. Every time they make a law and listen to all parties, they include young people in the consultation process.
Major discussion point
Youth Engagement in Digital Policymaking
Topics
Human rights | Legal and regulatory | Sociocultural
Agreed with
– Tsvetelina Penkova
– Yogesh Bhattarai
– Ashley Sauls
– Bibek Silwal
Agreed on
Youth engagement in digital policymaking should be meaningful and integrated
Disagreed with
– Bibek Silwal
Disagreed on
Approach to youth engagement in policymaking
YouTube Kids provides a successful model of controlled digital ecosystem for children that other platforms should emulate
Explanation
YouTube Kids creates a separate digital ecosystem with a completely controlled algorithm specifically designed for children, unlike other social media platforms that allow children into adult-oriented spaces. This approach should be demanded from other companies as it provides appropriate protection for minors.
Evidence
Comparison made that allowing children on regular Instagram is like allowing a child to enter a casino or nightclub, while YouTube Kids provides a virtuous experience with a controlled algorithm creating a distinct ecosystem for children.
Major discussion point
Private Sector Responsibility and AI Governance
Topics
Human rights | Sociocultural | Cybersecurity
Olga Reis
Speech speed
141 words per minute
Speech length
592 words
Speech time
250 seconds
AI technology is being used effectively for content moderation, with 8.6 million videos removed from YouTube in Q1 2025, 55% before being viewed
Explanation
Google utilizes AI technology to automatically detect and remove content that violates policies before it can cause harm. The majority of problematic content is identified and removed before users can view it, demonstrating the effectiveness of AI-powered content moderation systems.
Evidence
In Q1 2025 (January-March), 8.6 million videos were removed from YouTube for policy violations. 55% were removed before being watched at all, and an additional 27% were removed with less than 10 views, showing early detection capabilities.
Major discussion point
Regulation of Social Media Platforms and Content Moderation
Topics
Cybersecurity | Legal and regulatory | Sociocultural
Agreed with
– Anusha Rahman Ahmad Khan
– Franco Metaza
Agreed on
Social media platforms need to take greater responsibility for content moderation and harm prevention
AI development should be bold, responsible, and collaborative with international community, public sector, and civil society
Explanation
AI represents a transformative once-in-a-generation opportunity, especially for emerging markets, but must be developed and deployed through collaboration with multiple stakeholders. This approach ensures that the technology benefits society while addressing potential risks and concerns.
Evidence
Google’s approach to AI development emphasizes working together with international community, public sector, civil society, and users. Specific mention of AI as transformative technology with particular opportunities for emerging markets.
Major discussion point
Private Sector Responsibility and AI Governance
Topics
Development | Legal and regulatory | Economic
Agreed with
– Anusha Rahman Ahmad Khan
– Ashley Sauls
– Sorina Teleanu
Agreed on
Multi-stakeholder approaches are necessary for effective digital governance
Yogesh Bhattarai
Speech speed
108 words per minute
Speech length
831 words
Speech time
461 seconds
Digital platforms should be regulated but not controlled, requiring cooperation and collaboration rather than strict control
Explanation
Nepal’s approach emphasizes that digital platforms need regulatory frameworks that provide guidance and boundaries without imposing excessive control that could stifle innovation or freedom. The focus should be on collaborative governance that strengthens democratic institutions while ensuring platforms serve public interests.
Evidence
Nepal’s Constitution guarantees freedom of speech and expression, with Article 19 establishing right to information and communication as fundamental rights. Parliament will not accept any law that contradicts constitutional provisions. Nepal has National Information Commission and Press Council Nepal as independent oversight agencies.
Major discussion point
Regulation of Social Media Platforms and Content Moderation
Topics
Legal and regulatory | Human rights | Infrastructure
Agreed with
– Tsvetelina Penkova
– Ashley Sauls
– Sorina Teleanu
Agreed on
Balance between freedom of expression and safety protection is essential and achievable
Disagreed with
– Franco Metaza
Disagreed on
Level of regulation needed for digital platforms
Nepal is currently discussing Social Media Bill and Information Technology Bill while ensuring constitutional compliance with freedom of speech
Explanation
Nepal is developing legislation to address digital challenges while maintaining strict adherence to constitutional protections for freedom of expression and communication rights. The legislative process involves extensive stakeholder consultation to ensure balanced approaches that protect both safety and rights.
Evidence
Social Media Bill and Information Technology Bill currently under discussion. Government has requested suggestions from different stakeholders. Nepal has National Information Commission and Press Council Nepal as independent oversight agencies. MPs participate in programs organized by Civil Society Organizations discussing right to information and communication.
Major discussion point
Legislative Frameworks and Cybercrime Laws
Topics
Legal and regulatory | Human rights | Sociocultural
Constitutional guarantees of freedom of speech must be upheld while addressing legitimate concerns about harmful content
Explanation
Nepal’s legislative approach prioritizes constitutional protections for freedom of speech and expression while acknowledging the need to address misinformation, disinformation, and content that can divide society along caste, religious, racial, and gender lines. The challenge is creating effective governance without compromising fundamental rights.
Evidence
Constitution of Nepal guarantees freedom of speech and expression with Article 19 establishing right to information and communication as fundamental rights. Concerns identified about misinformation and disinformation spreading confusion about caste, religious, racism, gender, and professions, potentially dividing society and impacting national security.
Major discussion point
Balancing Freedom of Expression with Safety and Protection
Topics
Human rights | Legal and regulatory | Sociocultural
Nepal engages youth through national and youth internet governance forums in legislative processes
Explanation
Nepal has established both national and youth-specific internet governance forums that provide platforms for young people to participate in policy discussions and legislative processes. Parliamentarians actively engage with these forums to ensure youth perspectives are incorporated into law-making.
Evidence
Nepal has a national internet governance forum and a youth internet governance forum. Parliamentarians are engaged with these forums to make laws and other legislative processes. Nepal is described as having 125 languages and 125 different castes, making youth engagement particularly important for inclusive policy-making.
Major discussion point
Youth Engagement in Digital Policymaking
Topics
Human rights | Legal and regulatory | Development
Agreed with
– Franco Metaza
– Tsvetelina Penkova
– Ashley Sauls
– Bibek Silwal
Agreed on
Youth engagement in digital policymaking should be meaningful and integrated
Tsvetelina Penkova
Speech speed
137 words per minute
Speech length
1437 words
Speech time
628 seconds
The Digital Services Act serves as flagship EU legislation tackling harmful content, disinformation, and protecting minors and vulnerable groups
Explanation
The DSA represents the EU’s comprehensive approach to regulating digital spaces, addressing multiple challenges including protection of minors and vulnerable groups, cyber violence, harmful content, and disinformation. It serves as the cornerstone of EU digital regulation covering 27 member states.
Evidence
DSA described as flagship EU legislation representing 27 member states. Specifically tackles protecting minors and vulnerable groups, cyber violence, harmful content and disinformation – issues mentioned throughout the day’s discussions.
Major discussion point
Regulation of Social Media Platforms and Content Moderation
Topics
Legal and regulatory | Human rights | Sociocultural
EU has comprehensive digital legislation including DSA, DMA, GDPR, and AI Act working together to create protective frameworks
Explanation
The EU has developed an interconnected system of digital laws where each piece of legislation complements others to create comprehensive protection. The Digital Markets Act ensures fair competition, GDPR protects data rights, and the AI Act provides safeguards for AI development, all working alongside the DSA.
Evidence
Digital Markets Act promotes greater choice for consumers and interoperability. GDPR reinforces individuals’ control over their data while promoting trustworthy data sharing. AI Act described as protecting people and ensuring enough time for consumers to understand risks before wide technology deployment. European Democracy Action Plan and Media Freedom Act address media freedom and pluralism.
Major discussion point
Legislative Frameworks and Cybercrime Laws
Topics
Legal and regulatory | Human rights | Economic
Digital transition must be human-centric while protecting citizens’ rights, requiring balance between innovation and protection
Explanation
The EU’s approach prioritizes human-centric digital transformation that protects citizens’ rights while allowing for innovation and growth. This involves ensuring that technological advancement serves human needs rather than compromising fundamental rights and protections.
Evidence
Four key priorities identified: human-centric digital transformation, combating online hate and disinformation, digital literacy and resilience, and children online safety. EU strategy focuses significantly on digital education as essential for successful legislation and enforcement.
Major discussion point
Balancing Freedom of Expression with Safety and Protection
Topics
Human rights | Development | Legal and regulatory
Agreed with
– Yogesh Bhattarai
– Ashley Sauls
– Sorina Teleanu
Agreed on
Balance between freedom of expression and safety protection is essential and achievable
Young people understand that the digital economy will shrink without proper regulation and want to be actively involved in policy conversations
Explanation
Youth recognize that lack of appropriate regulation will harm the digital economy and actively seek participation in policy-making processes. They bring valuable insights about new trends and are ahead of many legislators in understanding digital developments, making their involvement essential rather than optional.
Evidence
Key messages from youth consultations included that the digital economy will shrink without regulation, children require protection without limiting rights, and bloggers are not restricted in content publishing. Young generation is described as being ahead of legislators on new trends and wanting to be part of conversations.
Major discussion point
Youth Engagement in Digital Policymaking
Topics
Economic | Human rights | Development
Agreed with
– Franco Metaza
– Yogesh Bhattarai
– Ashley Sauls
– Bibek Silwal
Agreed on
Youth engagement in digital policymaking should be meaningful and integrated
Cyber violence decisions should be made by judges rather than governments to prevent abuse of regulatory power
Explanation
To prevent government overreach and protect against potential abuse of cybercrime laws, the EU framework requires that decisions about illegal content removal be made by judicial authorities rather than government officials. This provides an important check on executive power and protects democratic freedoms.
Evidence
In the DSA framework, illegal content must be taken down without delay once detected, but this decision has to be taken by a judge, not by the government. This addresses concerns about cybercrime laws being misused against journalists and opposition voices.
Major discussion point
Balancing Freedom of Expression with Safety and Protection
Topics
Legal and regulatory | Human rights | Cybersecurity
Disagreed with
– Anusha Rahman Ahmad Khan
Disagreed on
Decision-making authority for content removal
Ashley Sauls
Speech speed
151 words per minute
Speech length
1066 words
Speech time
423 seconds
South Africa has enacted multiple acts including Protection of Personal Information Act, Cyber Crimes Act, and Fullerman Publications Act
Explanation
South Africa has developed a comprehensive legal framework to address digital governance challenges through multiple pieces of legislation that regulate different aspects of digital activity. These laws work together to provide protection for personal information, address cybercrime, and regulate digital publications.
Evidence
Specific mention of Protection of Personal Information Act, Cyber Crimes Act, and Fullerman Publications Act as enacted legislation regulating digital platforms and activities in South Africa.
Major discussion point
Legislative Frameworks and Cybercrime Laws
Topics
Legal and regulatory | Human rights | Cybersecurity
Disinformation about South Africa led to international consequences, including cancelled educational exchanges and reinforced negative stereotypes
Explanation
False information spread online about South Africa, including claims about white minority genocide and stereotypes about racial groups, influenced international decisions and relationships. This demonstrates how disinformation can have real-world diplomatic and social consequences beyond national borders.
Evidence
US executive decision was largely based on online disinformation about white minority genocide in South Africa. This fueled narratives about the ‘coloured’ racial group as violent and gangster-ridden. A rugby match between Atlanta Secondary School and Liff Burra Grammar School from the UK was cancelled because parents feared for safety based on what was said in the White House about South Africa.
Major discussion point
Harmful Content and Its Impact on Vulnerable Groups
Topics
Sociocultural | Human rights | Cybersecurity
Multi-stakeholder approach engaging government, civil society, private sector, and public is essential for inclusive digital economy
Explanation
South Africa recognizes that effective digital governance requires collaboration between all sectors of society rather than top-down government regulation alone. This collaborative model ensures that benefits of digital transformation are equitably distributed and innovation addresses local challenges.
Evidence
South Africa has adopted a multi-stakeholder approach engaging government, civil society, private sector, and public. This collaborative model is described as essential for fostering inclusive digital economy where benefits are equitably distributed and innovation addresses local challenges.
Major discussion point
International Cooperation and Multi-stakeholder Approaches
Topics
Development | Economic | Legal and regulatory
Agreed with
– Anusha Rahman Ahmad Khan
– Olga Reis
– Sorina Teleanu
Agreed on
Multi-stakeholder approaches are necessary for effective digital governance
Policies must safeguard both security and fundamental human rights without infringing on privacy and freedom of expression
Explanation
South Africa’s approach to digital governance emphasizes that security measures and human rights protection are not mutually exclusive. The country is committed to creating policies that enhance cybersecurity while maintaining strong protections for privacy and freedom of expression.
Evidence
South Africa is committed to aligning digital policies with international standards, promoting balanced approach that safeguards both security and fundamental human rights. Emphasis on proactive rather than reactive approach to digital governance that is inclusive, secure and respects rights of all citizens.
Major discussion point
Balancing Freedom of Expression with Safety and Protection
Topics
Human rights | Legal and regulatory | Cybersecurity
Agreed with
– Yogesh Bhattarai
– Tsvetelina Penkova
– Sorina Teleanu
Agreed on
Balance between freedom of expression and safety protection is essential and achievable
South Africa is beginning to emphasize youth involvement more with the new government of national unity structure
Explanation
While South Africa previously had limited youth involvement in digital policy, the shift from single-party majority government to a coalition government of national unity has created opportunities for different approaches that include youth voices. This change is already showing practical results in internet governance leadership.
Evidence
South Africa had one party majority government since 1994, now has government of national unity (coalition government) with different approaches coming together. Both the chairperson and deputy chairperson of South African IGF are young people present at the conference. This is the first time parliamentarians are participating in IGF because of this different approach that includes youth voices.
Major discussion point
Youth Engagement in Digital Policymaking
Topics
Human rights | Development | Legal and regulatory
Agreed with
– Franco Metaza
– Tsvetelina Penkova
– Yogesh Bhattarai
– Bibek Silwal
Agreed on
Youth engagement in digital policymaking should be meaningful and integrated
There should be balance between people’s well-being and profits, with attention to preventing digital apartheid and racial profiling
Explanation
From an African perspective, AI development must consider the risk of perpetuating historical discrimination through digital means. South Africa’s apartheid history makes it particularly sensitive to the possibility that AI training could embed racial biases that create new forms of digital discrimination.
Evidence
South Africa’s apartheid history creates unique concerns about AI training containing risks of digital apartheid. Example given of AI potentially profiling based on historic racial separation, with concern that darker-skinned individuals might face discriminatory profiling. Warning that if AI repeats ugly history, we cannot rejoice for digital future.
Major discussion point
Private Sector Responsibility and AI Governance
Topics
Human rights | Development | Sociocultural
Raoul Danniel Abellar Manuel
Speech speed
163 words per minute
Speech length
139 words
Speech time
51 seconds
The Philippines’ Cybercrime Prevention Act (2012) contains problematic cyber libel provisions that have been misused against journalists and teachers
Explanation
The Philippines’ cybercrime law includes cyber libel provisions that were anticipated to be problematic and have indeed been abused by repressive leaders to target journalists and teachers who express opinions contradicting the government. This demonstrates how cybercrime laws can be weaponized against legitimate expression.
Evidence
Cybercrime Prevention Act passed in 2012 with cyber libel provision that was flagged years ago as potentially problematic. It has been used by abusive or repressive leaders to go after journalists and even teachers who had opinions contradicting the government. Philippines is now discussing possible review and amendments to the law.
Major discussion point
Legislative Frameworks and Cybercrime Laws
Topics
Legal and regulatory | Human rights | Cybersecurity
Bibek Silwal
Speech speed
222 words per minute
Speech length
185 words
Speech time
50 seconds
Youth serve as positive catalysts in policy implementation and should be involved from initial policymaking through public outreach
Explanation
Youth involvement amplifies the impact of policymaking processes, whether in initial policy development or implementation phases. Young people serve as effective bridges to reach end-mile communities and enhance the overall effectiveness of policy initiatives through their engagement and outreach capabilities.
Evidence
Every process where youth are involved amplifies the impact of policymaking, whether in implementation or initial policymaking process. Youth are described as positive catalysts for reaching out to the end mile and enhancing policy effectiveness.
Major discussion point
Youth Engagement in Digital Policymaking
Topics
Development | Human rights | Legal and regulatory
Agreed with
– Franco Metaza
– Tsvetelina Penkova
– Yogesh Bhattarai
– Ashley Sauls
Agreed on
Youth engagement in digital policymaking should be meaningful and integrated
Disagreed with
– Franco Metaza
Disagreed on
Approach to youth engagement in policymaking
Audiance
Speech speed
158 words per minute
Speech length
442 words
Speech time
166 seconds
UN should recognize digital rights as the fourth generation of human rights to provide common framework for legislation
Explanation
The UN has historically recognized three generations of human rights (civil and political; economic and social; collective environmental rights) which have been incorporated into national constitutions. Digital rights should be formally recognized as a fourth generation to provide legislators with common technical frameworks and instruments for creating effective digital legislation.
Evidence
UN recognized civil and political rights as first generation (1945-1948), economic and social rights as second generation, and collective environmental rights as third generation. Many countries have incorporated these into their constitutions based on UN frameworks. Digital rights lack this formal recognition, making it difficult for legislators to have common technical instruments for lawmaking.
Major discussion point
International Cooperation and Multi-stakeholder Approaches
Topics
Human rights | Legal and regulatory | Development
Anne McCormick
Speech speed
141 words per minute
Speech length
318 words
Speech time
135 seconds
Private sector needs clarity on reliable AI and liability frameworks as AI adoption spreads across different economic actors
Explanation
As AI technology becomes embedded across various sectors and company sizes, there is growing concern among business leaders, board members, investors, and insurance companies about AI governance, liability, and risk management. Small and large enterprises alike need clear frameworks to deploy AI with confidence while managing potential risks to their operations and reputation.
Evidence
Ernst & Young works with clients across different sectors and countries, observing need for clarity on reliable AI. Company leaders, board members, investors and insurance companies are asking about AI limits, risks, and potential liability. Concerns about deploying AI while maintaining employee safety, client relationships, and company reputation.
Major discussion point
Private Sector Responsibility and AI Governance
Topics
Economic | Legal and regulatory | Development
Independent oversight and transparency mechanisms are needed to ensure accountability throughout AI lifecycle
Explanation
Rather than over-regulation or under-regulation, there should be appropriate mechanisms that encourage disclosure, transparency, and accountability for AI systems throughout their development and deployment lifecycle. This includes independent oversight and assessment to ensure all economic actors can use AI technology with confidence.
Evidence
Emphasis on not over-regulating or under-regulating, but having right mechanisms to encourage disclosure and transparency rather than black box approaches. Need for independent oversight, possibly independent assurance or assessments, so everyone can use AI technology with confidence and get the best outcomes.
Major discussion point
Private Sector Responsibility and AI Governance
Topics
Legal and regulatory | Economic | Development
Amy Mitchell
Speech speed
173 words per minute
Speech length
223 words
Speech time
77 seconds
Sorina Teleanu
Speech speed
224 words per minute
Speech length
1102 words
Speech time
294 seconds
Enforcement of digital laws is challenging and requires empowered national authorities to put legislation into practice
Explanation
Having laws in place is only the first step – the real challenge lies in ensuring that national authorities have the capacity and resources to actually implement and enforce these digital regulations effectively. This is a common challenge across countries, including in the EU region.
Evidence
Reference to challenges in Romania and Bulgaria where it’s not easy to put digital legislation into practice, despite having comprehensive EU frameworks like DSA, DMA, and GDPR in place.
Major discussion point
Legislative Frameworks and Cybercrime Laws
Topics
Legal and regulatory | Development
Critical thinking and digital literacy education should benefit all users, not just young people
Explanation
While there is significant focus on educating young users about digital technologies, all users across age groups would benefit from improved critical thinking skills when interacting with digital platforms and content. Digital literacy should be a universal priority.
Evidence
Acknowledgment of excellent points on literacy, capacity building, and education for building critical thinking in young users, with extension that ‘we would all benefit from a bit more critical thinking when we interact with digital technologies.’
Major discussion point
Youth Engagement in Digital Policymaking
Topics
Development | Human rights | Sociocultural
Safety and security protection can coexist with human rights protection without requiring trade-offs
Explanation
There is a false dichotomy in thinking that protecting safety and security requires giving up human rights protections, or vice versa. Effective digital governance should achieve both objectives simultaneously through balanced approaches.
Evidence
Highlighting speaker comments about finding balance between protecting safety and security while ensuring protection of human rights, emphasizing ‘we don’t have to give one up to protect the other and the other way around.’
Major discussion point
Balancing Freedom of Expression with Safety and Protection
Topics
Human rights | Legal and regulatory | Cybersecurity
Agreed with
– Yogesh Bhattarai
– Tsvetelina Penkova
– Ashley Sauls
Agreed on
Balance between freedom of expression and safety protection is essential and achievable
Technology platforms should engage in discussions with legislators during the law-making process
Explanation
As countries develop digital legislation, it’s important to have dialogue and engagement with technology platforms to ensure that regulations are practical and effective. This interaction helps create more informed and implementable laws.
Evidence
Question posed to speakers about ‘how are you interacting with technology platforms as you’re working on this legislation in the country? Do you have any discussions with them, how is the relation being?’
Major discussion point
International Cooperation and Multi-stakeholder Approaches
Topics
Legal and regulatory | Economic
Agreed with
– Anusha Rahman Ahmad Khan
– Ashley Sauls
– Olga Reis
Agreed on
Multi-stakeholder approaches are necessary for effective digital governance
Agreements
Agreement points
Social media platforms need to take greater responsibility for content moderation and harm prevention
Speakers
– Anusha Rahman Ahmad Khan
– Franco Metaza
– Olga Reis
Arguments
Social media platforms prioritize revenue over cultural sensitivity and fail to respond adequately to government requests for content removal, leading to serious harm including suicide cases
Companies like Google can do more than they are currently doing to tackle harmful content, as they have the budget and resources but are not making sufficient efforts
AI technology is being used effectively for content moderation, with 8.6 million videos removed from YouTube in Q1 2025, 55% before being viewed
Summary
All speakers agree that social media platforms have both the capability and responsibility to do more in addressing harmful content, though they approach it from different perspectives – regulatory demands, parliamentary consensus, and industry acknowledgment of current efforts.
Topics
Legal and regulatory | Cybersecurity | Sociocultural
Balance between freedom of expression and safety protection is essential and achievable
Speakers
– Yogesh Bhattarai
– Tsvetelina Penkova
– Ashley Sauls
– Sorina Teleanu
Arguments
Digital platforms should be regulated but not controlled, requiring cooperation and collaboration rather than strict control
Digital transition must be human-centric while protecting citizens’ rights, requiring balance between innovation and protection
Policies must safeguard both security and fundamental human rights without infringing on privacy and freedom of expression
Safety and security protection can coexist with human rights protection without requiring trade-offs
Summary
Speakers consistently emphasize that protecting safety and security does not require sacrificing freedom of expression or human rights, and that balanced regulatory approaches can achieve both objectives simultaneously.
Topics
Human rights | Legal and regulatory | Cybersecurity
Youth engagement in digital policymaking should be meaningful and integrated
Speakers
– Franco Metaza
– Tsvetelina Penkova
– Yogesh Bhattarai
– Ashley Sauls
– Bibek Silwal
Arguments
Youth participation should be transversal across all policy-making rather than segregated into youth-only discussions
Young people understand that the digital economy will shrink without proper regulation and want to be actively involved in policy conversations
Nepal engages youth through national and youth internet governance forums in legislative processes
South Africa is beginning to emphasize youth involvement more with the new government of national unity structure
Youth serve as positive catalysts in policy implementation and should be involved from initial policymaking through public outreach
Summary
All speakers agree that youth should be meaningfully integrated into digital policymaking processes rather than marginalized, recognizing their unique insights and catalytic role in policy implementation.
Topics
Human rights | Development | Legal and regulatory
Multi-stakeholder approaches are necessary for effective digital governance
Speakers
– Anusha Rahman Ahmad Khan
– Ashley Sauls
– Olga Reis
– Sorina Teleanu
Arguments
Parliamentarians should create joint strategies to collectively address social media platforms and protect vulnerable citizens globally
Multi-stakeholder approach engaging government, civil society, private sector, and public is essential for inclusive digital economy
AI development should be bold, responsible, and collaborative with international community, public sector, and civil society
Technology platforms should engage in discussions with legislators during the law-making process
Summary
Speakers consistently advocate for collaborative approaches involving multiple stakeholders rather than unilateral action by any single actor, recognizing the complexity of digital governance challenges.
Topics
Legal and regulatory | Development | Economic
Similar viewpoints
Both speakers from developing countries (Pakistan and Argentina) share concerns about social media platforms’ inadequate response to harmful content that leads to real-world violence and harm, particularly affecting vulnerable populations.
Speakers
– Anusha Rahman Ahmad Khan
– Franco Metaza
Arguments
Social media platforms prioritize revenue over cultural sensitivity and fail to respond adequately to government requests for content removal, leading to serious harm including suicide cases
Disinformation can lead to real-world violence, as seen with assassination attempts on political leaders fueled by fake news and hate messages
Topics
Cybersecurity | Human rights | Sociocultural
Both speakers recognize the risk of cybercrime laws being abused by governments and emphasize the need for judicial oversight to prevent misuse against legitimate expression and press freedom.
Speakers
– Tsvetelina Penkova
– Raoul Danniel Abellar Manuel
Arguments
Cyber violence decisions should be made by judges rather than governments to prevent abuse of regulatory power
The Philippines’ Cybercrime Prevention Act (2012) contains problematic cyber libel provisions that have been misused against journalists and teachers
Topics
Legal and regulatory | Human rights | Cybersecurity
Both speakers emphasize the need for responsible AI development that considers broader societal impacts beyond profit motives, with attention to equity and clear governance frameworks.
Speakers
– Ashley Sauls
– Anne McCormick
Arguments
There should be balance between people’s well-being and profits, with attention to preventing digital apartheid and racial profiling
Private sector needs clarity on reliable AI and liability frameworks as AI adoption spreads across different economic actors
Topics
Human rights | Economic | Development
Unexpected consensus
Private sector acknowledgment of need for greater responsibility
Speakers
– Franco Metaza
– Olga Reis
Arguments
Companies like Google can do more than they are currently doing to tackle harmful content, as they have the budget and resources but are not making sufficient efforts
AI technology is being used effectively for content moderation, with 8.6 million videos removed from YouTube in Q1 2025, 55% before being viewed
Explanation
It’s unexpected to see a Google representative (Olga Reis) essentially agreeing with parliamentary criticism by acknowledging current efforts while implicitly accepting that more can be done, rather than defending current practices as sufficient.
Topics
Legal and regulatory | Cybersecurity | Economic
Recognition of digital rights as fundamental human rights requiring formal framework
Speakers
– Audiance
– Tsvetelina Penkova
Arguments
UN should recognize digital rights as the fourth generation of human rights to provide common framework for legislation
Digital transition must be human-centric while protecting citizens’ rights, requiring balance between innovation and protection
Explanation
The consensus between a civil society representative calling for formal UN recognition of digital rights and an EU parliamentarian’s human-centric approach suggests growing recognition that digital rights need formal international frameworks similar to other human rights generations.
Topics
Human rights | Legal and regulatory | Development
Overall assessment
Summary
The discussion revealed strong consensus on several key areas: the need for greater platform responsibility, balanced approaches to regulation that protect both safety and freedom, meaningful youth engagement, and multi-stakeholder governance. Speakers consistently emphasized human-centric approaches to digital governance.
Consensus level
High level of consensus on fundamental principles, with differences mainly in implementation approaches rather than core objectives. This suggests potential for collaborative international action on digital governance frameworks, particularly around platform accountability, youth engagement, and balanced regulatory approaches that protect both safety and human rights.
Differences
Different viewpoints
Approach to youth engagement in policymaking
Speakers
– Franco Metaza
– Bibek Silwal
Arguments
Youth participation should be transversal across all policy-making rather than segregated into youth-only discussions
Youth serve as positive catalysts in policy implementation and should be involved from initial policymaking through public outreach
Summary
Franco Metaza argues against segregating youth into separate discussions, preferring transversal integration across all policy areas. Bibek Silwal advocates for dedicated youth involvement and specialized engagement processes.
Topics
Human rights | Legal and regulatory | Development
Level of regulation needed for digital platforms
Speakers
– Yogesh Bhattarai
– Franco Metaza
Arguments
Digital platforms should be regulated but not controlled, requiring cooperation and collaboration rather than strict control
Regulation through democratic parliaments representing all social and political expressions will never go against freedom, similar to traffic laws for vehicles
Summary
Bhattarai emphasizes light-touch regulation focusing on cooperation, while Metaza supports stronger parliamentary regulation comparing it to necessary traffic laws.
Topics
Legal and regulatory | Human rights
Decision-making authority for content removal
Speakers
– Anusha Rahman Ahmad Khan
– Tsvetelina Penkova
Arguments
Social media platforms prioritize revenue over cultural sensitivity and fail to respond adequately to government requests for content removal, leading to serious harm including suicide cases
Cyber violence decisions should be made by judges rather than governments to prevent abuse of regulatory power
Summary
Khan advocates for stronger government authority over content removal decisions, while Penkova insists judicial oversight is necessary to prevent government overreach.
Topics
Legal and regulatory | Human rights | Cybersecurity
Unexpected differences
Private sector engagement and criticism
Speakers
– Franco Metaza
– Olga Reis
Arguments
YouTube Kids provides a successful model of controlled digital ecosystem for children that other platforms should emulate
AI technology is being used effectively for content moderation, with 8.6 million videos removed from YouTube in Q1 2025, 55% before being viewed
Explanation
Unexpectedly, Metaza both praised Google’s YouTube Kids as a virtuous model while simultaneously criticizing Google for allowing defamatory content about Argentine political leaders in search results. This shows complex relationship between acknowledging good practices while holding companies accountable for failures.
Topics
Legal and regulatory | Sociocultural | Human rights
Constitutional vs. practical approaches to digital rights
Speakers
– Yogesh Bhattarai
– Audiance
Arguments
Constitutional guarantees of freedom of speech must be upheld while addressing legitimate concerns about harmful content
UN should recognize digital rights as the fourth generation of human rights to provide common framework for legislation
Explanation
While both support strong digital rights protection, they disagree on whether existing constitutional frameworks are sufficient (Bhattarai) or whether new international frameworks are needed (Audiance). This represents a fundamental disagreement about legal foundations for digital governance.
Topics
Human rights | Legal and regulatory | Development
Overall assessment
Summary
The main areas of disagreement center on regulatory approaches (light-touch cooperation vs. stronger parliamentary control), decision-making authority (government vs. judicial oversight), youth engagement methods (integrated vs. specialized), and legal frameworks (existing constitutional vs. new international instruments).
Disagreement level
Moderate disagreement level with significant implications. While speakers share common goals of protecting vulnerable groups and balancing rights with safety, their different approaches could lead to incompatible policy frameworks. The disagreements reflect deeper tensions between national sovereignty and international coordination, government authority and judicial independence, and regulatory approaches across different legal and cultural contexts.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers from developing countries (Pakistan and Argentina) share concerns about social media platforms’ inadequate response to harmful content that leads to real-world violence and harm, particularly affecting vulnerable populations.
Speakers
– Anusha Rahman Ahmad Khan
– Franco Metaza
Arguments
Social media platforms prioritize revenue over cultural sensitivity and fail to respond adequately to government requests for content removal, leading to serious harm including suicide cases
Disinformation can lead to real-world violence, as seen with assassination attempts on political leaders fueled by fake news and hate messages
Topics
Cybersecurity | Human rights | Sociocultural
Both speakers recognize the risk of cybercrime laws being abused by governments and emphasize the need for judicial oversight to prevent misuse against legitimate expression and press freedom.
Speakers
– Tsvetelina Penkova
– Raoul Danniel Abellar Manuel
Arguments
Cyber violence decisions should be made by judges rather than governments to prevent abuse of regulatory power
The Philippines’ Cybercrime Prevention Act (2012) contains problematic cyber libel provisions that have been misused against journalists and teachers
Topics
Legal and regulatory | Human rights | Cybersecurity
Both speakers emphasize the need for responsible AI development that considers broader societal impacts beyond profit motives, with attention to equity and clear governance frameworks.
Speakers
– Ashley Sauls
– Anne McCormick
Arguments
There should be balance between people’s well-being and profits, with attention to preventing digital apartheid and racial profiling
Private sector needs clarity on reliable AI and liability frameworks as AI adoption spreads across different economic actors
Topics
Human rights | Economic | Development
Takeaways
Key takeaways
Social media platforms prioritize revenue over cultural sensitivity and public safety, often failing to respond adequately to government requests for harmful content removal
Effective content moderation requires a balance between protecting vulnerable groups and preserving freedom of expression, with decisions ideally made by judicial rather than governmental authorities
Legislative frameworks must be developed through multi-stakeholder collaboration including parliamentarians, civil society, NGOs, and media to ensure comprehensive protection while upholding democratic values
Youth engagement in digital policymaking should be transversal across all policy areas rather than segregated, as young people understand digital challenges and want active involvement in solutions
International cooperation and joint parliamentary strategies are essential for addressing global digital challenges that transcend national boundaries
AI technology shows promise for automated content moderation but requires responsible development with attention to preventing digital discrimination and ensuring transparency
Digital rights may need formal recognition as a fourth generation of human rights to provide a common international framework for legislation
Capacity building for parliamentarians and public officials is crucial for effective digital governance and understanding emerging technologies
Resolutions and action items
Parliamentarians should create joint strategies to collectively address social media platforms and protect vulnerable citizens globally
UN should consider recognizing digital rights as the fourth generation of human rights to provide common legislative framework
Private sector should increase efforts and investment in tackling harmful content despite having adequate resources
Governments should engage with technology platforms during legislation development processes
Educational institutions and civil society should launch campaigns teaching youth to identify fake news and develop critical thinking skills
Capacity building programs for public officials should be expanded, including Google’s AI Campus training program
YouTube Kids model of controlled digital ecosystem should be adopted by other social media platforms for child protection
Unresolved issues
How to make social media platforms more culturally sensitive and responsive to local government requests for content removal
Enforcement challenges at national levels for implementing comprehensive digital legislation frameworks
Definitional language around media freedom and journalism in the digital age as content creators diversify beyond traditional media
Prevention of legislative abuse by future governments that might use cybercrime laws to suppress opposition voices
Addressing digital apartheid and racial profiling risks in AI training and deployment
Balancing innovation and economic growth with necessary protective regulations
Establishing liability frameworks for AI adoption across different economic sectors beyond large tech companies
Creating effective mechanisms for cross-border enforcement of digital rights and content moderation
Suggested compromises
Digital platforms should be regulated but not controlled, emphasizing cooperation and collaboration over strict governmental control
Cybercrime legislation should focus on protecting vulnerable groups while ensuring judicial rather than governmental oversight of content decisions
AI development should proceed boldly but responsibly through collaboration between private sector, government, civil society and international community
Content moderation should combine automated AI systems with human oversight to balance efficiency with cultural sensitivity
Legislative frameworks should align with international standards while addressing local cultural and social contexts
Private sector should engage in capacity building and transparency initiatives while maintaining innovation and competitive dynamics
Thought provoking comments
It’s not a fight between East or the West. It’s a fight between revenue generation entities versus a revenue curbing request.
Speaker
Anusha Rahman Ahmad Khan
Reason
This comment reframes the entire debate about content moderation from a geopolitical or cultural clash to an economic one. It cuts through diplomatic language to identify the core tension: platforms prioritize profit over cultural sensitivity and user safety. This insight is particularly powerful because it moves beyond abstract discussions of rights to concrete economic incentives.
Impact
This comment established a recurring theme throughout the discussion about private sector responsibility. Multiple subsequent speakers referenced the need for platforms to do more, and it influenced Franco Metaza’s later assertion that ‘companies can do more than what they are doing’ and his criticism of revenue-driven content decisions.
I think that today the permanent scrolling that we are all subjected to is as much or more harmful as going at full speed with a vehicle without knowing what is in front of us or without having a traffic light.
Speaker
Franco Metaza
Reason
This metaphor brilliantly captures the unregulated nature of social media consumption and its potential dangers. By comparing social media scrolling to reckless driving, it makes the abstract concept of digital harm tangible and relatable, while also providing a framework for understanding why regulation isn’t about restricting freedom but ensuring safety.
Impact
This metaphor was specifically noted by the moderator and became a reference point for discussing the legitimacy of digital regulation. It helped shift the conversation from whether regulation is needed to how it should be implemented, making the case that just as we accept traffic rules for safety, we should accept digital rules.
Digital platforms should be regulated, not controlled. Cooperation, collaboration, and solidarity should be strengthened.
Speaker
Yogesh Bhattarai
Reason
This distinction between regulation and control is crucial in the digital governance debate. It acknowledges the need for oversight while respecting democratic principles and avoiding authoritarian overreach. The comment provides a nuanced middle ground between laissez-faire and heavy-handed government intervention.
Impact
This comment influenced the moderator’s follow-up questions about how countries interact with tech platforms and helped establish a framework for discussing responsible governance approaches. It contributed to the overall theme of finding balance between protection and freedom.
We are now tired of waiting and I would urge and request all the other parliamentarians to come together to make a joint strategy where we can collectively speak to the social media platforms.
Speaker
Anusha Rahman Ahmad Khan
Reason
This call for collective action represents a shift from individual national approaches to coordinated international pressure on tech platforms. It recognizes that platforms operate globally while governments act locally, creating an inherent power imbalance that can only be addressed through cooperation.
Impact
This comment sparked discussion about regional cooperation, with Franco Metaza confirming consensus in Mercosur about platform responsibility, and influenced later questions about African Union initiatives. It helped establish the theme of multilateral approaches to digital governance.
Si les Nations Unies passent cette étape, vous allez remarquer que beaucoup de pays vont enjamber et peut-être ça sera même inscrit dans nos constitutions et dans nos lois… il faut que nous ayons un élément, un instrument commun comme on l’a avec les trois générations des droits humains et avec la génération des droits numériques.
Speaker
Honorable from Democratic Republic of Congo
Reason
This intervention fundamentally challenges the current human rights framework by proposing digital rights as a fourth generation of human rights. It’s intellectually rigorous, drawing on the historical evolution of rights recognition, and identifies a systemic gap in how we conceptualize digital governance within established human rights frameworks.
Impact
This comment prompted immediate agreement from Tsvetelina Penkova, who acknowledged it would resolve enforcement issues. It elevated the discussion from practical policy implementation to fundamental questions about the nature of rights in the digital age, representing one of the most conceptually ambitious contributions to the session.
There should be a balance also around the importance of the well-being of people and profits… we’ve realized that in the AI training that there is still the presence of a risk of what I would call digital apartheid.
Speaker
Ashley Sauls
Reason
This comment introduces the concept of ‘digital apartheid’ and connects AI bias to historical injustices, making the discussion more concrete and urgent. It challenges the tech industry’s narrative of progress by highlighting how AI systems can perpetuate and amplify existing inequalities, particularly affecting marginalized communities.
Impact
This was one of the final substantive comments and served as a powerful counterpoint to the earlier private sector presentation about AI benefits. It grounded the abstract discussion of AI governance in lived experience and historical context, emphasizing that technological advancement without equity considerations can reproduce historical injustices.
Overall assessment
These key comments fundamentally shaped the discussion by moving it beyond surface-level policy debates to deeper structural questions. The session evolved from individual country experiences to systemic analysis of power dynamics between governments and platforms, the need for international cooperation, and the fundamental question of how to conceptualize rights in the digital age. The economic framing of platform behavior, the traffic regulation metaphor, and the digital apartheid concept provided concrete ways to understand abstract policy challenges. The call for collective action and the proposal for a fourth generation of human rights elevated the discussion to consider both practical coordination mechanisms and foundational legal frameworks. Together, these comments created a progression from problem identification to systemic analysis to potential solutions, while maintaining focus on protecting vulnerable populations and democratic values.
Follow-up questions
How long will social media platforms continue to take to listen to governments and their requests to remove objectionable content and secure vulnerable groups online?
Speaker
Anusha Rahman Ahmad Khan
Explanation
This addresses the ongoing challenge of platform responsiveness to government content removal requests, particularly for protecting vulnerable populations like women and children from harassment and harmful content.
How can social media platforms be made more sensitive to different cultural contexts when making content moderation decisions?
Speaker
Anusha Rahman Ahmad Khan
Explanation
This highlights the need for culturally-aware content moderation, as platforms currently make uniform decisions without considering local cultural sensitivities that could have severe consequences for users.
How can parliamentarians develop a joint strategy to collectively speak to social media platforms about protecting vulnerable citizens?
Speaker
Anusha Rahman Ahmad Khan
Explanation
This suggests the need for coordinated international parliamentary action to increase leverage when dealing with global technology platforms on content moderation issues.
Are there collaborative efforts across Mercosur countries to deal with harmful online content through awareness and capacity building, not just legislation?
Speaker
Sorina Teleanu
Explanation
This explores whether regional parliamentary bodies are taking comprehensive approaches beyond just legal frameworks to address online harms through user education and preparedness.
How are countries interacting with technology platforms while working on digital legislation?
Speaker
Sorina Teleanu
Explanation
This addresses the important process question of stakeholder engagement and dialogue between governments and platforms during the legislative development process.
How is the AI Act connecting to creating safer online environments and increasing transparency from private actors?
Speaker
Sorina Teleanu
Explanation
This explores the intersection between AI regulation and content safety, particularly regarding platform transparency obligations and automated content moderation systems.
Are there examples of African Union-level initiatives dealing with digital governance and online safety issues?
Speaker
Sorina Teleanu
Explanation
This seeks to understand regional cooperation mechanisms in Africa for addressing digital policy challenges at a continental level.
How can youth be more effectively involved in digital policymaking processes across different regions?
Speaker
Bibek Silwal
Explanation
This addresses the need for meaningful youth participation in policy development, recognizing that young people are both primary users of digital technologies and key stakeholders in implementation.
What are concrete experiences and best practices for combating cybercrime while avoiding abuse of cybercrime laws by repressive governments?
Speaker
Raoul Danniel Abellar Manuel
Explanation
This addresses the critical balance between effective cybercrime legislation and preventing authoritarian misuse of such laws to suppress dissent and free expression.
Should the UN recognize digital rights as a fourth generation of human rights to provide clearer framework for national legislation?
Speaker
Representative from Democratic Republic of Congo
Explanation
This proposes a systematic approach to digital rights recognition that could provide clearer guidance for national constitutional and legal frameworks worldwide.
How should definitional language around journalism and media freedom be developed in digital legislation to account for diverse content producers?
Speaker
Amy Mitchell
Explanation
This addresses the challenge of defining protected journalistic activity in an era where content production has expanded beyond traditional media to include citizen journalists and diverse digital creators.
How can enforcement mechanisms be designed to prevent future governments from misusing digital legislation for authoritarian purposes?
Speaker
Amy Mitchell
Explanation
This addresses the need for robust institutional safeguards that can withstand changes in government and prevent the weaponization of digital laws against civil society.
How can AI training and deployment address risks of digital apartheid and historical bias, particularly affecting marginalized communities?
Speaker
Ashley Sauls
Explanation
This highlights the critical need to address how AI systems may perpetuate or amplify existing social inequalities and discrimination, particularly in post-apartheid contexts.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
