WS #395 Applying International Law Principles in the Digital Space
24 Jun 2025 14:45h - 15:45h
WS #395 Applying International Law Principles in the Digital Space
Session at a glance
Summary
This workshop session explored how existing international law frameworks can be applied to protect human rights in digital spaces, examining the intersection of international human rights law, international humanitarian law, and international criminal law. The discussion was moderated by Sanhawan Srisod and featured panelists from various organizations working on digital rights and international law.
Chantal Joris from Article 19 opened by explaining that while there is broad consensus that international law applies to cyberspace, the practical implementation remains controversial and fragmented. She highlighted how different legal frameworks—human rights law, humanitarian law, and criminal law—each address similar issues like incitement to violence and hate speech, but often operate separately without clear coordination. This fragmentation creates protection gaps for affected communities who may not receive adequate remedies regardless of which legal framework technically applies.
Tiesa McCruff from Hamle provided a concrete example from Gaza, describing how tech companies have failed to meet their obligations under the UN Guiding Principles on Business and Human Rights. She documented systematic censorship of Palestinian voices on social media platforms, disproportionate content moderation, and the role of online platforms in potentially facilitating genocide through hate speech and dehumanization campaigns.
Nieves Molina from the Danish Institute for Human Rights emphasized the growing accountability gap in digital spaces, noting that crimes enabled by digital technology are often treated with “exceptionalism” rather than being addressed through existing international law frameworks. She pointed to the blurred relationship between states and corporations as creating difficulties in determining responsibility levels, and suggested that fragmentation enables forum shopping and avenues for impunity.
Francisco Brito Cruz, participating online from Brazil, discussed the challenge of translating corporate responsibility principles into practical platform accountability. He emphasized that both action and inaction by platforms can produce human rights violations, and highlighted Brazil’s recent experience with blocking Platform X as an example of how complex these accountability questions become in practice. He stressed the importance of building methodologies for human rights due diligence that include transparency, monitoring, and proper expertise.
Mikiko Otani, former chair of the UN Committee on the Rights of the Child, brought attention to children’s specific vulnerabilities online while emphasizing that children want to use digital spaces safely rather than be excluded from them entirely. She highlighted the Committee’s 2021 General Comment requiring states and businesses to integrate child rights impact assessments into digital product design and emphasized the importance of hearing directly from children about their experiences and needs.
The discussion revealed several key challenges: the fragmentation of international law creates protection gaps and accountability loopholes; the close relationship between states and corporations complicates responsibility attribution; existing legal frameworks may be sufficient but lack proper implementation and enforcement mechanisms; and there is tension between protecting rights and avoiding overregulation that could harm freedom of expression. Participants concluded that while new legislation may not be the answer, better coordination between existing legal frameworks and stronger implementation of current obligations is essential for protecting human rights in digital spaces.
Keypoints
## Major Discussion Points:
– **Fragmentation of International Law in Digital Spaces**: The discussion highlighted how different bodies of international law (human rights law, humanitarian law, criminal law) operate separately when addressing digital issues, creating protection gaps and accountability challenges. While there’s consensus that international law applies online, the practical implementation remains fragmented and uncoordinated.
– **Platform Accountability and Corporate Responsibility**: Panelists examined the challenges of holding tech companies accountable under international frameworks like the UN Guiding Principles on Business and Human Rights, particularly regarding content moderation, censorship, and their role in enabling human rights violations during conflicts like in Gaza.
– **State-Corporate Collaboration and Double Standards**: The discussion addressed how states and platforms often work together in ways that can violate human rights, such as government requests for content takedowns that silence dissent while allowing harmful content to proliferate, creating a blurred line of responsibility.
– **Protection of Vulnerable Groups, Especially Children**: The conversation emphasized the disproportionate impact of digital harms on specific populations, particularly children, and the need for comprehensive approaches that consider evolving capacities and the seamless nature of online/offline experiences for young users.
– **Gaps in Access to Justice and Remedies**: Panelists discussed the growing accountability gap for victims of digital human rights violations, noting that crimes enabled by digital technology are often treated with “exceptionalism” rather than being addressed through existing international legal frameworks.
## Overall Purpose:
The workshop aimed to explore how existing international law can provide a regulatory framework to protect individuals and communities from human rights abuses in digital spaces. The session was part of a broader Digital Democracy Initiative project to clarify how international law should be progressively interpreted to address ambiguities and contradictions in the digital context.
## Overall Tone:
The discussion maintained a serious, academic tone throughout, with participants demonstrating deep expertise and concern about the challenges presented. The tone was collaborative and constructive, with panelists building on each other’s points rather than disagreeing. There was an underlying sense of urgency about addressing these issues, particularly when discussing real-world examples like the situation in Gaza and conflicts in India-Pakistan. The conversation remained professional and solution-oriented, even when addressing complex and sensitive topics involving state accountability and corporate responsibility.
Speakers
**Speakers from the provided list:**
– **Sanhawan Srisod** – Moderator of the session, Senior legal advisor at the International Commission of Jurists
– **Chantal Joris** – Senior legal advisor at Article 19
– **Tiesa Meccrewfy** – EU advocacy officer of Hamle (Palestinian digital rights group)
– **Francisco Brito Cruz** – Law professor at Funda Getulio Vargas in Brazil, consultant at OSHR BTEC projects (formerly Executive Director of Internet Lab)
– **Nieves Molina** – Chief advisor of tech business and human rights at the Danish Institute for Human Rights
– **Mikiko Otani** – Former chair of the UN committee on the right of the child and ICJ commissioner
– **Audience** – Various audience members asking questions during the Q&A session
**Additional speakers:**
– **Nadim Nashif** – Founder and director of Hamle (Palestinian digital rights group) – mentioned as originally scheduled panelist but unable to travel due to ongoing war between Israel and Iran
Full session report
# International Law in Digital Spaces: Addressing Fragmentation and Accountability Challenges
## Executive Summary
This workshop session, moderated by Sanhawan Srisod from the International Commission of Jurists, examined the critical intersection of international law and digital rights protection. The discussion brought together legal experts, digital rights advocates, and human rights practitioners to explore how existing international legal frameworks can be applied to protect individuals and communities from human rights abuses in digital spaces. The session was part of a broader Digital Democracy Initiative project co-implemented by ICJ and the Danish Institute for Human Rights, aimed at clarifying how international law should be progressively interpreted to address ambiguities and contradictions in the digital context.
The conversation revealed ongoing challenges in translating legal principles into effective protection mechanisms, particularly given the rapid pace of technological advancement and the complex relationships between states and corporations in digital governance. Panelists identified significant gaps between theoretical legal frameworks and practical implementation, with particular attention to fragmentation across different bodies of international law.
## Opening Framework and Context
Moderator Sanhawan Srisod opened the session by explaining that the workshop was part of the Digital Democracy Initiative, a project co-implemented by ICJ and the Danish Institute for Human Rights. She noted that Nadim Nashif from 7amleh was originally scheduled to participate but could not travel due to the Israel-Iran war, and was replaced by Tiesa Meccrewfy from the same organization.
The session followed a structured format with 25-50 minutes of presentations from five panelists, followed by 25 minutes of Q&A and a brief wrap-up. Participants included both in-person attendees and online participants, with Francisco Brito Cruz joining remotely from SĂ£o Paulo.
## Fragmentation of International Legal Frameworks
Chantal Joris from Article 19 established the foundational framework for the discussion by addressing the fragmentation challenge in applying international law to digital spaces. She explained that while there is consensus that international law applies to cyberspace, different bodies of international law—including international human rights law, international humanitarian law, and international criminal law—each address similar issues such as incitement to violence and hate speech, but operate separately without clear coordination.
Joris emphasized that international human rights law has developed the most advanced understanding of digital application, but noted the challenge that technology advances much more quickly than the ability of international bodies and domestic parliaments to create appropriate rules. She observed that we are “not in the golden age of treaty making” and must therefore rely on existing international law rules and their interpretation.
From the perspective of impacted communities, Joris noted, “it is not that relevant whether it’s humanitarian law or human rights law and which one is like specialis.” The academic distinctions between legal frameworks become less meaningful when victims cannot access adequate remedies regardless of which legal framework technically applies to their situation.
## Palestinian Digital Rights and Platform Accountability
Tiesa Meccrewfy from 7amleh, a Palestinian digital rights organization, provided concrete examples of systematic censorship of Palestinian voices on social media platforms. She referenced a report published by 7amleh on “digital rights, genocide and big tech accountability in Gaza,” documenting discriminatory content moderation policies that suppress Palestinian narratives while allowing harmful content to proliferate.
Meccrewfy highlighted how government requests for content takedowns complicate transparency and introduce bias into content moderation processes. She described how online platforms can play a role in potentially facilitating serious human rights violations through hate speech and dehumanization campaigns during conflicts, arguing that tech companies have clear obligations under the UN Guiding Principles on Business and Human Rights that are not being effectively implemented.
## State-Corporate Relationships and Accountability Gaps
Nieves Molina from the Danish Institute for Human Rights addressed the growing accountability gap in digital spaces, noting that crimes enabled by digital technology are often treated with “exceptionalism” rather than being addressed through existing international law frameworks. She identified the blurred relationship between states and corporations as creating difficulties in determining responsibility levels.
Molina highlighted that this close relationship creates a “blurred reality regarding levels of responsibility,” making it challenging to attribute accountability for digital harms. She suggested that fragmentation enables forum shopping and creates avenues for impunity, allowing both state and corporate actors to evade responsibility.
Significantly, Molina raised the possibility that international law itself may have become a victim of misinformation campaigns, observing that international law and human rights content may have been targeted by campaigns that relativize international law and question its usefulness. She also noted that some scholars are discussing expanding international criminal responsibility to include legal personalities, potentially bringing corporations under frameworks like the ICC-Rome Statute.
## Corporate Responsibility and Platform Governance
Francisco Brito Cruz, participating online from SĂ£o Paulo, disclosed that he is no longer Executive Director of Internet Lab but is now a law professor and consultant for BTEC (Business, Technology and Human Rights), a project within the Office of the High Commissioner for Human Rights. He emphasized a crucial paradox in digital governance: “inaction can produce violations, but action can produce violations as well.”
Brito Cruz stressed the importance of building specific methodologies for human rights due diligence that go beyond general principles, arguing for transparency, monitoring, and proper expertise in these processes. He highlighted Brazil’s recent experience with blocking Platform X and the arrest of Telegram’s CEO in Paris as examples of how complex these accountability questions become in practice, demonstrating the need for stakeholder engagement, participation, and transparency tools in human rights due diligence processes.
## Children’s Rights in Digital Spaces
Mikiko Otani, former chair of the UN Committee on the Rights of the Child, brought attention to children’s specific vulnerabilities online while emphasizing that children want to use digital spaces safely rather than be excluded from them entirely. She cited a 2017 UNICEF statistic that one in three internet users worldwide is a child, making the digital environment crucial for children’s rights realization.
Otani explained that the Committee’s 2021 General Comment, adopted after a two-year consultation process that included hearing directly from children, requires states and businesses to integrate child rights impact assessments into digital product design, development, and operation. She challenged traditional protective approaches by noting that children themselves reject overprotection in favor of safe access.
Crucially, Otani observed that for children, “offline and online is seamless. It’s not so easy to differentiate what is the online and offline for the children.” Children, she reported, “want to use the online space safely” rather than being totally protected or excluded from digital spaces.
## Audience Questions and Discussion
The Q&A session included several specific questions that highlighted practical challenges:
**India-Pakistan Example**: Ulvia from London Story described how during tensions between India and Pakistan, platforms amplified violent content while 8,000 accounts with alternative narratives were taken down, illustrating how states can use content moderation to silence dissent while failing to stop harmful content.
**Domestic vs International Law Coordination**: An audience member asked about coordination between domestic and international legal frameworks, highlighting the complexity of multi-level governance in digital spaces.
**Separate Digital Rights Legislation**: Ana Galate from UC Berkeley questioned whether separate comprehensive legislation for digital rights is needed or if existing international law frameworks are sufficient with better implementation.
**Binding Frameworks**: Christian Fazili from DRC asked about the need for new binding frameworks and due diligence obligations, reflecting ongoing debates about whether current voluntary approaches are adequate.
## Key Challenges Identified
The discussion revealed several persistent challenges:
**Legal Fragmentation**: Different bodies of international law address similar digital issues separately, creating protection gaps and coordination difficulties.
**Technology-Law Gap**: Technology advances faster than legal frameworks can adapt, creating ongoing governance challenges.
**Attribution Problems**: The blurred relationship between states and corporations makes it difficult to determine responsibility for digital harms.
**Implementation Gaps**: While legal principles exist, translating them into effective protection mechanisms remains challenging.
**Remedies and Access**: Ensuring access to justice for victims of digital rights violations across fragmented legal systems remains problematic.
## Emerging Approaches and Suggestions
Panelists suggested several potential approaches:
**Focus on Existing Law**: Rather than creating entirely new treaty frameworks, emphasis should be placed on operationalizing and clarifying existing international law.
**Iterative Approach**: Adopting an approach of “intervention and testing” for digital legislation, acknowledging that some regulatory innovations may require adjustment.
**UN Guiding Principles**: Using the UN Guiding Principles on Business and Human Rights as an operative layer to navigate tensions between different international law frameworks.
**Transparency Requirements**: Developing transparency requirements for both state and corporate actions in digital spaces to enable proper scrutiny.
**Inclusive Consultation**: Ensuring consultation processes include diverse voices, including children, marginalized communities, and regional perspectives.
## Conclusion
The workshop session highlighted both the complexity and urgency of applying international law to digital spaces. While existing international legal frameworks provide a foundation for protecting human rights online, practical implementation faces significant challenges due to fragmentation, rapid technological change, and complex state-corporate relationships.
The discussion demonstrated that current approaches to international law in digital spaces face implementation gaps that affect real people and communities. The panelists’ insights suggest that while new legislation may not be the primary solution, better coordination between existing legal frameworks and stronger implementation of current obligations is essential.
The conversation emphasized that protecting human rights in digital spaces requires moving beyond technical legal analysis to acknowledge the human impact of these challenges. The path forward requires not only legal innovation but also political will to address the accountability gaps that currently allow both state and corporate actors to evade responsibility for digital human rights violations.
The session concluded with recognition that effective digital governance requires concrete mechanisms for accountability that can keep pace with technological advancement while preserving fundamental rights and freedoms, and that this work must center the voices and experiences of affected communities.
Session transcript
Sanhawan Srisod: Good afternoon. For those who join us here both in person and online, welcome to the workshop sessions Applying International Law Principles in the Digital Space. So my name is Chantal Joris and I’m a senior legal advisor at the International Commission of Jurists and I’m going to moderate the session today. So let me begin with the purpose of today’s session. So the title is Applying International Law Principles in the Digital Space and what we’re going to discuss today is we’re going to explore how existing international law provides a regulatory framework to protect individuals and communities from human rights abuse in the digital space. And we’re also going to discuss about how different bodies of law interact in this increasingly complex digital world. We also touch upon some state obligations, corporate responsibility and involving directions of accountability for both state and corporate actors under international law and also discuss about recent developments and remaining gaps as well. So this discussion is also part of a broader project co-implemented by the International Commission of Jurists and the Danish Institute for Human Rights who is our co-partner today under the Digital Democracy Initiative. So under this initiative, us with a group of experts whom some of them is also from part of the panelists today, we draft a set of principles we seek to clarify how international law should be progressively interpreted and to address some ambiguity and contradiction if they’re able to exit. So as for the format of the sessions, I think we’ll spend the first 25 to 50 minutes for presentation by panelists and then another 25 minutes for Q&A comments in the interactive dialogue with participants and we conclude the sessions with brief wrap-up remarks from panelists. So let me begin with like a brief introductions of the panelists today. Before introducing all the panelists I would like to acknowledge that indeed we already have panelists with us which is Nadim Nashif, the founder and director of Hamle which is a Palestinian digital rights group. He was originally on the list of the panelists but he unfortunately unable to travel today due to the ongoing war between Israel and Iran. However we have his colleague with us today Tessia McCruffy, she’s a EU advocacy officer of Hamle. Another panelist who joins us from online is Francisco Brito Cruz, he’s a law professor at Funda Getulio Vargas in Brazil, also executive director of Internet Lab and consultant at OSHR BTEC projects. Another panelist with us here in the room is Chantal Joris, she’s a senior legal advisor at Article 19 and next to her here is Nervis Molina-Cromont, chief advisor of the tech business and human rights at the Danish Institute for Human Rights. And last that side is Mikiko Otani, the former chair of the UN committee on the right of the child and also ICJ commissioners. So let’s begin the discussions. So for the first issues I would like to begin with Chantal. If we talk about the topic we are discussing today which is on international law in the digital space. So of course this topic begin with a concept that international law of life apply online. like there’s a certain kind of consensus on this but in in the reality when we talk about international law people we ask like which one we are talking about are you talking about international human right law are you talking about international human talent law are you talking about international criminal law are you talking about everything or are you talking about the framework on corporate so probably you can help us set the scene how this framework interact or are they even interact with each other are there any some harmonization between all the so-called international law of this path work of international law or not if there’s any gaps that exist or remain the floor is yours.
Chantal Joris: Thank you very much and thanks everyone for for joining us yeah as you said there is broad consensus that international law applies to cyberspace and that cyberspace is is not a lawless space but what that means in practice is is still very much a subject to discussion and to controversial discussions for example to what extent cyber conduct might constitute an illegal use of force or a violation of the prohibition of the principle of non-intervention and these questions are being discussed within different initiatives under international humanitarian law for example the ICRC has been working extensively on understanding how those norms that have been established in particular the binding rules that have been established many decades ago how these norms can still be interpreted in a manner that they continue to be relevant in in cyberspace I would say international human rights law has probably the most advanced understanding I would say about how it applies as you as you mentioned that the same rights are understood to apply offline as much as they apply online. There are extensive reports, for example, by mandate holders, by the Human Rights Committee, to explain what it means, for example, to be able to enjoy freedom of expression online, the role of online platforms and so on. International criminal law is also catching up to the realities of how cyber conduct can contribute or even constitute atrocity crimes. Tomorrow, for those who will be here, there will be another session discussing the initiative of the International Criminal Court’s Office of the Prosecutor to establish a policy on how cyber-enabled crimes might fall under the scope of the Rome Statute. Of course, there’s also a well-known, the Lean Manual, Oxford Statement. So, there are a lot of initiatives recognizing that it is important for international law to remain relevant, that we also understand how it applies in cyberspace. And certain questions, I would say, are anyway controversial questions under international law, and they also manifest in cyberspace, one example being for the extraterritorial application of human rights. Information operations by a certain state, whether they target the home population or the population in a foreign country, does that shape the obligations that they have? Does it mean that there could be a protection gap when it comes to the rights holders that might be impacted by certain operations? So, what I would say is that Those initiatives are very relevant, but I think there could be more effort probably to avoid the fragmentation of the responses to how international law applies to cyberspace and to make it more, to harmonize it more, to avoid a protection gap. For example, coming back to a potential, say, disinformation campaign that is inciting and dehumanizing and targets people in another country. Potentially, you could have the prohibition of propaganda for war that applies under Article 20 of the ICCPR or the prohibition of hate speech under Article 20, Paragraph 2 of the ICCPR. Certain of those operations could also be prohibited under international humanitarian law. For example, the obligation to respect international humanitarian law. It could potentially be a violation of the prohibition of direct and public incitement to genocide. So we have all these different frameworks that have something to say about this type of content, this type of state conduct, if that’s what we want to focus on. But how they interrelate is not always so established. And from the perspective of the impacted communities and the rights holders who might be harmed by those types of information operations, it is not that relevant whether it’s humanitarian law or human rights law and which one is like specialis. And those can make for very interesting legal discussions. But I think it is important that we find a way to, again, operationalize it, clarify it and make clear what each actor’s obligations are and what, again, the rights of the impacted communities really are with respect to all these legal frameworks.
Sanhawan Srisod: Thank you so much. I think you point out very important issues and very important, one of the most concerning issues about the fragmentations of law is indeed, for example, we talk about incitement of hate violence or propaganda of war. It is there in almost every section of the law, every forms, I mean in ICL, in IHL, as well as under the human rights framework, but it has been discussed or interpreted separately. Like international human rights law, we talk about roundabout plan of actions, in which we try to interpret these sections, but haven’t taken into consideration IHL, ICL at all. So yeah, and I want to link it to our next speaker, Tasia, because you have been working in the context in which three of these law has been collided, has been interacted, including on the issue that Chantal gave an example as well, the propaganda of war, incitement of hate or violence. I understand that at the end of last year, Hamle also have just published a report about digital rights, genocide and big tech accountability in Gaza as well, in which you also record that there’s a race on online hate speech and dehumanization targeting Palestinians. So probably you can explain about this a little bit, and probably you can provide your first-hand observation on how this law operates in practice, and if you see the gaps when it comes to like a fragmentation of the law in practice, especially in the context of Gaza.
Tiesa Meccrewfy: Thank you so much. Non-state actors, including big tech companies, have obligations under international frameworks, such as the UN Guiding Principles on Business and Human Rights to respect and protect digital and human rights. These principles mandate that businesses must conduct due diligence to identify, prevent, mitigate, and account for how they address their impacts on human rights. The right to access the Internet, freedom of expression, freedom of opinion, and privacy are all essential for individuals to share their experiences, seek justice, and advocate for their rights. Violations of these rights during such critical times, such as what’s happening in the Gaza Strip right now, not only silence marginalized voices, but also hinder efforts to address and prevent atrocities. So, in the context of the war on Gaza, the protection of digital rights is paramount. Tech companies and online platforms play a real critical role in documenting human rights abuses, sharing information, and mobilizing support as well. Systematic censorship and discriminatory content moderation policies by these platforms, as seen in the suppression of Palestinian voices, undermine these digital rights. And the disproportionate over-moderation leads to restrictions limiting the reach of Palestinian content at the international level. In some cases, it can completely suspend users. Palestinian and international news outlets, as well as journalists, have all experienced content takedowns and account restrictions on Instagram and Facebook specifically. Another contributing factor to censorship is obviously government requests for content takedowns on social media platforms, which complicates the issue of transparency and bias in content moderation. The United Nations Committee on the Elimination of Racial Discrimination already expressed serious concern about the sharp increase in racist hate speech and dehumanization directed at Palestinians since October 7th. particularly on the internet and in social media and the ICG order on the plausibility of genocide highlights the gravity of the situation as they are considering in this case the documented use of online platforms to incite genocide against Palestinians in Gaza and all social media services in Israel and Palestine need to prioritize a comprehensive approach that really mainstreams and safeguards human rights and addresses the root causes of discrimination against the community and narratives in full transparency and in line with the United Nations guiding principles on business and human rights and to finish because there is an intersection of digital rights and genocide in Gaza we should highlight the urgent need for robust protections and accountability to ensure that digital spaces remain open and equitable for all thank you.
Sanhawan Srisod: Thank you so much Tassia. In the context of Gaza that’s a I think you already list out I mean there’s a lot of study also statement made by special proctor UN mechanisms and others on the use of platform the use of platform to commit it some prohibited act that may amount to crime under international law she’s including genocide and also the protection gaps as well which is now not there yet so we touch upon another lead lie because it’s in the situation of Gaza now it’s not just about protection but there’s an issue about accountability as well if you talk about the possibility of crime under international law to be committed in that so I would like to move to the next speaker to touch upon the issue of accountability when it comes to the human right violation or abuse that committed in a digital space including those that may amount to to crime under international law so probably the nearest you can talk about you know like the accountability framework as well and challenge in accessing to justice and remedies for those who were victims of human rights violations or abuses that committed online and probably those who have to face challenge through fragmented legal framework as well.
Nieves Molina: Thank you, well this is a quite impressive room, yes I mean building on what my colleagues here have been talking about, I would like to talk about the growing gap in the accountability and as a consequence of course the growing gap on victims access to remedies. Although international law has a well-established framework of obligations in relation to access to reparations, access to justice, prevention of impunity, procedural remedies, compensation, rehabilitation and in what includes like taking all possible measures so that violations don’t occur again. We see that crimes committed or violations or wrongdoings committed being enabled by digital or cyber technology are treated with a level of exceptionalism. So there is this paradox while most legal scholars think that international law has most of the principles that would help us to provide remedies or to provide a regulation, there is certain exceptionalism on producing new legislation and attempting to create new laws that take time to create, but at the same time create a danger of fragmentation of the law. So, from my point of view, there are two or three things that appear as, well, there are more, but I want to talk only about two or three things in these five minutes that appear as a barrier, no? And one of them is this idea of the fragmentation of the law, the idea that a specialist in environmental law, a specialist on IHL, on ISL, on cyber law, seem to be operating separately and not coordinated, not knowing what the other specialties are doing. And then that creates a situation where it creates lack of certainty, it enables forum shopping, where different actors seek for the most favourable regulation for their conduct. And it creates also difficulties for cooperation, and ultimately the result is avenues for impunity. The human rights system is a coherent system where all human rights are interrelated, we say, but there has been an attempt to try to analyse right by right, which, from my point of view, there is no right that is not affected by digital and cyber and new technologies as we advance in a quite fast pace. Finally, the issue of the close relationship between state and corporations has created a blurred reality in which it is difficult to know or to define what the levels of responsibility that different actors have in a given situation. A number of situations have also cast a question when international crimes have been committed with the help or with the assistance of cyber or digital technologies. There are some scholars that are starting to call for the international criminal responsibility of companies as well and there are a number of initiatives on whether it would be possible to expand the ICC-ROM treaty to include also legal personalities. Eventually, for us, there are two ideas that I would like to put forward. One is, do we have all the tools in international law that we require? Eventually, law is by definition a reaction to social changes and human rights are the safeguards that we put for every turn of the advancement of societies. And the second question that I would like to put to you is that in the middle of all this information overload, whether or not international law and the content of human rights have been also a victim or have been also been targeted by misinformation campaigns, relativizing international law and questioning their usefulness. creating spaces where there is some gaps of accountability and gaps of regulation eventually. I want to stop there so that we can have time for an interaction.
Sanhawan Srisod: Thank you so much, Nieves. I think you put on a very important point of the fact that now there’s still a broad line between state and corporate when it comes to who shall be held accountable for certain actions. And it seems like the current international law framework still not catch up to that challenge, but there’s still an ongoing effort to address them. So, moving next to Francisco, who probably could fill in the gaps of the corporate accountability side as well. So, Francisco, under international law, corporates are required to conduct due diligence, they have due diligence obligations, and they also expect to be held accountable as well when there’s a certain kind of conduct that committed on that platform, for example. But in reality, most of the time, we haven’t seen a lot of examples where the platform may be held accountable when there’s a suspect of involvement in certain kind of actions. So, probably you can shed light on these issues, like what’s the status of international law now when it comes to platform accountability? And also, at the same time, because under international law, whatever we do, we have to avoid disproportionate restriction on human rights of the online user as well. So, how can we strike the balance between those accountability and protection of the right? And probably you can give example of the recent rulings in Brazil on this issue as well.
Francisco Brito Cruz: Thank you. I hope you are all listening to me. Hello from Sao Paulo. I’m wanting to be with all of you guys in Norway. I’m happy that my colleagues laid the ground first in terms of international law and these discussions. Just a disclosure, I’m not the Executive Director of Internet Lab anymore. Now I am a law professor and an expert consultant of the BTEC project within the office of the High Commissioner for Human Rights at the United Nations. So in these five minutes I will deal with three questions. The first one is, it seems when we entered this, and as our moderator was questioning me, that we have some tensions within international human rights law. We have prohibition, a number of different discourses of incitement, for example, but also protections for freedom of expression and the need to strike a balance. And we also have tools like the United Nations Guiding Principles on Business and Human Rights. So the first question is how to see this. There are tensions, these are layers. I think that it is interesting to see how the United Nations Guiding Principles on Human Rights can act as a more operative layer on that. And maybe a good way to depart this discussion is the assertion that the action can produce violations, but also action can produce violations. Also the statement that corporate can participate in violations and corporate power and also state actors. can participate on violations. And this makes us difficult for navigating, but I think that the guiding principles can provide us at least a more toolbox of different approaches on how we see corporate responsibility. So the second question is, can we turn corporate responsibility into platform accountability? I think this is the main challenge when we see the United Nations guiding principles on human rights. What will make this is to not only assert that principles and human rights law are valid. I think this is very important, but also we should, as the guiding principles show us, we need to build method to that. And thinking about what means human rights diligence as not any form of the diligence, but a specific one that we need to embolden method, we need to embolden transparency, and we need to raise the bar. And to that, I would like to mention the resources that the BTEC project has. BTEC project is a project within the Office of the High Commissioner that is trying to make this translation. So what does it mean to take all of the principles and human rights that are in different documents and in different tools for international human rights law, and how we can build method to bring companies to this kind of compliance? So in terms of AI, for example, we have a different set of tools that we need to apply to that, that combine not only transparency to. but also building expertise on red teaming, for example, that need to build expertise on content regulation, for example. And that put us in a position that we need to not only point out to methods, but also to think on ways to monitor how they are being deployed and what are the results that we are having during time. And to end it, to end my first contribution here, landing this discussion in context, it’s very difficult. So in Brazil, for example, we are seeing the judiciary trying to build a platform accountability field. It’s important to say that, as a Brazilian, I would like to share this. We have a judiciary with many number of peculiarities that can be very proactive. But kind of the judiciary is trying to play an important role after an attempted coup and also after challenges from the tech sector leadership defying the rule of law. And on that, we are seeing on how much is difficult to build this field and to build an idea of platform accountability and even human rights diligence without, for example, state capacities or without a regulator. But of course, we are, as I’m saying on the beginning of my contribution here, not only inaction can produce violations, but action can produce violations as well. So making those steps are very difficult. And I just want to share one thing to make us think on how much the discussion about incitement is important. very important, as my colleagues were commenting before me. We have an interesting episode, the blocking of the Platform X in Brazil. And we saw how much this was spoken about in the international stage, and how much this information was still to be the key motive for the Supreme Court to block this platform in Brazil. And this is not true. The core case that the Supreme Court made that was an incitement case against a law enforcement officer. So advancing on this translation, it’s really, really important for not only setting up standards for different contexts, but also on grounding on this context, the capacity that we need to prevent violations, not only from the state power, but also violations that can be facilitated by the private sector. So I’ll leave it here, and I’m anxious to hear all of you in our interactions.
Sanhawan Srisod: Thank you so much, Francisco. So we have the last panelist with us, and indeed, that’s one of the topics in which we haven’t yet fully covered during our intervention today, which is on the disproportionate impact of online harm on specific groups, especially children, whose lives are increasingly shaped by the digital environment. So Mikiko, under international law, it’s clear, there’s clear recognition of the children’s rights online. However, are there any specific obligations that are imposed on the state and corporate itself, you know, when it comes to protection of students in the online space? So probably you can share with us about those obligations. you know like and probably you share how the protection of this group could be strengthened as well. The floor is yours.
Mikiko Otani: Thank you very much. So in the five minutes I have I’d like to bring in the perspective of child rights and so there are many other group of persons who need special attention but in my case children. So in 2017 UNICEF report said worldwide one in three internet users is a child. So it’s such a reality we learned in 2017 and when I joined in the committee on the right of the child actually 2017 we started learning more and more how the digital environment is important to children. So what we learned the reality is that digital is really part of children’s daily lives. Of course still there is some digital divide however children are living with the digital and children’s rights are impacted. I have to say I have to emphasize positively and negatively we very often emphasize the negative impact of the digital on the children’s rights. However children’s rights are also promoted or enhanced by digital so this is the reality. So in 2019 the committee on the rights of the child decided that we need to work on the children’s rights and in digital relation. Convention on the rights of the child adopted by general assembly in 1989. So we were convinced by the reality that convention on the rights of the child need to be read, understood, applied and implemented in relation to the digital environment because this is the life the children are nowadays are living. So we cannot ignore this reality. and we have to show how the Convention on the Rights of the Child is relevant to the children’s rights. But I also want to emphasize something which I learned through the process of the committee’s work to develop this general comment on children’s rights in relation to digital environment. So it took two years in drafting because we had a wider public consultation including the consultation with the children. So what I’m going to share with you is what I learned from the children. So children living in digital world space, however, offline and offline, online and offline is seamless. It’s not so easy to differentiate what is the online and offline for the children. And secondly, almost all the children’s rights under the Convention are impacted by the children. Of course, the freedom of expression and privacy, however, also many other things like right to play, education, health. In particular these days, mental health is such a serious issue for the children’s rights in relation to the digital. And also how to develop the relationship, personal relationship with others, starting from the children’s and parents’ relationship. And so all those things are actually impacted by the children’s rights. And also I have to emphasize the important role of the parents. So if we talk about a lot of privacy, for example, but if the parents are not aware how they are actually imposing the children to the risk online, so the children’s rights are not protected. So those are what I learned. But what is most important message from the children are they want to use the digital. So they claim to us, the committee, that they don’t want to be… totally protected or excluded from the digital space, but they want to use the online space safely. So this is a very strong message from the children. So what the committee said in this general comment adopted in 2021 after the two-year consultation, I’d like to bring up three issues. So one, the committee said that the states and the business should integrate child rights impact assessment. And particularly, I’d like to emphasize one thing. So the committee said privacy and safety in relation to the design, engineering, development, operation, distribution and marketing of their products and services is very, very important to protect the children’s safety and privacy in the digital world. Second, what I want to share today is that it’s very important to understand the children’s rights comprehensively if we want to address the children’s rights in the digital space. In particular, children are all the persons under 18 years old. So you can imagine how the digital space will impact in the younger age of the children or adolescents. So the evolving capacities is a very important concept. Thirdly, remedies. So for the children, what does that mean to access to justice and remedies on the online home? So we need to integrate those perspectives. And to do that, we need to hear from the children, their lens and their experiences, what they think, how to protect themselves is very key. Thank you very much.
Sanhawan Srisod: Thank you so much, Mikiko. Next, I would like to invite all participants to ask questions and share perspectives. I know that after five speakers, most of the… What we share are challenges due to the fragmentation of international law and especially in terms of providing protections to individual communities and also on accountabilities. Of course, there are still gaps, but there is still an effort globally to address them. If anyone would like to share, would like to ask any questions, please.
Audience: Hello, my name is Ulvia. I’m from the London Story. We recently published a report which is called Escalate, where we describe how during the India-Pakistan tensions and military conflict in April-May 2025, platforms like X and Meta amplified violent and hateful content, while the Indian government pushed for takedowns of critical voices like journalists and human rights defenders. So, around 8,000 accounts have been taken down, which verbalized alternative narratives. For us, this raises a serious issue when states use the content moderation to silence dissent, but don’t stop harmful content that can fuel the violence. In this situation, population is left more vulnerable online, especially during the conflict. So, my question would be, when it’s the state itself contributing to the erosion of online civic space, who is responsible for protecting civilians in these digital spaces? And how should international law respond to these double standards, where both the… The States and the platform failed to act in the public interest. So I would like just to hear your perspective on this because we also work on this issue. Thank you so much. Are you taking questions in turn? Are you going to take questions in turn?
Sanhawan Srisod: I think we probably can take one more question and then we answer.
Audience: Yeah, sounds good. I think my question relates to what was just mentioned about the role of the state. We’ve heard from the panel the idea of international law placing duties and responsibilities in terms of due diligence on corporations. And I think this is kind of the language or the idea that we are seeing increasingly more also in domestic legislation. So platform regulation also kind of requires companies to have some kind of due diligence, assess risk, risk mitigation, risk identification, or kind of duty of care. If not, for example, the UK Online Safety Act embodies this concept. And then my question is, how do you see the two of them fitting together or not? Is kind of international law and this kind of domestic efforts to hold platforms to account friends and foes? How can we make them friends and foes? Because you also hear from some voices, and I think some of the previous comments was in that direction, that states can also violate international human rights law. So how can regulation kind of both international and domestic kind of be driving change in accountability in the same direction and not be kind of conflicting? Yeah, that’s my question. Thank you so much.
Sanhawan Srisod: I think we take the first round probably for questions.
Audience: My audio device isn’t working. as well, so I’m not sure if I’m being heard right now. Okay, great. Hi, I’m Ana Galate. I represent UC Berkeley. I work as a public interest cybersecurity researcher and practitioner, and I have background in software development, psychology, philosophy, so on, so forth. I want to say thank you, first of all, for this overview of international law and its application. But during our time here, the focus largely appears to be how current international law, human rights, criminal, could be applied or amended to better intersect with rights needing to be protected for within the digital spheres. But the nuances are far more complex, given how intertwined and interdisciplinary our digital selves journey truly is, you know, complex domains such as political infrastructure, behavioral psychology, economic power, tech design, these are all purposely interwoven to converse within like a digital users, daily systems. My question, are there any efforts underway for separate legislation for our digital selves, you know, one that is far more holistic to protect, for example, like cognition of cognition and mind and our thoughts, like there are new technology deployed day by day at an exponential rate that is attempting to scrape and source by any means possible to better build our digital profile and kind of use us as a pawn. So just wondering if there are any efforts, like separate efforts, given all that. Thank you.
Sanhawan Srisod: Can I proceed? Yes, please.
Audience: Okay, thank you. My name is Christian Fazili. I’m from the Democratic Republic of Congo, and I work as a lecturer and researcher at the University of Goma. I’m also a civil magistrate. Thank you for the wonderful presentation. I have a few questions related to this topic. Beyond voluntary commitment, what binding framework would ensure equitable digital governance such as a global treaty on tech accountability, another one related to due diligence, to what extent are states obligated to prevent malicious cyber activity, organizing from their territory? And the last one, how can this be enforced without infringing sovereignty? Thank you very much.
Sanhawan Srisod: Thank you so much for all the questions. I think we have the questions that on when the state itself contributes to all our erosions, who’s going to protect civilians, and respond to double standards. Probably who wants to respond, Chantal? You want to respond to the first question? And we can give the second question to Francisco as well. Probably you want to answer the third question. Okay, probably you, the first question first.
Chantal Joris: Let me perhaps make sort of overarching observations on I think a number of the questions. I think one of the challenges is that we see that there are massive challenges as to the conduct in the digital space. There are digital harms, new digital harms, technologies evolving fast. How can we keep up, right? And we know that those advancements are often much quicker than the ability of certainly international bodies, but of domestic parliaments as well, to come up with good rules. So I think one of the problems that we have right now is perhaps it’s, I don’t think we’re exactly in the golden age of treaty making. So that also means that we will often try to rely on existing international law rules and see how can we operate with them, how can we interpret them for them to remain relevant, because it seems reasonably unlikely that there will be a new Geneva Convention. adopted, ratified, and widely implemented anytime soon, I think. The second challenge is also, I think, I understand the quest for wanting to regulate some of these issues, but I think the devil is in the detail, and also following up on what Francisco said and what other panelists said, it’s a very complex space, and again, as has been mentioned, inaction can be problematic, and action can also be problematic, so what sort of rules, there has been a question about a global treaty on digital accountability, I think, or again, lawmaking to address some of these technologies, it’s just some of the legislations we see are very reactive, not based on human rights, so you have this disconnect between the international law obligations of states, particularly human rights obligations, and what they do domestically, they don’t bring the domestic legislation in rhyme with human rights, they also don’t do it based on proper expertise, so it’s not necessarily good lawmaking, again, you might have to bring a psychologist, and you want to hear from children’s rights groups when you regulate those issues, so I think it’s sometimes, again, the response can be almost as problematic as the lack of response, perhaps also one point on the role of online platforms, also that we’ve seen in armed conflicts, India, Pakistan, I think one of the things also to mention, and I think Nieves has mentioned this as well, often platforms also operate or respond to government demands as well, so it’s not, you know, it’s not states do one thing and platforms do another, more and more, depending on the governments, depending on the power of the governments, there is very close cooperation and engages both states’ obligations and companies’ obligations, and I think there is also a big responsibility for us to understand whether the right measures have been taken. I’ll wrap up. The devil is in the details with these measures, be it, as Francisco mentioned, a blocking of eggs in Brazil or an arrest of the Telegram CEO in Paris. We need to have really the details, understand why exactly it has been done. There needs to be transparency as to the real reason, so we can also really scrutinise and assess whether those measures are in line with international law. I tried to sort of connect a few things.
Nieves Molina: There are a couple of interesting points. I mean, the issue of a state actor’s behaviour in platforms is one that needs to be addressed. And I’m not sure, I agree with Chantal, I’m not sure it’s the golden age of new legislation.
Chantal Joris: It’s not.
Nieves Molina: It’s not.
Chantal Joris: No, no, no.
Nieves Molina: That’s what I said. That I agree. Yeah, yeah. No, it’s not the golden age of it, because we have seen at international level, but also at national level, we have a couple of projects where we try to compare national legislation with international obligations. And every single time that we face this reaction of legislating more, and every time that we get it, then it’s more repressive kind of legislation. So that’s one thing. The other thing is that there are the legislation in relation to incitement, so in relation to justification of war, to justification of crimes. It’s all, we have examples of that. We have the media trial, but before that, Nuremberg, you know, so the concepts of law might exist, the way we implement them and the willingness for enforcing them. is where where I feel that we are lacking behind and I want to I want to address the issue that that I think the second question brought up on the issue of how technology is advancing so much that that there is there are some when what a friend of mine Suante or wrote about no is the issue of freedom of thought now takes a different connotation because technology actually my advance to that point that you that what you think can be reachable even if you don’t announce it as technology will advance and we will have to take decisions on whether the legislation that we have covers but what I would warn against is whether the we do have a real vacuum or what we have is an willingness to enforce international obligation that exists yeah I think I leave it there.
Sanhawan Srisod: Yeah probably also on the second question Francisco you want to help answer as well on domestic registration on platform regulations and also probably you can help answer on the beyond monetary commitment are there any buying framework especially on tech and due diligence as well yeah I’ll make
Francisco Brito Cruz: some comments are you listening yes yes great great so the question is if platform regulation for accountability and and even international human rights law if they’re friends or foes I think they can be both depending on the situation actually Like as Chantal was saying, action and inaction are both pervasive if you’re seeing what’s happening. And then there is no recipe for this. But maybe a few points should be made. First, with or without legislation, with or without some kind of regulatory framework, I believe that human rights legislations coming and looking to the United Nations Guiding Principles on Business and Human Rights, they can be fruitful in terms of at least laying the ground for some kind of intervention that can help us to know more what’s happening, helping us to know more what is working or not, and helping us evolve on which is the right way to intervene or to demand corporate responsibility in a very concrete way. But I know that all of that, there are baby steps, right? It’s not something that you can sketch up from one day to another. And also, all of that should happen with involving stakeholder engagement, involving participation, involving transparency tools for civil society. This is the kind of part of a human rights due diligence process that can also ensure that everyone is participating and everyone is in the same page. But also, with legislation, when we’re seeing regulatory frameworks, I think it’s also key on monitoring how this legislation is performing. So, seeing out what are the checks and balances for state action and state power, and thinking about that there is no end line. There is intervention and testing, and more intervention and testing. And some things can go wrong, and we need to acknowledge that some things can go wrong. Some innovations of legislation can go wrong as well. And I don’t think, I don’t see this as a way of paralysis. I see this as a way of learning what’s working or not. And to end, remedies are also key, right? Not only remedies around corporate power and corporate, on corporate decisions, but also remedies on state decisions. To really end, just a question here, and I would love to exchange more on that. We’re talking a lot about sovereignty, and how important for ensuring corporate responsibility this can be as well. But I think an important question should be made, like sovereignty, it can be a path for many things, right? Rule of law can be a path for many things. And it’s, I think we as human rights defenders, and people that believe on human rights, international law as well, we can, we should see sovereignty and rule of law in all of these complexities, to understand what are the ways that can serve as a path for protection also of human rights. So, but this is more of a question than an answer. And I thank you for it.
Sanhawan Srisod: Thank you so much. and Francisco. Probably last question, I think we are running out of time, probably the last one I ask Mikiko then, because there’s a question about, judging from the exiting floor that we are talking about today, are there any effort on separate legislations that try to holistically address the issues, especially when technology day by day has been like, with the threat of technology day by day, probably you can talk about it from the student right perspective as well, and probably you can update a little bit on the effort that we are engaging or involving in.
Mikiko Otani: Thank you so much, so I heard, so the first question for the panel, we started talking about the international law means, international criminal law, international human rights law, international humanitarian law, and then I brought in the child rights perspective, and I emphasise that the children’s is not only the one group which requires the specific attention, and I heard from Nieves talking about the risk of fragmentation, so how can we holistically approach all those issues, I don’t want to mean by sharing the children’s perspective to again go into the silo or fragmented approach, so I think we really need to bring in various perspectives and diverse voices when we talk about how this big question, international law, applies and should play a role in the digital space. So there is no answer, however the consultation process and hearing from various voices, not only about the groups, but also regionally, because there is a lot of different initiatives and challenges, so it’s not so much an answer, however I think this is the way we should go. Thank you.
Sanhawan Srisod: Thank you so much, I think it’s already time’s up, so thank you for all of your time and I hope today’s session was helpful for you, I think it’s just the beginning of talking more of this topic and we hope to engage more with all of you on this topic in the coming years as well. Thank you so much.
Chantal Joris
Speech speed
137 words per minute
Speech length
1225 words
Speech time
533 seconds
International law applies to cyberspace but practical implementation remains controversial
Explanation
While there is broad consensus that international law applies to cyberspace and that it is not a lawless space, what this means in practice is still very much subject to discussion and controversial debates. Questions about cyber conduct constituting illegal use of force or violations of non-intervention principles are being actively discussed.
Evidence
Examples include discussions about whether cyber conduct might constitute an illegal use of force or a violation of the prohibition of the principle of non-intervention
Major discussion point
Application of International Law in Digital Spaces
Topics
Legal and regulatory | Cybersecurity
Agreed with
– Nieves Molina
– Francisco Brito Cruz
Agreed on
International law applies to cyberspace but implementation remains challenging
International human rights law has the most advanced understanding of digital application
Explanation
Among different bodies of international law, human rights law has developed the most comprehensive framework for understanding how legal principles apply in digital spaces. The same rights that apply offline are understood to apply online as well.
Evidence
Extensive reports by mandate holders and the Human Rights Committee explain what it means to enjoy freedom of expression online and the role of online platforms
Major discussion point
Application of International Law in Digital Spaces
Topics
Human rights | Legal and regulatory
Different legal frameworks (IHL, ICL, human rights law) address similar issues separately, creating fragmentation
Explanation
Various international legal frameworks including international humanitarian law, international criminal law, and human rights law all have something to say about similar digital conduct, but they operate separately without clear coordination. This creates confusion about how these frameworks interrelate and which takes precedence.
Evidence
Example of disinformation campaigns that could potentially violate Article 20 of ICCPR (propaganda for war/hate speech), international humanitarian law obligations, or constitute incitement to genocide
Major discussion point
Application of International Law in Digital Spaces
Topics
Legal and regulatory | Human rights | Cybersecurity
Agreed with
– Nieves Molina
– Sanhawan Srisod
Agreed on
Fragmentation of legal frameworks creates protection gaps and accountability challenges
Technology advances much quicker than ability of international bodies and domestic parliaments to create good rules
Explanation
One of the major challenges in digital governance is that technological advancements occur at a much faster pace than the ability of international organizations and domestic legislative bodies to develop appropriate regulatory responses. This creates a gap between technological capabilities and legal frameworks.
Major discussion point
Challenges in Digital Governance and Regulation
Topics
Legal and regulatory | Development
Agreed with
– Nieves Molina
– Audience
Agreed on
Technology advances faster than legal frameworks can adapt
Not in the golden age of treaty making, so must rely on existing international law rules and interpretation
Explanation
Given the current international political climate, it seems unlikely that new comprehensive international treaties will be adopted and widely ratified soon. Therefore, the focus must be on interpreting and applying existing international law rules to remain relevant in the digital age.
Evidence
Mentions that it seems reasonably unlikely that there will be a new Geneva Convention adopted, ratified, and widely implemented anytime soon
Major discussion point
Challenges in Digital Governance and Regulation
Topics
Legal and regulatory | Human rights
Disagreed with
– Nieves Molina
– Francisco Brito Cruz
– Audience
Disagreed on
Approach to new legislation vs. existing law interpretation
Reactive legislation often not based on human rights creates disconnect between international obligations and domestic implementation
Explanation
Much of the current digital legislation is reactive rather than proactive and often fails to incorporate human rights principles. This creates a disconnect between states’ international human rights obligations and their domestic legal frameworks, resulting in poor lawmaking that lacks proper expertise.
Evidence
Notes that domestic legislation often doesn’t align with human rights obligations and isn’t based on proper expertise, mentioning the need to involve psychologists and children’s rights groups when regulating digital issues
Major discussion point
Challenges in Digital Governance and Regulation
Topics
Legal and regulatory | Human rights
Disagreed with
– Nieves Molina
– Francisco Brito Cruz
– Audience
Disagreed on
Approach to new legislation vs. existing law interpretation
Nieves Molina
Speech speed
111 words per minute
Speech length
931 words
Speech time
499 seconds
Need for harmonization to avoid protection gaps and forum shopping
Explanation
The fragmentation of legal approaches creates situations where different actors can seek the most favorable regulation for their conduct, leading to forum shopping. This fragmentation also creates lack of certainty and difficulties for cooperation, ultimately resulting in avenues for impunity.
Evidence
Mentions that fragmentation creates lack of certainty, enables forum shopping where different actors seek the most favorable regulation, and creates difficulties for cooperation
Major discussion point
Application of International Law in Digital Spaces
Topics
Legal and regulatory | Human rights
Agreed with
– Chantal Joris
– Sanhawan Srisod
Agreed on
Fragmentation of legal frameworks creates protection gaps and accountability challenges
Law is by definition a reaction to social changes, and human rights are safeguards for societal advancement
Explanation
Legal frameworks naturally evolve in response to social changes and technological developments. Human rights serve as essential safeguards that must be maintained and adapted as societies advance and face new challenges.
Major discussion point
Application of International Law in Digital Spaces
Topics
Human rights | Legal and regulatory
Close relationship between state and corporations creates blurred reality regarding levels of responsibility
Explanation
The increasingly close relationship between state actors and corporate entities has created a complex situation where it becomes difficult to clearly define and distinguish the levels of responsibility that different actors have in given situations. This blurring of lines complicates accountability mechanisms.
Major discussion point
Corporate Accountability and Platform Responsibility
Topics
Legal and regulatory | Economic
Some scholars are calling for international criminal responsibility of companies and expanding ICC-ROM treaty to include legal personalities
Explanation
In situations where international crimes have been committed with the assistance of cyber or digital technologies, some legal scholars are advocating for extending international criminal responsibility to corporate entities. This would involve expanding the International Criminal Court’s Rome Statute to include legal personalities beyond individuals.
Major discussion point
Corporate Accountability and Platform Responsibility
Topics
Legal and regulatory | Cybersecurity
Growing gap in accountability and victims’ access to remedies for digital crimes
Explanation
Despite well-established international legal frameworks for access to reparations, justice, and remedies, there is an increasing gap in accountability for digital crimes. Victims of digital violations face significant challenges in accessing justice and obtaining appropriate remedies.
Evidence
Notes that international law has well-established frameworks for access to reparations, justice, prevention of impunity, procedural remedies, compensation, and rehabilitation
Major discussion point
Accountability Gaps and Access to Justice
Topics
Legal and regulatory | Human rights
Crimes enabled by digital technology are treated with exceptionalism despite existing legal frameworks
Explanation
There is a paradoxical situation where crimes committed or enabled by digital or cyber technology are treated as exceptional cases requiring new laws, even though existing international legal frameworks contain most of the principles needed for regulation. This exceptionalism creates delays and fragmentation.
Evidence
Mentions the paradox where legal scholars believe international law has most principles needed for regulation, yet there’s exceptionalism in creating new legislation
Major discussion point
Accountability Gaps and Access to Justice
Topics
Legal and regulatory | Cybersecurity
Agreed with
– Chantal Joris
– Francisco Brito Cruz
Agreed on
International law applies to cyberspace but implementation remains challenging
Disagreed with
– Chantal Joris
– Francisco Brito Cruz
– Audience
Disagreed on
Approach to new legislation vs. existing law interpretation
Fragmentation creates lack of certainty, enables forum shopping, and creates avenues for impunity
Explanation
The fragmented approach to digital governance, where different legal specialties operate separately without coordination, creates legal uncertainty. This allows different actors to seek the most favorable regulatory environment for their conduct and ultimately creates opportunities for avoiding accountability.
Evidence
Mentions that specialists in environmental law, IHL, ISL, and cyber law seem to operate separately without coordination
Major discussion point
Accountability Gaps and Access to Justice
Topics
Legal and regulatory | Human rights
Technology may advance to the point where freedom of thought takes different connotation as thoughts become reachable
Explanation
As technology continues to advance rapidly, there may come a point where even human thoughts become accessible through technological means, even if not explicitly announced. This would fundamentally change the concept of freedom of thought and require new legal considerations.
Major discussion point
Challenges in Digital Governance and Regulation
Topics
Human rights | Cybersecurity
Agreed with
– Chantal Joris
– Audience
Agreed on
Technology advances faster than legal frameworks can adapt
Tiesa Meccrewfy
Speech speed
131 words per minute
Speech length
417 words
Speech time
189 seconds
Tech companies have obligations under UN Guiding Principles on Business and Human Rights to conduct due diligence
Explanation
Non-state actors, including big tech companies, have clear obligations under international frameworks such as the UN Guiding Principles on Business and Human Rights. These principles mandate that businesses must conduct due diligence to identify, prevent, mitigate, and account for how they address their impacts on human rights.
Major discussion point
Corporate Accountability and Platform Responsibility
Topics
Human rights | Legal and regulatory
Agreed with
– Francisco Brito Cruz
– Audience
Agreed on
Corporate accountability requires moving beyond voluntary commitments to concrete methodologies
Systematic censorship and discriminatory content moderation policies suppress Palestinian voices and undermine digital rights
Explanation
Tech companies and online platforms engage in systematic censorship and discriminatory content moderation that specifically targets and suppresses Palestinian voices. This disproportionate over-moderation leads to restrictions that limit the reach of Palestinian content internationally and can result in complete user suspensions.
Evidence
Palestinian and international news outlets, as well as journalists, have experienced content takedowns and account restrictions on Instagram and Facebook specifically
Major discussion point
Digital Rights Violations and Censorship
Topics
Human rights | Sociocultural
Government requests for content takedowns complicate transparency and bias in content moderation
Explanation
Government requests for content takedowns on social media platforms add another layer of complexity to the issue of censorship and bias in content moderation. These requests compromise transparency and can contribute to discriminatory enforcement of platform policies.
Major discussion point
Digital Rights Violations and Censorship
Topics
Human rights | Legal and regulatory
Francisco Brito Cruz
Speech speed
119 words per minute
Speech length
1382 words
Speech time
696 seconds
Need to translate corporate responsibility into platform accountability through specific methodologies
Explanation
The main challenge with the UN Guiding Principles on Business and Human Rights is moving beyond simply asserting that principles and human rights law are valid to actually building concrete methods for implementation. This requires developing specific methodologies that can effectively translate corporate responsibility into measurable platform accountability.
Evidence
Mentions the BTEC project within the Office of the High Commissioner that is trying to make this translation from principles to practical methods
Major discussion point
Corporate Accountability and Platform Responsibility
Topics
Human rights | Legal and regulatory
Agreed with
– Tiesa Meccrewfy
– Audience
Agreed on
Corporate accountability requires moving beyond voluntary commitments to concrete methodologies
Human rights due diligence requires specific methods, transparency, and raising the bar beyond general due diligence
Explanation
Effective human rights due diligence is not just any form of due diligence but requires specific methodologies, enhanced transparency, and higher standards. This includes building expertise in areas like red teaming for AI and content regulation, along with methods to monitor deployment and results over time.
Evidence
Examples include building expertise on red teaming for AI and content regulation, with methods to monitor how they are being deployed and their results
Major discussion point
Corporate Accountability and Platform Responsibility
Topics
Human rights | Cybersecurity
Building platform accountability is difficult without state capacities or proper regulation
Explanation
The experience in Brazil shows how challenging it is to establish platform accountability and human rights due diligence without adequate state capacities or proper regulatory frameworks. Even proactive judicial systems face significant difficulties in creating effective accountability mechanisms.
Evidence
Brazil’s judiciary trying to build platform accountability after an attempted coup and challenges from tech sector leadership defying rule of law
Major discussion point
Accountability Gaps and Access to Justice
Topics
Legal and regulatory | Economic
Both inaction and action can produce violations, making navigation complex
Explanation
The complexity of digital governance lies in the fact that both failing to act and taking action can result in human rights violations. This creates a challenging environment where decision-makers must carefully balance interventions, as even well-intentioned actions can have negative consequences.
Evidence
Example of the blocking of Platform X in Brazil, where the core case was actually an incitement case against a law enforcement officer, not just misinformation as internationally reported
Major discussion point
Accountability Gaps and Access to Justice
Topics
Human rights | Legal and regulatory
Agreed with
– Chantal Joris
– Nieves Molina
Agreed on
International law applies to cyberspace but implementation remains challenging
Need for stakeholder engagement, participation, and transparency tools in human rights due diligence processes
Explanation
Effective human rights due diligence processes must include meaningful stakeholder engagement, broad participation from affected communities, and robust transparency tools for civil society. This participatory approach is essential to ensure that everyone is involved and aligned in the process of corporate accountability.
Major discussion point
Challenges in Digital Governance and Regulation
Topics
Human rights | Legal and regulatory
Mikiko Otani
Speech speed
137 words per minute
Speech length
956 words
Speech time
418 seconds
One in three internet users worldwide is a child, making digital environment crucial for children’s rights
Explanation
According to a 2017 UNICEF report, children represent a significant portion of internet users globally, with one in three users being under 18. This reality makes the digital environment a crucial space for children’s rights protection and promotion.
Evidence
2017 UNICEF report stating that worldwide one in three internet users is a child
Major discussion point
Children’s Rights in Digital Environment
Topics
Human rights | Sociocultural
Digital impacts children’s rights both positively and negatively across all rights under the Convention
Explanation
The digital environment affects virtually all children’s rights under the Convention on the Rights of the Child, not just obvious ones like freedom of expression and privacy. Rights such as play, education, health (particularly mental health), and the development of personal relationships are all significantly impacted by digital technologies.
Evidence
Examples include right to play, education, health, mental health issues, and development of relationships including children-parent relationships
Major discussion point
Children’s Rights in Digital Environment
Topics
Human rights | Sociocultural
Children want to use digital spaces safely rather than be totally protected or excluded
Explanation
Through consultations with children during the development of the General Comment, the Committee learned that children do not want to be completely protected from or excluded from digital spaces. Instead, they want to be able to use online spaces safely while maintaining access to digital opportunities.
Evidence
Strong message from children during the Committee’s consultation process for the General Comment on children’s rights in digital environment
Major discussion point
Children’s Rights in Digital Environment
Topics
Human rights | Cybersecurity
States and businesses should integrate child rights impact assessment in design and development of digital products
Explanation
The Committee’s General Comment adopted in 2021 emphasizes that both states and businesses must integrate child rights impact assessments into their processes. This is particularly important in the design, engineering, development, operation, distribution, and marketing of digital products and services to protect children’s safety and privacy.
Evidence
Committee’s General Comment adopted in 2021 after two-year consultation process
Major discussion point
Children’s Rights in Digital Environment
Topics
Human rights | Legal and regulatory
Need to understand children’s rights comprehensively considering evolving capacities from younger ages to adolescents
Explanation
Protecting children’s rights in digital spaces requires a comprehensive understanding that accounts for the wide age range of children (all persons under 18) and their evolving capacities. The impact of digital spaces varies significantly between younger children and adolescents, requiring nuanced approaches.
Evidence
Concept of evolving capacities is emphasized as very important, considering children are all persons under 18 years old
Major discussion point
Children’s Rights in Digital Environment
Topics
Human rights | Development
Need to bring various perspectives and diverse voices when discussing international law’s role in digital space
Explanation
To avoid fragmentation and silo approaches in digital governance, it’s essential to incorporate various perspectives and diverse voices from different groups and regions. This includes not only different vulnerable groups but also regional perspectives, as there are different initiatives and challenges across different contexts.
Major discussion point
Holistic Approaches and Future Directions
Topics
Human rights | Legal and regulatory
Audience
Speech speed
132 words per minute
Speech length
751 words
Speech time
340 seconds
States use content moderation to silence dissent while failing to stop harmful content that fuels violence
Explanation
During the India-Pakistan tensions and military conflict, there was a clear double standard where platforms took down accounts that provided alternative narratives (including journalists and human rights defenders) while amplifying violent and hateful content. This demonstrates how states can manipulate content moderation to suppress dissent while allowing harmful content to proliferate.
Evidence
During India-Pakistan tensions in April-May 2025, around 8,000 accounts that verbalized alternative narratives were taken down while platforms like X and Meta amplified violent and hateful content
Major discussion point
Digital Rights Violations and Censorship
Topics
Human rights | Cybersecurity
Digital users’ daily systems involve complex domains like political infrastructure, behavioral psychology, and economic power
Explanation
The digital environment is far more complex than current legal frameworks account for, involving intricate interconnections between political infrastructure, behavioral psychology, economic power structures, and technical design. These domains are purposely interwoven to influence digital users’ daily experiences and decision-making processes.
Major discussion point
Holistic Approaches and Future Directions
Topics
Sociocultural | Economic
Question whether separate legislation for digital selves is needed given exponential rate of new technology deployment
Explanation
Given the exponential rate at which new technologies are being deployed to scrape and source data to build digital profiles and manipulate users, there’s a question about whether entirely separate legislation specifically for digital selves might be needed. This would be more holistic legislation that protects aspects like cognition, mind, and thoughts from technological manipulation.
Evidence
New technology deployed daily at exponential rate attempting to scrape and source data to build digital profiles and use users as pawns
Major discussion point
Holistic Approaches and Future Directions
Topics
Human rights | Legal and regulatory
Agreed with
– Chantal Joris
– Nieves Molina
Agreed on
Technology advances faster than legal frameworks can adapt
Need for binding frameworks like global treaty on tech accountability beyond voluntary commitments
Explanation
There is a need to move beyond voluntary commitments to establish binding international frameworks that would ensure equitable digital governance. This could include mechanisms like a global treaty on tech accountability that would create enforceable obligations rather than relying solely on voluntary corporate commitments.
Major discussion point
Holistic Approaches and Future Directions
Topics
Legal and regulatory | Human rights
Agreed with
– Tiesa Meccrewfy
– Francisco Brito Cruz
Agreed on
Corporate accountability requires moving beyond voluntary commitments to concrete methodologies
Disagreed with
– Chantal Joris
– Nieves Molina
– Francisco Brito Cruz
Disagreed on
Approach to new legislation vs. existing law interpretation
Sanhawan Srisod
Speech speed
144 words per minute
Speech length
1888 words
Speech time
781 seconds
International law framework faces fragmentation challenges with different bodies addressing similar issues separately
Explanation
The moderator highlights that when discussing international law in digital spaces, there are multiple frameworks (international human rights law, international humanitarian law, international criminal law, corporate frameworks) that may address similar issues but lack harmonization. This creates confusion about which framework applies and how they interact with each other.
Evidence
Examples include incitement of hate violence or propaganda of war being addressed in ICL, IHL, and human rights frameworks but interpreted separately, such as the Rabat Plan of Action interpreting these sections without considering IHL or ICL
Major discussion point
Application of International Law in Digital Spaces
Topics
Legal and regulatory | Human rights
Agreed with
– Chantal Joris
– Nieves Molina
Agreed on
Fragmentation of legal frameworks creates protection gaps and accountability challenges
Digital rights violations during conflicts require intersection of multiple legal frameworks
Explanation
In contexts like Gaza, there is a collision and interaction of three bodies of law (human rights law, international humanitarian law, and international criminal law) particularly around issues of propaganda of war, incitement of hate, and violence. This demonstrates the practical need for coordinated legal approaches in conflict situations involving digital spaces.
Evidence
Reference to Hamle’s report on digital rights, genocide and big tech accountability in Gaza documenting rise in online hate speech and dehumanization targeting Palestinians
Major discussion point
Digital Rights Violations and Censorship
Topics
Human rights | Legal and regulatory | Cybersecurity
Accountability gaps exist between state and corporate responsibility in digital spaces
Explanation
The moderator identifies that current international law frameworks have not adequately addressed the challenge of determining accountability when there are blurred lines between state and corporate actions. This creates situations where it’s unclear who should be held responsible for certain digital violations or abuses.
Major discussion point
Corporate Accountability and Platform Responsibility
Topics
Legal and regulatory | Human rights
Children face disproportionate impact from online harm requiring specific legal protections
Explanation
The moderator emphasizes that children represent a specific group whose lives are increasingly shaped by the digital environment and who face disproportionate impacts from online harm. This requires examination of specific obligations imposed on states and corporations for protecting children in online spaces.
Evidence
Recognition that children’s rights online are clearly established under international law but implementation of specific obligations remains challenging
Major discussion point
Children’s Rights in Digital Environment
Topics
Human rights | Cybersecurity
Platform accountability challenges arise in balancing corporate responsibility with user rights protection
Explanation
The moderator highlights the complex challenge of holding platforms accountable for their role in facilitating harmful content while simultaneously avoiding disproportionate restrictions on human rights of online users. This requires striking a careful balance between accountability measures and rights protection.
Evidence
Reference to recent rulings in Brazil as examples of attempts to address platform accountability
Major discussion point
Corporate Accountability and Platform Responsibility
Topics
Human rights | Legal and regulatory
Agreements
Agreement points
International law applies to cyberspace but implementation remains challenging
Speakers
– Chantal Joris
– Nieves Molina
– Francisco Brito Cruz
Arguments
International law applies to cyberspace but practical implementation remains controversial
Crimes enabled by digital technology are treated with exceptionalism despite existing legal frameworks
Both inaction and action can produce violations, making navigation complex
Summary
All speakers agree that while international law clearly applies to digital spaces, translating this into practical implementation faces significant challenges due to the complexity of digital governance and the rapid pace of technological change.
Topics
Legal and regulatory | Human rights | Cybersecurity
Fragmentation of legal frameworks creates protection gaps and accountability challenges
Speakers
– Chantal Joris
– Nieves Molina
– Sanhawan Srisod
Arguments
Different legal frameworks (IHL, ICL, human rights law) address similar issues separately, creating fragmentation
Need for harmonization to avoid protection gaps and forum shopping
International law framework faces fragmentation challenges with different bodies addressing similar issues separately
Summary
There is strong consensus that the current fragmented approach to international law in digital spaces, where different legal specialties operate separately, creates significant gaps in protection and accountability mechanisms.
Topics
Legal and regulatory | Human rights
Corporate accountability requires moving beyond voluntary commitments to concrete methodologies
Speakers
– Tiesa Meccrewfy
– Francisco Brito Cruz
– Audience
Arguments
Tech companies have obligations under UN Guiding Principles on Business and Human Rights to conduct due diligence
Need to translate corporate responsibility into platform accountability through specific methodologies
Need for binding frameworks like global treaty on tech accountability beyond voluntary commitments
Summary
Speakers agree that while corporate obligations exist under current frameworks like the UN Guiding Principles, there is a critical need to develop concrete methodologies and potentially binding frameworks to ensure effective platform accountability.
Topics
Human rights | Legal and regulatory
Technology advances faster than legal frameworks can adapt
Speakers
– Chantal Joris
– Nieves Molina
– Audience
Arguments
Technology advances much quicker than ability of international bodies and domestic parliaments to create good rules
Technology may advance to the point where freedom of thought takes different connotation as thoughts become reachable
Question whether separate legislation for digital selves is needed given exponential rate of new technology deployment
Summary
There is consensus that the rapid pace of technological advancement significantly outpaces the ability of legal and regulatory systems to develop appropriate responses, creating ongoing challenges for digital governance.
Topics
Legal and regulatory | Human rights | Cybersecurity
Similar viewpoints
Both speakers recognize that current political realities make new comprehensive international treaties unlikely, requiring focus on interpreting existing law while acknowledging the practical difficulties of implementation without adequate institutional capacity.
Speakers
– Chantal Joris
– Francisco Brito Cruz
Arguments
Not in the golden age of treaty making, so must rely on existing international law rules and interpretation
Building platform accountability is difficult without state capacities or proper regulation
Topics
Legal and regulatory | Human rights
Both highlight how content moderation is being weaponized to suppress marginalized voices and dissent while allowing harmful content to proliferate, demonstrating systematic bias in platform governance.
Speakers
– Tiesa Meccrewfy
– Audience
Arguments
Systematic censorship and discriminatory content moderation policies suppress Palestinian voices and undermine digital rights
States use content moderation to silence dissent while failing to stop harmful content that fuels violence
Topics
Human rights | Cybersecurity
Both speakers emphasize the complex interconnection between state and corporate actors in digital spaces and the need for more transparent, participatory approaches to accountability that involve multiple stakeholders.
Speakers
– Nieves Molina
– Francisco Brito Cruz
Arguments
Close relationship between state and corporations creates blurred reality regarding levels of responsibility
Need for stakeholder engagement, participation, and transparency tools in human rights due diligence processes
Topics
Legal and regulatory | Human rights
Unexpected consensus
Children’s agency in digital spaces should be respected rather than imposing total protection
Speakers
– Mikiko Otani
– Audience
Arguments
Children want to use digital spaces safely rather than be totally protected or excluded
Digital users’ daily systems involve complex domains like political infrastructure, behavioral psychology, and economic power
Explanation
There is unexpected consensus that rather than completely protecting children from digital spaces, the focus should be on enabling safe participation. This challenges traditional protective approaches and recognizes children’s agency while acknowledging the complex manipulative systems they navigate.
Topics
Human rights | Sociocultural
International criminal law may need to expand to include corporate entities
Speakers
– Nieves Molina
– Sanhawan Srisod
Arguments
Some scholars are calling for international criminal responsibility of companies and expanding ICC-ROM treaty to include legal personalities
Accountability gaps exist between state and corporate responsibility in digital spaces
Explanation
There is emerging consensus on the potentially radical idea of extending international criminal responsibility to corporations, particularly in cases involving digital-enabled atrocity crimes. This represents a significant departure from traditional individual-focused international criminal law.
Topics
Legal and regulatory | Cybersecurity
Overall assessment
Summary
The speakers demonstrate strong consensus on the fundamental challenges facing international law in digital spaces: fragmentation of legal frameworks, the gap between technological advancement and legal adaptation, the need for concrete corporate accountability mechanisms, and the complexity of balancing protection with rights preservation. There is also agreement on the inadequacy of current voluntary approaches and the need for more systematic, coordinated responses.
Consensus level
High level of consensus on problem identification and challenges, with emerging agreement on some innovative solutions like expanding international criminal law to corporations and respecting children’s agency in digital spaces. The consensus suggests a mature understanding of the issues but highlights the urgent need for coordinated international action to address the identified gaps and fragmentation in digital governance.
Differences
Different viewpoints
Approach to new legislation vs. existing law interpretation
Speakers
– Chantal Joris
– Nieves Molina
– Francisco Brito Cruz
– Audience
Arguments
Not in the golden age of treaty making, so must rely on existing international law rules and interpretation
Reactive legislation often not based on human rights creates disconnect between international obligations and domestic implementation
Crimes enabled by digital technology are treated with exceptionalism despite existing legal frameworks
Need for binding frameworks like global treaty on tech accountability beyond voluntary commitments
Summary
Speakers disagreed on whether to focus on interpreting existing international law or creating new binding frameworks. Chantal and Nieves emphasized working with existing laws due to challenges in creating new treaties, while audience members called for new binding frameworks like global treaties on tech accountability.
Topics
Legal and regulatory | Human rights
Unexpected differences
Role of state capacity in platform accountability
Speakers
– Francisco Brito Cruz
– Chantal Joris
Arguments
Building platform accountability is difficult without state capacities or proper regulation
Reactive legislation often not based on human rights creates disconnect between international obligations and domestic implementation
Explanation
While both speakers acknowledged challenges in platform accountability, Francisco emphasized the necessity of state capacity and regulation (citing Brazil’s experience), while Chantal warned against reactive state legislation that violates human rights. This created an unexpected tension between the need for state intervention and concerns about state overreach.
Topics
Legal and regulatory | Economic | Human rights
Overall assessment
Summary
The main areas of disagreement centered on approaches to legal frameworks (new vs. existing), the role of state intervention in platform accountability, and the balance between protection and access in digital rights. However, there was broad consensus on core problems: fragmentation of legal approaches, accountability gaps, and the complexity of digital governance.
Disagreement level
The level of disagreement was moderate and primarily methodological rather than fundamental. Speakers shared common goals of protecting human rights and ensuring accountability in digital spaces, but differed on implementation strategies. This suggests that while there are different approaches being pursued, there is potential for convergence around shared principles and coordinated efforts.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers recognize that current political realities make new comprehensive international treaties unlikely, requiring focus on interpreting existing law while acknowledging the practical difficulties of implementation without adequate institutional capacity.
Speakers
– Chantal Joris
– Francisco Brito Cruz
Arguments
Not in the golden age of treaty making, so must rely on existing international law rules and interpretation
Building platform accountability is difficult without state capacities or proper regulation
Topics
Legal and regulatory | Human rights
Both highlight how content moderation is being weaponized to suppress marginalized voices and dissent while allowing harmful content to proliferate, demonstrating systematic bias in platform governance.
Speakers
– Tiesa Meccrewfy
– Audience
Arguments
Systematic censorship and discriminatory content moderation policies suppress Palestinian voices and undermine digital rights
States use content moderation to silence dissent while failing to stop harmful content that fuels violence
Topics
Human rights | Cybersecurity
Both speakers emphasize the complex interconnection between state and corporate actors in digital spaces and the need for more transparent, participatory approaches to accountability that involve multiple stakeholders.
Speakers
– Nieves Molina
– Francisco Brito Cruz
Arguments
Close relationship between state and corporations creates blurred reality regarding levels of responsibility
Need for stakeholder engagement, participation, and transparency tools in human rights due diligence processes
Topics
Legal and regulatory | Human rights
Takeaways
Key takeaways
International law applies to cyberspace but practical implementation remains fragmented across different legal frameworks (international human rights law, international humanitarian law, international criminal law)
There is broad consensus that existing international law provides regulatory framework for digital spaces, but harmonization between different bodies of law is lacking, creating protection gaps
Corporate accountability under UN Guiding Principles on Business and Human Rights exists but translating this into effective platform accountability requires specific methodologies, transparency, and stronger enforcement mechanisms
Digital rights violations disproportionately affect marginalized communities, with systematic censorship and discriminatory content moderation suppressing voices during conflicts
Children’s rights in digital environments require comprehensive protection considering their evolving capacities, with emphasis on safe access rather than exclusion from digital spaces
Technology advances faster than legal frameworks can adapt, creating accountability gaps where both state and corporate actors can evade responsibility
The blurred relationship between states and corporations in digital governance complicates attribution of responsibility and enables forum shopping for favorable regulations
Resolutions and action items
Need for more coordinated efforts to avoid fragmentation of international law responses in cyberspace
States and businesses should integrate child rights impact assessment in design, development, and operation of digital products and services
Requirement for human rights due diligence processes that include stakeholder engagement, participation, and transparency tools for civil society
Development of specific methodologies to translate corporate responsibility principles into operational platform accountability measures
Need for monitoring and evaluation mechanisms to assess how digital legislation and regulations are performing in practice
Unresolved issues
How to effectively harmonize different bodies of international law (human rights, humanitarian, criminal) when they address similar digital issues separately
Who is responsible for protecting civilians in digital spaces when states themselves contribute to erosion of online civic space
How to balance platform accountability with protection of user rights without creating disproportionate restrictions
Whether separate comprehensive legislation for digital rights is needed or if existing international law frameworks are sufficient
How to enforce state obligations to prevent malicious cyber activity from their territory without infringing sovereignty
What binding frameworks beyond voluntary commitments could ensure equitable digital governance
How to address the exponential rate of new technology deployment that attempts to access and profile users’ thoughts and cognition
How to ensure remedies and access to justice for victims of digital rights violations across fragmented legal systems
Suggested compromises
Focus on operationalizing and clarifying existing international law rather than creating entirely new treaty frameworks, given the current challenges in international treaty-making
Adopt iterative approach of ‘intervention and testing’ for digital legislation, acknowledging that some regulatory innovations may fail and require adjustment
Use UN Guiding Principles on Business and Human Rights as an operative layer to navigate tensions between different international law frameworks
Emphasize consultation processes and hearing diverse voices (including children, marginalized communities, regional perspectives) when developing digital governance approaches
Balance children’s desire for safe digital access rather than complete protection or exclusion from digital spaces
Develop transparency requirements for both state and corporate actions in digital spaces to enable proper scrutiny of measures taken
Thought provoking comments
Those initiatives are very relevant, but I think there could be more effort probably to avoid the fragmentation of the responses to how international law applies to cyberspace and to make it more, to harmonize it more, to avoid a protection gap… from the perspective of the impacted communities and the rights holders who might be harmed by those types of information operations, it is not that relevant whether it’s humanitarian law or human rights law and which one is like specialis.
Speaker
Chantal Joris
Reason
This comment is deeply insightful because it shifts the focus from academic legal distinctions to the practical reality faced by victims. It highlights a fundamental problem in international law – that legal fragmentation creates protection gaps for those who need help most. The observation that victims don’t care about which legal framework applies, they just need protection, is both profound and practical.
Impact
This comment established the central theme of the entire discussion – fragmentation as a barrier to justice. It set up the framework that subsequent speakers built upon, with each panelist addressing different aspects of this fragmentation problem. It moved the conversation from theoretical legal analysis to victim-centered practical concerns.
There is this paradox while most legal scholars think that international law has most of the principles that would help us to provide remedies or to provide a regulation, there is certain exceptionalism on producing new legislation and attempting to create new laws that take time to create, but at the same time create a danger of fragmentation of the law.
Speaker
Nieves Molina
Reason
This observation reveals a critical paradox in how the international community responds to digital challenges. It’s thought-provoking because it suggests that the very attempt to solve problems through new legislation may be creating bigger problems through fragmentation. The concept of ‘exceptionalism’ in treating digital crimes differently is particularly insightful.
Impact
This comment deepened the discussion by introducing the paradox that efforts to create solutions might be creating new problems. It influenced the conversation by making participants question whether new legislation is always the answer, and it connected to later discussions about the challenges of treaty-making in the current global context.
Children living in digital world space, however, offline and offline, online and offline is seamless. It’s not so easy to differentiate what is the online and offline for the children… children want to use the digital. So they claim to us, the committee, that they don’t want to be totally protected or excluded from the digital space, but they want to use the online space safely.
Speaker
Mikiko Otani
Reason
This comment is profoundly insightful because it challenges the traditional binary thinking about online/offline spaces and protection/access. The revelation that children themselves reject overprotection in favor of safe access represents a sophisticated understanding of digital rights that many adults lack. It reframes the entire approach to digital protection.
Impact
This comment shifted the discussion from a paternalistic approach to digital protection to one that recognizes agency and voice of affected communities. It influenced how other participants thought about balancing protection with access, and it reinforced the theme that those affected by digital harms should be centered in policy discussions.
The assertion that the action can produce violations, but also action can produce violations… inaction can produce violations, but action can produce violations as well.
Speaker
Francisco Brito Cruz
Reason
This seemingly simple observation captures one of the most complex challenges in digital governance. It’s thought-provoking because it acknowledges that there are no easy solutions – both regulating and not regulating can cause harm. This paradox is at the heart of many digital policy dilemmas.
Impact
This comment introduced a crucial nuance to the discussion about accountability and regulation. It prevented the conversation from becoming overly simplistic about solutions and forced participants to grapple with the complexity of digital governance. It influenced later discussions about the need for careful, monitored approaches to regulation.
In the middle of all this information overload, whether or not international law and the content of human rights have been also a victim or have been also been targeted by misinformation campaigns, relativizing international law and questioning their usefulness.
Speaker
Nieves Molina
Reason
This is a meta-level insight that’s particularly thought-provoking because it suggests that the very framework being discussed (international law) may itself be under attack through digital means. It raises the possibility that misinformation campaigns are deliberately undermining faith in international legal frameworks, creating a recursive problem.
Impact
This comment added a new dimension to the discussion by suggesting that the challenges aren’t just about applying international law to digital spaces, but about protecting international law itself from digital attacks. It introduced the concept that the erosion of trust in international law might be a deliberate strategy, adding urgency to the discussion.
Overall assessment
These key comments fundamentally shaped the discussion by establishing fragmentation as the central challenge, introducing crucial paradoxes about regulation and protection, and centering the voices of affected communities. The conversation evolved from a technical legal discussion to a more nuanced exploration of the tensions between protection and access, action and inaction, and the need for victim-centered approaches. The comments collectively moved the discussion away from seeking simple solutions toward acknowledging complexity and the need for holistic, coordinated responses. Most importantly, they established that the current fragmented approach to international law in digital spaces is failing those it’s meant to protect, creating a compelling case for more integrated, community-centered approaches to digital governance.
Follow-up questions
How can different bodies of international law (human rights law, humanitarian law, criminal law) be better harmonized to avoid fragmentation and protection gaps in digital spaces?
Speaker
Chantal Joris
Explanation
This addresses the core challenge of legal fragmentation where the same conduct (like incitement to violence) is covered by multiple legal frameworks but interpreted separately, creating confusion and potential gaps in protection
How do we operationalize and clarify what each actor’s obligations are across different legal frameworks when dealing with digital harms?
Speaker
Chantal Joris
Explanation
There’s a need for practical guidance on how to apply overlapping legal obligations from different international law frameworks in real-world digital scenarios
How can we build effective methods for human rights due diligence that go beyond general principles to create specific, measurable corporate accountability?
Speaker
Francisco Brito Cruz
Explanation
While principles exist, there’s a gap in translating them into concrete methods that can be monitored and enforced against tech companies
Should international criminal responsibility be expanded to include legal personalities (corporations) under frameworks like the ICC-Rome Statute?
Speaker
Nieves Molina
Explanation
Given the blurred lines between state and corporate responsibility in digital harms, there’s growing discussion about whether companies should face international criminal liability
Has international law itself become a victim of misinformation campaigns that relativize its importance and create accountability gaps?
Speaker
Nieves Molina
Explanation
This explores whether disinformation efforts are deliberately undermining trust in international legal frameworks to create spaces for impunity
When states themselves contribute to online civic space erosion, who is responsible for protecting civilians in digital spaces?
Speaker
Audience member (Ulvia)
Explanation
This addresses the accountability vacuum when the primary duty-bearer (the state) is itself the violator of digital rights
How can international law and domestic platform regulation work together rather than conflict, especially when states can also violate human rights?
Speaker
Audience member
Explanation
There’s tension between international obligations and domestic regulatory approaches that need to be resolved for effective platform accountability
Are separate legislative frameworks needed for digital rights that holistically address the interdisciplinary nature of digital harms, including protection of cognition and thought?
Speaker
Audience member (Ana Galate)
Explanation
Current frameworks may be inadequate for emerging technologies that can access and manipulate human thoughts and cognitive processes
What binding frameworks beyond voluntary commitments could ensure equitable digital governance, such as a global treaty on tech accountability?
Speaker
Audience member (Christian Fazili)
Explanation
There’s a question about whether voluntary corporate responsibility frameworks are sufficient or if binding international treaties are needed
To what extent are states obligated to prevent malicious cyber activity originating from their territory, and how can this be enforced without infringing sovereignty?
Speaker
Audience member (Christian Fazili)
Explanation
This addresses the balance between state sovereignty and international obligations to prevent cross-border digital harms
How can sovereignty and rule of law serve as paths for human rights protection rather than barriers in the digital context?
Speaker
Francisco Brito Cruz
Explanation
This explores how traditional concepts of sovereignty can be reframed to support rather than hinder digital rights protection
How can diverse voices and regional perspectives be systematically integrated into international law development for digital spaces to avoid fragmented approaches?
Speaker
Mikiko Otani
Explanation
There’s a need for inclusive processes that bring together different stakeholder groups and regional experiences in developing digital governance frameworks
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.