Day 0 Event #255 Update Required Fixing Tech Sectors Role in Conflict
23 Jun 2025 14:45h - 15:45h
Day 0 Event #255 Update Required Fixing Tech Sectors Role in Conflict
Session at a glance
Summary
This discussion, titled “Update Required,” focused on ensuring tech companies respect international humanitarian law and evolving standards regarding private sector roles in armed conflicts. The panel featured experts Marwa Fatafta from Access Now, Chantal Joris from Article 19, and Kiran Aziz from KLP, a Norwegian pension fund, discussing corporate accountability in conflict zones.
Fatafta emphasized that tech companies are never neutral actors in armed conflicts, outlining three ways they contribute to harm: directly causing human rights violations through censorship, providing technological assistance to military forces, and mirroring state policies of discrimination. She cited examples from Gaza, including Google and Amazon’s Project Nimbus providing cloud services to Israeli military, and Microsoft supplying engineering services to defense units. Despite civil society pressure, she noted no meaningful positive changes in corporate behavior, with companies increasingly dropping voluntary commitments against military AI development.
Joris explained the legal framework, noting that both international humanitarian law and human rights law apply during conflicts, with enforcement primarily through international criminal law and domestic courts. She highlighted challenges in attribution and evidence gathering, particularly as tech companies become more integrated with military operations. The discussion revealed that corporate executives could theoretically face liability under international criminal law, though few precedents exist.
Aziz described investor perspectives, explaining how institutional investors rely on public information and civil society reports to assess risks. She noted the extreme difficulty in engaging tech companies compared to traditional sectors, leading to exclusions from investment portfolios when companies fail to respond to human rights concerns. The panel concluded that stronger government regulation, transparency requirements, and strategic litigation are essential for meaningful corporate accountability in the tech sector.
Keypoints
## Major Discussion Points:
– **Tech companies’ direct involvement in armed conflicts**: Discussion of how technology companies are not neutral actors but actively contribute to conflicts through providing cloud computing services, AI tools, facial recognition technologies, and other services to military forces, with specific examples from the Gaza conflict including Google’s Project Nimbus and Microsoft’s engineering services to Israeli military units.
– **Legal frameworks and enforcement challenges**: Examination of how international humanitarian law (IHL) and human rights law apply to tech companies, the difficulties in establishing corporate accountability under current legal systems, and the potential for strategic litigation through domestic courts, international criminal law, and investor pressure.
– **Corporate transparency and due diligence failures**: Analysis of tech companies’ extremely low response rates to civil society inquiries (4% compared to 26% for companies in Russia/Ukraine conflicts), their refusal to conduct meaningful human rights due diligence, and their lack of transparency about operations in conflict zones.
– **Evidence requirements for accountability**: Discussion of what types of evidence are needed to hold tech companies accountable, including impact stories, corporate relationship mapping, government contract transparency, and the burden of proof challenges in different legal contexts.
– **Increasing militarization of civilian tech**: Concern about the trend of tech companies dropping voluntary commitments against military applications, forming partnerships with defense contractors, and executives joining military units, blurring the lines between civilian technology and military operations.
## Overall Purpose:
The discussion aimed to explore avenues for ensuring tech companies respect international humanitarian law and to develop strategies for corporate accountability in the technology sector’s role in armed conflicts. The session sought to identify enforcement mechanisms, evidence requirements, and collaborative approaches between civil society, investors, and legal systems to address the largely unchecked influence of tech companies in conflict situations.
## Overall Tone:
The discussion maintained a serious, urgent, and somewhat frustrated tone throughout. Speakers expressed deep concern about the lack of corporate accountability and transparency, with particular frustration about companies’ unwillingness to engage meaningfully with civil society. The tone was analytical and solution-oriented, with participants sharing expertise and brainstorming practical approaches, but there was an underlying sense of urgency given the ongoing conflicts and the increasing integration of technology into warfare. The atmosphere was collaborative among panelists and audience members, united in their concern about the current state of corporate responsibility in the tech sector.
Speakers
– **Meredith Veit**: Session moderator, leads the discussion on tech companies and international humanitarian law
– **Marwa Fatafta**: From Access Now, leads policy and advocacy work on digital rights in the Middle East and North Africa, has written extensively on digital occupation in Palestine and focuses on the role of new technologies in armed conflicts
– **Kiran Aziz**: Representative from KLP (Norwegian pension fund), works on investor engagement and exclusion policies related to human rights violations
– **Chantal Joris**: From Article 19, senior legal officer focusing on platform regulation and freedom of expression
– **Audience**: Multiple audience members who asked questions during the Q&A session
**Additional speakers:**
– **Dr. Pichamon Yeophantong**: Mentioned multiple times in transcript but role/expertise not clearly defined
– **Phillipe Stoll**: Mentioned multiple times in transcript but role/expertise not clearly defined
– **Jalal Abukhater**: Mentioned in transcript but role/expertise not clearly defined
– **Annette Esserhausen**: Audience member, works with the Association for Progressive Communications
– **Monika Ermert**: Audience member, reporter
– **Audrey Moklay**: Audience member, from Open Mic
– **Sadhana**: Audience member who asked a question about the Genocide Convention
Full session report
# Update Required: Ensuring Tech Companies Respect International Humanitarian Law
## Executive Summary
The panel discussion “Update Required” examined the critical issue of ensuring technology companies comply with international humanitarian law in armed conflicts. Moderated by Meredith Veit, the session brought together experts including Marwa Fatafta from Access Now, who leads policy and advocacy work on digital rights in the Middle East and North Africa; Chantal Joris from Article 19, focusing on platform regulation and freedom of expression; and Kiran Aziz from KLP, a Norwegian pension fund, who works on investor engagement regarding human rights violations.
The discussion revealed that technology companies are increasingly active participants in armed conflicts rather than neutral service providers. A particularly striking finding was the dramatically low response rate from tech companies to accountability inquiries—only 4% compared to 26% for similar outreach regarding the Russia-Ukraine conflict. The panel explored multiple avenues for accountability, from legal frameworks to investor pressure, while acknowledging significant barriers including corporate opacity and government protection of domestic tech companies.
*Note: This summary is based on a transcript with significant technical issues and garbled sections, particularly affecting the complete capture of some speakers’ contributions.*
## The Challenge of Tech Company Neutrality
Marwa Fatafta fundamentally challenged the notion of tech company neutrality, stating: “Tech companies are never neutral actors in situations of armed conflict. They exacerbate the dynamics of the conflict and sometimes even drive them or fuel them, particularly in contexts where there are asymmetries of power between warring parties.”
Fatafta outlined three primary ways tech companies contribute to harm in conflict zones:
**Direct Human Rights Violations**: Companies engage in systematic censorship and content removal that mirrors state policies of discrimination. In the Palestine context, this includes widespread removal of Palestine-related content and suppression of documentation of human rights violations.
**Technological Assistance to Military Forces**: Fatafta provided specific examples from the Gaza conflict, including Google and Amazon’s Project Nimbus, described as a $1.2 billion contract providing cloud computing services to the Israeli military, and Microsoft’s provision of 19,000 hours of engineering and consultancy services to Israeli defense units including “Unit A200 and Unit 9900.”
**Mirroring State Policies**: Technology companies replicate discriminatory state policies through their service provision, including differential geographic service availability and varying representations of occupied territories.
## The Militarization Trend
A concerning development highlighted by Fatafta was the increasing militarization of civilian technology companies: “There’s a surge in increasing militarization of civilian tech… both Google and OpenAI have both quietly dropped their voluntary commitments earlier this year not to build AI for military use or surveillance purposes.”
She also noted the direct integration of tech executives into military structures: “Senior executives from high-tech firms, specifically Meta, OpenAI and Palantir, are joining the US Army Reserve at a new unit called Executive Innovation Corp.”
This trend represents a fundamental shift in the technology sector’s relationship with military operations, moving from maintaining ethical boundaries to actively pursuing defense partnerships.
## Legal Framework Perspectives
Chantal Joris provided legal context on how international humanitarian law applies to technology companies, though the transcript quality limits the complete capture of her contributions. She discussed the potential for corporate executives to face liability under international criminal law, noting that “corporate executives, in theory, under the very, very high thresholds that are under the Rome Statute could be liable under international criminal law.”
Joris emphasized the importance of government transparency, noting that many service contracts fall under national security exemptions, limiting access to crucial evidence needed for accountability efforts.
## The Accountability Gap
Perhaps the most striking revelation was the extent of corporate resistance to accountability measures. As noted in the opening remarks, there was “an astonishingly low 4% response rate from companies,” which was described as “unprecedented,” particularly when compared to the 26% response rate for similar outreach regarding tech companies operating in Russia and Ukraine.
Fatafta noted that even when companies claim to conduct human rights due diligence, these processes are fundamentally flawed: “Even when companies claim to conduct audits, they lack insight into how their technologies are used, making due diligence ineffective.”
## Investor Perspectives
Kiran Aziz provided insights into how institutional investors approach tech company accountability. She explained that “institutional investors rely on long-term perspectives that incorporate material risks including human rights violations as financial risks.”
However, investors face significant challenges due to corporate opacity: “Investors depend heavily on civil society reports and public domain information since companies provide inadequate reporting on human rights impacts.”
Aziz noted that “exclusion of companies from investment portfolios can be effective when done transparently with public documentation of reasons,” but emphasized that “tech companies are increasingly difficult to engage with compared to traditional sectors, often only referencing policies without discussing concrete matters.”
## Government Protection and Political Barriers
The discussion revealed how government policies actively shield tech companies from accountability measures. Fatafta highlighted the political dimensions: “The Trump administration is taking an extremely protectionist approach to their tech sector… they will not grant visas to foreign officials who have mandated quote-unquote censorship by these companies.”
This government protection creates significant barriers to international accountability efforts and reflects the strategic importance of tech companies to national competitiveness.
## Evidence and Documentation Challenges
The discussion emphasized the critical importance of evidence gathering for accountability efforts. Audience members stressed the need for:
– **Impact stories** showing how specific corporate actions led to concrete human rights violations
– **Corporate relationship mapping** to understand broader patterns of partnerships
– **Hard evidence** including contracts and internal communications for litigation
– **Risk assessment documentation** for investor engagement
The challenge is that different accountability mechanisms require different types of evidence, but corporate opacity makes gathering any form of evidence extremely difficult.
## Audience Engagement and Practical Concerns
Significant portions of the discussion involved audience questions and responses, reflecting concerns about:
– The effectiveness of different accountability mechanisms
– The role of documentation in building cases against tech companies
– Strategies for overcoming corporate resistance to engagement
– The adequacy of current legal frameworks for addressing tech sector challenges
## Areas of Consensus
Despite different professional backgrounds, the panelists demonstrated consensus on several critical issues:
– Tech companies are not neutral actors in conflicts
– Voluntary corporate responsibility frameworks have failed
– Corporate transparency is inadequate, with unprecedented resistance to accountability
– Current due diligence frameworks are insufficient for the tech sector
– The militarization trend is deeply concerning
## Unresolved Challenges
The discussion highlighted several unresolved questions:
– How to effectively regulate US-based tech companies given government protectionism
– What burden of proof standards should apply to corporate due diligence
– How to address attribution challenges when tech executives integrate into military structures
– How to access information protected under national security classifications
## Conclusion
The “Update Required” discussion revealed the significant challenges in holding technology companies accountable for their roles in armed conflicts. The combination of corporate resistance, government protection, and inadequate legal frameworks creates substantial barriers to accountability.
The speakers’ consensus suggests that incremental reforms are insufficient and that systemic change is required, including new legal frameworks specifically designed for the tech sector and coordinated international action. The path forward requires acknowledging that tech companies are active conflict participants and developing appropriate accountability mechanisms.
The discussion’s title proves apt—an update is indeed required not just for tech companies’ practices, but for the entire ecosystem of accountability mechanisms needed to address the unprecedented challenges posed by the technology sector’s role in armed conflicts.
*This summary reflects the content available in the provided transcript, which contained significant technical issues and incomplete sections that may have affected the complete capture of all speakers’ contributions.*
Session transcript
Meredith Veit: Welcome to the session, Update Required. We’re going to discuss avenues for ensuring that tech companies respect international humanitarian law, as well as evolving international norms and standards regarding the role of private sector actors in conflict. I have three fantastic experts here that are going to help guide us through the discussion today. Get prepared. This is going to be quite active. We have mics here on either side of the room. So we expect a lot of audience participation, given all the expertise as well out there and online. First we have Marwa Fatafta from Access Now. She leads their policy and advocacy work on digital rights in the Middle East and North Africa. And she’s written extensively on the digital occupation in Palestine and focuses on the role of new technologies in armed conflicts. We also have Chantal Jordis from Article 19. She is a senior legal officer focusing on platform regulation and freedom of expression in… Meredith Veit, Marwa Fatafta, Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Dr. Pichamon Yeophantong, the U.N. Guiding Principles on Business and Human Rights. In 2014, we conducted a survey on heightened and Dr. Pichamon Yeophantong. We have a total of 70 human rights due diligence with over 104 technology companies operating in or providing services to the occupied Palestinian territories and or Israel, and only three companies got back to us in detail actually responding to the questions of the survey, making it nearly impossible to actually determine if and how heightened human rights due diligence is actually occurring at all in a context that has long been exposed to conflict related risk. We have also reached out to a number of tech companies in the Middle East. We have also reached out to a number of tech companies in the Middle East. An astonishingly low 4% response rate from companies is unprecedented in the resource center’s history, and previously we had sent a similar survey to tech companies that were operating in Russia and Ukraine, and 26% had responded by comparison, and, of course, both of these numbers are abysmally low, 26% and 4%, which means we need more transparency about what is happening in around the world, where the conflict is taking place and what the impact is looking like upon Afghanistan. One thing I will note is that USAID and hundreds of oil companies are working together. In Afghanistan, so for instance, with coal companies, we have no threat to Iran. We have no threat to earthquakes or lagoons. We have no threat to Gaza or 2006. One other thing I will notice is that companies are profiting from conflicts but they are exacerbating them, further hurt-earning harms. We don’t have a handful of instances where we have seen corporate accountability for aiding war crimes and crimes against humanity playing out in courts and boardrooms, whether it’s convictions or sanctions and they are really rolling back their collectively to uphold their principal principles. We’re going to talk about a few of these. and Dr. Pichamon Yeophantong. We are excited to be here today. We have heard a lot of things today such as governments placing export controls on companies that are selling dual-use tech to maligned actors. There’s currently a stakeholder resolution before Alphabet requesting that the company carry out heightened human rights due diligence regarding its operations in conflict zones. And there are a number of Norwegian investors, some of which borrowed laws due diligence from all Western governments over their misconduct in relation to conflict in international law. And while these examples are incredibly important and noteworthy, and hopefully we surface even more examples during this discussion today, these should not be the exception. Guaranteeing that tech companies are not involved in breaching international humanitarian law should be the minimum requirement. And for the tech sector, we have yet to set a strong enough precedent for accountability, and therefore it is not possible to say that no major tech companies or executives have been criminally convicted for violating international humanitarian law, although there is mounting evidence, and as we know, tech companies are not neutral actors in many conflicts. So we’re going to spend the rest of our time today, about 50 minutes, discussing this topic together and diving into more about what’s needed for greater corporate accountability for the tech sector’s largely unchecked and increasingly powerful and pervasive role in conflict. So we can start off with our expert interventions, and then we’ll open up to the floor to talk a bit further together. So first I’ll start off all the way to my left with Marwa, asking her to kick us off. Can you reflect a little bit more on different ways in which tech companies have been involved in conflict, and have we seen any meaningful positive change in regard to corporate behavior in response to civil society or regulatory pressure?
Marwa Fatafta: Thank you very much, Meredith, and thanks, everyone, for attending this session. I will start with the point you ended with, to emphasise on the fact that tech companies are never neutral actors in situations of armed conflict. They exacerbate the dynamics of the conflict and sometimes even drive them or fuel them, particularly in contexts where there are asymmetries of power between warring parties. They can facilitate human rights abuses or even in some cases contribute to atrocity crimes. I have been primarily focused on the unfolding genocide in Gaza over the past year and a half. Most of my examples will derive from this particular context, which is important because in some ways it might be a foreshadow to the future of cyber warfare and the involvement of tech companies. I will expand on that in due course. I can summarise the ways in which tech companies have been involved in conflict in three notable patterns. Firstly, tech companies can be responsible for directly causing adverse human rights impacts that undermine or violate people’s rights, including the right to freedom of expression, the right to a peaceful association and assembly, the right to bodily security, non-discrimination, among other rights encoded and enshrined in the International Covenant on Civil and Political Rights, and also economic, social and cultural rights. An example of that is what you mentioned with regards to censorship by social media companies and the systematic removal of Palestine-related content online. The second trend is that some companies indeed can contribute to adverse impacts via third parties, such as, for example, the Israeli government or another military. Companies provide direct technological assistance, products, services in the context of Gaza. This includes cloud computing services, AI tools such as LLMs, facial recognition technologies, among others, which have been linked to egregious violations of international law, gross human rights abuses, crimes against humanity, war crimes, and possibly the crime of genocide, which is pending before the International Court of Justice. And here I want to mention maybe just a few examples. We know, for instance, that Google and Amazon, they have this $1.2 billion project providing a national cloud service or infrastructure to the Israeli government called Project Nimbus. During the war, the start of the war, those services have surged in demand. So we know that Google has deepened its business relationship with Israel, particularly with the Ministry of Defense in March 2024, and provided them with a landing zone. They even created, according to media reports, a classified team composed of Israeli nationals with security clearances specifically tasked with receiving sensitive information from the Israeli government and also to provide specialized training with government security agencies and participate in joint drills and scenarios tailored to specific threats. Amazon Web Services also had provided, according to media reports, provided Israel’s military intelligence with server farms that allow for endless storage for mass surveillance data that Israel has gathered on almost everyone in the Gaza Strip. And beyond supplying cloud infrastructure, according to media investigations, they have, on occasion, Dr Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Meredith Veit, Jalal Abukhater, Dr. This is just to demonstrate that the services they provide are substantial in nature, indicated by the surge and high demand that we’ve seen across these different companies. And also another point that I would like to mention is that this is not only about providing technological support, but actually providing human resources, trainings, joint exercises, for example, some leaked documents from Microsoft had shown that the Israeli Ministry of Defense had purchased approximately 19,000 hours of engineering and consultancy services from Microsoft. And Microsoft teams had provided on-site, so on military basis, as well as assistance, and remotely to the IDF, including to units called Unit A200 and Unit 9900, which are notorious for surveillance and military surveillance in particular. A third trend or way in which the companies or tech companies can or are involved in armed conflict is sometimes the companies can contribute to adverse impacts in parallel with a third party, in this case, you know, with the Israeli government or the military, leading to cumulative impacts. For instance, a number of tech companies have mirrored Israel’s state policy of apartheid and segregation in the way they provide or prohibit or withdraw their services to the Palestinians. One clear example of that is, for instance, Google Maps, which if you use it in the West Bank or in the occupied Palestinian territories, it only treats you as if you are an Israeli settler. So you’re only given roads and maps that connect Israeli settlements, but you’re not given any roads between, for example, Palestinian towns or villages, putting people at direct risk, safety risk. PayPal is another interesting example, where if you are an Israeli settler living in an illegal settlement, you can access, for example, PayPal financial services. But if you’re a Palestinian, you’re deprived of that service, which contributes to the development of the Palestinian communities and economy, something that has been written about by the World Bank and other UN agencies, showing again the degree in which these tech companies, by depriving or refusing to provide their services for whatever reason, to communities that are going through a situation of military occupation, for instance, can contribute to the cumulative impact of the occupying power or the policies that they have enshrined in that area. To your question, have we seen any positive shifts? Quick answer, no. Unfortunately, and particularly in the Palestine context, the survey that you have shared in the beginning is really an experience that we share ourselves. We have written letters and have tried to meaningfully engage with tech companies to point out, first to point and show evidence, a body of evidence of the harms that they are directly or indirectly contributing to. But most of those attempts have gone unproductively. We haven’t gotten any productive answers from the companies with regards to their conduct, or, for instance, they were unable to even answer very simple questions such as have you conducted a heightened human rights due diligence in order to mitigate and identify and mitigate such harms? There’s zero transparency with regards to that conduct. But also, even when they do succumb to pressure, for example, Microsoft had recently issued a statement after a year and a half of public mobilization, not only from civil society, but particularly from their tech workers, in which they said, well, we conducted an audit to see whether our technologies have contributed to harm or targeting of civilians in the Gaza Strip. And while we don’t have an insight into how our technologies are used by Israel, especially in air-gapped military bases, we concluded that we have not contributed to any harm. And that contradiction in itself shows you how even UNGPs have, you know, when companies say we’re going to do a heightened human rights due diligence, that results in a box-sticking exercise where they really don’t have any insight or ability to control how their technologies are being used. Finally, I do want to end on a note that I think quite the opposite of what, where or where we want to see companies going. There’s a surge in increasing militarization of civilian tech provided by those companies. For example, both Google and OpenAI have both quietly dropped their voluntary commitments earlier this year not to build AI for military use or surveillance purposes, signaling their readiness to deepen their ties with the arms industry. Within a month of amending its AI principles, Google signed a formal partnership with Lockheed Martin, one of the biggest defence contractors in the world. Open AI, which maintains a non-profit status, also announced a partnership with a defence tech company called Anduril to deploy its technologies in the battlefield. Anduril and Meta are also partnering to design, build and field a range of integrated virtual reality products for the US military. And last week, there was a very disturbing announcement that senior executives from high-tech firms, specifically Meta, Open AI and Palantir, are joining the US Army Reserve at a new unit called Executive Innovation Corp. In the rank of lieutenant colonels to provide tech advice for the US military. So there we see a trend where not only tech companies are providing militaries with their services, but are actually possibly even combatants or taking an active role in the military, which has implications that Chantal maybe can…
Meredith Veit: This is a perfect segue. Thanks, Marwa. Chantal, can you tell us a little bit more about amongst these blurred lines and confusing commitments and changing the definition of what actually is a tech company, what is a defence company and are they one and the same now? What does the hard law actually tell us when we’re talking about international humanitarian law and what friction exists in applying IHL to companies and how does this relate to other existing frameworks that should be helping to guide states and companies like international human rights law? Thank you, Meredith, and thanks everyone for joining.
Chantal Joris: I will try to unpack some of these questions a little bit. I think there’s a few interesting developments also that Marwa mentioned. One thing I remember, I believe it was in Washington Post, where there was the the headline of the the social media company Meta is providing these like virtual reality products for For the US military and I was like well Is it then still really a social network like these these companies are clearly merging into something? So so much bigger and that has obviously been going on for a long time and and also when we look at this increased integration into the military Legally speaking, I think it also raises some quite significant questions around Attribution. So we think about you know, where does it sit and does it become a state obligation if a Meta executive? Operates within the military then again, okay He might be a combatant but also it gives rise to to state obligations directly So I think yeah There’s a lot of questions that that are based on the facts that that become Become more merged and and become more complicated to answer Maybe I will first talk a little bit about the the legal obligations and then about thoughts when it comes to enforcement and I think we need to look at so both international humanitarian law and human rights law because in a sense they They are different in their application and although they’ve both applied during armed conflicts, but also in that sense they there’s different opportunities for For enforcement and accountability, I would say So for example when when it comes to humanitarian law, of course It starts applying in in times of armed conflict non-international armed conflict international armed conflict And it’s actually more sort of used to apply to actors that are also not state actors So be like non non state armed groups, but also to individuals and and applying it to companies It wouldn’t necessarily directly apply to a company but it does apply to individuals that operate within a company When those business activities have an access to an armed conflict and as we’ve heard now Traditionally speaking, you might be thinking more of a mining company that is on the ground or a private military security company that is on the battlefield. But with these technology companies, they might have seemed a bit further removed, but they are increasingly so closely intertwined with how these battles are fought with the military that I think there is, in most cases, in many cases, obviously always context dependent and so on. But the relatively easy point to make that there is this nexus to an armed conflict. And so that means that the staff in that sense also would have to, would also be bound by IHL. And at the same time, human rights law is, we have this very famous soft law instrument of the UN Guiding Principles on Business and Human Rights, which more prominently speaks about human rights and humanitarian law. But in terms of hard law, you would be, it’s more established as long as we don’t have a business and human rights treaty, international treaty, it’s of course more established for state actors. And that also translates in a sense into a bit of a difference when it comes to enforcement. So when you look at enforcement and accountability, in terms of humanitarian law, you will primarily think about international criminal law. Or let’s say, I mean, I think some people in the humanitarian sector would disagree because they say enforcement is not only litigation enforcement, but I speak specifically about legal obligations and litigations in that sense. There as well on the international level, you might be able under some circumstances to go to the International Criminal Court. We’ve seen recently the ICC is in the moment, is at the moment drafting a policy on, the office of the prosecutor of the ICC is drafting a policy on how cyber innovation… cyber conduct might fall under the Rome Statute. And there, you know, at the ICC, you might not have legal persons as such that can be accountable or criminally liable. But corporate executives, in theory, under the very, very high thresholds that are under the Rome Statute could be liable under international criminal law. So that’s on the international level, although probably we still have a bit of a way to go until this is an actual realistic prospect that those are the cases that are actually brought by the prosecutor. Let’s see, let’s hope, let’s hope not. On a domestic level, I think not only with respect to tech companies, generally speaking, sort of liability for war crimes for corporate executives is not something where we have a huge amount of case law. There is the Lundin Oil case, where we talk about corporate executives potentially being liable for aiding and abetting war crimes. There’s also the Lafarge case, where it was really the company’s liability itself that was at stake. But so these cases are becoming more prominent, but so often it really depends on the domestic framework. Do we have something like universal jurisdiction enshrined in the domestic criminal code? Is there, again, corporate, potential corporate criminal responsibility, or does it only have to go through the individuals? And depending on that, you know what evidence you might need to provide, you know what legal grounds you need to prove, and you can start building a case. Just to finish up, also, of course, human rights instruments in some domestic jurisdictions, there can be a domestic legal obligation, in a sense, to conduct human rights diligence, might include heightened human rights due diligence as well. We have also seen cases being brought against companies against the parent companies in the UK over some of their operations in Zambia. I think that is something that we can learn. How did they bring these cases? What evidence did they bring? How can we translate this into the tech sector that is more opaque, probably more intransparent than other sectors that we might be used to?
Meredith Veit: And this is a great setup for how we’re going to open up the floor to questions soon. We’re going to discuss what kind of evidence is needed and how can we work more collaboratively to get there. When it comes to investors, there may be different criteria about what constitutes evidence for investor action. What kind of information do investors typically need? Or at what point can investors actually act when it comes to portfolio companies being implicated with regards to allegations of potential violations of international law? Are there any examples that you can talk about from your experience with KLP?
Kiran Aziz: Thank you very much. And not least for having this conversation, which is much needed. Unfortunately, I thought I’d just start by just very briefly give an introduction about how we work as investors and what kind of opportunities we have and not at least limitations. And I think what I will say is mostly the case for a lot of the large institutional investors. And, you know, if you look at most of the international investors, they have a long-term perspective because they are investing people’s pension and saving money. And when you take a long-term perspective into account, it’s just not about the financial returns, but it is also some material risk which lies in a company. And this is where you’re trying to embedding respect for human rights into. Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Meredith Veit, Jalal Abukhater, Dr. Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Meredith Veit, Jalal Abukhater, Dr. Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Meredith Veit, Jalal Abukhater, Dr. Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Meredith Veit, Jalal Abukhater, Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Meredith Veit, Jalal Abukhater, Dr. These practices are coming into a legal obligation, such as in Norway we have a Transparency Act which demands investors to perform due diligence on their investments. And how do we know that there is a risk within a company? Well, you know, we as investors, we rely on information which is in the public domain. And this could be information, of course, from companies reporting. But when it comes to human rights, especially in conflict areas, you would see that there are… I wouldn’t say the company’s resources or the information or reporting from the companies is a really helpful tool. It’s mostly coming from the civil society. And this is where we, when we engage on this topic, it’s just not about engaging with companies but it’s also really vital to engage with vital stakeholders. And the UN High Commission on Human Rights and such as yours have done a really, really great job in very often conducting reports which tells that companies are present in some conflict areas where they have direct involvement. And this is where it is challenging because, as you said, there are about 110 armed conflicts. And it’s not necessary that we will have an exposure to all of these. There will be few such as the war in Gaza, West Bank, Myanmar, Yemen, Sudan to mention some. Where we see the companies play a vital role compared to a lot of other conflicts of war which have been in the presence such as in Afghanistan and so on. But again, I think our challenge as investors is really much about getting information which can link what’s happening of human rights violations up to the company’s contribution. And this is where we struggle and we really need help from vital stakeholders. and others. And just not the companies. And that’s one thing. And the other thing is if we often look at UNGPs, you would see that performing high due diligence is a really core perspective or the core tool, which we are also trying to implement in our investments. And when you are a passive investor, let’s say you invest in index funds, normally companies become part of your portfolio and then you perform a kind of due diligence later on. But what we have done now, given that it has been so much focus on UNGPs, is that we have started to screen companies up front. We did that when Saudi Arabia came as a new market in our portfolio. Then we conducted companies up front before we decided to invest. And also because due to Saudi’s involvement in the war in Yemen. And you know, we have traditionally been used to work with business models, which are quite traditional. The expectations which are there, the standards which are there, are very much tailored for traditional business, arm companies and so on. But you see that the tech companies are playing a much more important role. And we are really struggling, we as investors, to get an engagement with the companies at all. If there is an engagement, it’s very often that they would just like to give reference to their policies and they are not really interested in discussing the concrete matter. That’s one thing. And the other thing is also what we are seeing that, I would say, we as investors, you know, we have some tools, but they are limited. Because we really depend on the governments that they take the responsibility. I think this is a development which is sadly in a direction where less and less government parties are taking their responsibility. And most of the responsibility is left to the investors and business communities. And when it comes to… conflict areas we we try as much as we can but I have to say that especially when it comes to tech companies and I think it’s just regardless of KLP but most of investor we engage with that we are struggling with getting an engagement with the company at all and then you sit there you have information conducted from some of the civil society which says that the companies might contribute and then you are not able to get engagement with the companies well you know I would say that the proof of burden would be on the company side because we have given them a really fair chance to to express why their involvement should be seen as a contribution human rights violation or not and if they are not responding to our our queries then exclusion would be or or first choice it’s not that you know we would like to do that but I think this is where it’s really important that we even exclude a company at least for KLP we give a quite thorough exclusion document and this is a way to hold companies accountable but this is also to help other investors to get insight about where we draw the line between what is acceptable and not acceptable and I think this is also a way to try to put the companies on agenda and their contribution and we have seen we excluded you mentioned in your introduction we have excluded a lot of companies and I think it’s just that we are transparent about the exclusion which has really led that more investors have followed our exclusions and I think it’s also important for the companies to understand that you know if they improve their practices they can be re-included so I think I would stop there and then we can I’m happy to address the questions you might have.
Meredith Veit: And KLP adopts this very important best practice that not all investors do which as you mentioned is this public list and explanation as to why clearly citing the human rights harms or concerns with relation to international law which signals to the rest of the market right that in order for funds to flow, you have to respect investor principles and policies with regard to heightened due diligence. So we have about 20 minutes left. So now with the help of our tech colleagues, we’ll put a discussion question up on the board, and we’ll open it up to all of you. We want to dive a little bit deeper into this aspect of accountability and enforcement, which we’ve heard various explanations as to from different angles, we have different opportunities, different challenges. So the question for all of you is, what kind of evidence would lead to stronger enforcement actions against tech companies that facilitate violations of international humanitarian law, international law in times of conflict more broadly? And we can consider this from the different angles that we’ve tried to open up discussions about, from the different enforcers that have leverage over tech companies are states primarily who need to abide by international law according to their obligations. We have investors who also have responsibility to use their leverage with regard to tech companies within their portfolios according to the UN guiding principles. And we have courts, you know, strategic litigation is going to be a very key phrase for the coming five plus years, perhaps, in order to really push for accountability where we can. So with that, you know, feel free to come up to the microphones on the side of the room. We’d like to hear from all of your experiences or ideas or reflections based on what we’ve just heard as to what kind of evidence, what else do we need in order to start pushing for more enforcement for this sector? Any brave souls out there? Or anyone online as well? I know we have some people tuning in online.
Audience: My name is Annette Esserhausen, I work partly with the Association for Progressive Communications and actually we’ve started a best practice forum in the Internet Governance Forum that’s actually also looking at some of these things. I think the kind of evidence that is really useful is the impact, the stories of what the results are. I think there are many people in the investment digital justice space who might not be aware that this is not just sort of bad actions or irresponsible behavior, it actually can affect people’s lives. So I think certainly the stories of what the impact is. And then I think the other kind of evidence which is also important for civil society is what are the relationships? And I know this can be quite difficult to gather this kind of evidence, but in the way that Marwa was revealing partnerships of Lockheed Martin by different companies, I think that’s very useful for us as well. I think it’s important to understand how those corporate actors operate, not just in relation to one particular conflict, but what is the ethical backing or lack of it in how they form partnerships and make decisions about what they do where. So I think different types of evidence. It definitely, I think, could make it easier for us to try to hold them accountable. And please do, maybe Marwa can say a little bit about when it is as well. People that are interested in this topic, come to the Best Practice Forum meeting, which will be later this week, on Thursday. I think that’s the right time. Thanks for the panel. Very good.
Meredith Veit: Thursday at 2 p.m. Yes, at 2 p.m.
Audience: Hi, my name is Monika Ermert. I’m a reporter. I understand it’s difficult to engage with the companies. Is there any pathway from using governments to get them to engage? And then, did you try to engage with them to come here and to sit on that panel? Did you reach out to them? It would be interesting to know.
Kiran Aziz: No, we haven’t reached out to them for this particular forum. But we have, if you look at a lot of the tech companies, I think if they are placed within the EU, it would be easier to engage with the governments. And we try to use that. But if you see that Meta, Amazon, all of them, their headquarters are in the US. And I think it’s very much linked up to the current administration and how they are perceiving. And I think it’s just that international law as such, everything is being challenged at this time. So I think it’s very much, I think if you engage with the companies, we saw that it was at least before this current administration, some of them did an effort, but now it’s just like, you know, there are no boundaries for them. And I think it also has to do because they know there is lack of accountability from several places and there aren’t anybody to hold them accountable. And even if, I think we as investors, we exclude these companies, I think the vital part here is that these companies have so much influence that I think there is, even if it’s a really, really difficult path, but I think it’s really important that we and civil society, that we are still there and chasing them, even if they don’t want to engage. Because we have also heard internally from some of the voices which are internally companies that it helps that investors knock on the door.
Marwa Fatafta: I just want to quickly add that, yeah, the fact that we’re talking about mostly and predominantly US-based companies, the Trump administration is taking an extremely protectionist approach to their tech sector. They see it as a sector that needs to be protected against regulation and accountability and particularly from the EU or in fact any other state that may use its national jurisdiction to oblige companies to take one course of action or the other. For example, it caught my attention a couple of weeks ago, an announcement, I think from the State Department saying that they will not grant visas to foreign officials who have mandated quote-unquote censorship by these companies and I think they were mostly referring to the Brazilian judge of the Supreme Court which, you know, mandated X to take certain action against accounts that are spreading disinformation, if I remember the case correctly. So in such a context, yes, exactly, what do you do? Have we engaged with the companies we have at every opportunity in turn, not necessarily for this panel, because we know what would be the outcome, they will come here, rehash the same press lines and leave the room. So we can save you and save ourselves.
Meredith Veit: And something that we’re seeing with regard to power imbalances at play, you know, at the IGF with so many states here, I think we’re also hearing a strong call from all of us in different words that we need states to take tech regulation much more seriously. At the Resource Center, we’re constantly reaching out to companies about allegations of harm for a number of different sectors. And the tech sector consistently has a lower response rate than others like mining or oil or garments. And we think that’s because there’s less regulation. So there’s less pressure. There’s been more of a buildup for different sectors and industries over time with relation to business and human rights and states taking action. And for the tech sector, I mean, we see it now within the United States, for example, saying that they want to put a ban for the next 10 years on regulating AI and calls to try to deregulate in the EU as well. So if we want companies to actually give us the transparency that we need for our societies, we need governments to mandate human rights due diligence and transparency, as we’ve heard.
Chantal Joris: Yeah, I mean, one thought is also it is also about government transparency, because a lot of times we hear that there are direct service contracts procurement, which is very much a service, again, provided to a government. If you look at freedom of information, access requests, often you would have like a national security exemption. So even starting with and often for litigation, for example, what you need is really the hard facts, you need the contracts, you need to close what exact service provision, what did they say about potential violations of international law, you would… even need internal minutes, was the executive aware of certain risks and so on. So you really need, of course, I read the impact, of course, to prove the impact that you need this public reporting by civil society. But you also need a lot of information that unfortunately relies often on whistleblowers, on journalists, but again, looking at the fact that there’s many governments that attend IGF, I think government transparency is also a very, very good point to start. Of course, again, questions around defense, national security are prone to find justifications in secrecy due to national security concerns, but it’s where accountability efforts can start and where we can also measure whether states are upholding their own international obligations.
Meredith Veit: I saw we had someone coming up to the microphone here as well. Yeah, please. Hi, everyone. Audrey Moklay from Open Mic.
Audience: Thank you so much for your panel. My question, and you kind of just touched on it, was around to what extent there’s a defensive due diligence for these corporate executives or the company itself in international human rights law, and to what extent you would need that kind of evidence of the due diligence being done on the corporate side, and how much detail you would need it, because I think we’ve seen in our engagements with certain companies, they won’t even tell us who they hired as a third party to do the assessment. They’re very opaque about how the due diligence is being done. Yeah, and so I guess my question for you is, what’s the burden of proof there, and how do we place it onto the companies? Is it through the investors? Is it through the states? Thanks.
Chantal Joris: I mean, around the burden of proof, again, to sound boring, this is always, I think, generally speaking, very much on the domestic law. What sort of litigation do you bring? Is it a tort law? Is it a criminal complaint? So I think that would depend, maybe. And also, again, the burden of proof, the granularity of the information that you have. I will say in some jurisdictions, notably UK, US, there is also the possibility of making the disclosure request, right, where the judge can then order also the company to disclose certain information that is necessary to be able to actually adjudicate the claim properly. So again, I think one needs to be really quite granular and creative and really think about all the different means you might have to be able to collect what you need for the case. And then again, it will depend on the exact legal basis, what exactly the chain of causation that you might have to prove and how you can access that information as much as possible. But it remains a challenge, which is a reality, of course. I can say from investors’ point of view that we work on a risk-based perspective.
Kiran Aziz: So for us, we need to know that there is a risk. And then this is why we need to assess how high is the risk element. So I would say the bar is lower compared to if you follow, if you have to have a litigation in the court. But when that said, I think it’s still important that the evidence which is coming or the reports which are conducted, that they are done by trustworthy actors which we can rely on. And I think if more actors would emphasize the same risk, then it gets really clear for us that this is something we need to take into account.
Marwa Fatafta: Just to add on what you said, I mean, in the absence of legally mandated human rights due diligence, if we follow the UNGPs, it’s a risk-based approach as well, right? So when you have heightened risk because of armed conflict, of companies contributing Dr. Pichamon Yeophantong, Kiran Aziz, Phillipe Stoll, Meredith Veit, Jalal Abukhater, Dr. to their content moderation of Palestine-related content in 2021 that was a strong push from civil society and also their oversight board to see whether their actions had resulted in violating human rights which in the beginning of that engagement, the company pushed back very strongly against taking that approach and saying, we will internally investigate. I think that’s the company’s favorite phrase to say. We’re, you know, we’ll take care of that. So they are not independent, they’re not, we don’t also know who they’ve talked to because some decisions made by companies, you know, for instance whether to build a data center in an extremely suppressive and authoritarian state like Saudi Arabia where the company said, oh, we’ve done our human rights due diligence but if the outcome is to say, green light this project then there are of course huge question marks raised about that exercise meaning what kind of questions have they asked? What risks have they interrogated and scrutinized? And who are the people they’re talking to? Are they talking to the rights holders? Are they talking to the impacted communities? Or are they talking to some international NGOs that are not directly linked or really understand the context? Which makes these types of, I would even call that it’s watering down what UNGPs were supposed to achieve ultimately.
Meredith Veit: Okay, I’m given the four minute signal. Any other inputs from the audience? Yes.
Audience: Hi, thank you so much. My name is Sadhana. I just had a quick question on the enforcement front. We heard from the panelists about the relevance of international criminal law and IHL as well, but in situations like in Palestine at the moment when genocide is so inextricably linked to armed conflict, I wanted to know whether the Genocide Convention imposes any additional duty on private actors and companies to act positively to prevent genocide and whether there are any enforcement lessons from that convention that might also help us understand how corporate accountability might function where genocide happens in the context of an armed conflict.
Chantal Joris: So, good question. So I think the Genocide Convention, as you say, of course, there can be a risk to genocide happening or genocide might already be happening, but the Genocide Convention’s obligations are triggered well before we have established whether a genocide is happening in the context of an armed conflict or outside of it. And there is there the state’s obligations, of course, to ensure also that no one under their jurisdiction would be contributing to genocide or incitement to genocide and so on. I’m not sure. So I wouldn’t say it’s a direct legal instrument that you can, because it’s a state treaty, that you can at the international level at least at least base yourself on to hold companies accountable. Of course, again, many states have in their domestic legal frameworks also established genocide crimes against humanity, war crimes as crimes and connected with potential universal jurisdiction clauses. They might be able to pursue companies under those provisions. As I mentioned, as far as I’m aware, I think it’s been more war crimes-based complaints, criminal prosecutions, or for crimes against humanity. I’m not sure I’m aware of a corporate executive or a company directly facing genocide charges, let’s say, recently post-World War II. There was also the Rwanda Tribunal, of course, and so on. But then we talk again more about individual criminal responsibility. But still, I think learning from the cases brought in in other sectors, and as you say, under crimes against humanity provisions as well, is definitely something that we should do if we seek to look at strategic litigation, also in the tech sector, I would say.
Marwa Fatafta: The genocide convention, I think, criminalizes… I think the genocide convention criminalizes complicity in genocide, which I think in the Rome study outlines what the modes of liability are.
Chantal Joris: But basically it says the state should, on the domestic level, criminalize complicity in genocide. Exactly, so that’s what you need to…
Meredith Veit: And in looking across different sectors too, I mean, seeing what exactly was it that had the Dutch businessman who was selling chemicals to Saddam Hussein’s regime, what exactly was it about the sharing of information about individuals with the Argentinian regime, with the Ford Motor Company case? What are these pieces that we could take from case law and other previous jurisprudence? And actually apply it to tech, because in sharing names and personally identifiable information, this can translate to sharing names and biometric IDs in the modern context. So we are definitely out of time at this point. So I will just thank our fantastic panelists and for everyone who participated in the audience. Hopefully this served at least as a launching point to spark some ideas and get more people involved in thinking about this. Because as you can see, there’s a lot of work to be done from all angles. So thank you all so much for your time and for your interventions today. I appreciate it. Thank you. Thank you. Thank you. Thank you. Thank you.
Marwa Fatafta
Speech speed
138 words per minute
Speech length
1932 words
Speech time
838 seconds
Tech companies are never neutral actors in armed conflicts and can exacerbate conflict dynamics through power asymmetries
Explanation
Fatafta argues that tech companies actively participate in and worsen conflicts rather than remaining neutral observers. They particularly impact situations where there are power imbalances between warring parties, potentially facilitating human rights abuses or contributing to atrocity crimes.
Evidence
Examples from the Gaza conflict over the past year and a half, which she describes as potentially foreshadowing the future of cyber warfare and tech company involvement
Major discussion point
Tech Companies’ Role in Armed Conflicts
Topics
Cyberconflict and warfare | Human rights principles
Agreed with
– Chantal Joris
Agreed on
Tech companies are actively contributing to conflicts rather than remaining neutral
Companies directly cause adverse human rights impacts through censorship and systematic removal of Palestine-related content
Explanation
Tech companies violate fundamental rights including freedom of expression, peaceful assembly, and non-discrimination through their content moderation policies. This represents direct harm caused by corporate policies rather than indirect contribution to conflict.
Evidence
Systematic removal of Palestine-related content online by social media companies, violating rights enshrined in the International Covenant on Civil and Political Rights
Major discussion point
Tech Companies’ Role in Armed Conflicts
Topics
Freedom of expression | Content policy | Human rights principles
Tech companies contribute to violations through third parties by providing cloud computing, AI tools, and facial recognition technologies to militaries
Explanation
Companies provide technological infrastructure and services that enable military operations linked to serious violations of international law. This includes direct technological assistance to governments and military forces engaged in conflicts.
Evidence
Google and Amazon’s $1.2 billion Project Nimbus providing cloud services to Israeli government; Google’s deepened relationship with Ministry of Defense including classified teams and joint drills; Amazon Web Services providing server farms for mass surveillance data; Microsoft selling 19,000 hours of engineering services to Israeli Ministry of Defense including on-site military base assistance
Major discussion point
Tech Companies’ Role in Armed Conflicts
Topics
Cyberconflict and warfare | Privacy and data protection | Human rights principles
Companies mirror state policies of apartheid and segregation in their service provision, as seen with Google Maps and PayPal in occupied territories
Explanation
Tech companies implement discriminatory policies that parallel and reinforce state-level segregation and apartheid systems. By selectively providing or denying services based on identity or location, they contribute to cumulative impacts of occupation and discrimination.
Evidence
Google Maps in West Bank only shows roads connecting Israeli settlements, not Palestinian towns; PayPal allows Israeli settlers in illegal settlements to access services while denying them to Palestinians
Major discussion point
Tech Companies’ Role in Armed Conflicts
Topics
Human rights principles | Digital access | Consumer protection
Companies fail to conduct meaningful heightened human rights due diligence despite operating in high-risk conflict zones
Explanation
Despite UN Guiding Principles requirements for enhanced due diligence in conflict areas, tech companies either refuse to engage with civil society concerns or conduct superficial audits. When they do respond to pressure, their due diligence processes lack transparency and meaningful oversight.
Evidence
Microsoft’s audit after public pressure concluded no contribution to harm despite admitting no insight into technology use in air-gapped military bases; companies unable to answer basic questions about due diligence processes
Major discussion point
Corporate Accountability and Due Diligence Failures
Topics
Human rights principles | Legal and regulatory
Agreed with
– Meredith Veit
– Kiran Aziz
Agreed on
Tech companies consistently fail to engage meaningfully on human rights due diligence and transparency
Disagreed with
– Kiran Aziz
Disagreed on
Effectiveness of investor exclusion as accountability mechanism
Even when companies claim to conduct audits, they lack insight into how their technologies are used, making due diligence ineffective
Explanation
Companies perform box-ticking exercises rather than genuine due diligence, admitting they have no visibility into how their technologies are actually deployed by military clients. This contradiction undermines the entire premise of effective human rights due diligence.
Evidence
Microsoft’s statement that while they don’t have insight into how technologies are used in air-gapped military bases, they concluded no contribution to harm
Major discussion point
Corporate Accountability and Due Diligence Failures
Topics
Human rights principles | Legal and regulatory
Major tech companies are quietly dropping voluntary commitments against building AI for military use and forming partnerships with defense contractors
Explanation
There is an increasing militarization trend where tech companies are abandoning their previous ethical commitments and actively seeking military partnerships. This represents a fundamental shift toward embracing rather than avoiding military applications of civilian technology.
Evidence
Google and OpenAI dropped commitments not to build AI for military use; Google signed partnership with Lockheed Martin; OpenAI partnered with defense tech company Anduril; Meta and Anduril partnering on VR products for US military
Major discussion point
Militarization of Tech Sector
Topics
Cyberconflict and warfare | Future of work
Senior executives from Meta, OpenAI, and Palantir are joining US Army Reserve as lieutenant colonels, blurring lines between civilian tech and military roles
Explanation
The creation of the Executive Innovation Corp represents an unprecedented integration of tech executives directly into military command structures. This development raises fundamental questions about the distinction between civilian technology companies and military actors.
Evidence
Senior executives from Meta, OpenAI, and Palantir joining US Army Reserve Executive Innovation Corp as lieutenant colonels to provide tech advice
Major discussion point
Militarization of Tech Sector
Topics
Cyberconflict and warfare | Future of work
US protectionist approach under current administration shields tech companies from regulation and accountability measures
Explanation
The Trump administration’s protective stance toward US tech companies creates barriers to international accountability efforts. This includes threatening foreign officials who attempt to regulate US tech companies, effectively creating a shield against external oversight.
Evidence
State Department announcement refusing visas to foreign officials who mandate ‘censorship’ by US companies, referencing Brazilian Supreme Court judge’s actions against X
Major discussion point
Regulatory and Political Challenges
Topics
Legal and regulatory | Jurisdiction
Chantal Joris
Speech speed
155 words per minute
Speech length
1752 words
Speech time
674 seconds
International humanitarian law applies to individuals within companies when business activities have nexus to armed conflict, though enforcement primarily relies on international criminal law
Explanation
While IHL doesn’t directly apply to companies as entities, it does bind individual employees when their business activities are sufficiently connected to armed conflicts. This creates potential liability for corporate staff, though enforcement mechanisms are primarily through criminal law rather than corporate liability.
Evidence
Traditional application to mining companies on the ground or private military security companies, but tech companies are increasingly intertwined with military operations
Major discussion point
Legal Framework and Enforcement Challenges
Topics
Legal and regulatory | Cyberconflict and warfare
Agreed with
– Marwa Fatafta
Agreed on
Tech companies are actively contributing to conflicts rather than remaining neutral
Corporate executives could theoretically be liable under ICC jurisdiction, but realistic prospects remain limited due to high thresholds
Explanation
The International Criminal Court is developing policies on cyber conduct under the Rome Statute, which could potentially hold corporate executives criminally liable. However, the extremely high legal thresholds make actual prosecutions unlikely in the near term.
Evidence
ICC prosecutor’s office drafting policy on cyber conduct under Rome Statute; legal persons cannot be held liable but corporate executives theoretically could be under high thresholds
Major discussion point
Legal Framework and Enforcement Challenges
Topics
Legal and regulatory | Jurisdiction
Domestic frameworks vary significantly in their capacity for universal jurisdiction and corporate criminal responsibility
Explanation
The ability to hold tech companies accountable depends heavily on individual countries’ legal systems and whether they have universal jurisdiction provisions and corporate criminal responsibility frameworks. This creates an uneven patchwork of potential accountability mechanisms.
Evidence
Lundin Oil case with corporate executives potentially liable for aiding war crimes; Lafarge case involving company liability itself; UK cases against parent companies over operations in Zambia
Major discussion point
Legal Framework and Enforcement Challenges
Topics
Legal and regulatory | Jurisdiction
Government transparency is crucial as many service contracts fall under national security exemptions, limiting access to evidence
Explanation
Strategic litigation requires detailed evidence including contracts and internal communications, but government procurement with tech companies is often classified under national security. This creates a fundamental barrier to accountability efforts that could be addressed through improved government transparency.
Evidence
Freedom of information requests often blocked by national security exemptions; need for contracts, internal minutes, and evidence of executive awareness of risks
Major discussion point
Legal Framework and Enforcement Challenges
Topics
Legal and regulatory | Privacy and data protection
Agreed with
– Meredith Veit
– Kiran Aziz
Agreed on
Government regulation and mandates are essential for corporate accountability
The integration of tech executives into military structures raises questions about attribution and state obligations
Explanation
When tech company executives operate within military command structures, it becomes unclear whether their actions should be attributed to the state or the company. This blurring of lines has significant implications for determining which legal frameworks apply and who bears responsibility.
Evidence
Meta executives operating within military raising questions about combatant status and state obligations
Major discussion point
Militarization of Tech Sector
Topics
Cyberconflict and warfare | Legal and regulatory
Meredith Veit
Speech speed
145 words per minute
Speech length
1816 words
Speech time
749 seconds
Survey response rates from tech companies are abysmally low (4% for Palestine/Israel context vs 26% for Russia/Ukraine), showing lack of transparency
Explanation
A 2014 survey of 104 technology companies operating in occupied Palestinian territories received only 4% response rate, compared to 26% for a similar survey about Russia/Ukraine operations. This demonstrates unprecedented lack of engagement from tech companies on human rights due diligence in conflict zones.
Evidence
Survey of 104 tech companies in Palestine/Israel with 4% response rate vs 26% for Russia/Ukraine survey; described as unprecedented in the resource center’s history
Major discussion point
Corporate Accountability and Due Diligence Failures
Topics
Human rights principles | Legal and regulatory
Agreed with
– Marwa Fatafta
– Kiran Aziz
Agreed on
Tech companies consistently fail to engage meaningfully on human rights due diligence and transparency
Government mandates for human rights due diligence and transparency are essential since voluntary approaches have failed
Explanation
The consistently low response rates from tech companies compared to other sectors demonstrates that voluntary corporate responsibility frameworks are insufficient. Government regulation requiring mandatory human rights due diligence and transparency reporting is necessary to create accountability.
Evidence
Tech sector consistently has lower response rates than mining, oil, or garments sectors; calls for bans on AI regulation in US and deregulation in EU
Major discussion point
Regulatory and Political Challenges
Topics
Legal and regulatory | Human rights principles
Agreed with
– Chantal Joris
– Kiran Aziz
Agreed on
Government regulation and mandates are essential for corporate accountability
The tech sector has lower regulatory pressure compared to other industries like mining or oil, resulting in lower corporate response rates
Explanation
Tech companies face less regulatory scrutiny and pressure compared to traditional industries that have been subject to business and human rights frameworks for longer periods. This regulatory gap explains why tech companies are less responsive to accountability efforts.
Evidence
Tech sector consistently lower response rates than mining, oil, or garments when contacted about human rights allegations
Major discussion point
Regulatory and Political Challenges
Topics
Legal and regulatory | Human rights principles
Kiran Aziz
Speech speed
160 words per minute
Speech length
1503 words
Speech time
563 seconds
Institutional investors rely on long-term perspectives that incorporate material risks including human rights violations as financial risks
Explanation
Large institutional investors managing pension and savings funds take long-term investment approaches that consider human rights risks as material financial risks. This creates a business case for embedding human rights considerations into investment decisions beyond just ethical concerns.
Evidence
Norway’s Transparency Act requiring due diligence on investments; screening companies upfront before investment decisions
Major discussion point
Investor Leverage and Limitations
Topics
Economic | Human rights principles
Investors depend heavily on civil society reports and public domain information since companies provide inadequate reporting on human rights impacts
Explanation
Institutional investors cannot rely on corporate reporting for human rights risk assessment, particularly in conflict areas, and instead depend on civil society organizations and UN agencies for credible information. This highlights the critical role of civil society in corporate accountability.
Evidence
Company resources and reporting not helpful for human rights assessment; reliance on civil society reports and UN High Commission on Human Rights documentation
Major discussion point
Investor Leverage and Limitations
Topics
Economic | Human rights principles
Exclusion of companies from investment portfolios can be effective when done transparently with public documentation of reasons
Explanation
Public exclusion lists with detailed explanations of human rights concerns can influence other investors and put pressure on companies to improve practices. Transparency about exclusion criteria helps set market standards and can lead to company re-inclusion if practices improve.
Evidence
KLP’s transparent exclusion documents helping other investors follow similar exclusions; companies can be re-included if they improve practices
Major discussion point
Investor Leverage and Limitations
Topics
Economic | Human rights principles
Disagreed with
– Marwa Fatafta
Disagreed on
Effectiveness of investor exclusion as accountability mechanism
Tech companies are increasingly difficult to engage with compared to traditional sectors, often only referencing policies without discussing concrete matters
Explanation
Unlike traditional business sectors, tech companies are particularly resistant to investor engagement on human rights issues. When engagement does occur, companies typically deflect with generic policy references rather than addressing specific concerns or evidence of harm.
Evidence
Struggle to get engagement with tech companies at all; when engagement occurs, companies reference policies without discussing concrete matters
Major discussion point
Corporate Accountability and Due Diligence Failures
Topics
Economic | Human rights principles
Agreed with
– Marwa Fatafta
– Meredith Veit
Agreed on
Tech companies consistently fail to engage meaningfully on human rights due diligence and transparency
Investor engagement is limited by companies’ unwillingness to discuss concrete matters and lack of government accountability
Explanation
Investors face significant limitations in their ability to influence tech company behavior due to corporate resistance and insufficient government oversight. The burden increasingly falls on investors and business communities rather than governments taking responsibility for regulation.
Evidence
Companies unwilling to engage beyond policy references; governments taking less responsibility leaving burden on investors and business communities
Major discussion point
Investor Leverage and Limitations
Topics
Economic | Legal and regulatory
Agreed with
– Meredith Veit
– Chantal Joris
Agreed on
Government regulation and mandates are essential for corporate accountability
Audience
Speech speed
158 words per minute
Speech length
586 words
Speech time
222 seconds
Impact stories showing real-world consequences of corporate actions are crucial for demonstrating harm beyond just irresponsible behavior
Explanation
Many stakeholders in the digital justice space may not understand that corporate actions in conflict zones have life-and-death consequences for real people. Personal stories and concrete examples of impact are essential for making the human cost of corporate behavior visible and compelling.
Evidence
Stories of what the results are and how corporate actions can affect people’s lives
Major discussion point
Evidence Requirements for Accountability
Topics
Human rights principles | Content policy
Corporate relationship mapping and partnership analysis help reveal patterns of ethical decision-making across different conflicts
Explanation
Understanding how tech companies form partnerships and make decisions across multiple conflicts provides insight into their ethical frameworks and decision-making processes. This type of evidence helps establish patterns of behavior rather than isolated incidents.
Evidence
Partnerships like Lockheed Martin relationships; understanding ethical backing or lack thereof in how companies form partnerships across different conflicts
Major discussion point
Evidence Requirements for Accountability
Topics
Economic | Human rights principles
Agreements
Agreement points
Tech companies consistently fail to engage meaningfully on human rights due diligence and transparency
Speakers
– Marwa Fatafta
– Meredith Veit
– Kiran Aziz
Arguments
Companies fail to conduct meaningful heightened human rights due diligence despite operating in high-risk conflict zones
Survey response rates from tech companies are abysmally low (4% for Palestine/Israel context vs 26% for Russia/Ukraine), showing lack of transparency
Tech companies are increasingly difficult to engage with compared to traditional sectors, often only referencing policies without discussing concrete matters
Summary
All speakers agree that tech companies demonstrate unprecedented resistance to transparency and meaningful engagement on human rights issues, with extremely low response rates to surveys and superficial responses when they do engage
Topics
Human rights principles | Legal and regulatory
Government regulation and mandates are essential for corporate accountability
Speakers
– Meredith Veit
– Chantal Joris
– Kiran Aziz
Arguments
Government mandates for human rights due diligence and transparency are essential since voluntary approaches have failed
Government transparency is crucial as many service contracts fall under national security exemptions, limiting access to evidence
Investor engagement is limited by companies’ unwillingness to discuss concrete matters and lack of government accountability
Summary
Speakers consensus that voluntary corporate responsibility frameworks have failed and government intervention through regulation, transparency requirements, and accountability mechanisms is necessary
Topics
Legal and regulatory | Human rights principles
Tech companies are actively contributing to conflicts rather than remaining neutral
Speakers
– Marwa Fatafta
– Chantal Joris
Arguments
Tech companies are never neutral actors in armed conflicts and can exacerbate conflict dynamics through power asymmetries
International humanitarian law applies to individuals within companies when business activities have nexus to armed conflict, though enforcement primarily relies on international criminal law
Summary
Both speakers reject the notion of tech company neutrality in conflicts, with Fatafta providing extensive evidence of active participation and Joris explaining the legal framework that makes individuals within companies liable
Topics
Cyberconflict and warfare | Human rights principles | Legal and regulatory
Similar viewpoints
Both speakers identify and are concerned about the increasing militarization of the tech sector, with companies abandoning ethical commitments and executives directly joining military structures
Speakers
– Marwa Fatafta
– Chantal Joris
Arguments
Major tech companies are quietly dropping voluntary commitments against building AI for military use and forming partnerships with defense contractors
The integration of tech executives into military structures raises questions about attribution and state obligations
Topics
Cyberconflict and warfare | Future of work
Both emphasize the critical importance of civil society documentation and real-world impact evidence for accountability efforts, as corporate reporting is inadequate
Speakers
– Kiran Aziz
– Audience
Arguments
Investors depend heavily on civil society reports and public domain information since companies provide inadequate reporting on human rights impacts
Impact stories showing real-world consequences of corporate actions are crucial for demonstrating harm beyond just irresponsible behavior
Topics
Human rights principles | Economic
Both identify regulatory capture and protection of tech companies as major barriers to accountability, particularly in the US context
Speakers
– Marwa Fatafta
– Meredith Veit
Arguments
US protectionist approach under current administration shields tech companies from regulation and accountability measures
The tech sector has lower regulatory pressure compared to other industries like mining or oil, resulting in lower corporate response rates
Topics
Legal and regulatory | Jurisdiction
Unexpected consensus
Investor exclusion as an effective accountability mechanism
Speakers
– Kiran Aziz
– Marwa Fatafta
– Meredith Veit
Arguments
Exclusion of companies from investment portfolios can be effective when done transparently with public documentation of reasons
Companies fail to conduct meaningful heightened human rights due diligence despite operating in high-risk conflict zones
Survey response rates from tech companies are abysmally low (4% for Palestine/Israel context vs 26% for Russia/Ukraine), showing lack of transparency
Explanation
Despite coming from different perspectives (investor, civil society advocate, moderator), there was unexpected consensus that transparent investor exclusion can be an effective accountability tool when companies refuse to engage, representing a market-based solution to regulatory gaps
Topics
Economic | Human rights principles
The fundamental inadequacy of current due diligence frameworks for tech companies
Speakers
– Marwa Fatafta
– Chantal Joris
– Kiran Aziz
Arguments
Even when companies claim to conduct audits, they lack insight into how their technologies are used, making due diligence ineffective
Domestic frameworks vary significantly in their capacity for universal jurisdiction and corporate criminal responsibility
Tech companies are increasingly difficult to engage with compared to traditional sectors, often only referencing policies without discussing concrete matters
Explanation
All speakers from different expertise areas (advocacy, legal, investment) agreed that existing due diligence frameworks are fundamentally inadequate for the tech sector, which was unexpected given their different professional backgrounds and typical approaches to corporate accountability
Topics
Human rights principles | Legal and regulatory
Overall assessment
Summary
The speakers demonstrated remarkable consensus across multiple critical issues: tech companies’ active role in conflicts, failure of voluntary accountability mechanisms, need for government regulation, and inadequacy of current due diligence frameworks. There was also agreement on the militarization trend in tech and the importance of civil society documentation.
Consensus level
High level of consensus with significant implications – the alignment between civil society advocates, legal experts, and investors suggests a broad coalition for reform. This consensus indicates that the current system of tech accountability is fundamentally broken and requires systemic change rather than incremental improvements. The agreement across different stakeholder types strengthens the case for regulatory intervention and suggests potential for coordinated advocacy efforts.
Differences
Different viewpoints
Effectiveness of investor exclusion as accountability mechanism
Speakers
– Kiran Aziz
– Marwa Fatafta
Arguments
Exclusion of companies from investment portfolios can be effective when done transparently with public documentation of reasons
Companies fail to conduct meaningful heightened human rights due diligence despite operating in high-risk conflict zones
Summary
Kiran Aziz presents investor exclusion as a potentially effective tool that can influence company behavior and help other investors follow suit, while Marwa Fatafta’s examples suggest companies remain largely unresponsive to external pressure and continue harmful practices regardless of accountability efforts
Topics
Economic | Human rights principles
Unexpected differences
Optimism about incremental progress versus systemic failure
Speakers
– Kiran Aziz
– Marwa Fatafta
Arguments
Companies can be re-included if they improve practices
Even when companies claim to conduct audits, they lack insight into how their technologies are used, making due diligence ineffective
Explanation
While both speakers work on corporate accountability, Kiran maintains some optimism that companies can improve and be re-included in investment portfolios, suggesting the system can work with proper incentives. Marwa’s analysis suggests the entire due diligence framework is fundamentally flawed and ineffective, representing a more systemic critique. This disagreement is unexpected because both are advocates for corporate accountability but have different assessments of whether current frameworks can be reformed or need complete overhaul
Topics
Human rights principles | Economic
Overall assessment
Summary
The speakers show remarkable alignment on identifying problems with tech company accountability in conflict zones, but subtle differences emerge in their assessment of potential solutions and the effectiveness of current accountability mechanisms
Disagreement level
Low level of disagreement with high consensus on problems but nuanced differences on solutions. The implications suggest that while there is strong agreement on the need for tech company accountability, practitioners from different sectors (legal, advocacy, investment) may have varying levels of optimism about working within existing frameworks versus the need for fundamental systemic change. This could impact strategy coordination and resource allocation in accountability efforts
Partial agreements
Partial agreements
Similar viewpoints
Both speakers identify and are concerned about the increasing militarization of the tech sector, with companies abandoning ethical commitments and executives directly joining military structures
Speakers
– Marwa Fatafta
– Chantal Joris
Arguments
Major tech companies are quietly dropping voluntary commitments against building AI for military use and forming partnerships with defense contractors
The integration of tech executives into military structures raises questions about attribution and state obligations
Topics
Cyberconflict and warfare | Future of work
Both emphasize the critical importance of civil society documentation and real-world impact evidence for accountability efforts, as corporate reporting is inadequate
Speakers
– Kiran Aziz
– Audience
Arguments
Investors depend heavily on civil society reports and public domain information since companies provide inadequate reporting on human rights impacts
Impact stories showing real-world consequences of corporate actions are crucial for demonstrating harm beyond just irresponsible behavior
Topics
Human rights principles | Economic
Both identify regulatory capture and protection of tech companies as major barriers to accountability, particularly in the US context
Speakers
– Marwa Fatafta
– Meredith Veit
Arguments
US protectionist approach under current administration shields tech companies from regulation and accountability measures
The tech sector has lower regulatory pressure compared to other industries like mining or oil, resulting in lower corporate response rates
Topics
Legal and regulatory | Jurisdiction
Takeaways
Key takeaways
Tech companies are not neutral actors in armed conflicts and actively contribute to human rights violations through direct censorship, providing military technologies, and mirroring state policies of discrimination
Current legal frameworks (IHL and human rights law) can theoretically hold tech companies accountable, but enforcement faces significant practical challenges due to high legal thresholds, jurisdictional issues, and lack of transparency
Corporate accountability mechanisms are failing – tech companies have extremely low engagement rates (4% response rate) and conduct inadequate human rights due diligence despite operating in high-risk conflict zones
The tech sector is becoming increasingly militarized, with companies dropping voluntary commitments against military AI development and executives joining military units, blurring civilian-military distinctions
Investors can leverage exclusion strategies and transparency requirements to pressure companies, but face limitations due to companies’ unwillingness to engage and lack of government accountability
Successful accountability requires multiple types of evidence: impact stories, corporate relationship mapping, hard contractual evidence, and risk assessments from trustworthy sources
Government regulation and transparency mandates are essential since voluntary corporate approaches have proven insufficient – the tech sector faces less regulatory pressure than other industries
Resolutions and action items
Civil society should continue documenting and reporting on corporate relationships and partnerships to reveal patterns of decision-making across conflicts
Investors should maintain transparent exclusion practices with public documentation to signal market expectations and help other investors follow suit
Strategic litigation should learn from cases in other sectors (mining, oil) and apply similar evidence-gathering approaches to the tech sector
Government transparency through freedom of information requests should be pursued to access service contracts and procurement details
Continued engagement with companies is necessary even when they are unresponsive, as internal voices within companies report that external pressure is helpful
Best Practice Forum meeting scheduled for Thursday at 2 p.m. during IGF to continue discussions on these topics
Unresolved issues
How to effectively regulate US-based tech companies given the protectionist stance of the current US administration
What specific burden of proof standards should apply to corporate due diligence and how to enforce meaningful transparency requirements
How to address the attribution challenges when tech executives become integrated into military structures
What mechanisms can compel companies to engage meaningfully rather than simply referencing policies
How to access classified or national security-protected information about government-tech company contracts
Whether existing international legal frameworks are adequate for addressing the unique challenges posed by tech companies in conflict zones
How to establish effective accountability when companies operate across multiple jurisdictions with varying legal standards
Suggested compromises
Risk-based approaches that require lower burden of proof than criminal litigation but still enable investor and civil society action
Combination of hard law enforcement through courts and soft law pressure through investors and civil society engagement
Utilizing both international frameworks (IHL, human rights law) and domestic legal mechanisms depending on jurisdiction and available evidence
Focusing on government transparency as a starting point when direct corporate engagement fails
Learning from successful accountability cases in other sectors while adapting approaches to tech sector specificities
Thought provoking comments
Tech companies are never neutral actors in situations of armed conflict. They exacerbate the dynamics of the conflict and sometimes even drive them or fuel them, particularly in contexts where there are asymmetries of power between warring parties.
Speaker
Marwa Fatafta
Reason
This comment fundamentally challenges the common perception of tech companies as neutral service providers. It reframes the entire discussion by establishing that tech companies are active participants in conflicts rather than passive enablers, which has profound implications for accountability and legal responsibility.
Impact
This opening statement set the foundational premise for the entire discussion, moving the conversation away from whether tech companies should be held accountable to how they should be held accountable. It established the framework for all subsequent examples and legal analysis.
There’s a surge in increasing militarization of civilian tech… both Google and OpenAI have both quietly dropped their voluntary commitments earlier this year not to build AI for military use or surveillance purposes… senior executives from high-tech firms, specifically Meta, Open AI and Palantir, are joining the US Army Reserve at a new unit called Executive Innovation Corp.
Speaker
Marwa Fatafta
Reason
This revelation exposes a dramatic shift in the tech industry’s relationship with military operations, showing how the lines between civilian tech companies and military contractors are completely blurring. The fact that executives are literally becoming military officers represents an unprecedented development.
Impact
This comment created a pivotal moment in the discussion, prompting Chantal to immediately address the legal implications of attribution and state obligations when tech executives operate within military structures. It fundamentally changed the scope of the conversation from service provision to direct military participation.
When you look at enforcement and accountability, in terms of humanitarian law, you will primarily think about international criminal law… corporate executives, in theory, under the very, very high thresholds that are under the Rome Statute could be liable under international criminal law.
Speaker
Chantal Joris
Reason
This comment bridges the gap between theoretical legal frameworks and practical enforcement mechanisms, introducing the possibility of criminal liability for tech executives under international law. It moves beyond civil remedies to criminal accountability.
Impact
This shifted the discussion from corporate responsibility frameworks to individual criminal liability, raising the stakes significantly and introducing new pathways for accountability that hadn’t been previously explored in the tech context.
An astonishingly low 4% response rate from companies is unprecedented in the resource center’s history, and previously we had sent a similar survey to tech companies that were operating in Russia and Ukraine, and 26% had responded by comparison
Speaker
Meredith Veit
Reason
This stark comparison reveals the exceptional resistance of tech companies to transparency and accountability efforts specifically in the Palestine context, suggesting either heightened sensitivity or deliberate avoidance that goes beyond normal corporate non-responsiveness.
Impact
This statistic provided concrete evidence of the accountability gap and influenced subsequent discussion about the need for mandatory rather than voluntary disclosure mechanisms. It reinforced arguments for stronger regulatory intervention.
We as investors, we exclude these companies, I think the vital part here is that these companies have so much influence that I think there is, even if it’s a really, really difficult path, but I think it’s really important that we and civil society, that we are still there and chasing them, even if they don’t want to engage.
Speaker
Kiran Aziz
Reason
This comment acknowledges the limitations of investor power while simultaneously arguing for persistent engagement despite those limitations. It reveals the power imbalance between even large institutional investors and major tech companies.
Impact
This honest assessment of investor limitations prompted discussion about the need for government intervention and regulation, as market-based solutions alone appear insufficient to address the scale of tech company influence and resistance to accountability.
The Trump administration is taking an extremely protectionist approach to their tech sector… they will not grant visas to foreign officials who have mandated quote-unquote censorship by these companies
Speaker
Marwa Fatafta
Reason
This comment reveals how geopolitical dynamics and state protection of domestic tech companies creates barriers to international accountability efforts, showing how corporate impunity is actively supported by state policy.
Impact
This observation shifted the discussion to acknowledge the political dimensions of tech accountability, explaining why traditional engagement strategies are failing and why new approaches are needed that account for state protection of tech companies.
Overall assessment
These key comments fundamentally shaped the discussion by progressively revealing the depth and complexity of the accountability challenge. The conversation evolved from establishing that tech companies are active conflict participants, to documenting their increasing militarization, to exploring legal frameworks for accountability, to acknowledging the practical barriers created by corporate resistance and state protection. The comments collectively painted a picture of a sector that has outgrown existing accountability mechanisms and requires new approaches that account for unprecedented corporate power, state protection, and the blurring lines between civilian and military technology. The discussion moved from theoretical frameworks to practical challenges, ultimately highlighting the need for coordinated action across multiple stakeholders – civil society, investors, states, and courts – to address what appears to be a fundamental shift in how technology companies operate in conflict contexts.
Follow-up questions
What kind of evidence would lead to stronger enforcement actions against tech companies that facilitate violations of international humanitarian law?
Speaker
Meredith Veit
Explanation
This was posed as the main discussion question for audience participation, seeking input on what evidence is needed from different enforcement angles including states, investors, and courts
How can we better understand corporate relationships and partnerships beyond individual conflicts?
Speaker
Annette Esserhausen
Explanation
She emphasized the need to understand how corporate actors operate across different contexts and their ethical backing in forming partnerships, not just in relation to one particular conflict
Is there any pathway from using governments to get tech companies to engage when direct engagement fails?
Speaker
Monika Ermert
Explanation
This addresses the challenge of tech companies’ reluctance to engage with civil society and investors, exploring whether government pressure could be more effective
What is the burden of proof for corporate due diligence and how much detail is needed when companies are opaque about their assessment processes?
Speaker
Audrey Moklay
Explanation
This addresses the challenge of companies not disclosing who they hire for assessments or how due diligence is conducted, questioning how to place the burden of proof on companies
Whether the Genocide Convention imposes additional duties on private actors to prevent genocide and what enforcement lessons can be drawn from it?
Speaker
Sadhana
Explanation
This explores whether there are additional legal frameworks beyond IHL and human rights law that could be applied to corporate accountability in contexts where genocide occurs during armed conflict
How can we improve government transparency regarding service contracts and procurement with tech companies?
Speaker
Chantal Joris
Explanation
She identified the need for better access to government contracts and internal communications with tech companies, as this information is often protected under national security exemptions but is crucial for litigation
What can be learned from corporate accountability cases in other sectors that could be applied to the tech sector?
Speaker
Meredith Veit
Explanation
She suggested examining previous jurisprudence from cases involving other industries to identify applicable legal precedents for tech company accountability
How can civil society better document and present impact stories to demonstrate real-world consequences of tech company actions?
Speaker
Annette Esserhausen
Explanation
She emphasized the need for evidence showing actual impact on people’s lives, not just documentation of irresponsible behavior, to make the case for accountability more compelling
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
