Governing Tech for Peace: a Multistakeholder Approach | IGF 2023 Networking Session #78

8 Oct 2023 09:00h - 10:00h UTC

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Audience

As a representative of a youth organisation operating under the remit of the United Nations Department of Political and Peacebuilding Affairs (UNDPPA), Manjia exhibits a keen interest in the role that young individuals might fulfil in the ‘Tech for Peace’ initiative. Her focus on youthful involvement originates from a firmly held conviction that engaging this demographic is not merely beneficial but absolutely crucial to the success of the undertaking. However, Manjia’s curiosity is not limited solely to the participation; she also shows a genuine interest in uncovering and scrutinising any established best practices or forthcoming plans connected to fostering youth engagement in ‘Tech for Peace’.

The term ‘Peace Tech’, central to this discourse, warranted clarification, provided intrepidly by the director of Access Now. This term constitutes an umbrella reference encompassing a diverse range of domains, from the advancement of human rights to the safeguarding of environmental justice. The director’s commentary extended beyond mere terminological elucidation, presenting some critical perspectives concerning how ‘Peace Tech’ could be interpreted and utilised.

At the heart of these insights was a concern surrounding ‘techno-solutionism’ or the excessive dependency on digital tools and technology to address inherently complex human challenges. Arguing against the overreliance on technological solutions, the director emphasised the need for traditional peace-building efforts. Hence, he advocates embedding technical discussions within broader dialogues on peace-building and other interrelated sectors to guarantee a comprehensive approach.

These contemplations and sentiments underscore several Sustainable Development Goals (SDGs) set forth by the United Nations. Notably, these viewpoints reflect the core aspirations of SDG 16, which promotes Peace, Justice and Strong Institutions, and SDG 9, oriented around Industry, Innovation and Infrastructure. The dawn of digital rights and the care for ‘techno-solutionism’ serves to underscore the complex relationship between technology and peace-building, shedding light on the concurrent requirement for innovation and circumspection.

Evelyne Tauchnitz

The discourse principally revolved around three domains: peace, human rights, and the influence of technology. Central to the discussion was Evelyne’s comprehensive definition of peace. She emphasised that peace extends beyond the mere absence of direct or physical violence. Peace should also incorporate the eradication of structural and cultural violence, forms of oppression that can prevail in society even without overt conflict.

Evelyne underscored the integral role that human rights have in defining our comprehension of peace and ‘peace tech’. She identified human dignity and freedom as the cornerstone values that bind together peace and human rights. She argued that these principles ought to form the bedrock when evaluating peace technologies. The central argument was rooted in the notion that any advancement in technology should not infringe upon basic human rights under the pretext of facilitating peace. If a technology is found transgressing these human rights, its categorisation as ‘peace tech’ becomes problematic.

However, Evelyne expressed her concerns that not all technologies, notwithstanding being labelled as ‘peace tech’, necessarily harmonise with the values of peace and human rights. This suggests a need for more stringent scrutiny of such technologies and the establishment of robust standards for defining and categorising ‘peace tech’.

The symbiosis between human rights and peace was further analysed, affirming that respect for human rights is indispensable but not ample for peace. Peace was envisaged as a broader concept that transcends the domain of legal enforcement of rights. In contrast, human rights were characterised as more narrow in scope, subject to legal enforceability.

An additional perspective broached was the potential paradigm shift in our perception of peace due to digital transformation. Evelyne posited, optimistically, that digital technologies could be leveraged to forge a more comprehensive peace. However, she also underlined the necessity for prudence, especially in instances such as social scoring systems, currently instrumental in China for rebuilding societal trust.

On the whole, the discussion reiterates that while technology and human rights are pivotal in shaping peace, vigilance is required to ensure that the pursuit of peace does not inadvertently engender inequality or violate fundamental human rights.

Moses Owainy

Moses Owainy, the esteemed CEO of Uganda’s Centre for Multilateral Affairs, brings to light the necessity for a more diversified dialogue in peace tech, asserting the significance of integrating a multitude of global perspectives. Concentrating his argument on Sustainable Development Goals (SDGs) 16 and 9 – namely peace, justice, strong institutions, industry, innovation and infrastructure – Owainy stresses the need to embrace a wide range of viewpoints while deliberating peace tech initiatives.

Owainy adeptly highlights a key consideration: the varying impact of peace tech in different geographical contexts. He underscores that results of these tech endeavours differ noticeably between regions such as Uganda, Kenya or Nigeria and their deployment in more advanced economies. This accentuates the pressing need for context-sensitive, adaptive approaches to peace tech in diverse socioeconomic landscapes to ensure equitable outcomes.

Simultaneously, Owainy instigates a discussion on the very terminology used in peace tech dialogue, criticising the commonly used term ‘global peace’. He denotes its inherent ambiguity, advocating instead for the term ‘international peace’. Owainy asserts that this revised term would better embody the collaborative efforts between various states and multi-stakeholder groups, in their shared pursuit of peace.

In conclusion, Owainy’s insights guide towards a more inclusive and adaptable narrative in peace tech, suggesting a shift in terminology for clearer understanding and collaboration. His observations also underscore the need for nuanced, context-specific application of peace technologies, acknowledging regional differences and necessity of adaptability.

Marielza Oliveira

Marielza Oliveira acts in the prestigious role of Director for Digital Inclusion, Policies, and Transformation at UNESCO, guiding the organisation’s endeavours towards the advancement of digital inclusion and policy transformation whilst supporting the broad Sustainable Development Goals (SDGs) of fostering innovation and reducing inequalities. Her work is primarily centred around the protection of two fundamental human rights: freedom of expression and the right to access information.

A crucial aspect of this role is their commitment to UNESCO’s ultimate mandate; fostering peace in the minds of men and women and facilitating the unhindered flow of ideas. This aspiration is brought to fruition through various media, underlining the value UNESCO accords to the digital ecosystem.

To guarantee the regulatory approach towards internet platforms also respects this mandate, UNESCO has emphasised the necessity of regulations that uphold human rights. This conversation was furthered in its recent ‘Internet for Trust’ conference, demonstrating the organisation’s positive sentiment towards internet regulation sensitive to human rights.

Moreover, UNESCO intends to build the capacities of stakeholders to counter digital exclusion and inequality. This ambitious goal will enable better participation in the digital ecosystem, striving for a more inclusive internet environment. These capacity-building efforts strongly align with SDGs centred on industry, innovation, and infrastructure, as well as reduced inequalities.

In regard to achieving such aims, significant results are being accomplished through multi-stakeholder approaches. These alliances involving tech companies and government agencies aim at maximising opportunities and fostering a diverse cohort of partners. Examples of these advancements include the ‘AI for the Planet’ project combatting climate change and the United Nations Technology Innovation Lab utilising technology innovatively to foster peace-building. ‘Social Media for Peace’ is another remarkable project designed to combat online polarisation.

UNESCO stresses the importance of ensuring that internet regulation does not infringe upon the rights to freedom of expression and access to information. The organisation is currently planning to release guidelines encouraging an environment of regulation that emphasises processes rather than content, promoting transparency and accountability. This approach counters potential restrictions that could limit the freedom of expression for journalists, activists, and others.

The analysis underscores UNESCO’s leadership role in utilising technology and promoting digital inclusivity to achieve significant societal benefits and its commitment to embedding these principles within its operational structure and processes. It’s robust standing on maintaining a balanced approach to internet regulation, which safeguards human rights whilst promoting accountability, underpins its dedication towards peace, justice, and robust institutions. Such commitment reflects the considerable correlation between UNESCO’s operations and the wider SDGs.

Mark Nelson

Co-directors Mark Nelson and Margarita Quihuis are spearheading significant advancements in peace technology at both the Peace Innovation Lab at Stanford and the Peace Innovation Institute in The Hague, striving to commercialise their peace technology research. This pioneering approach, utilising modern technology, aims to reshape how peace is perceived and established globally.

A fundamental component of this research is the use of sensor technology. In the last two decades, sensors capable of detecting and monitoring human social behaviour have seen groundbreaking advancements, transforming how human interactions are observed and understood in real time. This invaluable insight underpins the development of detailed peace metrics in high resolution.

The concept of ‘persuasive peace’ lies at the heart of their innovation, favouring influence-based strategies over conventional coercive tactics. By crafting customised approaches suited to varying individual circumstances, they are redefining how peace is achieved and maintained.

An essential element of their endeavours is to establish a essential link connecting peace technology with capital markets to create a viable ‘peace finance’ investment sector. They aim to capitalise on the intrinsic but often unquantifiable value of peace, setting the stage for a new era of peacekeeping and conflict resolution strategies.

To summarise, Nelson and Quihuis’ groundbreaking work embodies a progressive approach to advancing the principles encapsulated in SDG 16 (Peace, Justice, and Strong Institutions) and SDG 9 (Industry, Innovation and Infrastructure), demonstrating the transformative power of technology in the pursuit of peace. Their efforts are leading the way in peace innovation, reshaping our understanding of peace, and developing or promulgating peace technology as a crucial aspect of infrastructure development and peacekeeping strategies.

Moderator

The debate primarily delves into the interplay between technology and peace, with a focus on the contemporary role of technology in fostering peace across varying social contexts, as evidenced through initiatives such as the Global Peace Tech Hub. This initiative, championed for advocating the responsible use of technology, centralises on the principle of instigating cross-sector dialogues and sharing enthusiasm for the possibilities that technology can provide in the attainment of peace.

The discourse also recognises the dual aspect of technology as both an agent of positive change and a potential risk. This bipolar facet of technological advancement was brought into the spotlight, with the positive influences such as the enhancement of democracy and increased access to services being weighed against the negative implications such as the promotion of misinformation and the potential to catalyse societal polarisation. Going forward, the panel agrees unanimously that strategies are required to be formulated to mitigate these risks if technology’s full potential for peace promotion is to be exploited.

UNESCO’s efforts at encouraging climate action with projects like AI for the Planet are highlighted as an epitome of how collaborations among diverse parties in the technological arena can result in meaningful outcomes, even when corporations are unable to commit substantial resources. The use of technology to counteract climate change receives a positive appraisal, especially when it supports initiatives that allow various actors to make a pronounced difference without substantial, long-term commitments.

Additionally, the dialogue underscored the significance of the partnership between UNESCO and the European Commission, embodied in the Social Media for Peace project. These efforts, seen as integral to initiatives such as the Internet for Trust guidelines, aim to regulate internet platforms not through content, but through process, asserting the importance of transparency and accountability in this space.

An essential reflection in the debate is the profound transformation peace technology has undergone. Innovative advancements enabling real-time analysis of human behaviour and interactions have led to a significant shift from coercive peace mechanisms towards more persuasive peace technology. This evolution has been lauded as a historic shift for the human species, opening the potential for fostering peace through persuasion rather than coercion.

The role of the youth in propelling human rights and peace movements, despite the inherent risks associated with digital activism, is highlighted. The willingness of young people from countries, including Thailand, Sudan, and the USA, to remain in the forefront of these movements, despite potential threats like spyware attacks and concerns over digital permanence, is noted.

Also significant is the sense of mistrust within certain societal factions. Phrases such as ‘peace tech’ and ‘tech for good’ are seen as potentially contributing to a trust deficit, with neither the technology sectors nor large humanitarian agencies being considered entirely trustworthy when it comes to personal data.

Finally, with the array of perspectives on peacekeeping through technology, the shared consensus was the need for continuous dialogue within the tech community. Seen as crucial to dismantle existing silos and establish a common understanding of technology’s role in peace, this call for consistent multi-stakeholder dialogue served as a central theme throughout the discussion. The breadth of opinions within the peace tech community, from seeing technology as a tool for peace to viewing it as a new battleground, or even a threat to peace, further underlines the importance of these dialogues.

Speaker

The concept of peace transcends the domains of security, extending to incorporate values of freedom and equality. It constitutes a comprehensive vision for societies’ development that endeavours to eliminate all forms of violence – direct, structural, and cultural. However, this idealistic vision can face threats from inappropriate use of technology. Instances where technologies infringe upon human rights pose a significant issue for peace and security on a global scale. Regrettably, the notion of peace can be misappropriated as a brand for technologies that may not contribute to harmonious societies.

Concurrently, recognising the potential of technological advancements, the United Nations is diligently prioritising digital transformation and technology. The UN Secretary General has been vigorous in advocating for this priority, particularly within the realm of cyber security. A testament to this focused effort is the creation of the Office of the Technology Envoy, established to champion these changes on a grand scale. Indeed, there’s a burgeoning understanding of the need for the UN Security Council to incorporate the monitoring of digital and cyber roles into their mandates, signalling the significance placed on technology in global security guidelines.

Nevertheless, amid these advancements, concerns regarding the right to access the internet and freedom of expression persist as substantial challenges in the digital era. Access Now, a focused human rights organisation, primarily champions these digital rights. Incidents like the deployment of spyware during the Armenia-Azerbaijan conflict highlight how tech misuse can affect peace negotiators and exacerbate conflicts. Furthermore, the role of social media platforms in content governance can hold far-reaching implications during crisis times, a stern reminder that technology can be a double-edged sword. In extreme situations, internet shutdowns have inflicted harm on human rights and obstructed conflict resolution.

Indeed, human rights are a necessary condition for peace. Nonetheless, peace as a concept demands a more comprehensive understanding and acceptance, exceeding that of human rights alone. Concord isn’t exclusively about respect for human rights but also significantly involves the mechanisms and ethics by which specific political decisions are made.

The digitisation of society could conceivably impact our comprehension and notion of peace. Young individuals are poised to play a key role within this transformative process. Impressively, youth across the globe are already spearheading movements championing human rights and peace, frequently risking their personal safety in pursuit of these causes.

Nevertheless, a pronounced trust deficit exists towards tech sectors and large humanitarian agencies handling personal data. Passive monitoring via ubiquitous sensors is viewed as a threat, illuminating the public’s discomfort with potential breaches of privacy. This underlines the challenge of striking a balance between leveraging technological advances and safeguarding human rights and security. Thus, whilst digital advancement offers vast potential for societal development, it is crucial to remain cognisant of their inherent risks to maintain peace, justice, and strong institutions.

Session transcript

Moderator:
I see a few of my participants I don’t know if they are able to share the video or audio to talk during the session or interact because the moment I’ve seen only Mark Nelson was able to activate his video and I’m not sure if they technically can also activate their video or not. How are you in Tokyo? All good? Do you start or? Well I can start anyway this is gonna be a very of course informal session. It’s a networking session. How many of you have ever attended a networking session before? No? Good. So that’s what we’re gonna start sharing this experience together because I have no clue what’s a networking session is but I guess it’s gonna be a very informal moment and to discuss, share enthusiasm, interest around the tech and peace, whatever that means. Exactly something that we could I hope we’re gonna come out with some common ground on on this specific topic. So I’m Andrea Calderaro. I’m Associate Professor International Relations at Cardiff University also affiliated at the European University Institute with the Global Peace Tech Hub. Today we have a series of colleagues that participate in this session from remote. Michele Giovanardi in particular who is the coordinator of the Global Peace Tech Hub and the leader of this initiative. Then we do have online I saw a Mark Nelson as director of the Stanford Peace Innovation Lab. Evelyn Taknitz, Senior Researcher at the Institute of Social Ethics. Yes that is and Peter of course who’s in the lead of access now. So and then of course we have other colleagues here in the room and then of course I give you of course the opportunity to introduce yourself. Michele do you have anything else to add? Yes so first of all we have a few more online participants but I’m not sure they can activate the camera because of the structure of the online session so they didn’t have the permissions but they’re listening in and if they want to interact of course there’s a chat function and of course the sound I guess they can intervene and talk. As you anticipated the idea is to really having a moment for us we have been running this this networking session for the last three years at IGF and it’s always a space to share some of the ideas and to learn about each other projects and work around technology and peace. So maybe we cannot see the online audience from sorry the in-presence audience from online so it would be nice maybe to start with a round of introduction or I can introduce briefly what is the global peace tech hub and and maybe so that we can have a quick reaction also after that. So up to you if you want to do a first round of introductions and then maybe we can go more into the topic. Go ahead the first of all the audience anyway we are two more people in addition to what you see on the main stage so it’s gonna be a very short round of introduction. So go first with the global peace tech hub and then we’re gonna keep this of course as much informal as possible. Great okay so just like a few words of what is the global peace tech hub what is global peace tech. We also have Maria-Elsa Oliveira joining online so if you can please give her co-host rights so that she can also activate a video and participate that’s for the organizers. So here is I just have three slides so nothing much just to understand what is global peace tech. We define global peace tech as a field of analysis applied to all processes connecting local and global practices aimed at achieving social and political peace through the responsible use of frontier technologies. This is very kind of it sounds very official but it’s something that we can just start with as a framework. So basically what’s the basic idea of the global peace tech hub. We know that emerging technologies bear great opportunities for positive change but at the same time they have threats and risks that we need to mitigate. So the idea is that in the 21st century we’re battling between these two opposite forces and this is a common trend that we can see with different technologies and with different functions. We know how the hope of the internet to be this force for democracy turned into a force for online disinformation, polarization, violence. We know about the opportunities of digital identities and in terms of giving access to finance and to services to people and marginalized people but also about all the privacy and data ownership and cybersecurity issues that come with that. We know about the great potential of telepresence to build empathy and trust but also about the risk of hate speech and deep fakes. We know about the use of data and potential of early warning response systems but also all the issues related to how this data are managed and secured. So this can go on and on and the list is very long. So the question is this one. So how can we at the same time mitigate some of the risks related to these emerging technologies but at the same time investing in those initiatives that they’re using technology for peace and there are many projects that we mapped more than 170 projects that are using technology in different ways to enhance different functions of peace at the global level. So on the one side is about mitigating the risks with the mapping and sorry with the regulation, capacity building, education, good governance. On the other side is about mapping these peace tech projects and understanding they are assessing their impact on peace processes and also try to shift public and private investment in these peace tech initiatives that work in partnership that we can build between different stakeholders. Hence the topic of this networking session. So a multi-stakeholder approach to peace tech, to governing tech for peace and first we would like to know who is on the call, what you’re doing, what you think technology can contribute to peace and second after we get to know you and so we can know each other. We would like to know what you think of the topic of the day. So how can different actors come together, actors from academia, governments, public sector, think tanks, NGOs come together in this puzzle and create synergies to achieve this goal, these peace goals through the responsible use of technology. So I will give the floor back to you and maybe we start with the first round and then we see where the conversation goes and I really invite online participants somehow, chat, audio, video to interact as well and give us their take about this and introduce themselves as well. Super, okay thank you Michele for providing such a good overview about the variety of, I mean the perspective that you’re privileged in in looking at the relationship within tech and peace. Yes I will let like Evelyn maybe you can introduce yourself and provide also your insight on the topic of what is exactly your perspective on

Evelyne Tauchnitz:
tech and peace. Thank you very much Andrea. Yes my name is Evelyn Tauchnitz. I’m a senior researcher at the Institute of Social Ethics at the University of Lucerne, Switzerland and my research focuses on digital change, ethics, human rights and peace of course, what we’re talking about today. So like if we’re really talking about what is peace tech, I think that’s a bit the guiding question here today. We first have to consider also what do we understand by peace and that is a really challenging question by itself and it’s not the first time we’re discussing that but we had a conference last year also where we dedicated some thoughts on that and I don’t think we reached a conclusion. It’s a really difficult question but I can tell you like how I’m using this concept of peace in my own research. So first of all I think that peace really is a vision, like it can show us in which direction we would like to develop as well together as humanity but also like in what kind of societies we would want to be living and although there will never be like absolute peace in all corners of the world or even in any society, we still need to know a bit like what would it look like and there I find it really helpful first to look at this positive definition of peace meaning that it should not only mean the absence of direct violence but also of structural violence and cultural violence so there should be no injustices, there should be equality, poverty, access to health care, access to education, so trying to eliminate the causes that then often give rise to direct violences. So it means really that even if you have no direct violence there is still, usually in any society you would still have some forms of injustices, some forms of discrimination which leads me also to the third form of cultural violence which legitimizes the other forms of violence like for example if women are not allowed to go to school this is both a mix of structural ones when they cannot access the school physically but it’s also a form of cultural violence. Why is it not boys that cannot access the school? Why is it girls? So probably there’s never going to be any society free of this violence but so when we talk about peace tech I think it’s really important to keep in mind that it’s more than security only because your security is a very narrow definition of peace like for example if we think about surveillance technologies they might increase security but then again there is different problems like of ambivalency means our personal freedom might be reduced because that has collected about us and we know that or even if it’s in a conflict setting we might not meet certain people anymore like you would not have big assemblies of people because they would be afraid that one person being there also one contact might be suspicious and then I’m meeting that person so rather I should not meet too many people and if it’s a bit a big crowd so more likely it would be that somebody is there who doesn’t have a good record so to say. So it really influences also on democracy it influences on like freedom of expression it how we move even how we move like how do we behave in public squares and I think it’s really important to just look at this from an ethical perspective like when we look at ethics you might have good purposes for instance but still it could have like negative consequences as in the case of surveillance you want to increase security but at the same time it might impact on personal freedoms or civil and political rights. So I think what I’m doing in my own research we need some kind of point of reference when we talk about peace like what kind of peace do we want what kind of peace technologies do we want and I’m using their human rights as I really as a baseline so if you have technologies that violate human rights I find it problematic to to say there are peace tech for example because they’re increasing security but peace in my understanding also is really based on on these values of human dignity and freedom as well and human dignity and freedom there are two basic values that connect both peace and human rights and that’s not by coincidence of course but it’s historically linked because human rights were the universe the creation of human rights was adopted after the Second World War after this devastating experience and well I don’t know really if I should go on but that’s really a topic of discussion maybe that I would like to raise rather than me speaking but what do we understand by peace or peace tech concretely or like what kind of technologies should we should we gather under this term like which technology really merit to be called peace tech because sometimes I see a lot of like almost advertising or marketing efforts to like use this brand of peace tech but if you look at it more closely it’s maybe not so much peace peace peace as it’s as it’s supposed to be so yes that’s so far my input or something I would like to discuss with you thank you very much

Speaker:
fantastic thank you and when it comes of course human rights and access now has been at the forefront on variety of human rights battles and you Peter have been also involved in a series of processes in the UN context so you could respond on your yeah your perspective absolutely thank you yeah well I I feel like I want to network with both of you after this I think this isn’t getting at the point of the session and yeah I really appreciate your remarks for two reasons I think there’s that Martin Luther King quote that paraphrase it badly peace isn’t the absence of conflict or the absence of war but the presence of justice which I think yeah leads to accountability and then thank you also yeah for recognizing the the need for this tech to be respecting of human rights that’s certainly where we come from at access now we are a human rights organization that’s got our start during the Green Movement in Iran and have since expanded globally working at the intersection of human rights and new technologies but I think around 2018 we recognized that increasingly we were working in context characterized by conflict fragile and conflict deflected states and then the communities who were in the midst of those fleeing those or you know trying to work in the diaspora to end those conflicts and I recognize this and have built it into our work plans most directly on our on our international organizations team adding a senior humanitarian officer who brings experience from that sector but you know this is really just an acknowledgement that the digital rights that we’ve worked to protect are most at risk in conflict affected persons and communities and that I think from you know monitoring and and prediction to mitigation to resolution to accountability a lot of the digital tools are infused and becoming essential to each of those steps I think so I can mention a few programs that we have to make it more practical we were we have tracked internet shutdowns intentional disruptions of access to the Internet since the Egypt shutdown in 2011 and have brought this this narrative this terminology and this this charge to end shutdowns through our hashtag keep it on campaign brought that to the UN and seen say a lot of positive resonance and a number of UN bodies and among states that you know recognize internet shutdowns you know go beyond the pale that increasingly these are linked to conflict events and times of unrest but that they really levy a number of harmful effects on human rights and increasingly we would say to the resolution of those conflicts themselves and so that keep it on campaign continues alongside our work on content governance so in some ways the flip side but I think this came out during a lot of the conflict in Ethiopia was how overrun a lot of social media platforms in particular were with incitement content and the unresponsive or you know the perceived indifference or even an ignorance of those tech companies to deal with what was happening on their platforms of course in Myanmar as well led to us to work with partners to create a declaration on content governance in times of crisis we’re following that up with a full report so you know when it comes to shutting down the Internet or you know allowing the proliferation of harmful content you know showing there’s a number of ways that freedom of expression is directly at the heart of I think a lot of these responses and interventions and then more recently we’ve tracked especially through our digital security helpline which is recognized as a tool that DPPA and others used to refer people to it provides free of charge technical guidance advice to civil society which we define pretty broadly 24 hours a day and 12 languages and increasingly we’re getting reports to that helpline of really invasive targeted surveillance and spyware and I mentioned this particularly because a recent report we wrote described the use of spyware in the midst of the Armenia-Azerbaijan conflict which has flared up again recently but we were actually able to detect infections on the devices of the actual negotiators of the peace on the Armenian side the ombuds person who was directly involved in talks their phone was completely infected throughout and so So, spyware is now, is a tool of war and of, you know, efforts to monitor peace negotiations. And finally, yeah, we are mapping now all of the tech that we can find being used in humanitarian situations, and we’ll definitely look at the mapping project that was presented just now on peace tech. I think with human rights, you know, we’re taking a human rights lens, where’s the human rights due diligence? What are the procurement processes? How do we determine, you know, whether this tech, you know, has it been analyzed openly? Is it human rights respecting, as you said, I think. So, you know, that’s the kind of work that we do more directly in pursuit of our mission to help communities and people at risk, and then we try to bring lessons from that frontline work to the United Nations through the OEWG cybersecurity process, and I’ll stop talking soon. But yeah, I’m happy to talk more on that later, and I’m joined by my colleague, Carolyn Tuckett, who runs our campaigns and rapid response work. Thanks. Super. Thank you. I do have actually a two-finger question, and sorry if I know that we should keep this roundtable going on. But I mean, since you are exactly following all the UN process, and since UN is, I guess you’re there because we know that UN is supposed to be this organization ensuring peace, so yeah, what do you think the UN is moving in, whether the UN is moving in the right direction? Of course, the UN Open-Ended Working Group is a process that I’m following as well, and I’m not sure whether we can feel satisfied about that. Well, yeah. I mean, it’s undeniable that from the top down, the Secretary General has put digital transformation and tech and cyber at the top of his agenda. You’ll see that by creating the Office of the Technology Envoy, and we, for our part, have been pushing the UN Security Council to integrate monitoring of the role of digital and cyber on its mandate and on the situations that come before it. And we haven’t seen much evidence, well, we don’t know a whole lot about what happens at the Security Council because it is so opaque by design, especially to civil society, but we’ve not seen really dedicated talks there in ways that could be helpful. Of course, they’re going to disagree over Article 51 and the right of security and offensive and defensive measures in cyberspace, but how about just understanding their ongoing situations at the council that they look at on Sudan, on Yemen, on Colombia? How is tech being integrated and influencing those conflicts, and what role could it play in bringing about some resolutions? So, yeah, we have a focus program on the Security Council now, and we invite others who monitor that body, as well as the first committee’s work. And then, you know, there are the cybersecurity and the cybercrime processes continuing right now. The Cybercrime Treaty looks like it’s going to come imminently, and it’s been really interesting to see how they scope that, making how wide of a lens they take on what constitutes a cybercrime. It’s certainly going to be relevant for people involved in peace tech. And yeah, I mean, I think the Summer of the Future finally does include the new agenda for peace, which is supposed to be where I think they’ll speak to cybersecurity alongside complementing the Global Digital Compact. So those are both meant to be signed and delivered in September 2024. So there’s a couple things to look at. Super.

Moderator:
So, going back online, Michele, you might have a better overview about who’s online than I do. Yeah, no, I just wanted to say that maybe before, since we have just one hour and we are out through and we have some people, just to check, you know, who’s in and who’s on the call, who’s in the meeting, and both there and online, maybe we do like a really easy round of who we have in the room, and then we go through more to the answering questions and go more and deeper into the discussion, because I’m just afraid that if we go one by one, then we will not get to the end, and I would also like to get a sense of the online audience who is online. So if you agree, maybe we do like just like a couple of minutes each, just around to see who’s around, and then we go back to the more in-depth discussions. What do you think? Hello? Hello? Yeah. So, yes, no, no, please, Mark. So let’s make this also, let’s interact also with the online audience. Mark Nelson, director of the Stanford Peace Innovation Lab, might have some insight on this. Michele, did you want me to speak while people are doing little intros in text, or should I wait for the? Let’s see if we can do like just who you are and what you do, like in one, two minutes and do this round first, and if when we’re done, we go back to you and Marielsa, and we get some more in-depth perspectives. Yeah. Okay. Happy to.

Mark Nelson:
So very quickly, my name is Mark Nelson, with my colleague and partner, Margarita Kiwis, I co-direct the Peace Innovation Lab at Stanford and also our independent institute in The Hague, the Peace Innovation Institute, where we are working to commercialize our research in peace tech. And I focus on the measurability that is possible now because of technological advances, so peace metrics that are possible in very high resolution in real time, and how that connects the emerging space of peace technology to capital markets and the potential to create peace finance as an active investment sector, and we’re working very hard on that lab part. So that’s the quick overview. Thank you. So I see Marielsa Oliveira online. Yes. Hello, everyone.

Marielza Oliveira:
Marielsa Oliveira from UNESCO, I’m the Director for Digital Inclusion, Policies and Transformation, which is a division within the communications and information sector of UNESCO. Our task in the CI sector, that’s what we call communications and information, CI sector for short, our task is to defend two basic human rights, freedom of expression and access to information, and leverage those for the mission of UNESCO, which has a mandate literally of building peace in the minds of men and women, and a mandate to enable the free flow of ideas by word and image, through every type of media and so on. So for us, the digital ecosystem is incredibly important, and we’ve been doing different types of interventions on it for quite a long time. From one side, on the freedom of expression, for example, recently we had the Internet for Trust conference, which looks at the regulation of internet platforms in ways that respect human rights. And on the other side, on the access to information side, we are working on building capacities of different types of stakeholders, so that they can intervene properly and participate properly in this process, because there’s quite a lot of digital exclusion and inequality happening, giving voice to specific sides only, and we want to see a more inclusive internet in that sense. So in short, I’ll stop here, thanks. Thank you so much, and I see Moses Owaini online, do you want to briefly introduce yourself? Yeah, thank you very much.

Moses Owainy:
I am very far deep in an area where the internet connection is really poor, so I may not be able to turn my video on, but my name is Moses Owaini, and I am the Chief Executive Officer of the Centre for Multilateral Affairs in Uganda. We are a platform that seeks to advance the perspectives of African countries in multi-stakeholder conversations and discussions like this one, and that’s why I said in the text that I’m really glad that I’m in this discussion, because the overall concept of global peace tech, in my view, cannot be fully complete without the perspectives of many stakeholders that some of the speakers have already alluded to. For example, if you look at the realities that are in parts of sub-Saharan Africa, where I come from, but also Pacific, Caribbean, the future of what the tech, the peace around technology and cyberspace is shaped also by their realities. So today, people are talking about cyber or technology for peaceful means, but what does that mean for a country like Uganda, Kenya, or Nigeria? And that meaning might imply a different form for countries that are more developed or more advanced economies. So for me, I’m glad that this conversation is here, but I’m just imploring the speakers and the organizers that also future conversations around this could also likely draw from all of these multi-stakeholder perspectives and actors that you have clearly talked about. So there’s one point. My second contribution, which is really just maybe a question you are all university professors and you know this, and a lot of critiques come around this, especially when you use the term global peace, because we are in a world of international relations where states are relating each other bilaterally, multilaterally. So is there, and I seek to be corrected, I may not be knowing, so are we right to say, is it more accurate to say we are in a global world or in a world of international, you know, or in an international world? So for me, the term global peace is a little bit, with all due respect, a little bit ambiguous, but if you have something like international, you know, peace in the tech, then it creates more meaning, it is more relatable because it implies that many different states are, you know, collaborating and working together with all these multi-stakeholder groups to achieve the kind of peace, to achieve the kind of, you know, prosperity in the cyberspace, in the tech industry that everyone desires. So that’s my contribution and thank you so much for this platform and the opportunity. Thank you. Thank you so much, Moses, for these very insightful comments and also connected to what Marie Elsa was mentioning. We will move forward with a round of introductions if there are other participants that want to contribute.

Moderator:
I see Youssef Benzekri, I don’t know if he wants to introduce himself. I see Wenzhou Li, I don’t know if… Okay, I see Yuxiao Li. All right, and Omar Farouk. Okay, you will have a chance to introduce yourself in the chat as well. I see also Teo Nanezovic, which is also part of the Global Peace Tech Hub online. And she’s the rapporteur, so she will also kind of put together some of the reflections we are sharing now after the session. So I guess let’s go back to the room and… Okay, there’s somebody who wants to introduce themselves. Yuxiao Li, if you want to introduce yourself, please feel free to do so. Yeah, no problem. Okay, I have no question. Thank you. Okay, thank you. Okay, let’s move forward. So I would like to go back to either who’s left in the room or Mark, if you want to expand a little bit on the topic of this session. So… And also, of course, Marielta. Do you see any opportunities in a multi-stakeholder approach in this space? There are many initiatives trying to leverage technology for peace in different aspects and dimensions. There are actually a non-profit sector growing in this space with many different organizations and networks. What is missing maybe is to connect these dots. So to connect this non-profit sector with the tech companies that also have the capacity and data to and capability to develop the technology itself and the government. So do you see some kind of opportunities in developing these multi-stakeholder approaches and networks? Maybe we’ll start with Marielta and then we’ll go to Mark. It’s great that you asked for this question, because this is one of the approaches that

Marielza Oliveira:
we’re taking at UNESCO, is to bring together different groups, particularly bringing together the tech community, the tech companies, and the remote stakeholder group. And we have one good initiative on that, it’s the AI for the Planet initiative, in which we are leveraging this technology for combating climate change. So this is one of the things that we see promise on, although quite a few of this, how to say, this hopeful approaches, they end up in failure. So this is one of the things that we have to be very, very frank about. There have been quite a lot of attempts to do exactly that. But so for the companies, the question is, what’s in it for me? And at the end of the day, they’ll be looking at what the gain it is. So bringing them to the table and having them to commit to specific initiatives and to support that and dedicate time, money, resources, it’s quite difficult. But you have to be selective, I think, in the types of initiatives that you pick, and when the priorities are high enough. And this is very interesting. And when also the companies can make a difference without necessarily a long-term, large commitment. One of the examples that we have that is very successful is the United Nations Technology Innovation Lab. I don’t know whether you’ve heard about it, but it’s using technology in innovative ways to enhance peace-building processes, such as, for example, 3D augmented reality to show the delegates in New York what the conditions that displaced people were facing in a particular war zone. So they could immerse themselves into that and then have an informed discussion about it. It’s a completely different type of thing to actually see the environment, to just hear the statistics about it. The increase in empathy in terms of understanding what needs to be done, it’s tremendous. So there are quite a few good potential things. Another way that UNESCO is working on that, we have a partnership with the European Commission on a project that’s called Social Media for Peace, in which we track and map the different types of conflicts that are happening online, in particular pilot countries, and bring together a multi-stakeholder group that then devises and deploys countermeasure tactics essentially to diffuse the polarization, to bring back a civil dialogue online and reduce polarization. It’s been quite interesting. We’ve had a lot of interesting results on that, and hopefully we can draw some mechanisms for scaling it up out of the pilots. But of course, the big thing that we’re looking at is the Internet for Trust conference. We’re about to release the guidelines that we have been developing for an entire year, together with quite a large group of stakeholders that provided inputs from all sectors of society, and how we actually regulate Internet platforms for civil discourse as well, so that without affecting freedom of expression, access to information, because quite a lot of measures that are there right now, to enhance one right, they sacrifice another. So that’s one of the things that we are not willing to do, is to sacrifice freedom of expression, access to information. So for example, when you say there are quite a few countries deploying measures of blocking or certain apps or features and et cetera, in a more not necessarily well-thought-out process. There are some examples of those. And of course, the consequences you have, for example, that limiting the voices of, for example, journalists that are reporting and keeping us informed about these issues is tremendous. So the activists and et cetera, they lose their voice too. So we can’t be indiscriminating that. So what the Internet for Trust guidelines are proposing is that instead of regulating content, for example, that we regulate process, that we enable and build together a real governance mechanism that fosters transparency and accountability, because those are the things that are missing on the internet. Accountability is something that is not there. So this is one of the key things that we need to target. And accountability is what truly makes people behave in a responsible way. If you don’t have it, there is no responsibility. There is no teasing and people act in different ways that are counterproductive to societal development. So thanks. Thank you very much.

Moderator:
Andrea, if you agree, I will also move to Mark online. And then if you have other feedbacks from the room, just feel free to interrupt us. But Mark, if you want to leave us your comment on the topic of this session, also connected to what Marielle was saying about building trust, and this is also related to building some measures of impact, some measures of some data. And I know your effort is also in this direction on the measurement side. So how do we measure these impacts? How do we find parameters to describe this space as well?

Mark Nelson:
Thank you, Marielle. And thank you also to Moses for the excellent question. I think these things all tie together. There’s a lot of general interest in peace tech without perhaps a detailed understanding of what it is and what are the components that make it possible and make us to be standing at this incredible opportunity as a human species right now. What changed? What’s different? Because it’s not the wheel. It’s not the lever. I mean, you could make a sort of general case that our ability as a species to construct tools of any kind is interesting and useful and isn’t at all peace tech if we use it right and so on. And that’s kind of true. But the thing that is really unique in the last 20 years is the vast proliferation of sensors and sensors that can detect all sorts of changes in the environment in real time. And the thing that really makes this powerful is that when you start looking at the subset of sensors that can detect human behavior and then again at the subset that can detect human social behavior, how we actually interact with each other, at that moment, you start realizing for the first time ever, we can really measure not only what kind of interactions we have between individual people in real time, we can also measure for the first time ever, over time, what are the impacts of those interactions. And so we can very quickly test theories about, well, I thought if I did this, it would be good for that person. But was it really good for that person or not? Suddenly, we can answer those questions and we can answer those questions in real time and with huge sample sizes. What this means is that as we structure the kind of data of human interactions to see where are we doing really well and what are the best practice kind of behaviors that humans can do for each other and how do we think about how we could design even better behavior sequences for each other and how could we customize those and tailor them for each of our individual situations. That’s where the huge opportunity of P-STEC is really possible. And it allows us to go back to something that was one of the original dreams of Norbert Wiener back in the 1950s, setting up the foundations of information technology and of cybernetics, which was this closed loop of technology between a sensor that could detect something happening in the environment that we cared about, communications technology that could move that data to processors that could do the best sensemaking of that data, and then the processor connecting to an actuator that can then respond to the environment so that the sensor can detect, did the response cause the effect, cause the change I wanted to change? Did it move the needle at all? And if it did move the needle, did it move it in the right direction? That’s where the opportunity now of P-STEC is really incredibly powerful. Because when you close that loop, you go from having a linear kind of technology to a closed loop technology. And that closed loop changes things to be fundamentally persuasive technologies. And this allows us, I want to just underscore why this is so historic. This allows us to move from, quote, peace technology, like the Colt Peacemaker from the 1800s, if you all know your Western movies really well. It’s a gun, and they called it a peacemaker. And it was built on a Hobbesian theory of Leviathan, that whoever has the power can enforce peace. Peace up until this change of technology has been coercive peace that has been based on a Hobbesian Leviathan theory. And what has changed now is that for the first time ever, we can build technology that can create persuasive peace instead of coercive peace. And that’s a huge shift for our species. I’ll just pause there and let everybody respond to that.

Moderator:
Thank you, Mark. Over to you, Andrea. We have eight minutes left. And so I just want to remind everybody that maybe we should collect some takeaways before the end of the session. We’ll keep points. Yeah. Yeah, there are people queuing somehow, so I’ll let. Okay. Please introduce yourself.

Audience:
Thank you so much. Hello, everyone. My name is Manjia. I’m from China. So I am a youth representative of a youth organization called Peace Building Project held by UNDPPA. So my question to all panelists is that how do you see the young people’s role in Tech for Peace and how to engage young people into this initiative or this wider agenda? And do you have any best practice to share or current or future plan for this? Thank you so much.

Moderator:
Thank you. Anybody else would like to add anything to the discussion from the floor? No? No. Sure. We’re networking. We can chat.

Audience:
Hi, everyone. My name’s Carolyn. I’m the director of campaigns and rapid response with Access Now. I think to your question about the definition of peace in the first place is a really important one. And I think from our frame, we really think about digital rights rooted in human rights and that applies in all contexts. And that has a very clear kind of legal structure to understand what the rules of the game are. And so I think I’d be curious to understand from the Peace Tech folks what the added value is of this lens specifically of Peace Tech is. So it seems like it covers a lot of ground from human rights to social justice to environmental justice and kind of is perhaps a bit of an umbrella term, but I think understanding what we’re trying to get to by using that lens would be helpful. And I guess the other thing that is on my mind from this conversation is just how this Peace Tech movement is thinking about kind of the compulsion towards moving in the direction of techno-solutionism. And I’m not saying that that’s what’s being presented here, but I think there are many conversations in this space here at IGF and that happen across the UN system and in other spaces where the instinct is very strong to reach for a particular digital tool to solve a very complex human problem. And so I think just, if you could all share a little bit more how you’re thinking about integrating these technical conversations back into some of the core work of peace building and all of these many different fronts and how those things are really feeding back into each other. Wonderful. Yes. Maybe you can reply for the global

Moderator:
and someone can reply from the youth perspective. Yeah, very briefly. So just 30 seconds each, please. So we can wrap up the session before the sushi reception that we are having here. So maybe I start with the last question then anybody wants to jump in for the youth perspective on this. So for the global Peace Tech Hub, first of all, the Peace Tech community is a growing space made of different actors and we’re not representing them, of course, but the Global Peace Tech Hub is an initiative that is based at the European University Institute in Florence and especially at the School of Transnational Governance, which is a school of governance for current and future leaders on issues that go beyond the state. That’s the context. So our role to play in this is a facilitator of different actors and of course, some independent analysis of what is going on in this space. So we cannot talk for the others, but about where we are positioning ourselves and where we see the added value is in fact in the connection of these different spaces that don’t necessarily talk to each other. So the Peace Tech idea was born more in the peace building sector and especially NGOs would apply digital tools to some peace building initiatives that we’re doing already. The way we perceive Peace Tech is a bit different. It starts more from a tech policy perspective. And as I was explaining at the beginning, it’s equally on how you assess the impact and invest in the impactful projects, Peace Tech projects, but at the same time, how you govern and mitigate the risks with the good governance, with regulation and capacity building. So in this challenge, it’s very broad, as you say, of course, and under definition, but at the same time, we see the value of the interconnectedness of the different challenges. So we’re talking about trust, for instance, before how is that divided from any discussion on digital identities? How is this not linked to some applications of blockchain? How is this not linked with some discussion about data and who owns the data? How is that not connected with internet infrastructure? And how do we give a secure accessibility to internet? How it is not connected to cybersecurity? So you see all this interconnectedness of different challenges. And I think that’s the kind of added value we were trying to bring. And on the concrete side, we also want to foster this dialogue between these different sectors that they discuss kind of in bubbles and they don’t necessarily talk to each other. So I’m talking about the tech sector per se that they didn’t have on the top of their agenda, let’s say, discussion about pro-social tech or peace tech. At the same time, NGOs, they’re doing a lot of things on the ground so they know what is going on there, but this doesn’t always elevate to the governance level. So also government should be involved. So we’re trying to create these connections and the triangulation with the peace definition, we adopt the positive peace paradigm. So a little bit similar to what underpins the global peace index. So this positive peace pillars, which is not just the absolute value. We need to, they’re about to kick us out of the room. You’re really kick us out of the room. So we need to move on. I think Evelyn had a brief reaction to the point made.

Evelyne Tauchnitz:
Yeah, thank you very much for this question. I just want to take that up. What is the added value of peace as compared to human rights? I think it’s a really important question. And I’ve been also asking me that. And as I see that human rights are a necessary condition for peace, but they’re not a sufficient condition, means peace requires the respect for human rights, but it’s a bit something more also. Like as I talked, I see it more as a vision. And in a sense, you also have with human rights a bit the problem that it’s about the process, like in the sense of you have this human right, you have that one, you have that one, but you can imagine that it’s any, the government can interpret human rights as they wish to in a way and say, okay, you get that right, but we don’t have so many resources that we cannot fulfill this kind of right and so on. Whereas peace is more kind of a broader concept. And I also see that in a way, it’s almost easier to start a public debate on it, or it’s like everybody has a perception of peace, whereas human rights are like really kind of focused on rights and they’re legally enforceable, which is an advantage, but it’s much more narrow. And it’s not about the process, for example, how do you reach certain political decisions? But it’s, or how do we, the process is not really incorporated in the human rights perspective. So this, I think, is an important, important say, it’s like it’s necessary, but it’s not sufficient. If I just can make a brief comment to the youth question, I think it’s a really important one as well. And I think, because if, I think it’s also important to ask like, how is digital change changing our perception of peace? Like what we have been thinking of peace might be changing in the future precisely due to the digital transformation. Maybe we don’t have, so that conception of peace changes. It’s not only that we can use digital technologies for constructing peace, but it’s also that through that process, we might have a different understanding of peace. And their use is really, really important, I think. And also connected to that, like I think you said, it’s funny because you said you’re from China, I had just a conversation with a colleague of mine recently, and he did research in China. And he told me, for example, that a lot of the social scoring system has to do with a deficit of trust in nowadays China as compared to the China of earlier generations. And in order to reconstruct a trust, social scoring systems are considered useful. But of course, if we talk about peace, it’s this broad condition. Like do we want trust as a digital tools, for example, or do we want to build, how do we want to build this trust in society? Like this would be a really vital question, I think. Yeah, really quickly, so thanks.

Speaker:
I think on a few of the questions, I mean, for youth, first of all, youth are leading movements for human rights and for peace around the world. And I think they are putting themselves on the line, and more and more so in a digitized era where everything they do, their pictures, holding protest signs out in marches are going to be indelible. They’ve probably already been scraped for the facial recognition, and so they’re getting attacked by spyware. We see that on the phones of youths. So from Thailand to Sudan to the USA, yeah, youth are doing it, are doing the work, and it’s on us to help ensure this doesn’t lead to reprisal or permanent problems for them. I think on, just to respond to Mark, I think we were talking about euphemisms here, peace and tech for good, digital public goods. And so as advocates, we do chafe at this because I think there is a bit of a trust deficit. We don’t see either the tech sectors or frankly large humanitarian agencies as deserving of the trust of our personal data, of personal information about us, especially those most at risk. And so when I hear talk of the ubiquity of sensors, sensors implies some kind of passive monitoring that is so contrary to the vortexes creating these traps for us to deliver our personal sensitive information into these black boxes that we have no control over that characterizes most modern tech, that I think sensors is a dangerous sort of word to use in that context, unless you’re talking about some tools that are completely divorced from the modern digital economy. Thanks.

Moderator:
Fantastic, so this was, of course, a networking session. This means that, I mean, we are pretty well networked, at least the people that are on this stage. But for those of you that would like to connect to these conversations, we of course will have a follow up and please pass your contacts on me. And so we’re gonna keep you in the loop. I mean, again, this was a networking session. The impression is that we are still, even within the tech peace community, there is no common ground, there is no agreement of what exactly tech and peace actually means. And the impression is that for some, tech is approached as a tool to achieve peace. For others, it is tech as a new battleground as such, and then all the discussion about cyber security and so on. And for others, it could be tech as threats to peace. For example, the discussion on the laws on the lethal autonomous weapon system, how AI is increasingly embedded in virus arms. So yes, I think that there is still a lot of conversation needs to be done also within the tech community in order to bridge this community that work in silos. And hopefully we’re gonna have additional sessions to trying to break this silos. That’s all from here. I don’t know, Michele, if you want to have a last 10 seconds word or a word, not even a sentence, just a word. Yeah, perfect. Wrap up, enjoy the sushi dinner. And it was great networking session. Let’s keep the context and the conversation going. Thank you. Fantastic. Thank you all also to the online contributors to the discussion. Bye all. Bye.

Audience

Speech speed

188 words per minute

Speech length

436 words

Speech time

139 secs

Evelyne Tauchnitz

Speech speed

179 words per minute

Speech length

1579 words

Speech time

529 secs

Marielza Oliveira

Speech speed

131 words per minute

Speech length

1068 words

Speech time

487 secs

Mark Nelson

Speech speed

166 words per minute

Speech length

902 words

Speech time

325 secs

Moderator

Speech speed

158 words per minute

Speech length

3134 words

Speech time

1187 secs

Moses Owainy

Speech speed

154 words per minute

Speech length

519 words

Speech time

202 secs

Speaker

Speech speed

149 words per minute

Speech length

1782 words

Speech time

717 secs