Benefits and challenges of the immersive realities | IGF 2023 Open Forum #20

9 Oct 2023 06:15h - 07:15h UTC

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Patrick Penninckx

The Council of Europe is actively examining the impact of new technological developments, such as AI and immersive realities, on human rights, the rule of law, and democracy. They recognize the importance of ensuring that these advancements uphold these fundamental values. To achieve this, the Council is partnering with IEEE to study the metaverse and its potential impact on human rights.

Guiding the development of the metaverse, the Council of Europe emphasizes the need for clear benchmarks that uphold human rights principles. They also highlight the importance of transparency, accountability, and the protection of digital rights within this emerging technology. Additionally, they stress the significance of involving multiple stakeholders, including the technical community, civil society, businesses, and academics, in decision-making processes regarding the metaverse.

Regarding immersive realities, concerns arise about the ethical decision-making process within private businesses. The Council of Europe acknowledges the risks posed by allowing private businesses to solely determine the development of immersive technologies, and calls for a more inclusive approach involving various stakeholders.

The Council also addresses the implications of immersive realities on privacy, with the collection of new forms of data like biometric and psychographic information. They highlight the potential for issues such as misinformation, disinformation, and freedom of expression. They also emphasize the need for inclusive access to immersive realities, particularly in light of the digital divide exposed by the COVID-19 pandemic.

In terms of governance principles, the Council of Europe has worked on data protection, cybercrime, and artificial intelligence. They are currently identifying ethical principles and existing legislation relevant to the metaverse, as well as addressing any gaps that need to be filled. They also express concerns about the influence of technology on human thought processes and freedom of conscience, stressing the need for careful consideration of these aspects.

In conclusion, the Council of Europe’s work on the impact of new technological developments on human rights, the rule of law, and democracy reflects their commitment to ensuring that these advancements align with fundamental values. Their partnership with IEEE to study the metaverse is a significant step in this direction. The Council emphasizes transparency, accountability, digital rights protection, and multi-stakeholder involvement. They are actively addressing privacy concerns, combating misinformation, and promoting inclusive access to immersive technologies, all while upholding human rights and societal values.

Audience

During the discussion, the speakers expressed concerns about the potential access to comprehensive biometric details in the virtual realms. Users’ immersion into these realms could enable the collection of biometric data such as eye tracking, brain activity, and heart rate. Nina Jane Patel specifically raised concerns about this potential breach of privacy and advocated for the need for regulation and governance on such intimate data in the metaverse. There is a perceived risk of individuals’ biometric data being misused in this virtual environment, highlighting the importance of safeguarding privacy and ensuring data protection.

Another concern raised during the discussion was the impact of immersive technologies on privacy, freedom of conscience, and psychophysical integrity. The speaker from Poland had different considerations regarding privacy and freedom of conscience in the face of these technologies. It was acknowledged that there are technical challenges involved in maintaining the psychophysical integrity of individuals and protecting their freedom of conscience within immersive environments. The speaker’s suggestion was to focus on developing technical solutions to handle these issues.

Content moderation in the metaverse was also a topic of concern. The Clinical Executive Director of the UCLA Institute for Technology Law and Policy highlighted the lack of effective tools for moderating content at scale in these new technologies. The current standards that exist for traditional social media platforms cannot be effectively followed in the metaverse. This raises questions about maintaining safety and regulating content in this evolving virtual space.

Furthermore, it was noted that the impacts of the metaverse will vary based on socioeconomic and geographical disparities. Steve Fosley from UNICEF pointed out that the cost of metaverse technology, such as VR headsets, could be prohibitive for some individuals. Not everyone will have the same quality of access to these technologies, and some may interact with artificial intelligence (AI) and the metaverse in less immersive and sophisticated ways. This highlights the potential for increased inequalities based on access and resources.

Overall, the discussion highlighted concerns about the access and misuse of biometric data, the need for governance and regulation in the metaverse, the impact of immersive technologies on privacy and freedom of conscience, the lack of effective content moderation tools, and the potential for disparities in the metaverse based on socioeconomic and geographical factors. The analysis provides valuable insights into the challenges and considerations surrounding the development and implementation of these emerging technologies.

Irene Kitsara

The increasing use of virtual realms has opened up new possibilities for accessing biometric data, including eye tracking, brain activity, and heart rate. This wealth of information has necessitated a rethink of privacy in response to this emerging technology. Experts have recognized the need to address the potential implications and consequences of such data collection.

One suggested solution is the introduction of “neural rights.” In fact, Chile has already incorporated neural rights into its constitution, demonstrating a growing recognition of the need to protect individuals’ rights and data in the context of advancing virtual realms.

Not only do individuals directly involved in virtual experiences require protection, but the concept of bystander privacy is also a concern. Bystander privacy refers to the privacy of those who may be indirectly captured or impacted by data collection, such as other individuals in the same room as a virtual reality user. Addressing this issue is crucial to ensure the protection and respect of personal privacy in all aspects of virtual realm usage.

When it comes to data governance, experts are divided on the best approach. Some propose self-regulation principles, where individuals, organizations, and industries voluntarily adhere to established guidelines and standards. Others suggest the reinterpretation of existing laws to adapt to the challenges posed by virtual realms. Lastly, the introduction of new laws is also considered a potential avenue for regulating biometric data and ensuring ethical practices.

In conclusion, the growing immersion into virtual realms and the accessibility of biometric data have raised important discussions regarding privacy and data governance. The concept of neural rights has emerged as a potential solution, and bystander privacy is also of significant concern. The best path for data governance remains a topic of debate, with options ranging from self-regulation to the introduction of new legislation.

Adam Ingle

The metaverse and immersive technology have the potential to revolutionise connections among children. Research conducted with UNICEF suggests that social connection plays a vital role in child well-being online, and the metaverse has the capability to enhance this through its connectivity and personalisation features. Avatars and identity in the metaverse enable children to establish unique connections and interact with others in ways that were previously unimaginable. This incredible connectivity has the power to bridge distance and cultural barriers, fostering a global community of children.

Furthermore, the metaverse and digital platforms like Minecraft, Roblox, and Fortnite provide children with an avenue to express and enhance their creativity. These platforms allow children to build imaginative worlds and engage with various forms of artistic expression. Improved technology, interconnectivity, and layered services within the metaverse amplify the creative potential for children, allowing them to develop their creative skills and explore their unique talents.

In addition to fostering social connections and creativity, the metaverse empowers children by enabling them to build their online identity. A strong sense of identity is fundamental to a child’s personal development, and the metaverse provides a digital space for children to shape and express their identity. By creating and managing their online presence, children can gain a sense of confidence, autonomy, and empowerment.

However, it is important to implement the metaverse in a responsible and considered manner, particularly when it comes to children. The potential risks and harms associated with the metaverse necessitate the establishment of high safety standards and responsible design. A collective approach by all stakeholders is essential to address the interconnected and interoperable nature of the metaverse. By ensuring robust safety measures and responsible design, a kid-friendly ecosystem can be created within the metaverse, safeguarding the well-being and protection of children.

Regulation and legislation are key aspects of addressing the challenges and issues in the metaverse. The development of regulatory frameworks and the resolution of existing problems from Web 2.0 platforms are crucial to ensuring a safe and secure metaverse environment. By learning from the experiences and responses to Web 2.0, it is possible to establish effective measures that protect children’s rights and well-being in the metaverse.

Furthermore, it is important to observe and evaluate the evolution of current Web 2.0 regulations and cultural responses. This ongoing assessment will provide valuable insights and guidance in handling the challenges and implications of the metaverse. By learning from the past, we can adapt and develop appropriate strategies and policies to shape a responsible and inclusive metaverse for future generations.

Lego, a prominent advocate for child safety, is committed to creating kid-friendly environments in and beyond the metaverse. Lego emphasises the importance of high safety standards and aims to establish a truly immersive ecosystem that prioritises children’s well-being and protection. Their dedication acts as an example and encourages others to join in implementing stringent safety measures and creating a child-friendly metaverse.

In conclusion, the metaverse and immersive technology have the potential to revolutionise connections among children, foster creativity, and empower them. However, responsible and considered implementation is crucial to mitigate potential risks and ensure the well-being of children. Regulation, safety standards, and observing the evolution of Web 2.0 regulations are vital aspects in handling the challenges of the metaverse. By establishing a collaborative and proactive approach, a safe and inclusive environment can be created, where children can explore, learn, and connect in the metaverse.

Melodena Stephens

The Metaverse, with a potential market size of up to 13 trillion USD, is undergoing rapid adoption in various sectors. Governments, educational institutions, and retail businesses are among those embracing this concept. Cities and countries are implementing digital twin strategies, while industries like manufacturing are creating digital twins for their operations. Education and healthcare sectors are also driving the adoption of Metaverse technologies. However, concerns about employment, behavioural addiction, environmental impact, cultural representation, and the need for effective governance have been raised. Collaboration, transparency, and careful consideration of social and ethical implications are crucial in harnessing the full potential of the Metaverse while mitigating risks.

Hugh

The concept of the metaverse, which was first introduced by Neal Stephenson in a sci-fi novel three decades ago, refers to a digital universe that could exist either alongside or as an extension of our current reality. It has garnered significant interest in the field of digital technology and is seen as the next phase of digital transformation.

Artificial intelligence (AI) plays a crucial role in the development of the metaverse, along with other technologies such as extended senses and actions (XR or spatial computing), persistent virtual worlds (persistent computing), and digital finance and economy (consensus computing). These core technologies, combined with supporting technologies like computation, storage, communications, networking, data, knowledge, and intelligence, are necessary components for creating the metaverse.

The metaverse is believed to have the potential to become the next version of the internet, redefining production and life in the process. It is seen as the natural progression from the current “intelligentization” phase, which is characterized by the rise of AI.

Hugh, in particular, holds the view that the metaverse is the next major advancement in digital transformation. He predicts that it will have a profound impact on various aspects of society, revolutionizing production methods and reshaping daily life.

Overall, the metaverse, with its integration of AI and technological advancements, presents exciting possibilities for the future. It is poised to bring about a new era in digital transformation that will have wide-reaching effects. As discussions around the metaverse continue, it will be interesting to see how these ideas evolve and shape the digital landscape in the coming years.

Clara Neppel

This analysis explores various topics related to virtual reality, immersive realities, digital twins, partnerships, and ethics. Clara Neppel, a prominent figure in this field, emphasizes the importance of architecting virtual reality in a way that encourages happiness and well-being. She believes that a multidisciplinary approach is necessary, involving not only technologists but also individuals with different perspectives such as ethics and social sciences.

Immersive realities, as highlighted in the analysis, contribute to safer flights through extensive pilot training. By allowing pilots to undergo training in immersive simulated environments, they can effectively manage challenging situations and improve their skills.

The analysis also discusses the role of generative AI in revolutionising design, particularly in the automotive industry. Immersive realities are used for testing designs, enabling designers to envision and evaluate various possibilities before implementing them in the physical world.

Digital twins, virtual replicas of cities or ourselves, play a crucial role in achieving goals related to climate and sustainable cities. By creating accurate digital representations, cities can better understand and address environmental challenges. Digital twins also offer opportunities to improve inclusive health and education by providing insights and personalised approaches to healthcare and learning.

Partnerships are highlighted as essential in achieving common goals. Collaboration among various stakeholders, including government bodies, NGOs, and private sector entities, is crucial for addressing complex challenges and advancing sustainable development.

Virtual reality is shown as a tool to help citizens understand the full impact of measures related to climate change. By creating simulated experiences, individuals can gain a deeper understanding of the consequences of their actions and make more informed decisions.

However, the analysis also points out that immersive realities and the metaverse introduce ethical challenges and issues. Concerns such as privacy, data protection, safety, and security need to be carefully addressed to ensure the responsible and ethical use of these technologies.

The governance of virtual spaces, including the metaverse, is highlighted as an area that requires a new system. Discussions are already underway regarding who should control the code and the resulting services. The concept of co-creation of infrastructure and its implications for ownership are also discussed.

The analysis raises concerns about the potential privacy issues that may arise with the omnipresence of technology in the future. It emphasizes the need to carefully navigate the balance between technological advancements and individual privacy rights.

Safety and interoperability of regulations are identified as major concerns in the deployment of AI solutions in various sectors. Poorly designed AI systems can have real impacts on individuals, particularly in the field of healthcare. Therefore, ensuring safety becomes paramount in discussions surrounding AI deployment.

The analysis emphasizes the need for interoperability of regulations through the establishment of global standards. These standards operationalise regulations and move from mere guiding principles to practical implementation.

A combined top-down and bottom-up approach is identified as crucial in developing a comprehensive framework. This approach involves considering the perspectives of both regulatory bodies and grassroots initiatives. The work of the Institute of Electrical and Electronics Engineers (IEEE) on ethically aligned design initiatives is cited as an example of a bottom-up approach.

Content moderation, both in terms of public and private control, is highlighted as a major point of discussion. Clara Neppel believes that this topic lies at the heart of discussions within the International Governance Forum.

Additionally, the importance of anonymity in exercising citizen rights is stressed. Anonymity provides individuals with the freedom to express themselves without fear of repercussions and plays a vital role in maintaining a balanced and inclusive society.

In conclusion, this analysis showcases the wide array of topics surrounding virtual reality, immersive realities, digital twins, partnerships, and ethics. It highlights the need for comprehensive approaches and collaborations to tackle the challenges and harness the potential of these technologies in a responsible and beneficial manner.

Session transcript

Irene Kitsara:
Thank you. . Good afternoon, ladies and gentlemen, and welcome to the open forum number 20 of the IGF 2023 on benefits and challenges of immersive technology. We have a number of speakers today. I will start with on-site panelists in alphabetical order. So we have with us Adam Ingle, global lead in digital policy from the Lego Group. We have Clara Neppel, senior director of European business operations at IEEE. And we have Patrick Pennix, head of the information society department from the Council of Europe, and the Council of Europe operating officer from then, the NRA, close to the network step. In remote participation, we have a professor, Melodyna Stephens, professor of innovation and technology governance from the Mohammed bin Rashid school of Government We have from IEEE ASA president. Welcome. I have the pleasure to be the moderator of this session. This is a very important report on the impact on the metaverse and its impact on human rights, the rule of law and democracy. And I will be your moderator today. So let me start by asking Patrick and Clara why the Council of Europe is organizing this session today and working on

Patrick Penninckx:
issues related to emerging technologies and also what is the role of IEEE in this. I think it is very important for the Council of Europe and the Council of Europe as a collective to always work at the edge of the developments of technology. Already in the 80s, we worked on the data protection convention. Later on, 20 years ago, we developed the cybercrime convention. So we are always trying to ensure that the new technological developments are compatible with the values of the old technology and the new technology, which is encompassed through artificial intelligence and the immersive realities. We also need to see to which extent this coincides and this reinforces or poses a certain number of challenges to the development of human rights. So I will have to start again, I guess. I will have to start again, I guess. I will have to start again, I guess. I will have to start again, I guess. What I was trying to say is that the Council of Europe has always been at the edge when it comes to the development of new technologies. When we try to look at the development of everything, automated processing of individual data already 40 years ago, or the cybercrime convention more than 20 years ago, for us it was always very important to look at the development of the new technologies and the impact of those emerging technologies and this way the immersive realities have on human rights, rule of law and democracy. to work in partnership with IEEE on looking into the metaverse and how the metaverse would impact those human rights. And that’s why we decided to organize this workshop here.

Clara Neppel:
Thank you. And thank you for having us here. So my name is Clara Neppl, and I’m the senior director of IEEE in Europe. We are based in Austria and Vienna. And on my flight here, I actually saw a documentary from a famous Austrian architect, Karl Schwantzer, who said that man creates buildings, and buildings create man. And actually, it’s the responsibility of an architect to create these buildings which make people happy. And now we are at a time when we create a completely new virtual reality, and we are the architects. And I think that we cannot do it alone as technologists. I think that we need to create an immersive reality which makes people happy, which cares for well-being, and of course, also human rights and the society. And we need to bring also in this report, that’s what we try to do, to bring different perspectives. So from a technological side, from the ethical side, social side. And yes, this is basically this bidirectional dialogue that we need to continue also for this sense. Thank you.

Irene Kitsara:
Thank you, Patrick and Clara. So we are hearing the terms metaverse, immersive realities, and also in other sessions, we have also related terms such as virtual worlds. And I think it would be good for our discussion to talk a little bit about these terms, and maybe as well as the technologies that are enabling making such realities an option and making it possible for us to experience. So with that, I would like to turn to you to provide us with his perspective on this.

Hugh:
Thank you, Irene. So as we all know, metaverse, this term was coined by Neal Stephenson in his sci-fi fiction novel back in 30 years ago. But during the past decades, this concept itself has been extended quite a bit. So let me share with you our definition of metaverse. We are trying to provide the most inclusive definition for metaverse. So in terms of metaverse, we could agree that this is talking about a digital universe. So from the experience perspective, we can say there are three types of metaverses. It could be either a digital and a different universe, or it could be a digital counterpart of our current universe, or it could be a digital extension of our current universe, which means these three different types of digital universes are corresponding to virtual reality, augmented reality, and the digital twin. So from that perspective, metaverse refers to a kind of experience in which the outside world is perceived as a universe. But from another angle, the functional view of the internet… Well, how about now? Let me say again. Could you hear me? Hello? Now we can hear you, thank you. Okay, sorry. So we know metaverse from another perspective. We call that a functional view. Metaverse could be referred to the next version of… …be the next stage of digital transformation. So with that being said, let’s take a look at the metaverse technology landscape. We can say that, of course, supporting technology is like computation. storage, communications, networking, data, and knowledge, and intelligence are all necessary for enabling metaverse. But there are also core technologies for metaverse, namely extended senses and actions. You can call that XR, or you can call that spatial computing. And the second category is persistent virtual worlds. We call that persistent computing, which is about how to create virtual maps, virtual scenes, virtual objects, and the virtual characters collectively constituting virtual worlds. And lastly, digital finance and economy, you can also call that a consensus computing, which is about digital assets, may or may not be built upon decentralization and the blockchain. So from this technology roadmap, or landscape, you can say that AI is actually an integral part of the metaverse technology landscape. So with that being said, we can say that metaverse is the next biggest thing. Why? Because if we look at the history of the digitalization, or digital transformation, we are actually between two stages. The current stage, which is already exploding, is we call that intelligentization, which is about the rise of AI, using AI everywhere. But the next phase by AI and its upcoming is the metaverse. So we are currently between these two stages. And I could also add that, as always, many of us will agree that AI is transforming production, transforming forces of production and the relations of production, but the metaverse will redefine production and redefine life. So that’s why we say metaverse is the next biggest thing. So I’ll stop here. Irene?

Irene Kitsara:
Thank you very much, Hugh. And you touched upon some of the fact, you know, that we have different areas of application of the metaverse. And I would like to now turn to Melodina and ask her about some application areas, and then move to Clara, Benefit and Adam and talk about some of the benefits. that can arise from the use of immersive experience and ways that the metaverse can also promote, for example, the human rights, the role of law and democracy. Melodina, would you like to start?

Melodena Stephens:
Thank you. So when Facebook changed its name to Meta in October 2021, the market speculated that the total size of Metaverse is 13 trillion US dollars. Over time, that number got revised and went downwards, but I do not think it is a wrong estimate at all. For the first reason is Metaverse is also hardware. So you see this doubling of computing power every 18 months. You also see a lot of the geopolitical tension is pushing the adoption of Metaverse. You can see this in the 5G wars and in the proxy wars currently going on. You also see private sectors, tremendous interest. In fact, the applied research from private sector is greater than government investment. And you see this in things like, for example, Microsoft’s acquisition of Activision Blizzard for about 69 billion US dollars. We also see governments are huge adopters, and I’m gonna go through that very briefly, but we see a standards war coming out and it’s being played by the private sector currently. You see Pokemon Go, which was an augmented reality game, got 50 million customers in 19 days. So that’s huge adoption curve. You see also a price war happening right now with Meta’s Oculus glasses priced at 500 versus Apple’s glasses priced at 3,500 all in time for Christmas. So gaming continues to drive the Metaverse right now. There’s more than 160 virtual worlds. Fortnite, for example, has half a billion customers and generates something like 6 billion US dollars. A lot of this income is also micro-purchasing. We can’t ignore other players which have huge numbers. For example, Meta with 3.88 billion users. Microsoft with most of the Fortune 500, and keep in mind, Microsoft has a Microsoft Mesh and now has Activision, that’s 92 million monthly users and Minecraft, significant number of children. Apple has 1.5 billion users entering into the payment circle and Google has 4.3 billion tens and 1.26. And we see Nvidia, which was typically a hardware provider now entering into this space. So the crossovers are very interesting and that’s why I think it’s very hard to determine market. Now, industry applications are, for example, in digital twins. We have countries adopting, well, cities, for example, in countries. UK has a digital twin strategy, for example. South Korea has one, but we also see cities adopting it. We see manufacturing, there are factories that are adopting and creating digital twins, Siemens, BMW, so definitely Germany. We see it in utility sector, Sydney Water. We see it in AdNoc, which is petroleum, oil and gas. We see 900 cities with smart cities. So with the internet of all things, I think this is also pushing the adoption of the metaverse. We have 125 billion connected devices in 2023. We see government, which historically. has contributed 40% to GDP approximately, maybe at the higher end, but also entering. So for example, tourism. During the pandemic, Dubai was present as World Expo. They had 24 physical visitors coming to the site. It was COVID after all, but 125 virtual visitors. And this becomes part of their legacy. We see KSA with Neom and Finland in Minecraft, actually, with the 3D version of Helsinki as a city. We see education as a huge adopter. Typically, it’s being pushed by engineering and health, and that’s also where a lot of the research is happening right now. There was the first surgery, but it was more to access digital records, and some work is happening on customer care. A lot on re-skilling. For example, Accenture bought 60,000 Oculus Quest headsets in 2021 for their employees, and they created the nth floor for training and for networking. We also see retail heavily getting involved in the metaverse. Typically, right now, it’s more experiences. Brands are testing it out. We’ve got luxury brands like Gucci, Burberry, fashion brands like H&M Forever 21. I mean, you name it, they are there, but they’re experimenting right now. There is no doubt we will reach 13 trillion. I think it’s a function of standards or maybe who will win the standards war, and also what is the situation with regulations. I’ll stop there right now, Irene.

Irene Kitsara:
Thank you, Melodina. Patrick?

Patrick Penninckx:
Well, well, if, is it on? Is it on? Yes, it is on. Okay, well, there are, the question that Vince Cerf just asked in the opening session of the high-level opening remarks was what is the internet we want and what is the internet we deserve? So these are two different questions, and the same goes for the metaverse. What is the metaverse we want, and which one do we deserve? I think if we want to create a metaverse that is respectful of human rights, that will enhance freedom of expression, that will be inclusive, that will be accessible, that will be fostering global connections, we need to put those milepoles and benchmarks in place, and that’s why we cannot just let digital development happen. We have to be able to steer that digital development. I wouldn’t say that we need to steer innovation. I think that is for companies to do. But we need to put those benchmarks right that make sure that there are within the metaverse also innovative educational opportunities, that there is a democratic participation, that there is a digital rights protection. We very often at the level of the Council of Europe say, what is the protection of rights that we need to do offline? We also need to do that online. If the metaverse is the next step up with the Internet of Things, with connected realities, with 5G, with quantum computing and how that interrelates all together and certain industries are very far ahead. You didn’t say that earlier on, but for example, testing of in the metaverse, how it feels to be underwater, for example. These are innovations that we need to be able to not grasp, but at least to be able to say what usage do we want it to give in the future. I could imagine that not only it gives you the feeling of jumping off a cliff into the ocean, which would be the fantastic use of the metaverse, I guess, but if we are able to use the metaverse in order to do waterboarding, this may be a completely different reality. So we need transparency, we need accountability, we need digital rights protection, and I think the experience already shows that we need to be able to give a certain guidance on that. We’re trying to do that in the technologies that are being developed. Right now we’re developing a regulation on artificial intelligence, which is a framework convention that is to be dealing with this. We hope to finalize that by mid-next year, but also in our future work plans. The metaverse is part of it and the fact that we can work together with IEEE on those kind of things seems to me essential because as we said before, it’s in this multi-stakeholder context that we need to be able to discuss that from all angles, whether that be from the technical community, from the engineers point of view, from the business point of view, but also from an ethical point of view, from civil society point of view, academic point of view, and be able to govern all of that. So I think the benefits are there and we can work towards the promotion of human rights and rule of law and democratic participation, but it’s not going to go evidently. We’ve seen that with the development of the Internet. The Internet has given us a number of opportunities. We want it to be open and transparent and flexible and worldwide, but we’re increasingly getting a more fragmented world and we also know that if we let things happen, if we want a metaverse, not want a metaverse, but if we get a metaverse we deserve, we may not be getting the metaverse we want. And I think that’s important from a human rights perspective to look at it.

Clara Neppel:
Clarence and Adam, on the benefits. Yeah, thank you. Well, I think we already heard quite a lot on the benefits. I was also thinking, again on my flight to Japan, that probably already immersive realities contributed this flight and your flight as well to be more safe because the pilot was probably trained by hours and hours in immersive realities to master a situation which we hopefully never encounter, so not very often and so this is already an immersive reality. reality which helps us. And we’re seeing now, we hear generative AI. Generative AI is going to revolutionize also design. We are going to have the car industry, which is already testing out different design options in different immersive realities. And I think that we are moving now, we heard from these digital twins of cities, and I think somebody asked to try to map it to SDG, so I will just try to do some. The obvious one of course would be the SDG 9, industry innovation and infrastructure. But if we go to the digital twins of cities and even of the planet, of course we are also touching about the SDG 13 on climate and also of sustainable cities. And we are moving to the digital twins of ourselves and I think that this is where our collaboration with the Council of Europe is going to be essential, because there we are entering a realm that we certainly cannot handle alone when it comes to human right, democracy and rule of law. And so digital twins of ourselves, what does it mean? It means of course inclusive health, health care, SDG 3, education was already mentioned, SDG 4. But what is very close to my heart is really SDG 17 and that is partnership. It’s partnership for these common goals. And I think that this is going to be now really a game-changer. Now if we’re thinking about climate change, we see quite a lot of measures which are very, very difficult to implement, because citizens don’t understand the full impact of it and there’s a lot of fear. What does it mean if a solar panel is very close to my field or if I have a wind turbine somewhere nearby? What does it mean if my city is going to implement new measures in terms of traffic control? And this is something that we can try out in the virtual reality. And we can really enhance the democratic participation that Patrick talked about. Thank you.

Adam Ingle:
Thank you. So I think the benefits have been well canvassed. But I’m from the LEGO group, so I’ll focus my comments on what it might mean for children. And really, it has tremendous potential to amplify the things that kids care about. So we’ve undertaken research alongside UNICEF to try and understand what is child well-being online, and what components and elements and building blocks actually make children feel like they’re in a positive space. One of them is social connection. I think the metaverse and the immersivity of it and the interoperability between different layers of the internet and different services can really connect children in a way that is unprecedented. You’re not just a username. You’re an avatar. You have a sense of identity that’s carried across experiences that’s built up through a history online. And that conveys a unique sense of yourself to your peers and other kids. So you can connect in a way that you haven’t been able before and that’s really what kids value. You can create in a way that you can’t do offline, even with LEGO bricks. You are able to really build these worlds around you. You’ve seen the power of Minecraft, Roblox, what’s happening in Fortnite. These are all early metaverses. As the technology improves, not just the graphics, but the interconnectivity, the layers of services, the creative potential is huge. And children learn through creation. That’s what we’ve really found. So they can do that in an even better way. You also can empower kids. They have this sense of identity. They’re online. They’re engaging. They’re building. own lives there and they really value this kind of sense of empowerment. Often you know they can find some interactions quite patronizing but you know they have a right of access to the benefits of technology and the metaverse is an avenue for that. So they can learn, create, connect, do all these things. Now I know we’re getting to the downsides later but I do want to say there’s a massive caveat to all that is you know these things need to be done in a responsible way particularly with children. So social connection you know it’s we’ve seen the harms that come through an unconsidered approach to those types of things. So the benefits are tremendous but it needs to be done right. Hopefully that’s a good segue.

Irene Kitsara:
Absolutely thank you for that and I think we have the spoiler alert in the title of the of the session about the challenges and I think this is part of what a lot of sessions in the IGF are addressing around concerns that come with emerging technologies and applications. So I would like to address this question to to all our panelists about what are the some of the challenges that could arise from immersive realities and what is the potential impact they could have on the human rights, the rule of law and democracy, remembering the organizer of the event. And let us maybe just give a bit of a background on what we have covered in the upcoming report. So we have looked into on one side the enabling environment that the immersive realities and the metaverse can create for exercising human rights and the rule of law and democracy. But other issues we have looked into were related to privacy and data protection, safety and security, protection of children and other vulnerable populations, access and accessibility issues, inclusion and non-discrimination, freedom of expression and censorship, labor environment and of course issues related to the rule of law such as territoriality, enforcement, access to justice and democracy. But before we all despair maybe let’s start by some of these issues and I will start by Clara and then we can move to Patrick, Melodina and Adam.

Clara Neppel:
Thank you. So I already mentioned that we have already very practical examples of virtual reality. So we have autonomous cars being tried out in different scenarios. But even there there are certain ethical questions. A cow on the street might have a completely different value in India than in Europe. And now if we have these digital twins or avatars or digital humans of course we are entering a completely new territory. These digital humans interacting now in a seamless interconnected space, there is, who is going to control that space? So until now these immersive realities have been, and also the rules of engagements, have been designed by private actors. Now if we have something like a public space, who is going to decide who is going to enter that space? What is acceptable behavior? And when somebody should be excluded? So here again we are also discussing about an inclusive, as much as inclusive space as possible. We already see a shift of paradigm from the moderation of control that we know from AI and social media, to the moderation of behavior and moderation of space. What does it mean to be aggressed in a virtual space? And again, if we are discussing about virtual spaces, what is a public infrastructure? To what extent can people co-create actually that infrastructure? And what does it mean then to ownership? We already see our children in Minecraft creating magnificent cities and so on. What does it mean if this is then incorporated in a private virtual space? Whose ownership is it? And again, who is dictating the rules? In the digital space we have in open source the governance of, you know, who is actually controlling what code is getting into it. We had some time ago something like a benevolent dictator. Somebody who is dictating which code should be part of that service. So are we going to have something like this in a digital space? Hopefully not. Hopefully we will have a democratic participation. And especially when it comes to such a technology which will very much influence our worldviews, because we are basically going to have a completely different perception of, let’s say, a certain environment. if we are immersed in this, who is going to, again, control how this is going to look like? What does it mean, our perception of history, of perception of reality as such? And I think we already heard about privacy. I think we are entering here into a completely new space. We are going to have this technology which is present, omnipresent. And we have to get away, let’s say, from the technologies that we hear now of the headsets. We have to think about technologies which are upcoming. Last week at a Paris fashion show, something called a human AI was presented, which was just a very small pin which is there all the time and registering basically everything, recording everything. It’s kind of a digital assistant, a Star Trek-like assistant. Question is, what would it mean to this conference if we had such a technology which is every time recording everything which is happening, recording who is talking to what, to whom, and what possibly feelings he has? So you can imagine the type of information asymmetry that we are going to have, and also the power of those who can also predict certain alliances, certain power games in the future. So you can see we have certain new aspects to existing ethical challenges, like privacy, bias, accountability. And we have also some completely new challenges. We had Tom Hanks also last week telling that there is a digital Tom Hanks around who is publicizing some dental care. He has nothing to do with this. We have more and more of these digital twins who are going to be copied, not only our physical features, but also our characteristics, the way we are talking and the way we are feeling. So how much can we actually control these digital selves or these digital feelings? Are we going to need to have an authentication not only of content, but also of these digital humans? And last but not least, I want to conclude with safety. I think that safety is also going to play a completely different role that we are discussing now in terms of AI. Maybe some of you have heard this advertisement that the metaverse is virtual, but its impact is real. And I think that’s very true. Of course, you will have a very real impact when it comes, for instance, to healthcare. But if it is not designed well, then it has a real impact on the patient. And other things which make this need of designing it the right way a very important one.

Patrick Penninckx:
Now, the human rights activists, but also organizations that stand for human rights, are very often seen as a little bit alarmist and do not see sufficiently the positive sides. But it’s also for a human rights organization to be able to point that out. Let’s say the evangelists, if I may call them that way, of the future developments, including the immersive realities, will point at the advantages. They also do serious efforts. I’m now not speaking for that business community, but it’s not as if that business community goes about developing things in a completely unethical way. They put quite a number of resources into place. Metaverse, unfortunately, or meta was unfortunately not able to… to participate in this panel discussion, but I know they do a lot of effort in order to be able to ensure that the ethical principles, human rights principles, legal principles are also being respected. I will, Adam will certainly say something more about it afterwards as well, because that’s their prime concern. Well, not their prime concern, their prime concern remains doing business, obviously. But the question is not so much how much ethical principles are being put forward by private business. It’s also to which extent this new universe is going to be regulated by private business or to which extent has a democratic society with the principles that it endorses and tries to promote. To which extent does that have an impact on the development of this new immersive reality? None of us here are immersive natives. I’m an analog native. Some of us may be digital natives. I’m not looking at anyone in particular, but none of us are immersive natives. We will have to be able to look into a completely new reality of which we do not necessarily yet see the contours. And in order to be able to see those contours, let us not be naive. I’m old enough to have looked at the start of the internet and the positive feelings about democratic governance and participation and improvement of, let’s say, grassroots democracy. But we also see that that was maybe a little bit naive and that we also see that there are a number of things that we need to ensure that especially when our societies are instead of growing more democratic, are getting more defensive of human rights, we’re regressing, we’re backsliding. So let us see what that means. If some of the information and data that have been collected, even until now, fall into the wrong hands, I think we are very badly off. Now, the metaverse of… also in the immersive realities, allows for new forms of crime, allows for new questions or has to be put new questions with regards to the jurisdiction. Where, who is going to be judge and party? Can we be judge and party? Should we not divide that? Should we not have the ones that are deciding on how the developments are taking place be separated from those who take a number of decisions with regards to the jurisdiction about it? Now, we’ve spoken about privacy, Clara mentioned it before. We’re getting into a new dimension of privacy because in order to create an immersive reality, we also need to ensure that new forms of data collection, including biometric, psychography are recorded. These are very intimate, more even, I would say, than our health data, which are sensitive data. How are they going to be governed? I think even if, who was it? Tom Hanks? No. Was it Tom Hanks? Complaints about deep fakes, I think in the future, we will be dealing with something which is far more immersive than that. I think we’re moving towards, in order to be able to represent yourself through an avatar, it basically means that you have to have a complete picture of yourself, including of your expressions, et cetera, et cetera, to make it more realistic. Will we in 30, no, in 2034, will the IGF take place in an immersive world, Irina? So these are the kind of things that we need to. ask, and what are the consequences of that for privacy and digital security? How do we identify ourselves? Not only Tom Hanks, but also everyone in our room here. What about anonymity? Can we still be anonymous? We’re outraged about video surveillance, and some countries and some cities are excelling in that, but what about anonymity? What about private life? At least for the European Convention on Human Life, privacy is one of the pillars, Article 8. What about freedom of expression? What about the counterpart of disinformation and misinformation? We see, especially now with the ongoing war, how misinformation and disinformation are being used in a 30s, 1930-like manner, but in a much more efficient manner, to be able to stifle freedom of expression, but also to control all forms of population. That immersive reality can only be an extra layer of that, and I think we need to not be naive in terms of thinking that everyone is nice. Not everyone is nice. At the IGF, of course, everyone is nice, but there are other people out there which may be not so nice, and that have different intentions on how your private information will be used. Let’s also think about inclusivity. The speeches earlier today were all about how can we make the next 2.6 billion people connect to the Internet. But how are we going to connect the next 8 billion people to the metaverse? Who is going to be included? What are the elements of inclusion? I see the potential for educational purposes and so on and so forth, but in order to be able to benefit from those educational goals, we have to be able to ensure that people can also participate. So, inclusivity, accessibility. How are we dealing with the digital divide, not only worldwide, but also within our societies? And that is something that has also been shown during the COVID crisis, how the digital divide in our countries has been extremely difficult to overcome. So, governance and accountability. It’s good to be accountable to yourself, but you can also get away with certain things. I try to be accountable, but I’m not always so accountable. Don’t tell anyone, but that’s the reality. If you’re judge and party, you cannot be totally objective. So, we need to, in this multi-stakeholder approach, come to common sense. I think this IGF also points at it. That is that we need to be able to, on the basis of a number of common principles, common values, how do we want to see the next step, not only in internet governance and artificial intelligence, but how do we also measure that in terms of the immersive realities and how are we going to position ourselves to that? Are we going to be naive in hoping that the next generation will be simple and will be defensive or not?

Irene Kitsara:
Thank you. Let’s now move to Melodina. And being aware of time, I’m asking all the speakers onward to be conscious of that so that we leave time for the Q&A. Melodina?

Melodena Stephens:
Yes, thank you. So, I would like to very briefly talk about the Universal Human Rights Article 23, which says everyone has the right to work, the free choice of employment, to just and favorable conditions of work and protection against unemployment. The metaverse is data hungry, so it basically consumes your data, just like Clara and Patrick have mentioned. And the worry is it will remove jobs. For the first time, the World Economic Forum, in their 2023 report, has actually said AI technologies, like metaverse, will be a net job loss, not a net job increase. And that means we will not be prepared, because now skills don’t matter, your experience doesn’t matter, all saved on the metaverse and this the cost of not preparing people to have jobs or to keep jobs will be something like 11.5 trillion for training but even more if you look at things like pensions or social security the bigger worry is the jobs that are being formed are often low-paying jobs so the human being is coming to the bottom of the supply chain right and we see this already because some of the jobs are things like tagging content or content moderation i’ll give you an example for example roblox has a very active community and they have 4.25 million developers and if you want to earn on roblox and convert their money that is the roblox to actual u.s dollars you have to make a minimum amount of money and after 4.25 million developers only 11 000 qualified this has a direct impact on health and that’s another universal right right and the impact is well-being especially the uncertainty whether i get to keep my job i think is important so this also raises questions on ip uh assuming my experiences and my skill sets are because of the amount of years i spend and are uniquely mine do i have ip on this we also see another important thing coming in which is perhaps behavioral addiction to technologies like this i mentioned right at the beginning a lot of the metaverse has been built from gaming so we try to gamify behavior and we know for children as an example that many are not just children adults also can get addicted to games so this has been declared a psychiatric disorder in 2019 by who but the worry is as we start putting it into our daily life in shopping in work and in education at what point will the so-called magic circle the circle between reality and imagination disappear and this is something we aren’t actually putting enough research into i would also like to very briefly bring in environment clara mentioned that but the metaverse is something that requires huge amounts of data and computing power hence it has a significant carbon footprint right just take the semiconductor chip which is embedded in most of our technology if you’ve got a mobile phone or a laptop the average ship when you take all of its components travels 50 000 kilometers right and it’s embedded in 169 industries so we’re looking at environmental costs uh in carbon in terms of water because chips are not recycled we see that the e-waste is growing exponentially and less than 17 percent is recycled so this will get into your groundwater and something like mercury we see that in fish across the ocean so it’s not contained uh we also i just want to briefly mention one more thing but culture representation becomes extremely important in the metaverse and i think this will be something nations will have to consider whether it’s stereotypes that are being represented on the metaverse or how do you actually do that so with that adam over to you

Adam Ingle:
thanks um i’ll keep it brief because a lot of the challenges have been discussed um i think one thing that has come out though and always comes out in these discussions is how so many of the issues aren’t unique there exists today and we’re still grappling with the solutions today and now regulation and legislation is forming a response to these issues so i think we’ll actually have to wait and see how the issues in web 2.0 and the regulatory response and the cultural response to these issues plays out to see whether you know we’ll actually start in earnest with the metaverse from a better playing field um but when it comes to kids and the challenges they they face you know i think from our mind um from our mind we want to create a really kid-friendly ecosystem one with high safety standards responsible design um you know limited um ways for harmful contact conduct contract um and in order to do that to create a truly immersive ecosystem we need others to join us and also share our standards because you know we can create all these great lego experiences but a metaverse is interconnected that it’s interoperable so everyone needs to lift their game if we’re going to have a collective approach to address a lot of the harms that that children are going to be facing

Irene Kitsara:
again adam thank you for leading to the to the last question and again because of time i would ask the rest of you to cover we are at the igf so naturally the last question is around governance of the of the metaverse and could you share some key concepts um you know the issues we have been hearing and the considerations and challenges are very much i think known issues from ongoing or previous discussions related to ai generative ai social platforms and gaming how can we address some of these challenges that we heard and what could be some of the considerations and elements we should bear in mind while considering governance of immersive realities. Patrick, would you like to start? Or Melodina, Melodina, would you like to start?

Melodena Stephens:
Sure, so when we look at the governance right now, I just wanna quote something from ITU in 2003, an IGF committed to the WSIS principles, which says, commitment to build people-centric, inclusive and development-oriented information society. I sometimes worry whether we put technology before people. So we see that there’s a lot at the national level in terms of policies, OECD reports 800 AI policies, most of them are in North Africa and Europe. And we also see a lot of data regulations, 62 countries with 144 data regulations, but most of it is fragmentary. So the metaverse will be global and it really requires collaboration across governments. The few governments that have put policies on the metaverse, most of them recommend self-governance. And I think this is because of the adoption curve. So you see South Korea came out with ethical principles, the Agile Nations, which is a coalition group, it’s an IGO with UK, Canada, Denmark, Italy, Singapore, Japan, and UAE, is coming out with a report in this week. And again, it talks about self-governance. China for the first time has actually said, you could file trademarks of NFTs and virtual goods. And this is a big shift that’s coming in. And Australia has a white paper on standards. But again, self-governance, because the time to collaborate and put together an overarching policy will take too long and we need private sector to work with that. Now there are standards coming out. So if we look at something like the metaverse standards, which is an association with 2,400 members, most of it private sector. Now, one of the challenges I would like to bring is open source. So the metaverse builds on top of open source and there’s a proprietary layer. And this really creates a problem. So take, for example, a database of faces. So Megapixel had a data set of 4.7 million faces scrapped from Flickr. Today you can do it from Instagram or from YouTube. And 80% of that was from these places. And it’s used in 900 research papers. So we see this open source does have some challenges that I’d like to highlight. Another one is Apache software. There’s something called the log4j and this is responsible for the 404 error that you see. And they found out there was a problem in its code that created a vulnerability. And what’s interesting is it’s embedded everywhere in Amazon, in Apple, in Minecraft, and all Java systems. And that’s 3 billion devices. So we can just see that this problem will exist. And it’s not really how much foresight you have in that, but how quickly and how transparently we can work together. If we penalize private sector for being transparent, they will hide it and it will make the vulnerability worse. So that’s something we need to find. We find out also that there isn’t much way forward. For example, Barbados wanted to put an embassy online. In the 1961 Vienna Convention, it talks about only physical embassies, but these are countries with limited resources. And if they need to be represented around the world, virtual embassies work. But again, this is a negotiated thing where there isn’t much information on that. I just want to highlight one more thing. Most governments who are being represented on the metaverse are being represented on top of private sector. So they’re using something like Decentraland or the SANS and working with that. I think this raises also interesting questions, at which point I’d like to stop now and hand over.

Irene Kitsara:
Thank you, Melodina. Patrick?

Patrick Penninckx:
Yeah, when you speak about governance, I think there’s a number of governance principles which are already enshrined in what we’ve done on data protection, what we’ve done on cybercrime, what we’re now trying to do also on artificial intelligence within the Council of Europe, questions related to responsibility, to transparency, to explainability, to revocability, to the right to contest. All of those elements need to be looked at. And obviously, what we did when we started to work on the new Convention on Artificial Intelligence, the first thing we did was some kind of a feasibility study that is look at what are all the ethical principles that are already out there and which are applicable, what is the legislation that is out there and that would be applicable to the metaverse, and then look at where are the gaps, and if we have identified the gaps, then look at how, which are the elements that could constitute the elements of a future governance within this. I think I’ll leave it at that.

Clara Neppel:
Thank you, Patrick. Thank you. Well, I think that what we hear now more and more from the private sector as well is that there’s a need for the interoperability of regulation, actually, for regulatory requirements, and one way to achieve this could be, is actually through global standards. And I think that it is important to say that standards are there, of course, to move from principles to practice to actually operationalize regulation, so this would be the top-down approach, and this is important, but we also see a bottom-up approach. So in IEEE, we’ve been working since 2015 on ethically aligned design initiatives, which resulted in a set of standards from value-based design, which can be used also for the metaverse, to defining more closely what is transparency, what does it mean to have age-appropriate design. I think, Adam, you’re a part of that. And so I think that we need to bring together this top-down and bottom-up principles in order to create that framework which works for everyone. Yeah, I think I will just let it here because we want to have some questions as well. Thank you.

Irene Kitsara:
Yes, and I would like now to turn to you and to the audience and see if you have any questions to our panelists. And then I hear we also have an online question. Maybe we can start with that. And you can think in the meantime.

Audience:
Does this work? Can you? Yes. So I will just read the question in the chat from Nina Jane Patel. With increasing immersion of users into this virtual realms, there is potential access to a plethora of biometric data, from eye tracking, to brain activity, to heart rate. How do you envision the governance and regulation of such intimate data in the metaverse? Furthermore, what steps do you believe need to be taken to ensure that individuals’ biometric data remain private and protected from the misuse? Thank you.

Irene Kitsara:
So I can address what we have identified in the report. Maybe that will give an overview of some of the issues that have been identified by the experts. So indeed, we will be looking into much more invasive practical supervision and censoring. And the idea is we will be looking, so our experts have been looking into the idea of rethinking privacy, rethinking what this means. There are different defenders of the introduction of the so-called neural rights. There is in Chile, this has been also covered in their constitution. On the other side, there are some thoughts. There are issues around bystander privacy, not just your own that you can potentially consent to, but also, for example, the people who may be in the same room with you and they don’t know that they are being also recorded with you. So there is a plethora, indeed, of questions. And there are different views around the governance of that, whether there may be also some self-regulation, self-governance principles that could help with that, or whether we should be looking at reinterpretation of existing hard law or introduction of a new one. Do we have any questions? Please, the gentleman.

Audience:
Good morning. Thank you that I could participate in your panel town hall. It’s very interesting, especially if you talk about immersive, but what now, life, technology, or maybe existence. And from that perspective, we in Poland, because I came from Poland, has a different consideration than now. The biggest tension is not on the freedom of expression, not even of personal data and privacy, but much more, and maybe it’s only one of the future tension, freedom of conscience, not from the religious point of view, but from the psychophysical integrity of person. And from that perspective, I would like to ask you if we can suggest something how to deal with this. Also, not only it is one of the part of fundamental rights, of course, but from the technical point of view, it’s, of course, challenging. I understand this. I thought that it was worth to put the question on the table. Thank you.

Patrick Penninckx:
I remember chanting in one of the demonstrations in Belgium in the 1980s that the thoughts are free. I don’t know if the thoughts will still be free, and that’s freedom of conscience indeed. Of course, I don’t know to which extent. Once we are starting to look into interaction between machine and man, and if we see that technology already enhances or has the capacity to influence our behavior, to which extent will it influence our thought processes? I think our thought processes are already being influenced by the messages that we get very directly, otherwise how could you explain that whole forms of the entire populations can be influenced in a certain manner? When I looked at the Edelman Trust Barometer, I saw that in authoritarian regimes, the trust in public service media is the highest. This seems to be contradictory, but it also is quite revealing on how a regime, and whether that be a private or a public entity, can actually influence the way people maybe not think, but at least… act according to what is expected from them. So freedom of thought is definitely, and freedom of religion, because also in the European Convention of Human Rights, this is enshrined, are definitely things which are at stake and that would need to be looked at. Thank you.

Clara Neppel:
If there are no other questions, Robert, it’s my personal view. I think that we are discussing now much more about the moderation, that practically we are discussing about content moderation and if this should be private or public. And probably in order to have a certain balance, we need to have this multi-stakeholder moderation at some point. And this is, I think, that we are here at the International Governance Forum, this should be at the heart of discussions, I think, because this is also what Patrick mentioned before, democratic process cannot happen if you are not, if you cannot control, if you don’t have anonymity, first of all, I think that’s important. If you don’t have anonymity, you cannot actually exercise your rights as a citizen, I think, as well. But that’s my private.

Irene Kitsara:
Just a short, just shortly to that, we are identifying the report of this mental privacy, mental autonomy and the practically reinterpretation of notions that we knew like freedom of expression and what they mean nowadays with these technologies which have the potential of even changing our, not just our perception of reality, but even changing our thought process and even our, you know, the facts. Thank you.

Audience:
Oh, hi, my name is Michael Kernikos, I’m the Executive Director of the UCLA Institute for Technology Law and Policy. I wanted to pick up on what you said about content moderation, because as far as I understand it, the tools to moderate content effectively at scale do not exist for these technologies. So it’s fine right now as long as adoption rates are where they are, but if these things take off rapidly. there’s no actual way to follow the standards that already exist for traditional social media platforms. So is that something that you’re looking into, or this is a legal challenge and a policy challenge as well as a technical challenge? Okay, thank you very much. I’m Steve Fosley from UNICEF. We also did a short report on the metaverse and children and some of the rights. So hopefully that was useful. My question is around, and sorry, maybe this is too big a question for this time, but I’m from South Africa originally. Your thoughts on how the metaverse will play out over time? Because not everybody can afford the $500 or $3,500 headset, and not everybody will. So if these technologies are going to actually scale globally, and that’s also a question, but I think they will, but they’re going to look very different for users in South Africa, in Johannesburg or Cape Town, to perhaps kids, and obviously I’m looking at children in New York, some children in New York. And perhaps we’ve seen some signals of this, of beginning to talk to cloned characters, and you might be talking to them on WhatsApp, it doesn’t have to be in an immersive environment, but it’s beginning to normalize talking to AI, basically. And you’re not always sure if that’s a person or not. So any thoughts on how this might play out? And if there isn’t time now, I’ll be here for the next few days, so I’d love to have a coffee and pick your brains. Thank you.

Irene Kitsara:
Who would like to?

Melodena Stephens:
I need a little microphone. Yes. But I answered. Was it Adam who was going to go ahead? Please. Oh, so I was going to just say one thing, that when we look at the metaverse, generally the standards often come from maybe the IT sector, right? Or the technology sector. But we’re seeing now health coming into that. So it’s really important we don’t approach this in silos, ministries across have to work together. That means health has to sit with social. If you see an impact on people and communities and society, but you also have to sit and work with technology. And that’s missing right now. So, for example, content is being developed for schools. And I don’t know if there’s a psychologist, sociologist involved. I think in Adam’s company, they do. But in many cases, this is not necessarily true. On the case of inclusiveness, these technologies. will get cheaper and cheaper and cheaper. So I see that actually happening because these technologies are only viable at scale. That’s the only way they will work. But then there’s this danger that they will be affordable and embedded and you cannot get rid of it. Think of chat GPT, everyone’s using it. And now we’re trying to figure out how can we use it more? Or what can we do to regulate it? So we’re right now at that wonderful time. We’ve got a 10 year window to have these conversations and come up with the safeguards. And that’s why I think these dialogues are so critical. Thank you.

Irene Kitsara:
Thank you, Melodina. And I think we need to stop here but talking about partnerships, I would like to share with you that the result of the digital partnership that between IEEE and the Council of Europe and stay tuned to the upcoming report on the metaverse and its impact on human rights, the rule of law and democracy, which is expected to be released in early 2024. Thank you very much. And thanks to our panelists and the organizer and our host of course. Thank you.

Adam Ingle

Speech speed

172 words per minute

Speech length

700 words

Speech time

244 secs

Audience

Speech speed

181 words per minute

Speech length

644 words

Speech time

213 secs

Clara Neppel

Speech speed

162 words per minute

Speech length

1918 words

Speech time

713 secs

Hugh

Speech speed

158 words per minute

Speech length

541 words

Speech time

205 secs

Irene Kitsara

Speech speed

172 words per minute

Speech length

1334 words

Speech time

465 secs

Melodena Stephens

Speech speed

176 words per minute

Speech length

2440 words

Speech time

832 secs

Patrick Penninckx

Speech speed

144 words per minute

Speech length

2769 words

Speech time

1150 secs