Lightning Talk #136 The Embodied Web: Rethinking Privacy in 3D Computing

19 Dec 2024 10:30h - 11:45h

Lightning Talk #136 The Embodied Web: Rethinking Privacy in 3D Computing

Session at a Glance

Summary

This discussion, led by Stanford Law School professor Brittan Heller, focuses on the privacy implications of emerging 3D computing technologies, particularly extended reality (XR) and spatial computing. Heller explains how these technologies, which blend physical and digital realms, collect deeply personal data including body movements, eye tracking, and physiological responses. This data collection is far more extensive than traditional computing platforms and poses significant privacy risks.

Heller highlights that current privacy laws are ill-equipped to handle the nuances of immersive technologies. For instance, opt-out mechanisms are ineffective as spatial computing relies on body-based data for basic functionality. Recent studies have shown that behavioral data from XR devices can uniquely identify individuals and reveal sensitive information about age, gender, and even political affiliation.

The discussion delves into the potential misuse of this data, including targeted advertising based on involuntary bodily responses and the extraction of medical information unknown even to the user. Heller emphasizes the need for new privacy frameworks that address these unique challenges, including protections for environmental and body-based data.

The talk also touches on recent developments in generative AI for creating 3D virtual worlds, which while exciting, further complicate privacy concerns. Heller advocates for integrating privacy by design principles into these technologies as they evolve. She concludes by calling for proactive measures to develop legal, technical, and ethical standards that ensure user control over personal data in this new “embodied web” era.

Keypoints

Major discussion points:

– The rise of spatial computing and extended reality (XR) technologies that blend physical and digital realms

– Privacy risks associated with XR devices collecting deeply personal biometric and behavioral data

– Gaps in existing privacy laws and frameworks in addressing XR-specific data collection and use

– Recent developments in generative AI for creating 3D virtual environments

– Need for new privacy frameworks and safeguards to protect user rights in spatial computing

Overall purpose:

The purpose of this discussion was to raise awareness about the privacy and ethical implications of emerging 3D and spatial computing technologies, particularly extended reality (XR) devices. The speaker aimed to highlight the unique challenges these technologies pose to existing privacy frameworks and advocate for proactive development of new safeguards and regulations.

Tone:

The overall tone was informative and cautionary. The speaker presented the topic with a sense of urgency, emphasizing both the exciting possibilities of these technologies and the critical need to address their potential risks. While highlighting concerns, the tone remained optimistic about the potential to develop responsible and ethical approaches to spatial computing if action is taken proactively.

Speakers

– Brittan Heller: Professor at Stanford Law School

– Nouha Ben Lahbib: Project manager for an incubator for creative startups using new technology like VR and XR

Full session report

Extended Reality (XR) and Privacy: Navigating the Challenges of Spatial Computing

This discussion, led by Stanford Law School professor Brittan Heller, explores the privacy implications of emerging 3D computing technologies, particularly extended reality (XR) and spatial computing. The conversation emphasizes the urgent need for new privacy frameworks and safeguards in light of these technologies’ unique data collection capabilities and potential risks.

Introduction to XR and Spatial Computing

Heller introduces the concept of the “embodied web,” where our physical bodies become the interface for digital interactions. This new paradigm of computing blends physical and digital realms, creating immersive experiences characterized by presence, immersion, and embodiment. While offering exciting possibilities, these technologies also present unprecedented privacy challenges.

Key Privacy Concerns in XR Technologies

XR devices collect deeply personal data far more extensive than traditional computing platforms, including body movements, eye tracking, and physiological responses. The privacy risks associated with this data collection are significant:

1. Unique Identification: Recent studies have shown that behavioral data from XR devices can uniquely identify individuals and infer over 40 personal attributes, including age, gender, substance use, and political affiliation.

2. Sensitive Information Extraction: Eye tracking data can reveal highly sensitive medical and personal information, including truthfulness, sexual attraction, and preclinical signs of physical and mental health conditions.

3. Targeted Advertising: The potential misuse of involuntary bodily responses for targeted advertising, as illustrated by Heller’s scenario of receiving car insurance advertisements after playing a virtual reality racing game.

Challenges in Regulating XR Technologies

The discussion highlights several key challenges in regulating and protecting privacy in XR environments:

1. Inadequacy of Current Laws: Existing privacy laws are ill-equipped to handle the nuances of immersive technologies.

2. Essential Data for Functionality: XR devices rely on body-based data for basic functionality, complicating privacy protection efforts.

3. New Data Categories: Environmental and body-based data collected by XR devices are not adequately covered by existing regulations.

4. Limitations of Opt-Out Mechanisms: Traditional opt-out approaches are ineffective in XR environments due to the essential nature of data collection for device functionality.

Advancements in Generative AI and 3D Environments

Recent developments in generative AI have revolutionized the creation of 3D virtual worlds. Heller notes that companies like NVIDIA, MIT, and Google have made significant strides in this area, allowing for the rapid generation of navigable 3D environments from text prompts. While these advancements open up creative possibilities, they also further complicate privacy concerns in XR environments.

Psychological Impact and Legal Implications

The immersive nature of XR experiences necessitates considering them as extensions of lived reality. Heller cites the example of a UK public prosecutor investigating sexual abuse in the metaverse, highlighting the need for strong safeguards to protect users’ rights and safety in virtual spaces.

Approaching XR Privacy Issues

Heller suggests four steps for addressing XR privacy concerns at home:

1. Understand the technology and its implications

2. Identify personal boundaries and comfort levels

3. Research privacy settings and options on XR devices

4. Advocate for privacy protections and responsible development

The Future of Privacy Forum has also introduced the concept of “bot-based data” to describe the unique data generated in XR environments.

Awareness and Education

Both Heller and Nouha Ben Lahbib, a project manager for an XR startup incubator, stress the importance of awareness and education regarding XR privacy for developers, users, and the general public.

Conclusion

While XR technologies offer exciting possibilities for innovation and creativity, they also present significant privacy challenges that require urgent attention. The discussion concludes with a call for proactive measures to develop legal, technical, and ethical standards that address the unique challenges posed by XR technologies. As these immersive technologies continue to evolve, it is crucial to ensure responsible development and use that prioritizes user privacy and safety in the emerging landscape of the embodied web.

For further information or inquiries, Brittan Heller can be contacted at bheller@law.stanford.edu.

Session Transcript

Brittan Heller: I am a professor at Stanford Law School. I teach international law and study new forms of computing hardware like 3D computing, spatial computing, and AI. I’ve done so for about eight years at this point. What I’m going to talk to you about today is what happens when AI grows legs and starts walking around amongst us. I know this is a little different than most of the content that we get at IGF, but I think that this is a forum where we can talk about the future of computing and the type of privacy. Imagine this. The three of us are playing a car racing game in virtual reality. So we put on the headsets and what car do you pick? What type of car do you pick? So you pick a Volkswagen buggy. You do? What kind of car do you pick? A Bugatti. That’s a good one. Maybe something fast and pure like car racing or something. I don’t know. I pick a cherry red McLaren. And when I see that car, I race and I beat your VW buggy and even your Ferrari, I think you said, but I really like this car. And what happens is that my heart starts to race and my pupils dilate, my voice, these bodily reactions to the data because I really like what I’m seeing. Later on in virtual reality, I start checking my email and I get messages that tell me why now is a great time to renew my auto insurance. I go into… Later on in virtual reality, I go about my day. I check my email and I see I’ve gotten advertisements about why is a great time to renew my auto insurance. I go into a social club and I see somebody who looks like a person that I find to be very attractive ask me what car I drive. Finally, I go back into the game and I see the red McLaren go by driven by somebody who looks more than a little bit like me. This sounds like science fiction, but all of the capabilities I talked about are already present and deployed to some extent in virtual reality environments. The hardware tracked my heart rate increase, my body’s instinctual reactions when I saw something I liked, my pupil dilation rate and my gaze vectors in particular. And so these type of preferences and behaviors are the type of data that can be shared with advertisers and data brokers in most jurisdictions. So it’s not distant. It’s not virtual reality. It’s actual reality and not hypothetical happening now. We’re transitioning away from traditional… What this shift belies is a critical issue, making sure that our frameworks match the sophistication of the new 3D computing technologies. So over the last few years, we’ve seen the rise of spatial computing. And when I say spatial computing, I mean technologies that blend physical and digital realms. And these are called a couple things. The term that seems to be winning is extended reality or XR. You can see that with legislation going out. You can see that with the types of hardware that are getting the most investment. They’ll call it extended reality. But you’ll also see terms like virtual reality, augmented reality, or to a lesser extent now mixed reality. Companies that were selling what they called mixed reality headsets are now being phased out. But XR is the term that seems to be winning. And these allow for immersive experiences and it transforms industries like gaming and healthcare and education in particular. This is because unlike traditional computing platforms, the XR devices collect deeply, deeply personal data, including body movements, eye tracking, and your physiological responses to stimuli. But they also create a record of the stimuli itself that you are reacting to. And this makes it the most rich of any other data flows that we’ve experienced in a computing platform. It’s the same reason that people get excited about XR stuff, where they say this is the best tool for learning we’ve ever developed. A study just came out from Harvard Business School verifying that. But that’s the reason. Because there are these reciprocal data flows from your body to the computer and back again. It also creates really significant privacy risks that were not contemplated when we were writing laws for the traditional flat screen internet. Some of the challenges to traditional privacy protections are that basically, I’m a professor of international law. These laws around the world are not equipped to handle the nuances of immersive technologies. One example is that opt out mechanisms were kind of the standard that many legal regimes relied on, that you could opt out of your data collection. But that’s not effective when you’re looking at a 3D computer. Spatial computing relies on body-based data for functionality. The way these headsets are built, you have six cameras facing in, six cameras facing out. And you need that to position yourself in physical space to put the digital overlays on it. You also need it to calibrate the device so that you don’t feel nauseous or seasick when you’re using it. So if you take out this eye tracking information, you can’t use the computer. So having these opt out mechanisms for sharing your biometric data, which I’ve seen a lot of legal proposals sort of contemplate, just won’t work based on the way the computers actually function. Binary biometric responses can also be exploited for targeted advertising without users being fully aware of what’s going on. Behavioral data, like head and hand motions, is actually unique enough to identify individuals. And when I started doing this work, I was saying that privacy law may not be the best regime to handle these data flows. Because what that meant was privacy law is premised on personal identifying information. And until about two years ago, you couldn’t uniquely identify a person from these data flows. But last year, there was a study that came out from Berkeley. And it used VR motion data from Beat Saber, which is the most popular video game you can play in XR. If you haven’t tried it, it’s actually really fun. Blocks come at you and you chop them to music. Great exercise. There was a publicly available data set with the locomotion of how people were playing the game. The study was by Vivek Nair and it demonstrated how over 40 personal attributes, including age, gender, even substance use and political affiliation could be inferred from motion patterns alone. And it was accessible by a publicly available data set. There were other studies that were done by both Stanford and Berkeley that found that the way you tilt your head and point was as physically identifying as your fingerprint. By that, there was again the same data set was used to try to identify one. It took 90 seconds of recorded data with the way you tilt your head and your point. And a data set, first Stanford did it with about 2000 people and they were able to uniquely identify one person. Berkeley redid the study with 55,000 people. So not one person in a university class, but one person in a football stadium. And they were able to, based on the way you moved, identify one person out of the crowd by your teletromy. So in many privacy laws, we talk a lot about our digital fingerprints, but when you look at a 3D version of the internet, the way you move is as fundamentally identifying as what you say and the kind of mosaic of information available about you. The move to 3D technology also brings different risks that extend beyond traditional concerns. I think foremost, based on what I described, is the privacy invasion. Sensitive data can be obtained from the eye tracking information that you need to calibrate these devices. And by that, the example that I gave at the beginning about the car racing game, the way that your eyes react to the light. You have six cameras in, six cameras out, and it’s It’s normally an infrared camera looking at your eyes. It gauges the way your pupils dilate in response to stimuli, and that can also give you medically significant data that most people and most legal structures don’t understand is that rich. Through your pupillometry information, I could tell you whether or not you were likely to be telling the truth. I interviewed one of the first creators of these headsets who worked for the U.S. military and he asked me why I wanted to put a camera, a polygraph with six cameras on my face. You can tell whether or not somebody has, is sexually attracted to a person that they’re looking at, which is why I had the example of the very sexy person at the bar. So you can tell somebody’s protected characteristics like their sexual orientation through their involuntary bodily responses to what they’re looking at. There’s no way to control your eyes dilating when you see a person you like. Finally, there are other physical and mental health indicators that are contained in these datasets and they are preclinical signs, so they’re things your doctor does not know about you yet. They’re things your doctor doesn’t necessarily know to look for at this point, and a lag in your pupil dilation can be a sign of Parkinson’s disease, Huntington’s disease, autism, schizophrenia, or some forms of ADHD. So very rich data, significant to me, data that is protected by human rights laws and is supposed to be protected data in many, many jurisdictions around the world. In many jurisdictions, there is also not a bar on this type of personal information being shared or sold without consent. So there’s a possibility that this could be misused and abused by creating targeted advertising to you based on personal characteristics you weren’t aware you were giving away or things that you and your doctor don’t know about you yet. Profiling and surveillance risks can also increase with the granularity of data protected. XR is not just the self-contained headsets, it’s also how many people have tried a virtual reality headset? A couple people. How many people have used a Snapchat or an Instagram, a lens or a filter on your pictures? How many people have used a QR code to order food at a restaurant? So congratulations, you’re all part of the embodied web, and if you think about it, digital overlays on physical space, not necessarily a self-contained, Tron-looking video game playing headset. That’s the type of hardware that we need to be aware, thinking about for encompassing these new privacy risks. With the profiling and surveillance risks, we’ve actually seen this come mostly out of the video game context, where some video game companies who had massively popular games around the world were criticized by users as developing real-time location-based devices. So when you played their game, it showed where your location was in relation to other people playing the game, and users were not aware that that was being transmitted as they were playing a game like Pokemon Go. So people talked about it as a new form of stalkerware, where as you were playing the game and using the digital overlays, it was actually transmitting your physical location to other people with minimal levels of consent. That may be the way that some of the games operated, but that is a risk when you are playing with something that has the physical-digital hybrid information. I think the real risk is that, I come from an American jurisdiction, so at least under American law, behavioral and inferential data is just starting to be included in privacy law. It’s not a very common thing, and it’s really not common around the world unless you live in a jurisdiction that focuses on neuro-rights. So unless you’re from Chile, basically, it’s not necessarily going to be covered. There are people who argue that it is covered under GDPR and in European contexts, but it’s not clear that these applications are contemplated in the formation of the law, and sometimes the laws are… Hopefully, we’ll get there. …the overlays or the headsets are actually going to look like. Behavioral and inferential data can be exploited and used to influence users or push people towards purchases or beliefs, and there’s a high risk of targeted manipulation when you’re looking at this type of a context. It would not be a tech talk today if we weren’t talking about generative AI. So recent developments for generative AI in 3D environments. Go back and see if I… In the last few weeks, we’ve actually seen some pretty cool developments using generative AI to create content for 3D virtual worlds, and I think this is really exciting. I know I talk a lot about the risks of this, but this is the stuff that creates the future of the internet and makes it accessible to people without an engineering background. NVIDIA has tools that can now automate the creation of virtual worlds from a text-based input. MIT’s diffusion models transform 2D images into realistic 3D shapes just based on a description. ThinAI has SHAP-E, so Shape E, which enables 3D model creation from text or image inputs, making 3D design even more accessible. Google’s DeepMind Genie builds interactive 3D environments from text prompts, and this enhances training and immersive experiences. Basically, at this point, if you are familiar with generative AI tools, you can go from a text-based prompt to a navigable 3D world in a way that used to take video game studios six months to 18 months to build. It’s really impressive. I think it was two weeks ago we saw the advent of the creation of a 3D world from one still photographic image. So this is not something that is a future concern if you’re actually looking at the trajectory of the hardware. This is something that is incumbent on us now to look at when people can create these type of worlds through a narrative. They open up creative possibilities, but again, it also will result in new privacy or ethical concerns. So the way that we should look at this is trying to integrate privacy by design principles into these technologies as they continue to evolve. I could throw at you more developments, more companies, but unless they’re thinking about privacy and safety concerns at this point, it dampens my enthusiasm for the creative possibilities. What this means is that we have a need for new privacy frameworks. We need new measures to address the unique challenges that come with 3D spatial computing. So first is identifying gaps in existing laws. Like GDPR has limited coverage of XR-specific data. I think this is a very critical first step. Solutions from jurisdictions around the world should include protections for environmental data and body-based data. And when I say environmental data, it’s because of the way the hardware works, like we described at the beginning. Six cameras in, six cameras out. Or with other headsets, more sensors out so you don’t even need a controller. You can use your hands to navigate the world. Yeah, I see you used it. Or the absolute newest ones use your eyes, so eye tracking, or can even use your thoughts to control the controllers in kind of a very rudimentary form of brain-computer interfaces. So you see we’re getting into the land where privacy concerns will be more serious when we look at the way that the hardware will develop in new forms. Solutions need to include protection for environmental data. When we’re looking at the way that trust and safety regimes will work in this, normal flat-screen trust and safety regimes look at conduct and content of a world. These need to encompass conduct, content, and environment. And all the examples I just gave you about generative AI being able to create your immersive world really make that clear now that the environment is another vector that traditional content moderation systems don’t look at. They haven’t had the technical capability to do that in the past, but now that computer vision-based systems are getting better and are helping generative AI walk off the screen and into the world, we are going to have to think about different ways to help keep those environments safe and to provide adequate controls to help companies meet their legal obligations if content moderation is found to apply to 3D worlds in addition to 2D platforms. I say body-based data because there’s there’s another legal hole where biometric data does not necessarily include some of the risk factors that I’ve talked about in every jurisdiction yet. So certain groups like the Future on Privacy Forum have started saying bot-based data, which does not have a legal definition as opposed to biometric data, which does. I think looking at this in a comparative law perspective, you may be able to say biometric data depending on the jurisdiction you’re from, but that means you have to really look at what is encompassed under those type of protections in your home state. I would also see the way your home jurisdiction looks at sort of new challenges like neural data. I described how the cutting edge systems will let you control through either your thoughts or your eyes the way you interact through the environment. If neural data is not included as a protected category of data in your jurisdiction, you will have a problem. So you need to start thinking about this stuff now and not just in the limited context of brain computer interfaces, but looking at it as more of a general type of data category and include eye tracking in it because your eyes are not just the window to your soul in poetry. That is the way that you identify and access your central nervous system when you’re looking at the sensors that we need for 3D computing. It’s not as sophisticated as reading your thoughts. That’s basically reading your thoughts. So looking at some of the further implications for rights and safety, the type of things we’re looking at with spatial computing actually affect our fundamental human rights. So we have to have strong safeguards to protect data from unauthorized use. And with this, technological innovation can align with individual rights to assure ethical deployment. In the last panel that I was privileged to speak on, I said that it’s like we have a second bite at the apple. All puns intended if you work for Apple. But we have a chance to look at this new ecosystem and create laws and standards, both technical, legal, and ethical standards that will really ensure that users have control over their personal data in a different way than we saw with the evolution of the flat screen internet. And I think that’s really exciting. But it does create an obligation to look at this proactively in a way that so far we haven’t. Transparency and user control over our personal data will be integral in building trust. But this trust is important because then we can support broader goals like some of the strategic development goals through the UN. So that can be a wider application of this if you need to have some sort of, if you want to have sort of a further point to developing these types of standards in your jurisdiction. It can work under reducing inequalities, which is strategic development goal number 10. Maybe fostering strong institutions for justice, strategic development goal 16. There’s lots of ways that you can look at the uses of 3D computing now, which are fundamentally looking at industry and education and creativity and sort of see how that intersects with your national plans. Oh, and strategic development goal 9, industry, innovation, and infrastructure. All of this comes together in what I’ve been calling the embodied web. And that’s because of this reciprocal relationship with information gathered from your body by sensors. And it’s the sensors that separate it from the traditional flat screen web. And computers. And how you feed your information to the computer, it calibrates and sends it back to you. And it’s this circular relationship. Both are needed to create the environment that you want to live in. And relationship. Both are needed to create this new type of web, which is why I’m calling it the embodied web. And it won’t just be virtual reality. You’ll see lots of things. I just had a 20-minute conversation with a robot down the street. Or I guess down the road. So, it’s coming. And it’s exciting. But it’s also means we should be mindful. There are factors in immersive computing. Presence, immersion, embodiment. I could go into what those mean. But basically, these are all psychological characteristics that work together. That also make virtual reality different than flat screen computing. Because it feels real to your cognition. You process it through your hippocampus. All of your interactions in VR. It is your actual reality when you’re in it. And you respond to it like your actual reality. Which make any of the harms that you experience in there even more acute. To the point where the UK public prosecutor’s office has opened an investigation into the sexual abuse of a minor in the metaverse. Because of the psychological impact on the victim. So, states are slowly starting to look at this as not a separate reality, but an extension of people’s actual lived experience. Because it feels real, it fundamentally changes how users perceive and interact with content. And this transition creates immense opportunities across industries. But we will have to prioritize user rights and human dignity. Spatial computing must evolve responsibly and balance innovation with ethical considerations. So, if you’re looking for kind of four steps into how to start approaching this at home, you can do research on XR privacy implications based on how you see it being deployed by your companies and your government. You can collaborate and really engage with user groups and policy discussions. You can innovate and look at privacy preserving XR technologies. So, it may not be limiting access to data like we have done for flat screen computing, but developing privacy preserving technologies on the other end that allow the technologies to be used and calibrated, but still protect people’s right to not be personally identified. And to educate, to raise awareness about 3D computing risks and benefits. Because it’s not science fiction, the time for 3D computing is here. So, we should start thinking about it now to get ahead of all of these risks so that we can maximize the benefits. I know my time is almost up. I wanted to leave a few minutes for questions if we have any, and I’m happy to stay afterwards if that’s more comfortable for people.

Nouha Ben Lahbib: Thank you for this insightful session and information. I’m Nouha, I’m project manager for an incubator for creative startup that’s using new technology like VR and XR technology. And today, when I’m hearing about what’s the challenge, especially with data, we are just pushing them to develop this such experience. And special because they’re enhancing our art culture identity. And the data that they are looking for, also it’s, how to say it, it’s not available in this AI tool or in the digital space. So, today that I know that maybe you need to be aware what type of data and how you can provide for your customer as a developer of immersive experience that you can offer privacy for the data and they need to manage this. Like, they need to inform their client and their customer about this important remarks because, yes, they are doing now using the AI to develop 3D modeling and using this VR experience for special for events. So, a lot of people are using their VR headsets and collecting. So, this is like now, wow, they are collecting data from other people, our people are aware about it. So, maybe not a question, but, yeah, I want to know more about this subject that I can, when I give them, I have a program, let’s talk, and we can offer them this insightful topic to talk about it. It’s important to be aware that you are, sure, developing a new experience, but also you need to be aware about what type of data, how to manage this data for your client and for customer.

Brittan Heller: Thank you. I think that’s really insightful. I’ve done a lot of workshops for national governments with groups of their top content creators in 3D to actually talk about, have people think about privacy and safety, and if they use cloud computing, how they’re exposing people’s data elsewhere, and also talking to startups and hardware providers in the country’s home jurisdiction so that they look at privacy-developing technologies so that people, individuals don’t have to take this burden. They can be aware of the risks and see the ways to mitigate it, but it kind of makes it a more responsible ecosystem. It looks like our time is up. Thank you very much for coming today. My email is britain.heller at stanford.edu, and I’m very happy to continue the discussion later. Thank you.

B

Brittan Heller

Speech speed

139 words per minute

Speech length

3988 words

Speech time

1711 seconds

XR devices collect deeply personal data including body movements, eye tracking, and physiological responses

Explanation

Extended Reality (XR) technologies gather highly personal information from users. This includes data on physical movements, eye tracking, and bodily responses to stimuli.

Evidence

Example of car racing game where user’s heart rate, pupil dilation, and voice reactions are tracked.

Major Discussion Point

Privacy Risks of Extended Reality (XR) Technologies

Agreed with

Nouha Ben Lahbib

Agreed on

XR technologies collect sensitive personal data

Behavioral data like head and hand motions can uniquely identify individuals

Explanation

The way people move their heads and hands in XR environments is unique enough to identify specific individuals. This creates a new form of biometric data.

Evidence

Studies from Berkeley and Stanford showing that 90 seconds of recorded motion data can uniquely identify a person out of thousands.

Major Discussion Point

Privacy Risks of Extended Reality (XR) Technologies

Eye tracking data can reveal sensitive medical and personal information

Explanation

Eye tracking technology in XR devices can capture data that reveals highly sensitive information about users. This includes potential medical conditions and personal characteristics.

Evidence

Examples of eye tracking data revealing truthfulness, sexual attraction, and preclinical signs of diseases like Parkinson’s, Huntington’s, autism, schizophrenia, and ADHD.

Major Discussion Point

Privacy Risks of Extended Reality (XR) Technologies

Current privacy laws are not equipped to handle nuances of immersive technologies

Explanation

Existing privacy laws were not designed with XR technologies in mind. They fail to address the unique challenges and data types associated with immersive experiences.

Evidence

Example of opt-out mechanisms being ineffective for spatial computing due to the necessity of body-based data for functionality.

Major Discussion Point

Challenges in Regulating XR Technologies

Opt-out mechanisms for data collection are not effective for spatial computing

Explanation

Traditional opt-out methods for data collection don’t work well with XR technologies. This is because spatial computing relies on certain types of data for basic functionality and user comfort.

Evidence

Example of eye tracking data being necessary for device calibration and preventing nausea in users.

Major Discussion Point

Challenges in Regulating XR Technologies

Existing privacy laws have limited coverage of XR-specific data

Explanation

Current privacy laws do not adequately cover the types of data collected and used by XR technologies. This leaves gaps in protection for users of these immersive technologies.

Evidence

Mention of GDPR having limited coverage of XR-specific data.

Major Discussion Point

Challenges in Regulating XR Technologies

Need for new privacy frameworks to address unique challenges of 3D spatial computing

Explanation

The unique nature of XR technologies requires new approaches to privacy protection. These frameworks need to account for the specific types of data and interactions in 3D spatial computing environments.

Evidence

Suggestion to include protections for environmental data and body-based data in new privacy frameworks.

Major Discussion Point

Challenges in Regulating XR Technologies

Recent tools enable creation of virtual worlds from text or image inputs

Explanation

New generative AI tools have made it possible to create complex 3D virtual environments from simple text or image inputs. This dramatically reduces the time and expertise needed to create immersive digital worlds.

Evidence

Examples of tools from NVIDIA, MIT, ThinAI, and Google’s DeepMind that can create 3D environments from text or image prompts.

Major Discussion Point

Advancements in Generative AI for 3D Environments

Generative AI opens up creative possibilities but also raises new privacy concerns

Explanation

While generative AI tools for 3D environments offer exciting creative opportunities, they also introduce new privacy and ethical challenges. These need to be addressed as the technology develops.

Major Discussion Point

Advancements in Generative AI for 3D Environments

Strong safeguards needed to protect data from unauthorized use

Explanation

Given the sensitive nature of data collected by XR technologies, robust protections are necessary to prevent misuse. This is crucial for protecting individual rights and ensuring ethical deployment of these technologies.

Major Discussion Point

Implications for Rights and Safety

Opportunity to create new laws and standards for user control over personal data

Explanation

The emergence of XR technologies provides a chance to develop new legal and ethical standards. These can be designed to give users greater control over their personal data than was achieved with traditional internet technologies.

Evidence

Reference to this being a ‘second bite at the apple’ in terms of creating user-centric data protection standards.

Major Discussion Point

Implications for Rights and Safety

Psychological impact of XR experiences necessitates considering them as extensions of lived reality

Explanation

XR experiences can have significant psychological effects on users, feeling as real as physical experiences. This requires treating these digital interactions as extensions of real life, particularly in legal and ethical contexts.

Evidence

Example of UK public prosecutor’s office investigating sexual abuse of a minor in the metaverse due to psychological impact on the victim.

Major Discussion Point

Implications for Rights and Safety

Need for education on 3D computing risks and benefits

Explanation

As 3D computing technologies become more prevalent, it’s crucial to raise awareness about both their potential benefits and risks. This education is necessary for informed use and development of XR technologies.

Major Discussion Point

Awareness and Education on XR Privacy

Agreed with

Nouha Ben Lahbib

Agreed on

Need for awareness and education on XR privacy

N

Nouha Ben Lahbib

Speech speed

129 words per minute

Speech length

272 words

Speech time

126 seconds

Developers need to be aware of data collection and management in XR experiences

Explanation

Creators of XR experiences should understand the implications of data collection in their products. This awareness is crucial for responsible development and use of immersive technologies.

Major Discussion Point

Awareness and Education on XR Privacy

Agreed with

Brittan Heller

Agreed on

XR technologies collect sensitive personal data

Importance of informing clients and customers about data privacy in XR

Explanation

It’s essential for XR developers to communicate clearly with their clients and end-users about data privacy issues. This transparency is key to building trust and ensuring ethical use of XR technologies.

Major Discussion Point

Awareness and Education on XR Privacy

Agreed with

Brittan Heller

Agreed on

Need for awareness and education on XR privacy

Agreements

Agreement Points

XR technologies collect sensitive personal data

Brittan Heller

Nouha Ben Lahbib

XR devices collect deeply personal data including body movements, eye tracking, and physiological responses

Developers need to be aware of data collection and management in XR experiences

Both speakers acknowledge that XR technologies gather highly sensitive personal data, which requires careful management and awareness from developers and users.

Need for awareness and education on XR privacy

Brittan Heller

Nouha Ben Lahbib

Need for education on 3D computing risks and benefits

Importance of informing clients and customers about data privacy in XR

Both speakers emphasize the importance of educating developers, clients, and users about the privacy implications and risks associated with XR technologies.

Similar Viewpoints

Both speakers recognize that the current understanding and regulation of data privacy in XR technologies are inadequate, and there’s a need for increased awareness and potentially new frameworks to address these challenges.

Brittan Heller

Nouha Ben Lahbib

Current privacy laws are not equipped to handle nuances of immersive technologies

Developers need to be aware of data collection and management in XR experiences

Unexpected Consensus

Psychological impact of XR experiences

Brittan Heller

Psychological impact of XR experiences necessitates considering them as extensions of lived reality

While not explicitly agreed upon by multiple speakers, Brittan Heller’s point about the psychological impact of XR experiences being treated as extensions of real life is an unexpected and significant consideration in the discussion of XR privacy and regulation.

Overall Assessment

Summary

The main areas of agreement revolve around the sensitive nature of data collected by XR technologies, the need for increased awareness and education on XR privacy, and the inadequacy of current privacy frameworks to address the unique challenges posed by these immersive technologies.

Consensus level

There is a moderate level of consensus between the two speakers on the importance of addressing privacy concerns in XR technologies. This agreement implies a growing recognition of the need for new approaches to data protection and privacy in the context of immersive technologies, which could potentially drive future policy discussions and technological developments in this field.

Differences

Different Viewpoints

Unexpected Differences

Overall Assessment

summary

The speakers shared concerns about privacy implications of XR technologies and the need for awareness and education.

difference_level

Minimal to no disagreement. Speakers were largely in agreement, with Nouha Ben Lahbib’s question reinforcing Brittan Heller’s points about the importance of data privacy awareness in XR development.

Partial Agreements

Partial Agreements

Similar Viewpoints

Both speakers recognize that the current understanding and regulation of data privacy in XR technologies are inadequate, and there’s a need for increased awareness and potentially new frameworks to address these challenges.

Brittan Heller

Nouha Ben Lahbib

Current privacy laws are not equipped to handle nuances of immersive technologies

Developers need to be aware of data collection and management in XR experiences

Takeaways

Key Takeaways

Resolutions and Action Items

Unresolved Issues

Suggested Compromises

Thought Provoking Comments

Imagine this. The three of us are playing a car racing game in virtual reality… Later on in virtual reality, I go about my day. I check my email and I see I’ve gotten advertisements about why is a great time to renew my auto insurance.

speaker

Brittan Heller

reason

This opening scenario vividly illustrates how XR technologies can collect and use personal data in ways users may not expect, making abstract privacy concerns concrete and relatable.

impact

It set the tone for the discussion by immediately highlighting the privacy implications of XR technologies in a way that captured attention and made the topic feel relevant and urgent.

Spatial computing relies on body-based data for functionality. The way these headsets are built, you have six cameras facing in, six cameras facing out. And you need that to position yourself in physical space to put the digital overlays on it. You also need it to calibrate the device so that you don’t feel nauseous or seasick when you’re using it. So if you take out this eye tracking information, you can’t use the computer.

speaker

Brittan Heller

reason

This explanation reveals a fundamental challenge in protecting privacy in XR – that the very data that raises privacy concerns is essential for the technology to function properly.

impact

It deepened the discussion by highlighting the complexity of the issue and the need for novel approaches to privacy protection that go beyond simple opt-out mechanisms.

There was a study that came out from Berkeley… It used VR motion data from Beat Saber, which is the most popular video game you can play in XR… The study was by Vivek Nair and it demonstrated how over 40 personal attributes, including age, gender, even substance use and political affiliation could be inferred from motion patterns alone.

speaker

Brittan Heller

reason

This reference to scientific research provides concrete evidence of the extent to which seemingly innocuous data can reveal sensitive personal information in XR environments.

impact

It elevated the discussion from theoretical concerns to documented risks, underscoring the urgency of addressing privacy in XR technologies.

Through your pupillometry information, I could tell you whether or not you were likely to be telling the truth… You can tell whether or not somebody has, is sexually attracted to a person that they’re looking at… Finally, there are other physical and mental health indicators that are contained in these datasets and they are preclinical signs, so they’re things your doctor does not know about you yet.

speaker

Brittan Heller

reason

This explanation of the depth and sensitivity of information that can be gleaned from XR data reveals privacy risks that go far beyond what most users might expect.

impact

It significantly expanded the scope of the privacy discussion, moving it from concerns about targeted advertising to potential impacts on personal relationships, health privacy, and even human rights.

In the last few weeks, we’ve actually seen some pretty cool developments using generative AI to create content for 3D virtual worlds, and I think this is really exciting… Basically, at this point, if you are familiar with generative AI tools, you can go from a text-based prompt to a navigable 3D world in a way that used to take video game studios six months to 18 months to build.

speaker

Brittan Heller

reason

This comment highlights the rapid pace of technological development in XR and AI, showing how quickly the landscape is changing and potentially outpacing regulatory efforts.

impact

It shifted the discussion to consider not just current privacy concerns, but also the need for forward-looking policies that can adapt to rapidly evolving technologies.

Overall Assessment

These key comments shaped the discussion by progressively revealing the depth and complexity of privacy issues in XR technologies. Starting with a relatable scenario, the speaker built a comprehensive picture of the unique challenges posed by XR, from the necessity of collecting sensitive data for basic functionality to the unexpected insights that can be gleaned from this data. The inclusion of scientific research and recent technological developments grounded the discussion in concrete realities while also emphasizing the urgency of addressing these issues. Overall, these comments transformed what might have been a speculative discussion about future technologies into a pressing examination of current and imminent privacy challenges.

Follow-up Questions

How can privacy-preserving technologies be developed for XR that allow the technologies to be used and calibrated while still protecting people’s right to not be personally identified?

speaker

Brittan Heller

explanation

This is important to balance the functionality of XR devices with user privacy, as current opt-out mechanisms are not effective for spatial computing.

How can existing laws and regulations be updated to address the unique challenges of 3D spatial computing and XR technologies?

speaker

Brittan Heller

explanation

Current legal frameworks are not equipped to handle the nuances of immersive technologies, creating gaps in protection for users’ sensitive data.

How can environmental data and body-based data be effectively protected in XR contexts?

speaker

Brittan Heller

explanation

These types of data are fundamental to XR functionality but also pose significant privacy risks not covered by existing regulations.

How can trust and safety regimes be adapted to encompass conduct, content, and environment in 3D virtual worlds?

speaker

Brittan Heller

explanation

Traditional content moderation systems are not designed to address the environmental aspects of 3D worlds, creating new challenges for safety and moderation.

How can neural data and eye-tracking information be protected as categories of sensitive data in various jurisdictions?

speaker

Brittan Heller

explanation

These emerging forms of data collection in XR pose significant privacy risks but may not be covered by existing legal definitions of protected data.

How can XR developers and companies be educated about the types of data they are collecting and the importance of managing this data responsibly?

speaker

Nouha Ben Lahbib

explanation

Many developers may not be aware of the extent and sensitivity of the data they are collecting through XR experiences, highlighting a need for education and awareness.

How can the psychological impact of experiences in virtual reality be addressed in legal and ethical frameworks?

speaker

Brittan Heller

explanation

The immersive nature of VR can make experiences feel real, potentially leading to psychological harm that current frameworks may not adequately address.

Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.