WS #211 Disability & Data Protection for Digital Inclusion

18 Dec 2024 06:30h - 08:00h

WS #211 Disability & Data Protection for Digital Inclusion

Session at a Glance

Summary

This discussion focused on digital inclusion and data protection for persons with disabilities in the context of internet governance. Speakers highlighted the importance of involving persons with disabilities in policy-making and technology development processes. They emphasized the need for a more nuanced understanding of disability that goes beyond medical definitions and considers social barriers.

The conversation addressed several key issues, including the challenges of data collection on disability, the importance of digital accessibility, and the risks posed by automated decision-making systems. Speakers criticized the tendency to infantilize persons with disabilities in data protection laws and stressed the need for more agency and autonomy in digital spaces.

The discussion also explored the potential of local digital inclusion initiatives and the role of assistive technologies. However, concerns were raised about the unchecked optimism surrounding AI-powered assistive technologies and the need for greater accountability in their development and deployment.

Speakers highlighted the lack of representation of persons with disabilities in AI fairness conversations and governance policies. They called for more inclusive approaches to technology design and deployment that consider the diverse needs of persons with disabilities.

The importance of accessible education platforms and inclusive pedagogies was emphasized, with UNESCO’s work on guidelines for ethical AI development in education being highlighted. The discussion concluded with a call for more comprehensive definitions of disability in policy frameworks while also addressing concerns about potential misuse of disability classifications.

Keypoints

Major discussion points:

– The need for more inclusive data protection and privacy frameworks that consider the diverse needs of persons with disabilities

– Challenges in gathering accurate data on disability while maintaining privacy and autonomy

– The importance of involving persons with disabilities directly in technology development and policymaking

– Concerns about AI and automated systems potentially discriminating against or excluding persons with disabilities

– The need for a social model approach to disability rather than a purely medical one

Overall purpose:

The discussion aimed to explore how to make the internet and digital technologies more inclusive for persons with disabilities, particularly in the context of data protection and privacy regulations. It sought to identify gaps in current approaches and suggest ways to center disability perspectives in internet governance.

Tone:

The tone was largely collaborative and solution-oriented, with speakers building on each other’s points. There was a sense of urgency about addressing exclusion and discrimination. The tone became slightly more critical when discussing shortcomings of current policies and AI systems, but remained constructive overall. Speakers emphasized the importance of moving beyond surface-level inclusion to more fundamental changes in approach.

Speakers

– Fawaz Shaheen: Moderator

– Tithi Neogi: Analyst at the Center for Communication Governance

– Angelina Dash: Project officer at CCG and LU Delhi

– Eleni Boursinou: Consultant with UNESCO’s communication and information sector

– Osama Manzar: Director of the Digital Empowerment Foundation

– Maitreya Shah: Tech policy fellow at UC Berkeley, affiliate at the Berkman Klein Center for Internet and Society, disabled lawyer and researcher from India

Additional speakers:

– Dr. Mohammad Shabbir: Coordinator of Internet Governance Forum’s Dynamic Coalition on Accessibility and Disability, from Pakistan

Full session report

Digital Inclusion and Data Protection for Persons with Disabilities: An IGF Session Summary

This summary provides an overview of an Internet Governance Forum (IGF) session on digital inclusion and data protection for persons with disabilities. The discussion brought together experts from various fields to address key challenges and propose solutions for creating a more inclusive digital landscape.

Introduction

The session began with an acknowledgment of the collaborative document for best practices being compiled throughout the IGF. The moderator, Gunela Astbrink, emphasized the workshop’s policy of using people-first language and noted the accessibility considerations for the session, including real-time captioning and sign language interpretation.

Digital Accessibility and Inclusion

The discussion emphasized the critical importance of making digital technologies and services accessible and inclusive for persons with disabilities. Tithi Neogi, from the Centre for Internet and Society in India, highlighted the need for accessible consent mechanisms and digital services. Osama Manzar, founder of the Digital Empowerment Foundation, stressed the importance of involving persons with disabilities in technology development and service provision, stating, “Persons with disabilities must be part of the ecosystem in all aspects of research, data collection, and implementation.”

Eleni Boursinou, from UNESCO, addressed challenges in online education accessibility for learners with disabilities. She shared an example from Rwanda where including teachers with disabilities in digital skills development initiatives proved highly effective.

Maitreya Shah, a lawyer and researcher from India, pointed out the lack of representation of persons with disabilities in AI and technology conversations, highlighting a significant gap in current approaches to digital inclusion.

Data Protection and Privacy

The conversation revealed significant concerns about data protection and privacy for persons with disabilities. Angelina Dash, from the Centre for Communication Governance in India, criticized the tendency to treat persons with disabilities like children in data protection laws, arguing for the need to recognize their agency and autonomy. She advocated for the reintroduction of sensitive personal data as a category in India’s data protection law to provide additional safeguards for vulnerable data.

Maitreya Shah raised concerns about privacy risks associated with AI-powered assistive technologies and data collection practices. This point was further emphasized by audience members who expressed worries about private data of persons with disabilities being used to train AI systems without proper consent or oversight.

AI and Automated Decision-Making Systems

The discussion highlighted the potential risks and challenges posed by AI and automated decision-making systems for persons with disabilities. Maitreya Shah pointed out the dangers of bias and discrimination against persons with disabilities in AI systems, noting that his research at Harvard revealed how AI fairness metrics and governance policies often explicitly exclude disability from their scope.

Eleni Boursinou discussed UNESCO’s work on guidelines for ethical AI development in education, emphasizing the importance of considering the needs of persons with disabilities in these frameworks. The speakers agreed on the critical need for disability representation in AI fairness and governance conversations to ensure that these technologies do not perpetuate or exacerbate existing inequalities.

Disability Data and Definitions

A significant portion of the discussion focused on the challenges and limitations of current approaches to disability data collection and definition. Maitreya Shah criticized the medical or impairment-based approaches to disability data collection, arguing for a shift towards a social model of disability aligned with the UN Convention on the Rights of Persons with Disabilities (UNCRPD).

This perspective sparked a debate about the need for more inclusive and comprehensive definitions of disability in policy frameworks. However, audience members raised concerns about potential misuse or impersonation if definitions became too broad, highlighting the complex balance required in this area.

Conclusion and Future Directions

The discussion highlighted the complex challenges in achieving digital inclusion and data protection for persons with disabilities. It emphasized the need for more inclusive approaches to technology design, policy-making, and data collection that center the perspectives and needs of persons with disabilities.

Dr. Mohammad Shabbir, an audience member, contributed to the discussion by emphasizing the importance of considering the diversity of disabilities and the need for tailored approaches in technology development.

The session concluded with a reminder about the collaborative document for best practices and an invitation for participants to contribute their insights. This document aims to compile practical strategies for improving digital inclusion and data protection for persons with disabilities, serving as a valuable resource for future policy and technology development efforts.

Session Transcript

Fawaz Shaheen: . . Yes, I think it’s working now. Thank you so much. We’ll just start our session now. Welcome to all our on-site and our online participants. For our on-site participants, it is channel number 2. You can get in on channel number 2. And we also have a small, I mean, if you, as the session progresses at some point, you can come and check the workshop policy. And there’s a shared document that we can all be working on. We’ll talk more about it as the session goes, but at the front of the desk is my colleague, Nidhi. She has those QR codes. If anyone wants to scan and get those documents, you can do that. Now, before we start, I would just like to do some housekeeping and check if our online participants are able to speak and come on. So Maitreya, can you unmute? Or Angelina, can you unmute them one by one and see if they’re able to speak, if you’re able to hear them? Sure, I’ll just. Hello, hello. Angelina, you’re audible. You’re audible, thanks. Yes, morning, morning. You’re audible, thank you. Yeah, yeah. Hi, am I audible? Yes, yes, you’re audible. Thank you so much. I hope everyone’s able to hear them. Now, as I said, this is a session where we would like to talk about some of the most invisibilized conversations, some of the conversations that we often miss out. This is an opportunity to discuss how to make the internet more inclusive in a broader sense, but also more particularly as we all move towards establishing data protection regimes, establishing privacy regimes, including those of us in India. This is a chance to have a conversation about how data protection and privacy regimes can make the internet more inclusive instead of more exclusive. That’s the basic idea, the basic sense with which we’ve started. For our session today, we have a workshop policy to make sure that people are able to access us, people who are joining us both online and on site with diverse abilities, with different kinds of disabilities, they’re able to experience this session as well as the rest of us. So just a couple of pointers. We are requesting all of our speakers to briefly describe their physical attributes in their own words. For instance, I’m Fawaz, I’m a bearded man, kind of big, and I’m wearing a gray jacket today, I have short hair. Just something brief like that, whenever you’re speaking, it’ll be helpful to bring everybody in to make it a little bit more of an inclusive conversation. So thank you in advance for that. And apart from that, we have, if you want a detailed look at the workshop policy, although I’m sure there’s nothing very difficult, there’s nothing extraordinary that we’re asking. It’s just basic stuff, being respectful, being inclusive, bringing everyone into the conversation. What we do have the workshop policy, the QR code is with Nidhi. You can scan that, take a look. You can also use it for your own sessions, customize it, give us feedback. We also have another document during the Q&A session. We’ll show you that document. It is a shared Google document in which we’d like to build some code of best practices for inclusive internet. So that’s something we’ll invite all of you to participate in. For the online participants, my colleagues, Angelina and Tithi will be sharing them in the chat. For the onsite participants, my colleague Nidhi has a QR code that you can scan and you can get access to. Now, before we begin, before I introduce, this session has been organized and conceptualized by my colleagues at the Center for Competence, Angelina and Tithi. You can see them both on the screen right now. And they’ve also recently authored a policy brief on disability and data protection, which looks particularly at the new law in India, and it looks at its interaction with the Persons with Disabilities Act. And it tries to give a sense of where we are standing and an overall position. So before we begin, I would like to request Angelina, Tithi, just take 10 minutes, walk us through your brief and lay out what you envision for this session. And after that, we have an excellent panel of speakers. I’ll be introducing them one by one and we’ll take this forward. Thank you.

Tithi Neogi: Thank you for that introduction. Hi everyone, I’m Tithin Yogi. I’m an analyst at the Center for Communication Governance. My pronouns are she and her. I wear glasses, I have wavy hair and I’m wearing a red hoodie today. Over to you, Anjana, you can introduce yourself.

Angelina Dash: Hi everyone. My name is Anjali Nadash and I’m a project officer at CCG and LU Delhi. My pronouns are she and her and today I’m wearing a red jacket and I have wavy hair.

Tithi Neogi: Over to you, Tithin. So today’s session is based on the policy brief on data protection for persons with disabilities in India that, like Fawaz mentioned, Anjana and I have co-authored and have been working on for a while. So while India has indeed taken a step forward towards inclusion of persons with disabilities in its data protection framework online, we have identified some gaps in its nuances and we have tried to plug in some loopholes to sort of further advance the digital inclusion for persons with disabilities. So some common themes that we have identified in this data protection framework, specifically for persons with disabilities, are the themes of digital access and inclusion, data autonomy and data protection. So I’ll start off with something on what we have found on digital accessibility, digital access and inclusion. Specifically in our research, we have identified that digital accessibility is a precursor to persons with disabilities giving meaningful consent on the internet. While the disability rights framework in India guarantees the right to accessible information and communication technologies, persons with disabilities, the data protection law does not really mandate the data fiduciaries to operationalize or implement consent mechanisms that have accessible interface and can be used easily with persons with disabilities. Also, another thing that we have identified through our research is that data protection law allows guardians of persons with disabilities to give consent on their behalf. So that reduces or takes away the onus of giving consent from the person with disability and shifts that to the guardian of person with disability. And this, in turn, takes away the incentive that data producers might have had to make their consent mechanisms more accessible towards persons with disabilities, because now it’s not the persons with disabilities who are giving consent directly on these consent mechanisms, but it’s their guardians. So based on our findings on digital accessibility, we recommend making notice and consent mechanisms compliant with some accessibility standards and compatible with assistive technologies, use of audiovisual formats in these consent mechanisms, electronic braille, et cetera. I’ll hand over to Angelina to continue this discussion on digital accessibility. Over to you.

Angelina Dash: Thanks, Siddhi. So I think another concern that we have in terms of access and inclusion is in the context of limited access to digital services without the consent of their lawful guardian. So under the data protection law in India, persons with disabilities require the consent of their lawful guardian in order to access digital services on the internet. This is concerning because of two scenarios that may arise. What do these scenarios look like? In the first case, we have persons with disabilities who cannot access digital services where the support and consent of the lawful guardian may not be required at all. For instance, maybe something like a digital encyclopedia. In the second case, we have persons with disabilities who are hindered from accessing digital services which provide vital disability resources like perhaps online communities for persons with disabilities. Access may also be curbed in cases where a conflict of interest may arise between the lawful guardian and the person with disability. This could include access to digital helplines for physical and sexual abuse among other digital services. So our recommendation in this context is that persons with disabilities must be allowed to access digital services on the internet without the consent of their lawful guardian in certain cases. Over to you Tithi.

Tithi Neogi: Thanks Anjalina. I’ll now speak about the second theme that we have identified and that is of data autonomy. So in India, indirect consent through another person is collected from two kinds of data principles i.e. children and persons with disabilities. So in the Indian data protection law, children and persons with disabilities have been clubbed together and subjected to similar treatment while giving consent i.e. their parent i.e. in the case of children or their lawful guardian in the case of person with disability gives consent on their behalf. This treatment leads to infantilization of persons with disabilities on the internet since their guardians are now being equated to parents to the status of parents which is not the case because if we look at the disability rights framework in India, a lawful guardian for a person with disabilities is envisioned to be somebody who aids them or assists them and helps them with decision making process in a mutually consultative framework. So in the data protection law that India has right now, lawful guardians of persons with disabilities have that ability to give consent on their behalf completely without accounting for any mutual decision making. So in our research, we have recommended decoupling persons with disabilities from children and introducing a separate provision in the data protection law that defines persons with disabilities. This definition for persons with disabilities could take into account temporary disabilities, the degrees of support, the various degrees of support that a person with disability might need. We have also suggested that any consent collection processes online be informed by a consultative framework between persons with disabilities and their guardians, which is driven by principles of mutual decision-making. I’ll now hand over to Angelina to speak about the third common theme that we have identified. Over to you.

Angelina Dash: Thanks, Tithi. So I think another gap, as you mentioned previously as well, that we identified was in the context of data protection, specifically with regard to the absence of sensitive personal data. So the data protection law that was introduced in India last year came after a long consultative process, and previous iterations of this law had carved out sensitive personal data, or SPD. This was not included in the final law. Now, firstly, what is SPD? SPD was a distinct category of personal data warranting additional safeguards. And what did these safeguards look like? These include specific grounds for processing, including grounds like explicit consent or specific purposes of state action. Now, it’s important to question at this stage, why does the need for a separate category of SPD arise in the first place? So there are some concerns which scholars have been raising regarding sensitivity of data being used as a basis for categorization. However, we feel that India’s data protection framework is currently at a very nascent stage. And additionally, certain data of persons with disabilities can be more vulnerable, like health data of… financial data, and this data is more susceptible to misuse for the purpose of discriminating against persons with disabilities, particularly in terms of employment, health care and social welfare. Therefore, our recommendation in our policy brief is that sensitive personal data as a category should be introduced for personal data within India’s data protection law. And with this, we’d now like to talk about moving from the policy brief to today’s IGF session, where we’ll be continuing the discourse on centering disability in internet governance more broadly. So through our insights from working on the policy brief and lived experiences from stakeholders who are persons with disabilities, we have gained an understanding of how there are certain gaps in internet governance discourse with regard to disability. And I’d like to hand over to Tithi at this point, and she’ll elaborate a bit more on the gaps we identified. Over to you, Tithi.

Tithi Neogi: Thank you, Anjulina. So some of the loopholes that we have discovered in the discourse on data governance globally as well as in India, and we would really like the insights from the participants today to share their experiences and what they feel about this. So the first kind of loophole that we saw was that this explicit categorization of persons with disabilities as data principles in the Indian data protection law, this happens to be an anomaly of sorts because we haven’t really come across as this distinct categorization of persons with disabilities as data principle in any other major statute which has been enacted. So this is something that we have been discussing as to whether this separate categorization of persons with disabilities, whether this was a good measure, is this going to benefit persons with disabilities, is this the right approach or the wrong approach. We are aware that the GDPR refers to of certain vulnerable classes of data principles but does not mention persons with disabilities exclusively. We would like to take this opportunity to get insights from the participants as to whether they feel that the GDPR approach is the way to go or whether the Indian approach of specifying a certain category of data principles as persons with disabilities and having specific measures with respect to consent selection from them is that the approach to go. We are also aware of discussions on the presence effect on data governance and whether GDPR would be a good influence or is the novel approach that the Indian law is taking is this the way forward. We would really like to hear some discussions on this from our speakers as well as from the participants. I will now hand over to Angelina to discuss some other loopholes. Over to you Angelina.

Angelina Dash: Thanks Tithi. I think another loophole that we identified was the lack of a global south or a global majority perspective in discourse regarding disability in internet governance. The persons with disabilities are not a homogenous group. Members of this community especially those from global south or global majority jurisdictions like India can often face unique and diverse challenges in terms of internet accessibility and data autonomy. This may arise from intersecting marginalization in terms of gender, caste, poverty, illiteracy as well as broader infrastructural concerns and digital divide. Internet governance discourse currently does not adequately account for these challenges and that’s where we come in and we aim to highlight some of these issues through the discussions in the workshop today. And with this context and background that we’ve just provided and the gaps that we have identified we intend to build upon the work in our policy brief and sort of extend the conversation to also address disability and data protection in the context of AI and automation. decision-making systems. We hope to use this session as a forum to facilitate multi-stakeholder conversations and collaboration. This will enable the co-creation of best practices towards digital inclusion for persons with disabilities. With this, I would like to conclude our presentation and I now open the floor for any questions. Fawaz will be assisting us in this Q&A round.

Fawaz Shaheen: Thank you. Sorry, I think we’re doing a slight change in the format because we’re running a little short of time. So, what we’ll do is we’ll take all questions. We’ll have two rounds of questions. So, the presentation questions also we’ll take after the first round. We just haven’t changed on the fly, sorry. But now, without further ado, let’s move towards the conversation. Thank you for starting the conversation. We’ll now move towards the discussion. And first off, I would like to invite our on-site speaker, Dr. Osama Manzar, to please join me on the stage. We also have two excellent speakers joining us online. We have Eleni Borsino, who’s a consultant with UNESCO. And we also have Maitreya Shah, a lawyer, researcher, currently a tech policy fellow at UC Berkeley. I’ll be introducing all three of our speakers in more detail as I ask them questions. But also, just to remind all of you, the discussion, we’d like to have it as conversational as possible, as open-ended as possible. So, we’ll be doing a first round of questions. We’ll have four to five minutes each for each of the speakers. Then we’ll open it for questions. I encourage all our on-site participants to please ask all our speakers, as well as Tithi and Angelina, about this topic, about this issue. And also, all our online participants, please feel free to put your questions in the chat. Tithi, who’s our moderator online, will be taking those questions and relaying them to us for our speakers. So, after the first round of questions, round of speakers and questions. We’ll have another round, and we’ll end with a round of interventions. So we have about one hour left. 60 minutes is a good time to do this, I feel. And now to begin the conversation, I think I’ll first invite Eleni, who’s joining us. Eleni Borsino, who’s a consultant who works with UNESCO’s communication and information sector, especially on universal access to information. And Eleni, if you could unmute yourself. We would like to begin this conversation by asking you, what role do you think digital accessibility plays in furthering sustainable development goals? Especially working from your own experience, what would you describe as the role of digital accessibility and inclusive design in enhancing digital autonomy? Not just access, but also digital autonomy for persons with disabilities. But Eleni, over to you. And thank you for joining us. Thank you. Thank you for having me and for the invitation. I’m very honored to be in this panel today.

Eleni Boursinou: My name is Eleni. My pronouns are she, her. Today, I’m wearing a very Indian shirt, and I have wavy hair. You can, I mean, I don’t know if my camera can be switched on. But if you want, you can switch on. I think you can switch on the camera. Titi, can you just check? So thank you very, very much for this. So and for the very meaningful question. So the digital accessibility and the SDGs, it plays a very critical role by fostering inclusion and equity, and particularly for what you call persons with disabilities, but in general, any marginalized groups. So by removing usability barriers, it bridges the… digital divide, enabling participation in the digital economy and what we call in the UNESCO communication and information sector, the knowledge societies. So this supports the IGF and the global digital compact agenda of leaving no one behind and aligns with the UN Convention on the Rights of Persons with Disabilities, promoting social justice and equity. On the other hand, there are also accessible tools such as OER and UDL that play a significant role in empowering education, that is SDG 4, reducing dropout rates from school and enhancing educational outcomes and promoting lifelong learning. Additionally, accessible digital solutions can address challenges related to gender and disability, that is SDG 5, and can empower women and girls with disabilities to access education, employment and leadership opportunities. And finally, for SDG 17, that is international collaboration, the accessibility standards can provide open solutions that contribute to global cooperation, fostering access to information and promoting partnerships. So in the context of data autonomy, digital accessibility and inclusive design play a huge role in enabling individuals with disabilities to engage with and control their data. Accessible data systems and platforms allow users to interpret and manage data independently, ensuring that everyone, regardless of ability, can participate in data-driven decision-making. What we call in UNESCO open solutions. That is, solutions that are cost-effective with open licensing, it can be free and open source software or open educational content, and OER platforms provide resources that empower individuals to control their data and engage in lifelong learning. Digital formats and assistive technologies enhance understanding and trust in data-driven systems, while universal and inclusive design mitigate biases in automated decision-making, ensuring fairness and safeguarding marginalized communities from discrimination. So the key takeaway for embedding is that embedding digital accessibility and UDL principles in policies and practices can ensure equitable participation in the digital economy and knowledge society. And first, by improving data collection and analysis, AI can support more inclusive and equitable decision-making, ensuring that marginalized communities, including those with disabilities, are considered in policies aimed at achieving the SDGs. And second, a human-centric approach to AI and digital tools, essential for ensuring that the benefits of AI are distributed equitably and contribute to sustainable development goals, including promoting data autonomy and accessibility. So that’s all for me for now, but I’ll be happy to answer any questions you might have.

Fawaz Shaheen: Thank you, Lenny. We’ll come back to you. And now I’d like to go to our on-site speaker, Osama Manzar. Osama, as we know, is director of the Digital Empowerment Foundation, and he works with a large community network of digital fellows who are not just working on… on improving access to internet, but also on who have a mandate effectively to train people, to work with people, to make the internet a safer and more inclusive space. And today, particularly, we are very happy to have Osama with us, because recently, DEF has also come out with a report that is looking at ICT for empowerment, inclusivity, and access. And it’s a report in which they’ve spoken to more than 250 persons with disabilities and mapped the various challenges, also the opportunities. And so I would like to invite Osama to share some of those things. But also, to start off, I would like you to talk a little bit about doing this kind of work when you look at the challenges that are associated with gathering data on disability, talking to persons with disability, while also maintaining autonomy, maintaining anonymity where it’s required, doing that and balancing it with the need for having good data to work on disability. This becomes an even more urgent question when it comes to issues of census. One thing that your report is talking about is the need for a new disability census in India. So some of those things, I know it’s a very broad question, but I’d like to start with that and we can move on from there.

Speaker 1: Yeah, thank you, Fawaz. I will focus my discussion more on the ground realities and also with the entire community of PWDs or whoever we call it. We have only four or five minutes, so I’ll say that there are three things very important. One is that we treat our people with disability as subjects. you know, as, or I don’t want to say object, I would have said object, but more like somebody about whom we need to do something where there is no role play from themselves, right? Yeah, so exactly. Like we can, you know, exclude it and all that. So that’s a very behavior of the doers, whether it’s corporate sector, government sector, or able people, all philanthropists, anybody. You know, everybody thinks that there’s something need to be done. It has been done for so long that if you talk to the disabled people or people with disability, they feel like somebody will do something someday, you know, and we are just waiting, you know? So the whole ability to come out, to put their, assert themselves, demand, or, you know, ask for accountability is almost negligible. You know, almost negligible. I would say that they are not treated better than any other poor people in remote areas, or the people who are caste-wise or class-wise treated absolutely downtrodden. So I’m coming also from the perspective of large scale. We have about, what, 50 million population? Even more, about 5% of the population of India is people with disability. It’s as equal as the indigenous community. They are also about 5%. You know, that’s a huge, huge number in India. With that one, and then when we started seeing that in the last 20 years, digital’s development was becoming more like an enabler, and an automated enabler for people with disability. And let me also say, disability also see like, now, if I don’t have access to information, I am more able with digital access device, right? If I don’t want to talk to people, then I have something to talk online, and there is this. So it enables me, my confidence, my requirement. and so on and so forth. At Digital Empowerment Foundation, we realized this by going on the ground, that even though when we were going on the ground to provide digital access or infrastructure, we were seeing that disability was almost like invisible. You know, why were they invisible? They were not coming forward. We are not including them. We are not even thinking about them. All the government entitlements that we ought to deliver for them, they are not even coming to take it. We don’t know how to talk to them. And then we also realized there is a lot of dogmatic, you know, traditional look down upon kind of behavior in the community. We look at them with a lot of distance, you know. We don’t want to be in close conversation with them or touch them or feel them. They are considered as curse on the society, you know. That’s the last thing that one can imagine if you want to do a multi-stakeholder way of growing things, you know. They are not even part of the stakeholdership. So then we, I’ll just, next one minute I’ll, what we did is that we started talking to them and including them in doing, rather than talking about what’s your problem and what’s my problem and all that. We thought that, okay, this person is doing connectivity. You can also do connectivity. If this person is running a access point or a public access point, you can also run an access point. If this person is accessing computer and finding a payment for somebody, you can also do that. So everything that we thought anybody can, like anybody, if disabled person can do, we started having, working with them. So they become part of the working ecosystem and suddenly they became social entrepreneur or entrepreneur or a provider. So earlier their life was seeker. Now they are more like provider. And we just created not one, but 300, for example, now 500 people who run digital centers, digital access points in village level. and they provide service to all the able people, actually. And then what we learned from them is that now you can talk about disability, you can talk about their miseries, you can talk about their requirements, and so on and so forth. And that actually brought us at par, that now on an equal footing to discuss, right? And then we thought that can we replicate this model? So what I am now coming to is the last part, how if you, now they are able people for providing service, for talking, for infrastructure development, for digital access, everything, right? They also have more knowledge than others because they also know extra about the special ability to serve the disabled people. And then we did this entire research and found out that, oh my God, government has not done the census for people with disability for so long. Why we don’t have that? Now we are asking them, you demand it rather than we demand for you. Because we are also going to demand, we are going to demand for the census for the whole country, right? Which is also not done. But can we have a special only for that one? Yeah, it’s about 15, 20 years actually. Two decades. Yeah, yeah. One decade is lost and another decade has not come. So how can they want? The second thing, they started telling that all the government facilities are actually not implemented on the ground. Like government says that ICT enablement for access of people with disability, the whole digital centers at a public access point is not even able for that. It’s not wheelchair friendly. The point is that actually digital inclusion in our country and many other countries is make even the normal people disabled rather than the disabled people. They can’t type, they can’t go to the place. You have to do extra. You have to become absolutely audiovisual. And those kinds of things have started coming in between. Third is that being people with disability, they have more network on data about their own community, right? So they can become a secondary source of information about the spatial lag or spatial abilities of the people with disability. And that we must take advantage is one of the recommendations that we have done. And the last part is that data in any case for anybody is very important, right? About the protection. Why it is more protection-oriented approach required for people with disability is because of their extraordinary deprivation way of looking at them. So therefore you have to be more protective about them. You have to be more, you know, make sure that their participation, their sense is required. And the last part is that do not treat people with disability with, you know, let’s say mental disability, you know, like our researchers clearly said that government has created a legislation where you are infantilizing their ability. I mean, they are very much able, but why somebody else should take a decision on their behalf, you know? Just that because they cannot walk or they cannot do by hand? No, that’s not fair. You know, from that point of view, the whole illiterate people in our country, about, you know, 40% of the population who cannot access internet is disabled for accessing internet, just because you are not digitally literate. I mean, how can you do that? So that’s very, very important. But my last, you know, point in this one is that we must do conversation, action, intervention with people with disability in everything, in everything, whether you are doing research or data collection or doing something on work, they must be part of the ecosystem. Then only it becomes complete, you know, rather than we say that only conversation need to change, only something needs to change. They must be, it’s something like we say, don’t do mannel, always do panel, which must have a woman. Similarly, can we do that? There is always a panel with one person with disability, at least, you know, can you always have your discussion with one person with disability, always being part of it so that we normalize participation of disabled people into the normal conversation is what I would like to say that.

Fawaz Shaheen: No, I think that’s an excellent point. And it’s important also to highlight how even sometimes very well-intentioned interventions by civil society, by human rights actors can also be very infantilizing, patronizing. I think the point you’re making and the same point that Titi and Angelina were also making. And now on that note, I would like to move to our. Our next speaker, Maitreya Shah. Maitreya is currently a tech policy fellow at UC Berkeley, also an affiliate at the Berkman Klein Center for Internet and Society. He’s a disabled lawyer and researcher from India, and he has a unique insight into the challenges that persons with disabilities face with regards to digital access and digital protection. I would also encourage you to actually go and read some of his writings. You can find a lot of them at his page on the Berkman Klein Center, as well as on the UC Berkeley website. Do read some of his writings. But Maitreya, I would like to thank you for joining us. And also, I’d like to start by asking you, considering the increasing digitization of essential services, what are the gaps in legislation in India regarding the rights of persons with disabilities in the context of data protection? And also, how can policies that we adopt encourage user-centric approaches in the development of technology that accesses persons with disabilities? Again, a sort of broader question, but we would like you to come in on that and take us through some of these perspectives with your insight. Maitreya, if you’re able to introduce yourself. Can you hear me?

Maitreya Shah: Yeah. Hi. Yes, we can. Hi. Thank you for having us. Thank you so much for inviting me. And my pronouns are he, him. And I’m wearing a button-down with a large pair of headphones. I just started my camera. Hopefully, you can see me. I am an Indian with curly brown hair. Sorry, curly hair. So I think this is a great question. Yes, we can see you now. Thank you. Thank you. So I think this is a great question. And I’m wondering what sort of legislation I can talk about that we see. And as I said, I’ve already covered the issues of the current legislative frameworks in India. I think I’ll speak about digital accessibility in detail. protection a little broadly in the Indian context. So I think to start with, you know, India has always had this issue of privacy for people with disabilities when it comes to digital technologies or even other forms of emerging technologies. A lot of my recent work has been on how, you know, Aadhaar, the biometrics based system of India has, you know, not adequately considered the privacy and accessibility implications on people with disabilities. You know, Aadhaar scans and fingerprints that Aadhaar collects often exclude people with disabilities because the algorithms and the infrastructure that they use, quote unquote, treat people with disabilities as outliers or as non-normative. So in India, I think this issue has been long standing. Aadhaar started, you know, the project, we started way back in 2009, implementation 2012. And even as of today, people with disabilities are facing issues in rolling with the technology and also authenticating their identity, you know, especially with public services such as cash transfers or welfare programs. So this has been an intrinsically, I think this issue is long standing. But coming to kind of more legislative, more legal, more policy issues on this area, you know, as Titi and Angelina rightly said, the data protection law of India treats people with disabilities as or equally with children. And I think there are several issues with this. To start with, you know, the Rights of Persons with Disabilities Act, when it was enacted, it in a way takes precedence over other legislations when it comes to disability matters, especially. issues such as guardianship. I think my question is if the data protection law even has a prerogative to, you know, design a consent framework where guardianship and other very complex social legal issues are involved. You know, it is the, I think, the disability law that has done this efficiently and I think which is the right framework to address these issues. But the data protection law suddenly starts multiplication of, you know, these legislative provisions and starts adding new complexities for people with disabilities, right. And I think, you know, Titi and Angelina raised this very pertinent question as to whether the GDPR approach is correct or an Indian approach is correct. And I think this is again a very complicated question because, you know, India, although borrowed heavily from GDPR for the larger legislation that we enacted, when it came to disability, we, I think, did a lot of innovation when we brought this provision. And I think probably, you know, as I said, we don’t need a separate consent mechanism or a separate provision for people with disabilities at all. We might be good with the GDPR way of doing things because, you know, I think the idea is to respect the agency of people with disabilities, not develop new consent mechanisms. Instead, a focal thing on, you know, making your technologies privacy preserving, making your technologies more accessible, you know, seeing that technologies do not unlawfully access a disability or health information of people with disabilities. I think it’s more on the, you know, the focus is to shift

Fawaz Shaheen: from user to data fiduciaries or the corporations that are building technologies. I think in India, one of the biggest problems that we have had through our legislations is that a lot of onus is placed on people with disabilities. And with privacy, you know, a lot of, there’s been a lot of of scholarly research and a lot of criticism on this individual privacy frameworks, you know, individual privacy might not be the best way forward, especially with AI and emerging technologies, where usually user agency is already curtailed. So I think we we need to think about making your technologies more privacy preserving rather than keeping, you know, putting additional complexities on people with disabilities to, you know, coordinate with their guardians and then, you know, work on consent to access even the very basic necessities, you know, because I think digital access and internet access is now a necessity. And I can give you an example of how this is actually playing out on the ground. Recently, I was doing a training session for assistive technology manufacturers in India, and who are specifically building technologies for people with disabilities. And a lot of them told me how this this the guardianship provision in this, this complex consent based provision, the data protection law is, is raising many issues for them, because it’s not giving them adequate opportunity to provide their technologies to people with disability. They also wrote it, wrote to the government saying that, you know, this, this provision needs an amendment, specifically when it comes to assistive technology. I think this is this is one very broad issue. The other broad issue is that there has been this this inherent trade off between, you know, accessibility and privacy. So, at times people with disabilities are compared to give away their privacy to access digital technologies. To give you an example, I earlier this year, I written an article on how these companies that are developing automated tools, claiming that these automated tools can fix websites, make them accessible without changing the source code of it. And how these practices are inherently deceptive. So what it essentially does is it, you know, you know, in the garb of making websites accessible, it violates user privacy, because these overlay tools, as they are called, can infer disability status of individuals through, you know, collecting data on the screen reader usage, or their use of magnification device, so on. And so there has been some conversation on this outside India, especially in the United States. In India, this conversation is not happening. And why this is particularly problematic when it comes to privacy is that the adoption of these technologies, whereas it is, you know, it is, it is, in a way, restricted now in the United States, companies and even the government are increasingly adopting these problematic technologies in India. And to give you an example, you can go back and check this, you know, the India

Maitreya Shah: AI portal managed by the Ministry of Electronics and Information Technology, a quote, unquote, the knowledge platform for India’s AI economy, itself uses an accessibility overlay tool. So I think my question is, you know, we need to think about this comprehensively, digital accessibility is not a tick box solution. You know, we have to think about this a little comprehensively, especially when it comes to privacy. And there are additional challenges due to AI and emerging technologies that we are seeing that are posing, you know, issues for people with disabilities. And I think thirdly, I’ll very briefly touch on the issue of, you know, India’s digital accessibility laws are still very nascent. We have not adequately made our websites for that the government or private accessible and I think it’s also quite singular you know in a way that a lot of our digital accessibility frameworks in India focus on accessibility for people who are blind but does not adequately cater to people who have other forms of disabilities. So you know there are challenges of accessibility and privacy even for people say who have learning disabilities like dyslexia or who are autistic or who have other intellectual disabilities and I think our laws are still quite you know nascent on that front that we’re not adequately thinking about a cross disability approach when it comes to accessibility and privacy. So I think with India there are many issues at a larger policy level there are you know people with disabilities are not adequately represented in the conversation. There is a duplication and a lack of harmonization in our regulations and you know I think we’re not we’re not adequately regulating the larger sector when it comes to preserving privacy of people with disabilities and ensuring their accessibility at the same time. I’m happy to talk more about this in the question answer and so on.

Fawaz Shaheen: Thank you Mithra. I think that’s a good point to pause and get our first round of questions and observations. I know some of the conversation has been very India centric but these are problems which are quite universal and we have we are fortunate to have a lot of on-site participants from different areas different countries so if questions or any observations any intervention from your perspective we’d welcome that. Tithi also please feel free to tell us any questions from the chat from our online participants but yeah anyone has a question here or an observation? Yes.

Audience: Thank you for this opportunity. My question is to our in-person speaker. Since he tried to be practical, I am coordinating the implementation of the African Union Data Policy Framework. And as part of that, we are several stakeholder sessions as part of supporting some African countries to develop their national data policy. And our project wants to be participatory and inclusive. But there is a challenge when engaging persons with disability. So I want to know from your work, what is the best approach when it comes to engaging persons with disability, if there are any recommendations that we can also try? Thank you.

Speaker 1: OK, very practical question. And thank you, because I have a good answer. So I’ll just let you know one typical, I’m sure, African country. Which country did you say that you are from? I am from Ghana, but my work covers several countries. OK, so I’ve been to Accra, so I can see some of the villages in Accra or in Ghana and many other African countries. So listen, imagine a village where you go to just generally, people are not connected. People do not know computers. And they still need to educate themselves on computers. There are still people there who want their work to be done or withdraw their money or anything, like hundreds of services that you avail is now digitally done, right? That’s the scenario everywhere. can see. Now, when you go there, you try to see that how we can provide all this service to the people, right? It’s very simple, like, or whether you go to a school and you do the, you know, a lab, established a lab, or you create a public access point where people can come and withdraw money or file an application or whatever, whatever. What we did is that to do that, we found out is there any disabled person there, preferably woman, you know, who is ready to learn, not educated, not necessary. That’s not the basic qualification. Are you ready to take a chance? Are you ready to sit on a computer? Are you ready to handle computers? And all this service that we are talking, you provide through this center, you know, or this facility. So suddenly, so your job was actually just to find out a person with disability and with an intention to serve people, right? And that itself was such empowering thing, because now the disabled person who can’t even move and sitting and giving you education on computer or switching on and off computers and letting you know that this is the way you can work, this is the way you can fill a form, this is the way you can access YouTube, this is the way you can do something, right? So digital became an ammunition from that person to provide a service. Earlier, he was or she was sitting at home and waiting for somebody will do some favor to me, right? Suddenly, and you do this from your home, so you don’t have to go to a shop. So everybody is coming there, you know, to get those service. For that same service, people used to go to faraway places like two kilometers, five kilometers or 10 kilometers, which is very expensive. And also, if you are very highly educated, then you are also very arrogant, you know, and this person is very humble, you know? And then the less, most of the people also start coming there because they think that, oh my God, this person is doing so many things, why don’t we go there and get it done from this person? And that one example become a viral thing for the rest of the people, imagination change. And this we did not in one, now we have out of 2000 locations, 500 locations are run by. by the disabled people. So the whole narrative change, the area change. And then when you ask those questions, which I’m not going to share now, maybe if I’ll get a chance, I will share with you that what are the statistics, numbers say, because 85% of the people are with mobility disability. So you don’t have to think for another 15% to be done while you can do 85% of the things. Why don’t you just first do the low-hanging fruit? Just the mobility people, just catch them. They are the visual representative of the people with disability. So those are the things that we did and we can share with you the whole research that we have done with all these 500 people and how it can actually become a replicative initiative in many other countries.

Fawaz Shaheen: Thank you. We would also encourage more participants to come up with their questions, interventions. Before we move to the next round of discussion, any questions from the online chat? Otherwise we can move to the next round of discussion and come back also. There are no questions on the online chat, but there’s one comment from Emily commenting on Osama Manza’s point on involving persons with disabilities in the process. We could ask Emily to come in and… Is it better now? No, there’s still an echo. Can you ask Emily to come in? Hello? Yeah, I think we are better now. Sorry for that. Sorry for that. A bit of a disruption. But Eleni, could we ask you to come in? And I think there was a comment that you had. Oh, I’m on mute. Thank you. Yes. So, I was saying that it is very interesting. The point is, I think, in the meanwhile, we can continue the conversation here, especially because you already asked the question about, you know, the work that you’re doing at the digital center. The next round of discussion that we’re having is around automated decision making. But before we go there, there is the question that we wanted to ask Maitreyi also, and start with you, Mr. Manza, maybe. Because, is it? Yeah, I think there’s some problem. Should we wait a couple of minutes or we can continue? Okay. Yes. What I wanted to ask you was if we could understand a little more about setting up a local digital inclusion initiative, because it’s one aspect of what you’ve been talking about, empowering people, making them providers and enablers instead of, you know, being a recipient. But there is also the question of the data around disability is also extremely skewed. We don’t have data that adequately represents how persons with disabilities are also a diverse group. So the picture of who is a disabled person is a very settled. very stereotypical picture if I may say. But through your experience, through your work and including the interventions in the field as well as the research work, you have of course a much better See at a government level or we are trying to design intervention at a larger level. How do you think this diversity among persons with disabilities can be approached? How can that be accounted for?

Speaker 1: So I would say that it is actually innovation on, I would say, the data collection also and the data contextualization. You know, for example, let’s say if we have a diversified, if we have a data which shows the diversity of various kind of disability, let’s say. Now the tick box is what are the disability, but there is no tick box on what are the abilities, right? Because you have to ride on the abilities with the same person which is having disability. So in, I mean, what I’m giving example is that let’s say my hands are not working, let’s say. I’m one of the disability. But should we only work on making that hand work or should we work that your mouth is working, your voice is working, your eyes are working, your mind is working, your other parts of the body is working and how those working and the devices and the able devices can actually make it not even think about my hand. It’s something like we used to use remote control which is handled by hand, but now all the remote controls are voice enabled, right? So voice controlled. So I am controlling all the remote controls by voice. So I don’t need my hand for managing. I’m just giving one example. So rather than thinking that, you know, I will create, you know, something non-handheld. you think about what are the things, voice-enabled, eyes-enabled, visually-enabled. That is where the real innovation and contextualization is very important. I would like to say that while I’m not denying, and it must be that our census or data collection on the diversified people must be done. Beside that, we should have equal effort that what are the already available ability-oriented things which is available which can contextualize. For example, can we have a list of everything that a mobile phone does, which is directly related to the people with disability? Because mobile device is the most able device among all the other devices in terms of making you work in something like it can transliterate, you don’t have to see and it will speak out your messages. There are many other things that the mobile gives you, and in a very secure manner by keeping your data also very secure. The mobile does, but I don’t know, and many of the people with disability do not know. Especially, if I don’t know, which I’m a source of enabler in some of the community, then it’s a pity that I don’t know. Can there be a whole, what you call the whole writer, the checklist, and everything that these are the things that we must apply? There are all the able things that, let’s say, digital access, a public center should be at a village level. It should have a ramp, but it should have 100 other things, which is not done. For example, none of our websites are voice-enabled, none of the websites are hearing-enabled or visually-enabled. That is the first thing that we should do. To do that, you just have to make it mobile-enabled. If you do mobile-enabled, then automatically your content can also be read out and everything, rather than doing on a typical web or a traditional web version. So I’m saying that in. last sentence is that, you know, we must do the data collection and data solution from the very high level contextualization and making cross-pollination of the facilitation of all the ability-oriented things which can enable rather than only focusing on disability.

Maitreya Shah: Thank you so much. I think I’d like to add something. I think I’d like to differ with my friend a little bit here because I think, you know, the problem with the entire data collection effort is not the focus on disability or ability. In fact, I feel that the very distinction between ability and disability, I think, is a very medicalized conception of disability, you know, because we only focus on particular organ senses or impairments. And I think we have come a long way after a lot of battle and advocacy to kind of do away with that in the society and especially with governments who have increasingly been relying on, you know, impairment-based metrics to collect data about people or to categorize people with disabilities. And I think in India, this is specifically evident, you know, the 2016 Act only recognizes 21 categories of disabilities, which is a very narrow way of defining disability. You know, if you see the UN Convention on Rights of Persons with Disabilities, it talks about a very broader approach. It talks about, you know, social barriers. It talks about attitudinal barriers. And disability is defined from a more of a social model perspective where you think about how society is, you know, posing barriers for people with disabilities versus, you know, what the ability or the disability of a particular individual is. And so I think that is why, you know, this data collection efforts have been failing because, you know, is that, you know, we, the official data says that people with disabilities are, you know, 2 to 5 percent of the Indian population. But according to the World Health Organization, the 15 percent of the world population lives with some of the other form of disability. If you ask me, I think this number is only going to be more in the Indian context because we also have high rates, high numbers of people who are poor, people who face other forms of marginalization, including lack of access to healthcare. So the number of people with disabilities is going to be more. But our data collection efforts have been failing because we don’t want to count those people. We don’t want to count all those who might be eligible to identify as a person with disability. So I think this disability distinction, I feel is very medicalized and I think we need to probably take the UNCRPD approach, take a more social model approach, and think about larger barriers, larger universal access issues when we think about disability and data collection. Thank you.

Fawaz Shaheen: All right. Thank you, Mithra. I think that’s well taken. I could see you snorting most of it. We can continue this engagement also if you want to.

Speaker 1: We can have this. I don’t have any disagreement to what you’re saying because maybe when you narrate anything, there can always be a different articulation of different things. So the people who deal with data, deal with laws and orders, I don’t even know how to do that. But I totally agree that, yes, there is a perception of looking at disability from the very medical perspective. And that is the reason why I mentioned that most of our activities are related to not looking at it, looking at it with several other abilities and take positive stance on that one to make it more work.

Fawaz Shaheen: Sure. I think this… is an interesting conversation. We are down to the last 20 minutes so we’ll have one more question is to Maitreya and to Eleni and then we have some time for taking questions and observations from our participants. Before moving forward I would just like to remind all the participants who just joined us that we have a document around suggested code for best practices for an inclusive internet. This is a collaborative document. We request you before leaving please get access to the document from my colleague Nidhi who’s standing here and we will continue to take suggestions till the end of the day and we’ll try to get some of these suggestions included in the IGF call to action. So now without further ado Maitreya I would like to take the next question to you especially because you’ve been working on or not. Fairness in AI is a conversation that needs to include persons with disabilities much more and in your last answer actually you were talking about how most of the automated accessibility plugins that we see are inherently deceptive. Some of that we know is also because of as we said the corruption of the data set itself. The fact that it doesn’t account for the diversity that is there among persons with disabilities. But I guess my question to you right now is given this context how do you think we can begin to ask for accountability from automated decision-making systems especially from the perspective of persons with disabilities?

Maitreya Shah: Thank you so much. I think that’s a great question and I think I’d like to break down my answer into sort of two you know I think I’ll break it based on the two kind of broader technologies that I see in the market. One is quote-unquote technologies that are specifically designed for people with disabilities and the other is mainstream technologies that people with disabilities also use. as users. So the first category where we think about automated and AI based technologies such as assistive technologies particularly, that are designed particularly for people with disabilities. I think when we when we think about accountability here, I think my my first argument is, you know, there is a lot of pseudoscience and a lot of these broader, you know, unchecked optimism with with AI powered assistive technologies. You know, I think everyone these days I feel is coming out with an AI powered assistive technologies in the market without understanding what their implications would be on people with disabilities. To give you a very small example, especially in the care space, you know, for people with disabilities who require caregiving support people such as you know, muscular dystrophy or other cerebral palsy or other disabilities who might require based on the severity of disability might require caregiving support. A lot of AI powered smart robots are being adopted across the world saying that these technologies would be transformative for people with disabilities. Not realizing, you know, that these technologies might violate privacy of people with disabilities. You know, whether do you know, you do need these technologies at all? Like, why would you need to replace humans with a robot when it comes to disability caregiving? There are many other issues, you know, there are assistive technologies that claim to fix certain disabilities like autism. And I think that is something that is very deeply problematic, because, you know, disability is not some, you know, not not not an ailment or disease that needs to be fixed or an abnormality that needs to be fixed. And there are many AI powered technologies currently in the market that claim to quote unquote, cure So I think for me, the faintest conversation starts here to ask if we need a certain technology for people with disabilities at all. And if we need them, you know, who decides what is good for people with disabilities? Should it be a certain corporation sitting somewhere in the US run by non-disabled people thinking if a technology would be good or bad for a person with disability? Or should it be a person with disability or a group of people with disabilities thinking about technologies themselves? The second broader point is around mainstream technologies. And I think there is so much AI and so much automation around us. And I think that’s where people with disabilities are very marginalized in the conversation today. A lot of research that has happened has focused on racial minorities or for gender. But there is very little research that has focused on how AI technologies impact people with disabilities or how fairness metrics can cater to disability. A lot of my recent research at Harvard has been on how AI fairness metrics and AI governance policies both explicitly exclude disability from their ambit. So, to give you an example, a lot of LLMs these days like chatGPT are trained to not discriminate against people of color or people belonging to racial minorities. But they’re not trained on, you know, avoiding discrimination against people with disabilities. And in my work, I’ve illustrated how discrimination actually manifests with chatGPT and other generative AI tools like Gemini for people with disabilities. And I think there are several issues with this. There are issues with… with the data because you know this is a long-standing issue. We don’t have enough data on people with disabilities. The data that we have is often very biased because there has been so much societal stigma against people with disabilities and I think almost every jurisdiction or every country. So there is this problem of data. Then the other problem is that you know people with disabilities are usually never considered, they are never thought about when a technology is designed or deployed. So you know people with disabilities are usually never on the table. The workforce participation of people with disabilities especially the technology sector is very less. So the representation is an issue. And the third is when you think about governance policies, they also do not adequately consider disabilities. So when you think about how a technology should be free of biases or when they pose risks and so on, disability is not considered. And that’s something that you can see in other jurisdictions. So the European AI Act for example does not consider how a biometric technology might impact people with disabilities and they put biometric technologies in the lower threat. Whereas you know to me and to a lot of other researchers, biometric technologies pose significant risk for people with disabilities. So I think you know there are many issues with the larger AI fairness conversations, the larger AI bias conversations when it comes to disability. And I think to kind of end with, I think I like to put this in like the broad themes of one, lack of representation of people with disabilities in these conversations. Two, a lot of false optimism or a lot of hype around AI-powered assistive technology without understanding if they are even effective. of what they are seeking to do, what is the intent of the companies that are manufacturing such technologies. And I think the third is this larger marginalization of disability from different technological stages or the different stages of a technology life cycle, right from design, then to deployment, and then to governance. So I think this is broadly my understanding of the larger landscape of automated systems right now.

Fawaz Shaheen: Thank you, Mathita, that’s a very detailed, very, and I know you was trying to contain but into a very little. So thank you so much for doing that. We have 10 minutes now, and very quickly, Eleni, if we could go to you in three minutes or four minutes, if you could just, you know, you’ve had, UNESCO has done so much work on open distance, open learning, distance learning. And in your, from your own experience, if you could talk a little bit about the challenges of automated decision-making, especially in a field like education, and some either guidelines, ethical, you know, safeguards, principles that we should keep in mind when approaching AI systems for persons with, that’s it, for that impact persons with disabilities. And of course, when I say AI systems, I mean AI systems that are absolutely necessary, as Maitreya would say, not AI, not just any systems because we want to make them, but AI systems that are necessary or that are going to impact persons with disabilities. Eleni, if you could just respond to that, three or four minutes, yeah.

Eleni Boursinou: Thank you, thank you Fahad. So the work UNESCO has been doing in education is about, I mean, we see that there are really a lot of challenges by learners with disabilities in online education. So the accessibility gaps in content and in platforms have to be. recognized. There are tools that remain inaccessible, often incompatible with assistive technologies that leave learners with disabilities excluded. So these barriers are also accentuated by digital skill gaps and limited access to resources. We have a lack of inclusive pedagogies and capacity building and training for educators in universal design for learning that further exacerbates these issues. So automated decision making systems in education pose even more risks for persons with disabilities because biases in data and algorithms often result in discriminatory outcomes such as biased admissions, decisions, assessments or resource allocation. Transparency and accountability remain significant concerns and ADM systems often make unjust decisions and they fail to consider the diverse needs of persons with disabilities and what we are trying to do by addressing all the challenges is to work with member states on guidelines and implementation policies to include ethical AI development in education and also guidelines to include policies for not only persons with disabilities. What Mr Shah said in his comment that now the definition has to include all marginalized communities and vulnerable groups, even autism and people with dyslexia. and learning difficulties like this. So, what we think is… I’m going to share in the chat all the links on some documents that UNESCO has been working on. And I also want to end by a field example that we had with the Rwanda Education Board. We worked with the Rwanda Education Board and the Light for the World NGO to foster digital skills development with teachers that have disabilities. And we really saw that it is very, very important to have teachers with disabilities be represented within the communities and the whole education ecosystem. And we had a very, very interesting case study based on that. And they provided constructive feedback on the guidelines. And we actually enriched the guidelines for the governments based on their feedback. So, that’s all for me. I will share all the links in the chat.

Fawaz Shaheen: Thank you, Leni. And we have some observations from participants. I think we can… I’ll… Yeah. We’ll just get the mic to you, sir. Thank you.

Audience: To introduce myself, I am Dr. Mohammad Shabbir from Pakistan and I am the coordinator of Internet Governance Forum’s Dynamic Coalition on Accessibility and Disability. So, I am really impressed by the discussion. So, I am sorry that I joined a little bit. late due to another session that I was speaking at. So as the coordinator of Dynamic Coalition on Accessibility and Disability, I would just want to flag that we have accessibility guidelines. Dynamic Coalition on Accessibility and Disability is one of the old setups within the IGF system, and it has been advising IGF on organizing accessible meetings for people with disabilities. So we have revised guidelines out there on the DC booths, which if in-person participants can go and see there, get them from there, and if someone needs a braille copy, that is also available there. Secondly, I would agree with the speaker from Harvard. I think it was Metra. Pardon if I’m pronouncing the name wrong. There are deceptive practices within the data systems, particularly when we talk about AI and algorithm-based systems. Many of the things have already been said, but one concern that I have as a person with disability, we all use a number of AI-based systems, and we all know that data as the oil of AI is being used to train these systems. As a person with disabilities, sometimes we use these applications systems for many of our personal documents, personal performances, some work, et cetera. So there is a lot of private data needed to these applications and systems of persons with disabilities. Same is the case for people who are hard of hearing, those who use these technologies for interpretation and translation, be in sign language interpretation or otherwise in their conversations. So a lot of data is going into these applications. We don’t know because there are privacy policies, but they are, to my understanding, except for Be My Eyes or some other applications, they have not, many of the applications have not changed their privacy policies. So these also need to be looked at from a friendly perspective. One last point that, and it does not need a response, but I want to leave this to you as an afterthought. There is a disability definition by the CRPD which states people have impairments and disability occurs when those impairment interact with the societal barriers. If those barriers are removed, disabilities are removed. But broadening this definition, and this includes people with learning disabilities, autism and stuff, but my friends from India would understand there is a lot of efforts of abuse of these definitions of the disabilities and getting advantages on behalf of persons with disabilities posing as person with disabilities. So on one hand, while I agree that we need to encompass all disabilities into the definitions, but we need to have certain policies where impersonators could be kept out of those kind of facilities where people with disabilities are facilitated, but some impersonators come and get advantages in the name of persons with disabilities. Thank you so much.

Fawaz Shaheen: Thank you so much, Dr. Saab. I know that there is more scope for conversation right now, but unfortunately we are very much out of time. Thank you so much for joining us, even if it was for a little bit. And we are still here. If anyone wants to chat, have a conversation, the document. is still up, you can comment on it, leave your suggestions. And I would like to particularly thank our on-site speaker, Usama Manzar, for joining us, for sharing your learnings, your experiences. I’d like to thank Maitreya for joining us from such a different time zone and bringing your insights into this. I’m sure we have a lot to learn from you. And I encourage everyone, actually, to check out Maitreya Shah online. Check out his page on the UC Berkeley website, some very interesting articles. I was recently reading one on how the AI conversation needs to include voices with disabilities. Do go and check those out. Also, thank you so much, Eleni, for joining us, for sharing your very valuable insights. And thank you, everyone, for taking out the time. We’ll keep this conversation going. See you around. Thank you. Thanks, everyone. Thank you, everyone. Bye. Bye. Bye. Bye. Bye. Bye. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

T

Tithi Neogi

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Need for accessible consent mechanisms and digital services

Explanation

Digital accessibility is crucial for persons with disabilities to give meaningful consent online. Current data protection laws do not mandate accessible consent mechanisms for persons with disabilities.

Evidence

The Indian data protection law allows guardians to give consent on behalf of persons with disabilities, reducing incentives to make consent mechanisms accessible.

Major Discussion Point

Digital Accessibility and Inclusion for Persons with Disabilities

Agreed with

Osama Manzar

Eleni Boursinou

Maitreya Shah

Agreed on

Need for digital accessibility and inclusion for persons with disabilities

S

Osama Manzar

Speech speed

160 words per minute

Speech length

2804 words

Speech time

1050 seconds

Importance of involving persons with disabilities in technology development and service provision

Explanation

Persons with disabilities should be involved in providing digital services rather than being treated as subjects. This empowers them and changes the narrative around disability.

Evidence

Example of setting up digital centers run by persons with disabilities in villages, providing services to the community.

Major Discussion Point

Digital Accessibility and Inclusion for Persons with Disabilities

Agreed with

Tithi Neogi

Eleni Boursinou

Maitreya Shah

Agreed on

Need for digital accessibility and inclusion for persons with disabilities

Differed with

Maitreya Shah

Differed on

Approach to disability data collection

E

Eleni Boursinou

Speech speed

104 words per minute

Speech length

881 words

Speech time

505 seconds

Challenges in online education accessibility for learners with disabilities

Explanation

Learners with disabilities face significant barriers in online education due to inaccessible content and platforms. There is a lack of inclusive pedagogies and capacity building for educators in universal design for learning.

Evidence

UNESCO’s work with the Rwanda Education Board and Light for the World NGO to foster digital skills development for teachers with disabilities.

Major Discussion Point

Digital Accessibility and Inclusion for Persons with Disabilities

Agreed with

Tithi Neogi

Osama Manzar

Maitreya Shah

Agreed on

Need for digital accessibility and inclusion for persons with disabilities

M

Maitreya Shah

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Lack of representation of persons with disabilities in AI and technology conversations

Explanation

People with disabilities are often not considered when technologies are designed or deployed. There is low workforce participation of people with disabilities in the technology sector.

Evidence

Example of how AI fairness metrics and governance policies often explicitly exclude disability from their ambit.

Major Discussion Point

Digital Accessibility and Inclusion for Persons with Disabilities

Agreed with

Tithi Neogi

Osama Manzar

Eleni Boursinou

Agreed on

Need for digital accessibility and inclusion for persons with disabilities

Privacy risks from AI-powered assistive technologies and data collection

Explanation

There is unchecked optimism about AI-powered assistive technologies without understanding their implications on people with disabilities. These technologies may violate privacy and promote problematic ideas about ‘fixing’ disabilities.

Evidence

Example of AI-powered smart robots being adopted for caregiving without considering privacy implications.

Major Discussion Point

Data Protection and Privacy for Persons with Disabilities

Agreed with

Angelina Dash

Audience

Agreed on

Concerns about data protection and privacy for persons with disabilities

Risks of bias and discrimination against persons with disabilities in AI systems

Explanation

AI systems and large language models are often not trained to avoid discrimination against people with disabilities. This leads to biased outcomes in various applications of AI.

Evidence

Research showing how discrimination manifests in chatGPT and other generative AI tools for people with disabilities.

Major Discussion Point

AI and Automated Decision-Making Systems

Importance of disability representation in AI fairness and governance conversations

Explanation

People with disabilities are marginalized in conversations about AI fairness and governance. Disability is often not considered in policies about technology risks and biases.

Evidence

Example of the European AI Act not considering how biometric technologies might impact people with disabilities.

Major Discussion Point

AI and Automated Decision-Making Systems

Limitations of medical/impairment-based approaches to disability data collection

Explanation

Current data collection efforts often fail because they rely on a narrow, medicalized conception of disability. This approach excludes many people who might identify as disabled under a broader definition.

Evidence

Contrast between official Indian data showing 2-5% of the population as disabled, versus WHO estimates of 15% globally.

Major Discussion Point

Disability Data and Definitions

Differed with

Speaker 1

Differed on

Approach to disability data collection

Need for social model and broader definitions of disability aligned with UNCRPD

Explanation

A social model approach to disability, focusing on societal barriers rather than individual impairments, is needed. This aligns with the UN Convention on Rights of Persons with Disabilities and provides a more inclusive framework.

Major Discussion Point

Disability Data and Definitions

A

Angelina Dash

Speech speed

150 words per minute

Speech length

820 words

Speech time

326 seconds

Issues with treating persons with disabilities like children in data protection laws

Explanation

The Indian data protection law treats persons with disabilities similarly to children, requiring consent from guardians. This approach infantilizes persons with disabilities and doesn’t account for their autonomy.

Evidence

Comparison of treatment of children and persons with disabilities in Indian data protection law.

Major Discussion Point

Data Protection and Privacy for Persons with Disabilities

Agreed with

Maitreya Shah

Audience

Agreed on

Concerns about data protection and privacy for persons with disabilities

Need for sensitive personal data category in data protection laws

Explanation

The absence of a sensitive personal data category in Indian data protection law is problematic. Certain data of persons with disabilities can be more vulnerable and susceptible to misuse for discrimination.

Evidence

Examples of health and financial data of persons with disabilities being more vulnerable to misuse.

Major Discussion Point

Data Protection and Privacy for Persons with Disabilities

Agreed with

Maitreya Shah

Audience

Agreed on

Concerns about data protection and privacy for persons with disabilities

A

Audience

Speech speed

124 words per minute

Speech length

620 words

Speech time

298 seconds

Concerns about private data of persons with disabilities being used to train AI systems

Explanation

Persons with disabilities often use AI-based systems for personal tasks, inputting private data. There are concerns about how this data is being used to train AI systems and whether privacy policies adequately protect this sensitive information.

Evidence

Examples of applications used by people who are hard of hearing for interpretation and translation.

Major Discussion Point

Data Protection and Privacy for Persons with Disabilities

Agreed with

Angelina Dash

Maitreya Shah

Agreed on

Concerns about data protection and privacy for persons with disabilities

Challenges in balancing inclusive definitions with preventing misuse/impersonation

Explanation

While broader definitions of disability are needed, there are concerns about potential abuse and impersonation. Policies are needed to prevent impersonators from taking advantage of facilities meant for persons with disabilities.

Major Discussion Point

Disability Data and Definitions

Agreements

Agreement Points

Need for digital accessibility and inclusion for persons with disabilities

Tithi Neogi

Osama Manzar

Eleni Boursinou

Maitreya Shah

Need for accessible consent mechanisms and digital services

Importance of involving persons with disabilities in technology development and service provision

Challenges in online education accessibility for learners with disabilities

Lack of representation of persons with disabilities in AI and technology conversations

All speakers emphasized the importance of making digital technologies and services accessible and inclusive for persons with disabilities, highlighting various aspects such as consent mechanisms, education, and technology development.

Concerns about data protection and privacy for persons with disabilities

Angelina Dash

Maitreya Shah

Audience

Issues with treating persons with disabilities like children in data protection laws

Need for sensitive personal data category in data protection laws

Privacy risks from AI-powered assistive technologies and data collection

Concerns about private data of persons with disabilities being used to train AI systems

Multiple speakers raised concerns about the inadequate protection of personal data of persons with disabilities, highlighting issues in current laws and risks associated with AI technologies.

Similar Viewpoints

Both speakers advocate for broader, more inclusive definitions of disability that go beyond medical models, while also acknowledging the need to prevent misuse of such definitions.

Maitreya Shah

Audience

Limitations of medical/impairment-based approaches to disability data collection

Need for social model and broader definitions of disability aligned with UNCRPD

Challenges in balancing inclusive definitions with preventing misuse/impersonation

Unexpected Consensus

Importance of involving persons with disabilities in technology development and service provision

Osama Manzar

Maitreya Shah

Importance of involving persons with disabilities in technology development and service provision

Lack of representation of persons with disabilities in AI and technology conversations

Despite coming from different backgrounds (grassroots implementation vs. academic research), both speakers strongly emphasized the need for direct involvement of persons with disabilities in technology development and deployment.

Overall Assessment

Summary

The speakers generally agreed on the need for greater digital accessibility, inclusion, and data protection for persons with disabilities. They also emphasized the importance of involving persons with disabilities in technology development and policy-making processes.

Consensus level

There was a high level of consensus on the main issues, with speakers complementing each other’s perspectives from different angles (policy, grassroots implementation, research). This consensus suggests a strong foundation for developing more inclusive policies and practices in digital accessibility and data protection for persons with disabilities.

Differences

Different Viewpoints

Approach to disability data collection

Osama Manzar

Maitreya Shah

Importance of involving persons with disabilities in technology development and service provision

Limitations of medical/impairment-based approaches to disability data collection

Osama Manzar emphasizes focusing on abilities and involving persons with disabilities in service provision, while Maitreya Shah argues for moving away from medicalized conceptions of disability towards a social model approach in data collection.

Unexpected Differences

Optimism about AI-powered assistive technologies

Osama Manzar

Maitreya Shah

Importance of involving persons with disabilities in technology development and service provision

Privacy risks from AI-powered assistive technologies and data collection

While Osama Manzar appears optimistic about the potential of digital technologies to empower persons with disabilities, Maitreya Shah unexpectedly raises concerns about unchecked optimism regarding AI-powered assistive technologies, highlighting potential privacy risks and problematic assumptions about ‘fixing’ disabilities.

Overall Assessment

summary

The main areas of disagreement revolve around approaches to disability data collection, the role of AI-powered assistive technologies, and the extent of representation needed for persons with disabilities in technology development and policy discussions.

difference_level

The level of disagreement is moderate. While speakers generally agree on the importance of inclusion and representation for persons with disabilities, they differ in their specific approaches and areas of concern. These differences highlight the complexity of addressing disability issues in the context of digital technologies and data protection, suggesting the need for continued dialogue and diverse perspectives in policy-making.

Partial Agreements

Partial Agreements

Both speakers agree on the importance of involving persons with disabilities in technology development and conversations. However, they differ in their approach, with Osama Manzar focusing on practical involvement in service provision, while Maitreya Shah emphasizes representation in broader AI and technology policy discussions.

Osama Manzar

Maitreya Shah

Importance of involving persons with disabilities in technology development and service provision

Lack of representation of persons with disabilities in AI and technology conversations

Similar Viewpoints

Both speakers advocate for broader, more inclusive definitions of disability that go beyond medical models, while also acknowledging the need to prevent misuse of such definitions.

Maitreya Shah

Audience

Limitations of medical/impairment-based approaches to disability data collection

Need for social model and broader definitions of disability aligned with UNCRPD

Challenges in balancing inclusive definitions with preventing misuse/impersonation

Takeaways

Key Takeaways

Resolutions and Action Items

Unresolved Issues

Suggested Compromises

Thought Provoking Comments

We must do conversation, action, intervention with people with disability in everything, in everything, whether you are doing research or data collection or doing something on work, they must be part of the ecosystem.

speaker

Osama Manzar

reason

This comment emphasizes the critical importance of including people with disabilities in all aspects of research, policy-making, and implementation related to disability issues. It challenges the common practice of making decisions for people with disabilities without their input.

impact

This comment shifted the discussion towards the importance of representation and inclusion of people with disabilities in decision-making processes. It led to further exploration of how to meaningfully involve people with disabilities in various contexts.

The problem with the entire data collection effort is not the focus on disability or ability. In fact, I feel that the very distinction between ability and disability, I think, is a very medicalized conception of disability.

speaker

Maitreya Shah

reason

This comment challenges the traditional medical model of disability and introduces the social model perspective. It highlights how current data collection methods may be fundamentally flawed due to their underlying assumptions about disability.

impact

This comment deepened the discussion by introducing a more nuanced understanding of disability. It led to a conversation about the limitations of current data collection methods and the need for a more holistic approach to understanding disability.

A lot of my recent research at Harvard has been on how AI fairness metrics and AI governance policies both explicitly exclude disability from their ambit.

speaker

Maitreya Shah

reason

This comment highlights a significant gap in current AI ethics and governance frameworks, pointing out how disability is often overlooked in discussions of AI fairness.

impact

This comment shifted the discussion towards the intersection of disability and AI, leading to a more in-depth exploration of the challenges and risks posed by AI systems for people with disabilities.

We worked with the Rwanda Education Board and the Light for the World NGO to foster digital skills development with teachers that have disabilities. And we really saw that it is very, very important to have teachers with disabilities be represented within the communities and the whole education ecosystem.

speaker

Eleni Boursinou

reason

This comment provides a concrete example of how including people with disabilities in educational initiatives can lead to more effective and inclusive outcomes.

impact

This comment grounded the discussion in practical examples, demonstrating the real-world impact of inclusive practices. It led to a discussion of best practices and successful case studies in disability inclusion.

Overall Assessment

These key comments shaped the discussion by shifting it from a theoretical understanding of disability issues to a more nuanced, practical, and inclusive approach. They challenged traditional perspectives on disability, highlighted the importance of representation and inclusion, and brought attention to emerging challenges in areas like AI and data governance. The discussion evolved from focusing on barriers and problems to exploring solutions and best practices for meaningful inclusion of people with disabilities in various contexts.

Follow-up Questions

How can we make consent mechanisms more accessible for persons with disabilities?

speaker

Tithi Neogi

explanation

This is important to ensure persons with disabilities can give meaningful consent online and access digital services independently.

Should sensitive personal data be reintroduced as a category in India’s data protection law?

speaker

Angelina Dash

explanation

This is crucial for providing additional safeguards for vulnerable data of persons with disabilities, which could be misused for discrimination.

How can we address the lack of a global south perspective in disability and internet governance discourse?

speaker

Angelina Dash

explanation

This is important to account for unique challenges faced by persons with disabilities in global south countries, including intersecting marginalization.

What is the best approach for engaging persons with disabilities in policy development processes?

speaker

Audience member from Ghana

explanation

This is crucial for ensuring policies are inclusive and address the actual needs of persons with disabilities.

How can we improve data collection efforts to better represent the diversity among persons with disabilities?

speaker

Fawaz Shaheen

explanation

This is important for developing more accurate and inclusive policies and interventions for persons with disabilities.

How can we ensure accountability from automated decision-making systems, particularly from the perspective of persons with disabilities?

speaker

Fawaz Shaheen

explanation

This is crucial to prevent discrimination and ensure fairness in AI systems that impact persons with disabilities.

How can we address the privacy concerns related to AI-based assistive technologies collecting personal data from persons with disabilities?

speaker

Dr. Mohammad Shabbir

explanation

This is important to protect the privacy and data rights of persons with disabilities who rely on these technologies.

Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.