Safeguarding Processing of SOGI Data in Kenya | IGF 2023

8 Oct 2023 03:15h - 04:00h UTC

Event report

Speakers:
  • Muthuri Kathure, ARTICLE 19 Eastern Africa
  • Representative of LGBT community in Kenya,
  • Representative of the Intersex Persons Implementation Coordination Committee,
Moderator:
  • Muthuri Kathure, ARTICLE 19 Eastern Africa

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Jeremy Ouma

The assessment offers an exhaustive exploration of diverse facets of Kenyan society, legislation, and practice with significant emphasis being placed on LGBT rights and data protection.

A primary cause of concern emerges from the deficient legal protections for LGBT individuals against discrimination grounded in their sexual orientation or gender identity. The current Kenyan legal framework is found lacking in offering ample defence to these marginalised groups, thus fostering an environment in which the revelation of personal demographic data could invite bias and negative repercussions. This scenario is aggravated by particular practices linked to LGBT identities being outlawed in the nation.

However, this situation isn’t entirely devoid of optimism. A discernible societal shift is observed, with increasing momentum in Kenya to repeal or amend penal codes criminalising acts related to diverse gender identity. Despite hindrances to these initiatives, such as refusal of organisation registration, the identification as diverse is not outlawed. Activists highlight an ongoing case between a regulatory body and the ‘NGO board’ as a paradigm in this advocacy.

Prominent worries regarding data protection and processing standards also attract attention in the analysis. The differential handling of sexual orientation and gender identity data under current legal structures is deemed problematic. In order to address these issues, organisations need to register with the Office of the Data Protection Commissioner. They are urged to prepare and update data protection protocols and increase awareness about privacy during data management. An internal capacity building within civil organisations is particularly underscored to foster greater awareness and engagement on data protection.

Furthermore, the analysis points out persisting challenges relating to digital platforms and their content management strategies. These include the potential amplification of harmful content and a lack of understanding of local context in content moderation processes. The scarcity of transparency within these systems further exacerbates the situation, yet progress is recognised through ongoing legal cases, such as the one involving Meta and its former content moderators. This case serves as a tangible pursuit to hold platforms accountable within the Kenyan jurisdiction. Efforts are underway to bridge the gap between local users and platforms, nudging platforms to better comprehend and respond to their user base.

Finally, the study extends its purview beyond Kenya, expressing negative sentiment around homosexuality laws in Uganda. The persistent implementation of these unfair laws, resulting in adverse prosecutions based on them, is recognised as a significant violation of human rights.

In summary, the analysis identifies a complex network of obstacles within Kenyan society, but simultaneously showcases several steps taken to address and surmount these challenges. It provides a detailed account of ongoing efforts and the dire need for progress towards a more inclusive society.

Angela Minayo

Angela Minayo emphatically discusses the significance of well-regulated data management in preventing human rights violations. She emphasises that while data affords immense opportunities, if poorly regulated, it can be manipulated to facilitate human rights abridgements. Therefore, robust data regulation legislation is essential to safeguard human rights.

Minayo affirms the necessity of a harmonised protection framework for sensitive data, encompassing gender and sexual orientation. The continuing debate about data protection, confidentiality, and personal autonomy prompts her to argue for more comprehensive legislation that provides protection not just for conventional data but also for sensitive information regarding an individual’s gender identity and sexual orientation.

Conversely, Minayo voices her reservations concerning the efficacy of Kenya’s existing Data Protection Act. Despite the law’s progressive human rights perspective, she regards it as deficient in its coverage of gender identity under sensitive data and its contradictory approach in handling health data and sexual orientation.

Minayo spotlights non-profit entities and highlights the need for these organisations to comply with data protection regulations. She proposes that while data protection often associates with corporate compliance, both for-profit and non-profit entities need to ensure adherence to the regulations. Significant mishandling of personal data, especially sexual orientation and gender identity data, can result in severe human rights implications.

She goes on to emphasise the complexity of data protection and espouses an increase in the resources geared towards effective data protection, notably for non-profit entities that perhaps lack the necessary resources for comprehensive data security. Minayo further calls upon data entities in various countries to register data controllers and processors to legitimise their roles and allow for efficient allocation of budgets and responsibilities for data protection.

Minayo highly commends the use of data processing templates from Article 19, designed to assist non-profits. She affirms these templates as serving as a checklist for various data protection procedures. She also underscores the importance of organisations soliciting consent for personal data usage and documenting said consent.

In the context of protecting sensitive data, she references the severe implications for marginalised groups, pinpointing the increase in online homophobia following a Supreme Court ruling in Kenya that authorised LGBTQ organisations to be formally registered. She asserts that existing as queer in Kenya is akin to a political act, endangering individuals with stigma, and even death.

Finally, the digitisation of sectors such as sex work and resultant data protection concerns that emerge due to Kenyan users’ data being handled outside the country, leads Minayo to recommend improved awareness of data protection laws, specifically for the evolving digital economy.

In conclusion, Minayo centres her discussion on the importance of contemplating data protection from a broad spectrum, ranging from the necessity for robust regulations to protect sensitive data, the urgency for more resources for non-profit entities, the relevance of data protection across all sectors, and the significance of stakeholder awareness.

Audience

The discourse primarily centres around pressing issues related to data sensitivity and the recognition of diverse gender identities within Kenya’s legal framework. One of the main criticisms in the conversation pertains to the relatively rigid categories of gender identities officially recognised in Kenya, which currently include Male, Female, and occasionally, Intersex. This limited classification, as suggested in the discussion, disregards the broad spectrum of gender identities, leading to negative sentiment surrounding this lack of inclusivity.

Additionally, the classification and handling of sensitive information in the country are scrutinised. Specific emphasis is on the differential treatment of data regarding sexual orientation and gender identity under the Data Protection Act. Whilst the data on sexual orientation is treated as sensitive personal data, data on gender identity is viewed as general data. This discrepancy raises concerns about the absence of a specific law, offering protection against discrimination based on sexual orientation or gender identity in Kenya.

The conversation further extends to the issue of data protection measures, particularly with respect to sex workers. A unique point highlighted by the audience is the necessity to safeguard data associated with this group, emphasising non-profits’ interaction with sex workers and the requirement to guarantee adequate protection measures.

Another point of interest stemming from the dialogue pertains to the influence of Uganda’s Anti-Homosexuality Act on Kenya’s data protection laws. Particularly, concerns are raised about whether the stringent regulation enacted by neighbouring Uganda might impact the LGBT community’s data protection in Kenya.

Platform accountability in Kenya also draws concern from the audience, with particular focus on the efficiency and procedure of incident reporting in cases of data breaches. The police’s involvement in such instances is queried, implying an underlying need for more robust incident response protocols.

A significant part of the conversation is dedicated to the management and confidentiality of health data. Clarification is sought on the mechanisms for sharing health data between facilities, with additional questions being raised about whether individual consent should be obtained each time data is shared. Participants inquire about the legal coordination between the Health Privacy Act and the Data Protection Act, seeking to understand which of the two pieces of legislation primarily governs patient privacy. These discussions shed light on the evident gaps and ambiguities in the country’s data protection and privacy laws, highlighting the public’s demand for a more transparent and protective legal system.

Session transcript

Jeremy Ouma:
I’m going to ask Angela Minayo to introduce herself. I will probably allow my colleague to introduce herself first, then we can get into it. Also, another colleague of mine is on the way. I think he’ll introduce himself as soon as he gets here. Thank you. Over to you.

Angela Minayo:
My name is Angela Minayo and I’m the Director of the Center for Human Rights. I’m interested in the topic because I believe that data unlocks a lot, but also it can be an area where, if not well regulated, can lead to further human rights violations. So I’m grateful for this panel, and I await questions from Jeremy.

Jeremy Ouma:
Thank you, Angela. Thank you, Angela, and thank you to all of you for coming. So, since we have done some research, basically a brief, very brief, brief, by Article 19 Eastern Africa, as I’ve said, I work with Article 19 Eastern Africa. And I enjoy, we work on issues of freedom of expression, association, and, of course, the issue of gender identity. And also, specifically, for this session, we have this brief produced as part of our data, our voice project, supported by the GIZ Digital Transformation Center in Kenya. And the paper basically provides an overview of the processing of sexual orientation and gender identity data specific to the Kenyan context. And it also provides an overview of the process of data protection in Kenya, particularly for the LGBTQ people who face heightened privacy and discrimination risks in comparison to cisgender and populations, basically, in Kenya. So what we hope to do with this paper is to increase awareness of data protection by different stakeholders in Kenya, be it regulators, be it regulators, be it data protectors, be it data regulators, be it data regulators, be it data protectors. And by doing this awareness, people get to understand their rights. That’s for whoever is part of the community. That’s one. Two, people also understand the obligations that be it data controllers, data processors, and adhere to these data protection laws, building the trust that they have in the community. So I’ll briefly go into some of the findings that we had before we get to some questions with my colleague here. Let me briefly go into some of the findings. Okay, so as I’ve said, briefly some of the findings. One is the insufficient legal protections for LGBT people, especially in the Kenyan context, most laws, or rather the practice is outlawed, not being identifying as LGBT in the countries, but the practice or the action is outlawed. So this has negative impacts on disclosure of this kind of information, sexual orientation or gender identity data, disclosure and collection, especially, for example, be it a hospital, be it in banks, and the legal framework doesn’t provide protection from this discrimination. There’s a blanket protection for everyone that we call under the law, but there’s no specific protection from this kind of discrimination on the basis of sexual orientation or gender identity. That’s one, plus the social cultural context continues to have a great impact on the treatment of sexual and gender minorities in the country. We have this thing that we keep throwing around that it’s not our culture, that’s thrown around by government, by leaders, so it’s not a very good environment. That’s one. I think the second finding is about this differentiated treatment of SOGI data under the legal framework currently in place. So sexual orientation is classified as, and processed as sensitive personal data, but gender identity is classified and processed as personal data. So one is sensitive personal data, the other is general data under Section 2 of the Data Protection Act, which covers most things under data protection in the country. So in effect this means that data controllers and processors processing this kind of data must differentiate and accord higher levels of protection to sexual orientation data, despite gender identity also exposing data subjects to similar risks and consequences. That’s finding number two. Number three is restriction of SOGI data collection to legally recognized categories in both public and private sectors. So in the country most, if you go for example to a bank, to a hospital, and you need to collect data, the categories are majorly male, female. Sometimes they’ll say intersex, sometimes they’ll just put other. So this is basically attributable to the failure of the law to recognize other gender identities, sexual orientation, and this kind of people. So this paper was guided by some contributions from key stakeholders. We did some key informant interviews, we had some focus group discussions with key industry players in the country specific to Kenya. I think I’ll leave that as the very key findings, but we also have some… other findings and then go to some of the conclusions we drew and then some recommendations. Then after that we can we can speak to my my colleague and have a brief overview of the current situation. So I’ll go straight to some of the recommendations. Specifically for one, data controllers and processors. I’ve divided it into two. So we have recommendations for data controllers and processors. The second is, let’s call it the civil society. So one for data controllers and processors is basically around compliance. So all public private organizations and individuals processing personal data are required to register with the, let’s call it the regulator, not regulator, but we call it the ODPC, the Office of the Data Protection Commissioner in the country, as stipulated under the Data Protection Act. So we encourage that for anyone that is processing data, be it government, be it a hospital, be it a business, that you require to process to process data. Recommendation number two is to implement technical and organizational measures for compliant data processing of this kind of data, this kind of sensitive data, including doing a data protection assessment, data protection impact assessment, prior to processing and processing this kind of data, and also engaging these communities. That’s one. Two, appoint a data protection officer to oversee compliance with the laws, that’s the Data Protection Act, and other relevant privacy and data protection laws. That’s two. And for entities in the public and private sectors to basically prepare, update data protection policies and notices so that they are up to date with the needs of the community. Finally is basically awareness, and that is internal awareness for these entities to have a privacy aware culture so that you are aware of what you need to do when processing this kind of data, and how to handle that kind of data in the right way. I think I’ll go to the very last group. This is for basic, the civil society, these are the civil society actors, is one, to build in internal capacity and undertake training to, to one, understand the frameworks of data protection, the impact it has on the communities we work with, and two, is also once you have this knowledge, you engage the public and private sector entities to also create awareness for them to understand the impact this processing has, to understand what are the obligations under the laws. And finally is to advocate for the data protection commissioner to expressly include gender identity as a form of sensitive personal data. In light of the risk of significant harm that may be caused by the processing to data subjects, it’s also important to have this captured by the data protection commissioner and acknowledged, and have frameworks to protect this kind of data. That’s number two. Two, and finally is to advocate for better laws to remove the discriminatory laws, whether it’s repealing them or amending them to be in line with international standards. So yes, those are the kind of key findings and some of the recommendations we have in this brief. I think I’ll go to, if anyone has any questions up to that point, I’ll happily take them before we go to my colleague. Then we can have a discussion about other experience in the country. Yes, please.

Audience:
Why can’t you use the mic? So from your presentation, it seems that there’s like a kind of embedded contradiction, which is that on the one hand, the diverse gender identities are not recognized, right, officially. So, but at the same time, there is that risk of, like if somehow, you know, banks and medical facilities, so that’s kind of the focus of data protection. But in parallel, is there kind of a movement or a drive towards kind of having these recognitions, you know, formally, you know, instituted? Is there, you know, again, is there a drive or a movement? Is there something happening in Kenya?

Jeremy Ouma:
Okay, thank you. I think that’s the only question. I think, yes, there’s a drive to do that. Over the past couple of years, there’s been a drive to repeal a section of the penal code. I think it’s two sections, 162 and 163, that criminalizes the act, not necessarily the person. In Kenya, the identifying as diverse, it’s not criminalized, but the act is what is criminalized. So once you found, but most times, the law is abused. People are denied registration for organizations. They’re denied, but there’s been some good precedence that I think there’s been a long court case going on between the regulator that, is it called the NGO board?

Angela Minayo:
So there was a petition to the Constitutional Court to declare section 162 of the penal code to be unconstitutional based on the discrimination ground. That petition was not successful, and section 162 is still operational in our country. So Kenya’s policy around it is to act like they don’t exist. So whenever they’re asked about it, the government always says that that is not a priority for Kenya, that we’re a third world country, and we have more development issues to be concerned about. So what has helped Kenya and the LGBTQ community is that our constitution is very progressive. So the penal code is a relic of the colonial laws that we inherited from the British government systems, but the constitution is actually 2010, very new in terms of constitutional law practice, very, very new in constitutional making. And our constitution has the right to non-discrimination very strongly, and the Bill of Rights, which enumerates human rights. So when LGBTQ organizations are being denied registration based on the penal code, they went to court, and the court went up to the highest court, which is the Supreme Court. And the court ruled that while what they’re doing might be a violation of the penal code under section 162, they have the constitutional right to assembly and association and all that. And therefore, when the registrar of the NGOs refuses to register them, then they have contravened the constitution. So it’s the robustness of Kenya’s constitution, even on the right to privacy, under article 31 of Kenya’s constitution. And now the Data Protection Act that you see a very progressive human rights outlook, but we still have this elephant in the room, which is section 162, and you can see the contradictions. I hope that gives you an idea of what we are working with.

Jeremy Ouma:
And to also mention that there’s a lot of pushback from government, they are not, for example, in the petition, the registrar tried to say that you’re not supposed to do this. So there’s a lot of pushback from government. They are not ready to have these conversations. So yeah, I think then, I hope that answers you. Thank you. Then I’ll go straight to a couple of questions for my panelists here. The other one will arrive, I’m not sure, but it should be on the way. So we can basically start looking at the legal framework for processing of data, not necessarily just sexual or gender kind of data, but in general, what do you think, or rather, what’s the framework governing this processing of data?

Angela Minayo:
Yeah, so as I had stated earlier, it starts in our constitution, the right to privacy. Then we have international human rights commitments. Some of them, you know them, the International Covenant on Civil and Political Rights, and the African Convention of Human and People’s Rights. So these are the basis and the frameworks for the right to privacy. And then in 2019, we operationalized the Data Protection Act. Like many other African countries, it is a replication of the EU GDPR, and that comes with pros and cons. So some of the pros are that the GDPR provided a very good framework with a complaints handling mechanism, an independent office. I’ll put independent in quotes because you can say it’s independent, but who’s appointing? So independence is a bit of a, I won’t choose independent, I’ll just say it has a body because the independence is questionable. So we have the Data Protection Act, and it’s a very elaborate framework, and I’m not going to focus so much on the downsides other than to just say that just like the EU legislation, we expect that when data is being transferred from Kenya to other countries, that there’s enough safeguards to provide equal or similar protection for data that is being handled in the third countries where there’s similar protection. countries. So I hope that answers it. Maybe we can all go to the sexual orientation and gender identity. So I work in Kiktonet as a gender digital rights programs officer and last year we also had conversations around gender and data protection and this conversation cannot be had without talking about sexual orientation and gender identity data. Just a fun fact even before I delve into the Kenyan ecosystem is that there’s a very progressive approach to data protection from the southern African region. I don’t know if you knew this but South Africa, what are they called, SADC, the regional bloc in South Africa SADC came up with a model framework for privacy law for the member countries that belong to it and they put gender as one of the personal or sensitive data. So while Kenya just talks of sex they talk about gender and there’s a difference. So when you just talk about sex you’re referring to biological sex and this is what is assigned that path. So you’re male or female or intersex but when you talk about gender you’re talking about someone’s expression and that might sometimes not align with their biological sex. So that’s very progressive. I always try to talk about the SADC model law because it took a different approach even from the GDPR which is something we want to see more of regional blocs and countries taking their own approach to data protection in a way that makes sense for them but also in a progressive way in a feminist way. So I just like to mention that. So in Kenya you will find that gender identity is not specifically provided for under sensitive data. It is treated as just personal data and that means that it can be processed one when there’s consent and two even when there’s no consent when the data processor or the data controller can prove that that information or that data is necessary for performing certain tasks. So they’ll say you entered into a contract and part of my obligation was to do ABCD and that obviously means I have to process your data to do that. So that is a, how can I say this, there is no consent necessary in certain aspects of personal data processing. How is this a problem? It is a problem because we can still see how gender data or gender identity identifiers can still lead to human rights violations. It could be job applications, it could be loan applications, so it could lead to further violations. Now for sexual orientation I think we’ve already given you and we have set the scene for you to understand how sexual orientation is grappled with in our legal system that it is a crime and the said crime is still protected in the data protection framework which sends contradicting viewpoints and it’s not just sexual orientation data if I may add. Health data has also been one of the things we’ve grappled with because health data is dealt with in different frameworks. So there’s our Health Act which empowers health practitioners to collect data necessary to perform their work and at the same time you’re seeing that health data is sensitive data that cannot be processed unless certain safeguards are given. So this is something you will keep seeing in countries where they pass the Data Protection Act but don’t review or reform the laws that existed before. So you end up with a very interesting set of laws if I may say. So yeah, so when you have, when data is deemed sensitive it means there’s more safeguards towards this protection. You’ll find that if there’s no consent then the data controller or the data processor must prove the necessity of taking this data and of collecting this data or processing this data. Again that falls in data minimization that we want you to only collect data that truly is necessary for what you’re trying to do. So what happens when you put sexual orientation data and gender identity data separately? Let me give you an example. So you’re saying sexual orientation data is protected, right? But gender identity data is not sensitive data, it can be processed in any other way. But when we create links between datasets we can be able to tell that this is Angela or this is Jeremy. Jeremy is male, male or female, that is not protected we can tell, male. Jeremy is on a sex app that is for queer community. So that tells us that while Jeremy is male we can tell that, we can identify Jeremy’s sexual orientation from the apps he’s using. So we need to have a harmonized protection framework that protects Jeremy both for his identity as male and his orientation as queer. And this is of course just for example purposes after this meeting, please don’t harass Jeremy. But you get the point that we need, what I like to say is we need an equilibrium or a spectrum of protection that cuts across and doesn’t end and stop at a certain point.

Jeremy Ouma:
Thank you Angela and you’ve preempted my next question about probably what is your experience with the practice of processing of data, especially sensitive data..

Angela Minayo:
Thank you, I think for this talk to be important for the people in the room, I’d really like to talk about processing of data in non-profit entities. So for a long time we’ve been talking about data protection from, oh it’s a company’s compliance, actually I have that compliance but you have to use it. But what the message it sends is regulatory and compliance, that’s what companies do. Non-profits comply how when they’re sending financial reports to donors, like compliance is a very foreign word to non-profit entities. Yet you’ll find non-profit organizations process a lot of sexual orientation and gender identity data which if mishandled has serious human rights ramifications. So we need to understand data protection as something that applies both for non-profit and for-profit entities, the companies in this case. So you’ll find even how we have conversations around data protection, it’s all big tech and of course I understand why we do this because those are the examples that make the most impact and the most sense to the people in the room. But we also need to start talking about processing of data by non-profit entities because what will happen for instance if a non-profit like Article 19 for example, you operate in Kenya and Kenya is becoming maybe a draconian state and the state can have access to your documents. Do these organizations have a plan? Do they know how to fight back when this data is requested from them? We keep saying oh Apple is such a good company because they will never comply with request for information from governments. Do we ask the same questions when it comes to non-profit entities? So I think it’s very important to have those conversations of data protection even from a non-profit point of view. So from practice what is very worrying is this idea that data protection is a concern for companies and not a concern for non-profits, yet you will find the people who handle most of the sexual orientation and gender identity data, the people who are doing the research in these areas and collecting data in this area will be non-profit entities. Another worrying practice I’ve seen from my country, I’m speaking from Kenya’s perspective and during the question and answer forum I would like to invite you to give us perspective from your countries, is that there are so many myths and misconceptions around data protection and processing. I’ll give you an example. So last week, last month, let me just say last month because I’ve lost a sense of time, is that our Office of the Data Protection Commissioner issued penalties, penalty notices to three companies for breaching the Data Protection Act and one of the entities that was fined was a club, so like a place people go to have fun, a partying joint, and they were taking photos of people who are taking part, you know, so they, you know, in Kenya for some reason clubs have this obsession with taking photos of revelers having fun at their joint. I don’t know why, I don’t know if they feel like they wouldn’t make enough sales without, you know, it’s a whole data minimization and necessity principle, like do you really need it? But they did and they ended up being fined because the data subjects complained about them to the Data Protection Commissioner. I’m also meant to understand that it’s not, we should not take it for granted that our Commissioner can issue penalties. Apparently in some jurisdictions the investigative powers and the powers to give penalties are curtailed, so I just wanted to put that as a side note. And what the other clubs have understood from this penalty notice is to put emojis on the faces of the photos they take and they did this immediately after the penalty notice, like that’s how unserious of a country Kenya is, like we use humor to get through, it’s a very tough place to live in. So anyway, the point is they think putting emojis on the photos is complying with the data protection, it’s that bad, exactly. So there’s a very pedestrian approach to understanding data protection because if we have applications that can remove emojis and de-anonymize the photos, then they have not complied with the Data Protection Act if they don’t have the consent and necessity, if there’s no consent and all that. So I’ll just put, I’ll end it at that because I think it’s a light note and tells you what the problem we’re dealing with.

Jeremy Ouma:
Thank you. If you have any questions I will take them at the very end. I think I’ll just throw a couple of more, a few more questions to you. Kind of building on what you’ve just highlighted, is there some sort of worrying impacts that you have seen from this kind of processing of the of sensitive data or just any data in general?

Angela Minayo:
Actually I’ll give an example of an activity we were doing in Kiktonet. So last year we were doing something known as a Gender Internet Governance Exchange under the Our Voices Our Futures project by APC and we had people from the queer community as part of the participants and before we used to just work with women and we will take photos and post them, you know, part of the reporting but also the social media campaigns and they told us that some of them are closeted and putting their photos online in an activity that is clearly for queer people will put them at risk. I get comments that data protection should be some very, how can I say this, this is serious. This is about penalties of 50 million in the EU and Facebook and Meta but that minimizes the harm that such data breaches can have on normal ordinary people who are not celebrities, who are not in the EU and therefore their complaints cannot attract the penalties that are in the European Union. So understanding that in the context of a stigma and even deaths we’ve seen in our country against queer people, that is a serious risk we need to have in mind. I’ll give another example of the homophobia we saw online once the Supreme Court made a ruling allowing LGBTQ organizations to be formally registered. So we had a lot of disinformation online and what people understood that ruling to mean was that Kenya has formalized LGBTQ relationships, which was not even the case. We wish that was the position. It is not. It is not. And the homophobia, the messaging online was we will kill them, we are never going to accept that and I kid you not, it was not even just from people online, it was even from the leadership at the national level. So when there’s this understanding that this is an undesired people among us, it also warrants hate or justifies hate or motivates and incites hatred against that group. So being queer on its own, existing as queer, is a political act in Kenya and in certain other countries. So let me just end it at that.

Jeremy Ouma:
Okay, thank you. I think the final one should be, probably, do you have any recommendations or insights on best practice for collection and subsequent processing of this kind of data?

Angela Minayo:
Yes, first of all, I like at article 19 to actually publish the, I just want to call you out here. You need to publish this resource because they have annexed amazing templates that people can use when processing data. Data protection is a very complex principle. I have to always remind people, and I’m talking about it, it can’t cover all the bases in one topic or in a 45-minute panel session. But there’s need for more resources, not just for for-profit entities. Those ones have enough money to get the DPOs, to get the people to help them comply. But what happens to non-profits whose resources are quite minimal? So what article 19 has done is to come up with templates for non-profits when they’re, and it’s like a checklist, which is what we need because this is such a complex, it’s a complex process. So telling you, okay, you have this data, have you gotten consent? If you don’t have consent, do you have the basis for it? Have you documented? Documentation is so important in data protection because you need to also preempt what can happen in future. Will you ever need to provide proof of consent, for instance? Those are things we might not be thinking about, especially operating as non-profits, but those are, that is the age of data protection we’re in, that you need to be documenting consent, documenting contracts you have with data control, data processors. So let me just explain this. Sometimes you’ll find there are two entities involved in the processing of data. So there’s a data controller, the person who collects and also directs how the data is going to be processed. And then you can have another entity being the data processor. So these ones are the ones who are going to store the data, anonymize the data, analyze the data, and all that. So they might have different functions depending on if they’re a data controller or a data processor. So sometimes we use these words among people in the data protection field without explaining what the ramifications are. So to make it in a more simple way is that a data processor is an agent or an employee of the data controller. So essentially at the very end, the person who’s going to be responsible for all the data protection issues, the breaches or whatnot, and consent will be with the data controller and not the data processor. So having them registered with the data protection entities in their countries is very important because it also gives them the justification for having budgets and all that towards compliance. So let’s have this resource online, please, because I think for non-profits it’s very important.

Jeremy Ouma:
Okay, thank you. And just a disclaimer, the resource will be will be will be published by the… the end of the way in October, so sometime in November, it will be fully ready. It’s ready, it’s just that we haven’t yet published. There’s a whole process to go through, but it will be published. And the main aim of this resource is to create awareness about data protection. So today we looked at some of the challenges, the impacts that this processing has on specific groups, but the paper looks at trying to create awareness about data protection, and also some of the recommendations of best practices for data controllers, data processors, and also civil society actors in general. So yeah, I think I might want to leave it at that, but I’ll take some questions if there were any. We can take them at the same time, then, yeah, we can end after that. Over to the floor.

Audience:
Wait, where’s the microphone? Who sees that? Is this being live? Where’s the information coming from? Is it on the internet? Is it being filmed? You should have it. Do you want me to open it? Or for transcription? For transcription. OK. Just for live. I know, so it’s on YouTube. OK, so when they turn off the mic, I’ll ask my question. OK. When it’s over, it’s fine. OK, sure. Hi, I have three questions. I hope that’s OK. I have many questions. My first question has to do with how data from sex workers is being saved and used and protected, right? Because we know that data from sex workers tends to be more sensitive. And non-profits also work with sex workers. And maybe in Kenya, that happens as well. So I would like to know if there’s a difference, or if you have had any special remarks on the privacy of data for sex workers. Then my second question is how the Anti-Homosexuality Act from Uganda has affected Kenya and the data protection laws in Kenya. And my third question is, how is it like for you as civil society to work with platforms in terms of platform accountability? Because we know we have the data protection laws. But how accountable are platforms in Kenya when you register a report or an incident? Like, how does it work? And also together with the police, right, with the judiciary system. How is it like there? So these are my three questions. Thank you.

Jeremy Ouma:
Any other questions? Yes, please.

Audience:
Hi. This is more actually to clarify. Because when you’re giving the example of the health data sharing scheme, so my clarification is, is there a Health Privacy Act? OK. And in that case, when health care providers are getting the data, what is the sharing mechanism? Can they share with other, let’s say one hospital, is taking you as a patient, right? And if there is some kind of electronic health records, are they sharing and uploading to that repository? So it’s shared across facilities. So every facility has access to that data. Is that how it works? Or every time they need to get consent from the patient? That’s one clarification. And second is that if there is a Health Privacy Act, and then there is a Data Protection Act, how does the coordination between the two work? And what regime does the patient’s privacy fall under? Are they covered under the Data Protection or the Health Privacy Act? Thank you.

Jeremy Ouma:
Thank you. Is that the last question? OK. Do you want to take any of the questions? OK. Then take the two questions.

Angela Minayo:
I’ll start with Paisa’s question. And this is a topic I really like. So I hope you don’t get passionate and talk too much. But yeah. So it’s very interesting that you raised it, currently having a bill being tabled in government, in parliament, called the eHealth Act. So the eHealth Act is talking about telemedicine, but it’s talking about protection of health data, which is very interesting. And I think people need to stop doing this. They should just have called it the health privacy, because that’s what it does. And it’s providing a framework where first, consent. So collection will be consent-based. And two is that there will be sharing of data across health facilities. And three, there will be health identifiers. So they’re also going to be assigning unique numbers to both patients and health facilities. And four, they want it to be portable. So they want to give that control of data to the patient. So the patient will be having all the records in a portable format. We don’t know this portable format, but you know, data portability. And then they’ll be able to… So it’s still being debated, but that’s what they have in mind. On how it is going to operate together with the Act, you’ll find most of the time saying in the caveats for exceptions and things like that, if prescribed by any other law, or on the grounds if prescribed by any other law. So that is normally how we try to interlink laws. So if it’s talking about prescription by another law, we go to the other law, or any other relevant law in question. But you can talk about this after this. The question on sex workers. Again, sex work is also penalized in our country. So of course we understand that the situation is different places. But we also understand that sex work is also becoming digitized. So there’s OnlyFans and so many other, what are they normally called, webcam-based apps, where they’re still part of the digital economy. And that also means that this data is being processed sometimes outside Kenya. Again, the level of awareness is low even on just data protection accountability in Kenya. So how bad can it be for a user based in Kenya whose data is being processed outside Kenya, i.e. in the EU? Those are questions and conversations we are yet to grapple with in Kenya. But I’m glad that kick.net is part of the OVOF project. And this could be part of the research we can conduct to understand how it’s been dealt with. So I’ll just give you context for those two things. I’ll let Jeremy go for the homosexuality laws in Uganda and how it’s affecting Kenya and the platform accountability.

Jeremy Ouma:
Okay, thank you. I think I’ll start off with specifically platform accountability. And just to mention, first of all, there’s an ongoing case at the moment between Meta and some of its, let’s call them former content moderators. There’s an ongoing case about matters around accountability. There was, let’s call it good precedence where they can now be held accountable for their actions within the Kenyan jurisdiction. But that case is still developing. But it’s good progress for us. We see it as good progress. That was just first of all. But there’s also, from our point of view, there’s one of the things, or two things we’ve tried to do. First of all, there’s a coalition that we’ve tried to bring together around specifically content moderation. This is basically, we looked at some, we did some work around the current practices of content moderation in a couple of countries. But there’s a specific focus on Kenya. I’ll talk specifically about Kenya. Basically understanding the experiences and challenges of Kenyans around content moderation, takedowns and all that. So some of the things that we found is platforms are potentially amplifying harmful content. There’s lack of understanding of local context because we are trying to have some sort of decentralization in terms of content moderation so that we can probably at some point hold them accountable for what happens on their platforms. There’s also insufficient transparency in content moderation. And finally, we are trying to bring together, sort of bridge the gap between the platforms and the local users. So that’s one of the things we do. So we are trying to have… Okay, I’ll keep it very brief. So we are trying to basically bridge the gap between local stakeholders and local users and the platforms to sort of get some kind of conversation going on how we can make the platforms better. So yeah, I think in the interest of time, there was a second question. Uganda. For the case of Uganda, I think it’s very… It’s not somewhere we want to be. But I think we’ve recently been hearing cases of people being prosecuted based on this law. I think there’s also some very bad cases. But in Kenya, I think it’s very similar, but not as bad. But I think in relation to how it has affected Kenya, I think there’s been some…

Angela Minayo:
There’s some potential legislation. So we have the culture bill. It started on as a family values protection bill. Just to wrap up is that these are funded by Eurocentric far-right evangelical radicals. And it’s really sad. And it’s not African. It’s actually Western ideals being imposed on Africans. We’ll end at that. Thank you so much for attending our session. Thank you. Thank you. Thank you.

Angela Minayo

Speech speed

168 words per minute

Speech length

3903 words

Speech time

1395 secs

Audience

Speech speed

153 words per minute

Speech length

607 words

Speech time

238 secs

Jeremy Ouma

Speech speed

145 words per minute

Speech length

2503 words

Speech time

1034 secs