Main Topic 2 – European approach on data governance
18 Jun 2024 16:00h - 16:45h
Table of contents
Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.
Knowledge Graph of Debate
Session report
Full session report
EuroDIG Panel Explores the Complexities of Data Governance and Protection in Europe
In a detailed panel discussion at EuroDIG, moderated by Moritz Taylor from the Council of Europe, experts from diverse backgrounds delved into the intricacies of data governance, protection, and sharing within Europe. The panel featured Mirja Vujeda from the Croatian Data Protection Authority, Aurélie Merquiol from the French research institute IRD, and Darius Amilevičius from Lithuania’s Information Society Development Committee.
Mirja Vujeda commenced the session by acknowledging the EU’s pioneering role in open data legislation and its comprehensive data strategy. She focused on the European Data Governance Act (DGA), which encompasses both personal and non-personal data, aiming to facilitate data sharing and promote data altruism for societal benefits like scientific research.
Aurélie Merquiol highlighted the practical challenges in aligning scientific research with GDPR compliance, particularly in cross-border data transfers and the interpretation of “public interest” by different authorities. She underscored the lengthy process of obtaining transfer authorisations and the limitations of consent in public research contexts.
Darius Amilevičius discussed Lithuania’s public sector data governance reforms, emphasizing the goal of making data more accessible and reusable. He shared the technical and regulatory hurdles in providing public sector data for advanced analytics and machine learning, citing personal experiences of the protracted permissions process.
The panel addressed the DGA’s impact on personal data protection and the safe sharing of data across borders and institutions. They also explored the challenges of making public sector data available for machine learning, with Amilevičius illustrating the time-consuming nature of obtaining necessary data access permissions.
Audience questions enriched the conversation, touching on the global influence of European data governance, ownership issues in sensitive medical data, the relationship between GDPR and newer regulations like the DGA and the Data Act, personal data ownership, and the risk of economic inequality in data protection.
The panelists concurred on the importance of transparency and security in building trust for data sharing. They stressed the need for educating the next generation on data management and the imperative for regulations to keep pace with technological advancements.
In summarising the session, Moritz Taylor reflected on the complex regulatory landscape, acknowledging the good intentions of those working in the field and the potential of international cooperation to enhance data governance. The discussion underscored the need to balance individual rights with the collective good, particularly in the domains of research and innovation, and highlighted the ongoing efforts to improve data governance practices.
Session transcript
Moderator:
So, our break came to an end, and let’s start the new session, European Approach on Data Governments. So, everything as the moderator, Moritz, said, it will be about data, lots of things about data, and I want to ask our moderator, which is Moritz Taylor, Data Protection Unit, Council of Europe, to present his panelists. Moritz?
Moritz Taylor:
Okay. Good afternoon, everyone. My name, as she said, is Moritz Taylor, I work at the Data Protection Unit at the Council of Europe. Our session is on the European Approach on Data Governance. Council of Europe has a bit of experience in this. In 1981, they made Convention 108, which was the first convention on the automatic processing of personal data. And since then, there’s, we’re up to 55 parties, and we’ve got Convention 108 Plus, the updated version, which needs a couple more ratifications to come into force, but assures extra protection for people. So, based on that, and our expertise, and our knowledge, we invited our experts today. First, for us, the approach was to look at it from a research perspective, and from a regulatory perspective. And we thought we’d have kind of representations of people who work with data on a global level, on a regional level in Europe, specifically, and then more on a national level. So, I’ll just introduce our panelists. First, we have Mirja Vujeda. She’s a Senior Expert Advisor at the Croatian Data Protection Authority. Then we have Aurélie Merquiol, the Data Protection Officer at the IRD, which is a French research institute on sustainable development. And then we have Darius Amilevičius, and he’s the Chief Architect, Chief Data Architect, at the Information Society Development Committee, and the Chief Researcher and Head of AI Research Group at the university here in Lithuania. Thank you. And I’ll allow everyone to give… a brief three-minute statement before we then move on to questions. Our hope is that there’ll be audience participation, but if not, we have plenty of things to discuss. Please, Mirja.
Mirja Vujeda:
Thank you, Moritz, and good afternoon to everyone. I would first like to express my gratitude to have a chance to speak today on this panel. Gratitude to the organizers and to the Council of Europe. So, as introduction, I will share a few thoughts about data governance, more precisely about Data Governance Act and the importance of the open data from the perspective of the Data Protection Authority. So, the European Union has been at the forefront of promoting open data through its comprehensive legislation, and this European Union legislation is a cornerstone of the broader European strategy of data, launched in 2020. So, on the EU level, it has been recognized the great potential of data, both for social, for economical and social potential of data, and the importance of the open data. So, some of the key objectives of the European open data policies are fostering economic growth and innovation by making public sector data more available. The EU aims to stimulate economic growth. Also, one of the key objectives are ensuring transparency and accountability, promoting citizens engagement, and encouraging reuse of resources. So, what are the benefits of open data? One of the main benefits are the possibility to make informed decision-making. Open data provide valuable insights both for private and public sector to make informed strategic decisions. Also, some of the benefits which we could highlight are innovation and product development. For example, if an entrepreneur would have data, they can use this data to identify some gaps in the market and to provide innovative solutions. Also, open data improves government and community communication and evidence-based policy development. One of the most important acts when we talk about open data is for sure the European Data Governance Act, which entered into force in June 2022 and is fully applicable from September 2023. So, the scope of the DGA is both personal and non-personal data, and this is a cross-sector legal instrument that aims to regulate the reuse of public health-protected data by boosting data sharing through the regulation of novel providers of data sharing services known as data intermediaries and by encouraging the sharing of data for altruistic purposes. So, these are like two, we can say, novelties which the Data Governance Act regulates and which somehow we expect that it will encourage and boost the data sharing between data holders and data users. So, I believe that later in the discussion we will have an opportunity maybe to speak more about that. Thank you.
Aurélie Merquiol:
Does it work? Do you hear me? Yeah. Good afternoon everybody, I’m very glad to be here, thank you for having me. So I am data protection officer, I’m sure you know what it means. In public sector, scientific research, French institute which work all around the world. So I would like to explain the issues and difficulties that I have in my job to make compliant with GDPR. First one, the non-official recognition of adequacy by local DPAs, I will explain, DPAs, Data Protection Authority. A lot of countries in the world have imported European values in their data protection regulation, you know that, but what happens when public research wants to transfer data from rest of the world to EU? I will give you an example, an African data protection authority told me that even if GDPR is applicable in the EU, they need to officially deliberate to recognize our legislation. So in practice, we needed to have an authorization to the public research. And in practice, this involves making an authorization which make around one year to be given by the DPA. Another question I would like to speak here is the question of interpretation of text of the regulation by Data Protection Authority. The text of GDPR, like you know, requires justification to carry out data processing. You can find consent, or performance of a contract, or the performance of a task carried out in a public interest and three other cases The notion of public interest is subject to interpretation In West Africa, for example, you need to have the nationality to use this public interest In other words, processing data by a French public research organization like ERD cannot be based on this public interest In fact, we need to have an authorization and to ask for consent You know that consent in GDPR is very hard to prove and you can withdraw consent and it’s not adequate with public research I just finished, I see the prompt
Darius Amilevičius:
Thank you, hello everyone I am very happy and very glad to be here I am working in the Information Society Committee as a chief data scientist In Lithuania now is acting the reform of public sector data governance It’s very important because the umbrella of this reform our committee are doing big project to create new architect structure of public sector data to create ontology to make sense of how the data is moving from one user to another. But this is very important because in this case we can make data more findable, more accessible, more reusable. We are introducing common standards for describing the data, for transmitting the data, but doing our project we found that the technical part to describe data and so on is the easiest part. We can easily push the data into public sector data space, but it’s very hard to pull out for customers from this data space and to use this data because of the protection requirements. Public sector data is full of sensitive data, of personal data, so this is very interesting data for all. But in, for example, in Lithuania public sector is the biggest provider of data. For this reason, public sector data is very important for scientific research, for startups, for so for science, for business. On the other side, I am working in the university, one of the Lithuanian universities, and we are creating scientific research data space. This reform is only at the beginning of this reform. We are trying to make part of European scientific research data space and in public data space and scientific data space we have big problems with regulation in science research data space there, for example, intellectual property rights on the other side in public sector, personal data and other things. Thank you.
Moritz Taylor:
Thank you very much. Well, before we move on to opening the floor to everybody, I think it might be valuable for me to be able to ask you each a question as well. Maybe I’ll start with Mirja. So how do you think, since you mentioned it now, this Data Governance Act, how do you think that’s going to impact data protection specifically, especially of personal data?
Mirja Vujeda:
Well, as it was mentioned before, the scope of Data Governance Act will include also personal and non-personal data, but within the open data legal framework, actually, data protection law plays a crucial role, so the principle of the data protection regulation of the GDPR have this important law in applying open data legislation and in general in any situation which involves data processing. And this Data Governance Act makes it, is making very clear that if the activity involves processing of personal data, then the GDPR should prevail. So this Data Governance Act reinforces the protection provided by the GDPR, and it ensures that all the data processing and all the activities that are involved under Data Governance Act should respect and should comply with existing data protection regulation. Actually, this data protection, like providers of data sharing services, they are somehow a novelty with this Data Governance Act. and the aim of them, so somehow they will connect the data holders, they will create relationship between data holders and data users, and they should ensure the security of data and transparency standards, but they will, in this activity, they will be completely neutral in a way that they will not use this data, they will just somehow connect those who need data and those who are providing data. And when it comes to data altruism, like organization, so with this Digital Governance Act, it will be allowed to voluntarily share data for public purposes, like scientific research, but in this case, like those data subjects who would like, for example, to share their data will give consent for sharing this data, which should be complied with the GDPR. So somehow this Digital Governance Act is somehow, all the activities should be compliant with the GDPR and is actually strengthening these data protection principles. Thank you.
Moritz Taylor:
Okay, thank you very much. And perhaps, Aurélie, it’s adjacent to this, what Mirja has been talking about. How can data be shared safely, or how could we improve sharing the data safely across borders, across institutions, in a way that protects people’s data, again, protects safe, keeps them safe?
Aurélie Merquiol:
I think that the best way is anonymization, but in public scientific research, it’s not the good way, because we have to prove the veracity of the research, so anonymization is… It’s not possible. We just can pseudonymize. And moreover, the details of the data must be made available to the other scientific to reproduce and reach research. So we have solution, but we have another issues. Thank you very much.
Moritz Taylor:
And so Darius, before we open the floor, one last question for you as well. How can we ensure that data is available for machine learning, deep learning, and in particular, how can we ensure that it’s public sector data that is available for these tools?
Darius Amilevičius:
Very difficult, but very good question. In Lithuania, we have a data lake that serves as a sandbox for data analytics purposes. It’s a safe place to make some experiments with data analysis with a vast amount of data. But this data lake is not suitable for machine learning. For my purposes, I must have a big amount of data now, not tomorrow, today. Let me give one example. My team in university and the Central Polaris Department of Lithuania were doing a common project to create a very specific chatbot for Lithuanian police. To train this chatbot, I must obtain the data from police. They have thousands of emails, thousands of hours of telephone conversation records. But to obtain permission to research this data, to use this data, I spent four months to obtain all permissions. Like you know, project is limited in time. So four months of my project, I spent only to take the permission. We are also working in the health sector, the same story. For five months to obtain the permissions to have possibility to analyze health records. So for analysis purposes, we have to solve this problem with data sandbox for analytic purposes. In our Lithuanian artificial intelligence strategy, we have put our need to create a sandbox for artificial intelligence needs to create access to big data. Because, like I said, the principal provider of data in Lithuania is public sector. But for now, unfortunately, this is very difficult. And the question is not resolved.
Moritz Taylor:
Thank you, and I think that provides a bit of a basis for people who perhaps in the crowd or online have questions about data protection issues, about the transfer of data for research. And of course, in machine learning. So I’ll open the floor and allow anyone who wishes to raise their hand and participate, ask us any questions, ask the panel entirely or specific people questions. We have two over there, one there. So please up there first. I think there’s some important questions to think about here, about what we’re willing to sacrifice. We always get the impression, and as you said, you had four months, four months in the old days wasn’t that much time. And now, four months is an eternity.
Audience:
Thank you. I’m Pilar, I’m from the YouthDIG org team. So, my question is, how can we ensure that our European approach on data governance and the new data governance act, the new data act and the more recent regulations also become a beacon for the rest of the world? Thank you.
Moritz Taylor:
Thank you. Great question. Should I ask questions? Yeah, go ahead. I have a question. I have a question for both of you.
Audience:
Should I ask questions?
Moritz Taylor:
Yeah, we’ll take a few. Okay.
Audience:
I’m Mariam. I’m from Georgia. I’m the You Stick group. My question is, I’m actually now studying biomedical engineering and we have problems with patient data actually because when we try to do something new and build out the innovation, we have always questioned who is the owner of this data? Is it patient? Is it us who are trying to do the research on it or is it the third party or the provider, for example, the health care provider? If we could kind of like if you could kind of clear up for me who is the like how can we define the ownership of this kind of sensitive data? Thank you.
Moritz Taylor:
Brilliant. Thank you. Perhaps we’ll start and you can already start finding the next ask. There’s one behind you as well if you want. Maybe question one about spreading European standards. It’s a good one and I think one that of course we’re very interested in, not for the GDPR but others, but I’ll let Mirja take this one.
Mirja Vujeda:
Okay. Thank you for the question. Of course, these are like European standards, but all when comparing to the rest of the world, like maybe in Europe we recognize the value of the data because all these regulations are at the end for the benefit of the society and for the benefit of individuals. So, I mean, I think it’s very important to have the data available. And I think it’s important to have the data available. And I think it’s important to have the data available. And I think it’s important to have the data available. all these regulations are not to make our life or work more complicated, it is to regulate how to, for example, at the beginning when I was mentioning some of the benefits of open data, for how huge potential for economical growth for creating this data has, they have, and what is like also for policy making, like to make informed decisions on the government level, also for the public sector, so we believe that all these standards that we do have in Europe will be recognized also in the rest of the world, and when we speak also about the data protection, okay, here in Europe we do have GDPR, but also it’s important to mention this convention 108, and 108 plus protocol, that convention which creates like a global standard for data protection, and it’s open for signature and ratification to any country in the world, so I believe that all the countries for their, and also private sector and public sector, recognize the importance to somehow follow these European values and European approach when it comes to data protection and data governance, so I agree.
Moritz Taylor:
Perhaps Aurelie as well, because especially when it comes to the next question, maybe you have some input on medical research and how it…
Aurélie Merquiol:
Could you repeat me the next question?
Moritz Taylor:
The main point that I understood was that there’s not a lot of clarity in the ownership of data in medical research and biomedical research, so who is the owner of the data in medical research, who is responsible for handling and… keeping that data safe for example and maybe we can add on to it this part about having global standards for exchanging that data because I think that’s also quite an interesting… I really don’t like to say ownership of
Aurélie Merquiol:
data. I think we are in Europe so there are no ownership of data. We are not in United States of Americas where you can have another opinion. But about who has to ensure compliance of data processing of course it’s all the data processor and all the professional scientists and people who are processing data every day because we know that it’s human which increase risk in data processing.
Moritz Taylor:
Did you have a comment as well Darius?
Darius Amilevičius:
Yes for medical data this is my field. I think that the biggest point is that our approach to who is proprietary of this data is wrong because I think we must divide personal data and personal rights and individual rights and group rights. Now in regulation right of individual is put over the right of the group. For example if I am creating new tool for health sector this will give benefits for many people. So if I must look for 10,000 permissions of owners of data to make my new tools, this is something terrible. So in our protection, we must introduce some kind of collective interest, public interest. And in this case, in safe environment, like some kind of sandbox, the researchers like you must have access to those data without making this kind of reasoning to who is proprietary of the data, I think.
Moritz Taylor:
Thank you very much. I know there’s online questions as well before we move on to real life ones. But maybe one real life question and then do you have many? It’s on. Okay.
Audience:
Me or not?
Moritz Taylor:
Please, go on. Okay.
Audience:
I’m Jacques Begringer of Swiss IJF, but I had the privilege to work at the Council of Europe or to represent business at the Council of Europe when we negotiated in 2018 the recommendation on health data. And it was already at that time, there was this sentiment that some was maybe data protection has gone too far and needs some corrective. Now my provocative thesis is that the new EU Data Governance Act and the EU Data Act are somehow cutting back on GDPR, I mean, on the overall idea of data protection and in particular, the idea that every data must be avoided at all price has come to an end and we are at the pendulum swings back. So that’s my thesis, what you think of that. And maybe you can also explain the difference between Governance Act and a Data Governance Act and Data Act. This creates much confusion in business.
Moritz Taylor:
Thank you. Online questions.
Audience:
Okay, thanks for taking this question. Again, Deborah Allen Rogers, coming to you from the Hague Digital Fluency Lab called Find Out Why, representing civil society. I think the idea of data ownership is somewhat should be compared to the idea of we own our own money. And I would like to own my own data. And I would like to be able to opt in and opt out, especially in the case of medical data, for example, of if someone has access to my data. So I don’t think it’s something that we shouldn’t think about in terms of data ownership. It should be on a personal level, like we own our money, or we own our property. The other thing I want to say, and then I’m going to ask a question, is I also appreciate having worked at the Transatlantic Partnership this past summer between American and European values, and I often hear them sort of overlapping. And in many ways, they’re not alike at all. And in other ways, they’re quite similar. And I do think that the time it takes to build better systems and design the governance and design the encryption to get safe data transferring is worthwhile. So to the data scientists in the panel, it’s like the young people can certainly in your classes are helping you with this, I’m sure, because I see it in the classes, if I’m ever in the classroom, their minds think this way. So here’s my question. If we were to have ownership of our own personal data and have the encryption keys, and then have certain levels of protection, if it gets breached, or certain levels of protection for ourselves, if we give access, are these things that you’re thinking about anybody on the panel? Because I think ownership of personal data should be a real concern as a part of the collective and as a part of individual freedom and security. Thank you for taking my question.And I’ll add. Alexandra Yevdokimova. Given the bureaucratic burden, how can we ensure that the public sector remains competitive in open data provision and the big private actors don’t just come in with maybe paid but efficient and quick solutions?
Moritz Taylor:
Okay. Thank you. These are some pretty hefty questions, I think. I think, Mirja, you said you’d volunteer to see if you can answer that first question.
Mirja Vujeda:
First question, if I remember well, it was about the European Data Governance Act and in relation to the Data Act and also about lawfulness of processing something.
Moritz Taylor:
Yeah, depending on the swing back. So, basically, it takes away some of the provisions of the GDPR, and goes against some of the spirit of the GDPR.
Mirja Vujeda:
Well, I would not say that it is just my opinion that it’s taking away the spirit of the GDPR, because at the end, if there will be some kind of conflicts between the provision, the GDPR will prevail. And at the end, if under the Data Governance Act, these data, so the scope of the Act is personal and non-personal data. So, in the case these data will contain personal data, still you will need to have like a lawful basis for legal processing. You cannot process or access to these data without a lawful basis for the processing. So, I would just say that this Act is not confronting the GDPR. There somehow in the provisions of the Data Governance Act, it is very clear that the relationship between the Data Governance Act and GDPR. So, in that case, but I understand like your of concerns when it comes to the practical application of the Act. We have a lot of EU legislation regarding open data, which is like Data Governance Act and Data Act, which is the second pillar of European data strategy, and somehow this Data Act is complementing Data Governance Act. So the aim of this Data Act is to enhance the EU economy and foster competitive data market. Somehow the aim of this Act to somehow regulate who has the right to the value of the data which will be shared under the Data Act, and this Act focuses more on making industrial data more accessible and usable, encouraging data-driven innovation and increasing data availability. So, in Data Governance Act, we’ll somehow, from my perspective, how I see to make it more simple, this difference is to make more protected data from the public sector more accessible for the users. Thank you.
Moritz Taylor:
Do you have comments on some of the other questions, perhaps?
Aurélie Merquiol:
Yes, about ownership and data. I’m sure that we don’t have any ownership on our data, but we have right, right to access, right to withdrawal consent, and so on and so forth. About anonymization, I agree with encrypted protocol, but I have a question, where are the keys? Who has the keys? And in my practice, I often see that the keys are… are kept by American company, it’s another problem.
Darius Amilevičius:
I think that encryption of data, of course, it costs additional money. You must have additional technical resources, but I think that encryption is a good point to start, because maybe additional cost is the cost to safely use the data without anonymization. Because in anonymization case, the data became unusable for machine learning. So maybe encryption is the way to go. Thank you.
Moritz Taylor:
Thank you. I think there’s one question behind you over there. And maybe one more. Other than that, we’re starting to run out of time already, which is quite impressive. Was there anyone else?
Audience:
Hello. I’m Emily Khachatryan, coming from the Advisory Council on Youth of Council of Europe, also representing Youth Development Center of Armenia. Thank you very much for the panel. My question was regarding opting out of the usage of data, because there are now some implementations of that users can pay an additional fee to the platforms to opt out of their data being used. But don’t you think that it will cause inequality between different social groups? Because then people who have more privilege economically will be able to do this, and the lower class, unfortunately, won’t be able to protect their data, which is a big issue in our society now. Thank you.
Moritz Taylor:
I think that’s a fair question. Inequality in data and the power of the data. Who would like to answer?
Aurélie Merquiol:
I will give an answer. I totally agree with you. It’s not equal, and I think it’s not legal, but the European Court of Justice will have to examine this problem.
Darius Amilevičius:
I am a big fan of open data and open source. I think that all the software must be open-sourced, all the data must be available free of cost. Is there some kind of blood donor? When I give my blood free, I can ensure the quality. And if I pay for something, I think that the best way to go is all the data open to all. I agree that the producing of data also costs money, but we must find some resources to make foundation for creation of data, and so I think the fees are unusable for open data.
Moritz Taylor:
Okay. Maybe since I think we’re out of questions at the moment, unless I’ve missed someone somewhere in the corners, we could move on to some solutions. Or each of you work in a particular area of your expertise, what would be a way to enable the free flow of data in a safe manner for your area of work, and how you perceive these risks and opportunities?
Mirja Vujeda:
In this digital era, in the internet, we are much more aware how data protection is even more important. essential than ever and I believe that we can by somehow we can what can be some kind of way to ensure like easily data flows and sharing data is to building trust also about how we use the data especially this principle of transparency which I see very very important that and to ensure like the security of the data which for sure like somehow the consequence of that will be that the data share sharing will be more more easier if we somehow respect all the principles of the data protection if data subjects will know where for which purposes we use their data if we build like safe and secure like environment especially on the internet I would say like security and transparency would somehow affect that in future will affect more like will somehow support and data sharing like if I can say like more in general way like general answer thank you
Aurélie Merquiol:
so I think that today our data are mainly transmitted and managed by large companies like companies of big data or big pharma I think the future of internet will be different because it will decentralize and data are not destined to pass directly between users and users and maybe it will be help to take care of our data however I think it’s important, the most important is to train new generation and use their data.
Darius Amilevičius:
In our times, digital economy, data is fuel, and it’s most important thing for our times. But biggest problem is, I like European Union position, proactive position in regulation. But biggest problem that regulation and technical progress must go step by step, side by side. Because when the two things are separated in time, there is danger that regulation became over regulation, under regulation, regulation in that time, or will regulate that things. So I think that these two must go side by side. Thank you.
Moritz Taylor:
All right, thank you very much. I think we’ve essentially, surprisingly enough, we’ve managed to reach the end, unless there’s final comments. I’d like to thank the panelists very much for participating. I know it was my first EuroDIG, I know it was first EuroDIG for you too. Thank you very much to all the organizers who’ve made, and the co-organizers of this group, by the way, who helped me in the setting up of the panel discussion. And thank you, Maria, thank you, all of you for participating, Oeli, and of course Darius in taking part and answering everyone’s questions. I think on that note, we can just end it now.
Moderator:
As a journalist, Moritz, me, myself, a bit a journalist, I would ask you one last question. If you can say in one sentence, what would be the idea you want all these guys to bring from your session? Okay, two sentences, but you know, the main idea.
Moritz Taylor:
Each or just me?
Moderator:
You, you’re moderator, come on, do the hard work.
Moritz Taylor:
I think what we’re seeing is that there’s a tangled web of regulations and rules and that people are sometimes frustrated with, sometimes encouraged by. We’re getting better at it, I think. And I think from what we’ve heard here, people are doing their very best to adapt to the situation. And I think most regulators and most people who work in this field have good interests and good intentions. And that’s what I think I would take away overall is that with international cooperation, with people working on a global scale, on a regional scale and on a national scale, we’re moving forward.
Moderator:
Thank you, Moritz.
Speakers
A
Audience
Speech speed
173 words per minute
Speech length
898 words
Speech time
311 secs
Arguments
Ensuring that Europe’s data governance frameworks can lead globally
Supporting facts:
- Europe is introducing comprehensive data protection and governance laws
- There is a global need for better data management and privacy standards
Topics: Data Governance Act, Data Act, Global Data Regulation Standards
The ownership of patient data is unclear.
Supporting facts:
- Patient data is sensitive and its ownership is a common issue in healthcare research.
- Different entities including patients, researchers, and healthcare providers may lay claim to the ownership of such data.
Topics: Biomedical Engineering, Data Privacy, Healthcare Research
Data protection measures may be too stringent and need revision
Supporting facts:
- The sentiment in 2018 during the Council of Europe negotiations on health data suggested the need for corrective measures in data protection
- Concerns about data protection potentially hindering other priorities
Topics: Data Protection, GDPR, Data Governance Act, EU Data Act
Distinction between Data Governance Act and Data Act is unclear
Supporting facts:
- Businesses express confusion regarding the differences and implications of the Data Governance Act and the EU Data Act
Topics: Data Governance, Data Economy, Legal Framework
Report
Europe is actively enhancing its position as an exemplar in data governance with the development of comprehensive data protection and governance laws, such as the Data Governance Act and the EU Data Act. This initiative is propelled by a global demand for advanced data management and privacy standards, positioning Europe at the forefront.
These efforts are commended for their alignment with Sustainable Development Goals (SDG) 9 and 16, which highlight the promotion of innovation, infrastructure resilience, peace, justice, and strong institutions. In healthcare, the conundrum of patient data ownership raises pressing ethical issues. The sector grapples with determining the rightful custodian of patient data, which has repercussion for SDG 3’s advocacy for good health and well-being.
Advocates urge a move towards well-defined ownership parameters to fuel innovation while upholding privacy, critical for the healthcare industry’s progress. While the new data governance framework is viewed in a largely positive and reformative light, there are reservations about the severity of past regulations such as those enacted by the GDPR.
Some critics argue that the GDPR’s rigidities may hinder other priorities. Questions also arise as to whether new legislation will adequately relax GDPR’s stringency, troubling stakeholders who anticipate more open data sharing conducive to a single data market, underscoring concerns pertinent to SDGs 9 and 16.
The clarity, or lack thereof, surrounding the distinctions between the Data Governance Act and the EU Data Act provokes confusion amongst businesses, calling for more lucid explanations of legal requirements and the impact of these regulations on the data economy.
The growing complexities of data law require careful navigation to ensure comprehensive understanding and compliance. An evident trend is the shift from a previously stringent policy of data avoidance to an informed, balanced viewpoint that permits data use under certain circumstances.
This change reflects a broader pivot towards harmonising data protection needs with the functional demands of data use, though it does not align distinctly with a specific SDG. In summary, while the consensus is that Europe is setting a commendable precedent for global data governance, reflective of several SDGs’ principles, it faces challenges in legal clarity and in balancing innovation with privacy.
Europe is thus at a critical juncture, with its actions on the international stage for data governance expected to have profound, multifaceted effects.
AM
Aurélie Merquiol
Speech speed
105 words per minute
Speech length
728 words
Speech time
418 secs
Arguments
In Europe, data is not viewed as something that can be owned
Supporting facts:
- Aurélie Merquiol mentioned the lack of data ownership in Europe
Topics: Data Protection, European Union Law
Data processors are responsible for ensuring the compliance of data processing
Supporting facts:
- It is mentioned that professionals and scientists processing data are responsible for its compliance
Topics: Data Security, Compliance
Individuals do not have ownership of their data
Supporting facts:
- Aurélie Merquiol is sure that individuals do not own their data.
Topics: Data Ownership, Privacy Rights
Individuals have rights regarding their data
Supporting facts:
- Individuals have the right to access their data and the right to withdraw consent.
Topics: Data Protection, Privacy Rights
Concerns about the location and possession of encryption keys for anonymized data
Supporting facts:
- Aurélie Merquiol questions where the encryption keys are and who has them, noting often they are held by American companies.
Topics: Data Security, Encryption
The future of internet will be decentralized
Supporting facts:
- Data will not only pass between large companies but also directly between users
Topics: Internet decentralization, Data management
Report
The discourse on data rights within the European context, brought to the fore by insights from Aurélie Merquiol, delineates a scenario where data is not regarded as a property that can be owned—a viewpoint that is in sharp contrast with the stance observed in the United States.
This underscores a significant cultural and legal divergence in the attitudes toward data on either side of the Atlantic. Importantly, while the ownership of data may not be recognised, European citizens are endowed with specific rights pertaining to their personal data.
These include the entitlement to access their private data and the prerogative to retract consent for its utilisation, indicative of progressive steps towards the affirmation of personal data rights. Nevertheless, a prevailing disquietude emerges due to the acknowledgment that, despite having certain rights, individuals find themselves without outright ownership of their data, creating a paradox where personal control is somewhat circumscribed.
Data security surfaces as a contentious issue, particularly in relation to the guardianship of encryption keys which serve as the linchpin for the safeguarding of anonymised data. The observations by Aurélie Merquiol shine a spotlight on a troubling reality where these keys, and by extension the oversight of data security, are often in the possession of American corporations.
This raises questions about the level of autonomy Europeans truly possess over their data protection and privacy. In matters of regulatory adherence, there is a clear delineation of responsibility, identifying data processors as those accountable for ensuring that data practices align with established data protection legislation.
Such a requirement places considerable onus on professionals and scientists in the data industry to remain diligent and compliant, thus upholding the data protection standards as mandated by the European Union. Prospectively, there is a sense of optimism about the decentralisation of the internet, a progressive movement that is poised to democratise data management by facilitating direct data exchanges between end-users rather than solely through large corporations.
This evolution portends a shift in the dynamics of data management, offering a more equitable and empowering proposition for individuals. Education, particularly in topics of data usage and digital literacy, is emphasised as a focal area for future generations. Ingraining a thorough understanding of data management is essential for equipping individuals to adeptly manoeuvre the intricacies of digital rights and responsibilities.
Emphasising education aligns with SDG 4: Quality Education, underscoring its vital role in preparing citizens who are knowledgeable, capable, and digitally proficient. In drawing together the facets of this analysis, the picture that emerges is multi-faceted, with mixed sentiments, where rights over personal data exist yet without extending to ownership.
Challenges persist, particularly around data security and the responsibility for compliance that rests with data processors. However, through educational empowerment and advancements such as the decentralisation of the internet, the foundations are being laid for a digital milieu that is more fair and secure.
This perspective is in harmony with the aspirations of SDGs 9 and 16, which advocate for innovation, resilient infrastructure, and strong institutions that foster justice—cornerstones for a just and comprehensive data governance framework.
DA
Darius Amilevičius
Speech speed
115 words per minute
Speech length
1057 words
Speech time
551 secs
Report
The Chief Data Scientist from Lithuania discussed the extensive reform taking place within the country’s public sector data governance. The aim is to create a modern architectural structure to manage public sector data with greater efficacy, adhering to the principles of making data findable, accessible, and reusable.
The reform is supported by established uniform standards for data description and transmission. Technical progress notwithstanding, the speaker highlighted the challenges imposed by stringent data protection regulations, particularly when dealing with sensitive and personal data. Such regulations lengthen the duration required to obtain permissions for data access, as evidenced by the several months it took to procure data for initiatives like a police chatbot project and health sector analysis.
The Data Scientist advocated for a reconsideration of the existing data protection frameworks, proposing a more balanced approach that safeguards individual rights while also acknowledging the collective interest, such as in improving public health. Their view is that the current emphasis on individual rights potentially hinders advancements in technology and science that could benefit society.
Encryption was presented as a viable alternative to anonymisation – which often degrades the data’s utility for machine learning. Despite the higher cost and greater technical demands of encryption, they argued that the benefits of securely using sensitive data justify the investment.
Open data and open source principles were fervently supported by the speaker, who believes in the free availability of data and access to software, likening it to the ethos behind blood donations for the collective benefit. They opined that while producing data incurs expenses, alternative funding mechanisms should be sought to sustain openness, avoiding reliance on user fees.
Emphasising data’s critical role as the lifeblood of the digital economy, the speaker cautioned about the risks associated with a discrepancy between regulatory frameworks and technological advances. They stressed the importance of evolving regulation in tandem with technology to avoid the pitfalls of inappropriate regulation, which could be excessive, inadequate, or poorly timed.
In conclusion, the Chief Data Scientist’s discourse effectively encapsulated the complexities of advancing public sector data governance reform in Lithuania. It highlighted the need for structural transformation, a balanced approach to data protection, and the harmonisation of policy with technological development.
These insights underscore the growing need for agile and thoughtful data regulation that supports open data initiatives and innovation, ensuring that data continues to propel economic and societal growth.
MV
Mirja Vujeda
Speech speed
131 words per minute
Speech length
1641 words
Speech time
751 secs
Report
During a panel discussion, a speaker began by expressing gratitude to the organisers and the Council of Europe, paving the way for a comprehensive exploration of data governance, with a spotlight on the advancements within the European Union’s legislative framework, specifically the Data Governance which will fully come into effect in September 2023.
The EU’s shift towards embracing open data was commended. This approach is integral to the overarching data strategy launched in 2020, which aims to fuel economic growth and innovation via the broad availability of public sector information, leading to greater transparency, citizen engagement and improved decision-making.
The use of these data sets can identify market opportunities, stimulate innovation, and support strategic decision-making across both the public and private sectors. A key element of the discourse was the Data Governance Act (DGA), which targets both personal and non-personal data.
Its design serves to boost data-sharing, chiefly from public bodies, while implementing stringent regulations for data intermediaries. These intermediaries, connecting data providers with users, must adhere to rigid criteria on neutrality, transparency and security. The DGA introduces the innovative concept of ‘data altruism’, enabling voluntary data sharing by individuals and organisations for the public good, such as for scientific research.
This is to be done with informed consent, in accordance with the General Data Protection Regulation (GDPR). The speaker clarified that the DGA complements and exists alongside the GDPR, upholding the high standards of privacy and data protection established by the GDPR.
The global impact of EU data standards was also discussed, with an optimistic view that they would be adopted internationally, thus spreading the societal and economic benefits. A cooperative international perspective on data governance, in line with European values, was championed.
It was underscored that the DGA complements the GDPR, with both demanding lawful processing of personal data, maintaining the GDPR’s central role in data protection and ensuring that this foundational standard remains unaffected by the DGA. The interoperability between the DGA and other EU data legislation, such as the Data Act, was highlighted.
While the DGA is concerned with the accessibility of protected data, the Data Act aims to galvanise the EU economy by facilitating access to industrial data, and by nurturing a competitive and innovative data market. The speaker concluded by emphasising the importance of trust, transparency, and security as fundamental aspects of data sharing and protective strategies.
Called for was an ecosystem that supports substantial data sharing, upholds public trust, and aligns with data protection laws. In our digital age, such a framework reassures contributors that their data is handled responsibly, and ethically, creating an environment that promotes data-driven progress while protecting individual privacy.
M
Moderator
Speech speed
178 words per minute
Speech length
129 words
Speech time
43 secs
Report
The session titled “European Approach on Data Governance,” eagerly awaited to reconvene, will be presided over by Moritz Taylor, of the Data Protection Unit at the Council of Europe. As moderator, Taylor’s role is to introduce the panel members, each poised to offer their perspective on data governance within Europe.
The forum is designed to spur discourse on crucial data-related issues, expected to underpin the day’s discussions and presentations. Journalists present are charged with the task of distilling the complexity of the debates into public-friendly narratives. Their challenge is to extract and communicate the essence of the in-depth discussions on data governance, offering clear, digestible takeaways that capture the core messages of the event.
Ahead of this, a light-hearted gauntlet is thrown down to Taylor: to succinctly summarise the rich content and aims of the session in one, or at most two, potent sentences. The request not only jests at the moderator’s ability to synthesise multifaceted arguments but also serves as a beacon for journalists, guiding them in shaping their coverage.
Anticipated to traverse the nuanced terrain of data protection principles, privacy issues, regulatory compliance, and ethical considerations, the session aims to strike a balance between data utility and personal privacy rights from a European standpoint. The measure of success for the talks will be gauged by how effectively they can be condensed into a straightforward yet profound capsule that reverberates with the lay audience, mirroring the core objectives of European data policy and governance.
Throughout, UK spelling and grammar have been used, ensuring that the text aligns with regional linguistic standards, enhancing its relevance for a European audience and maintaining the consistency throughout the summary.
MT
Moritz Taylor
Speech speed
171 words per minute
Speech length
1368 words
Speech time
480 secs
Report
Moritz Taylor expertly chaired a comprehensive panel on the European Approach to Data Governance, hosted by the Council of Europe. The panel’s discussions largely focused on the implications of emerging legislation on data protection, methods for secured data sharing internationally, and the enhancement of public sector data use to advance artificial intelligence.
A key point of debate was the imminent implementation of Convention 108 Plus, which is an enhancement of the original 1981 convention. Its ratification is expected to strengthen the protection of personal data, reflecting the Council’s enduring commitment to adapt to technological advancements in data protection.
The panel featured a diverse range of expertise: – Mirja Vujeda, a Senior Expert Advisor at the Croatian Data Protection Authority, shed light on the potential effects of the Data Governance Act on the safeguarding of personal data. She deliberated on possible overlaps with established privacy protocols like GDPR, analysing the ensuing challenges for data controllers and processors.
– Aurélie Merquiol, a Data Protection Officer at IRD, provided insight into the complexities of data sharing and focused on achieving it securely across different jurisdictions and organisations, emphasising the balancing act between privacy and the benefits of data transfers for scientific progress and sustainability.
– Darius Amilevičius, Chief Data Architect at the Information Society Development Committee in Lithuania, emphasised the public sector’s responsibility in enabling data access for advancing AI, discussing how governments could contribute to innovation through the ethical provision of data. Audience interactions brought up several perceptive concerns about data governance, including: – The global influence and application of European data protection standards, addressed by Vujeda.
– The intricacies of data ownership in medical and biomedical research, with Merquiol discussing the current ambiguities and the necessity for concise guidelines. – The potential conflicts between GDPR and new legislative measures, suggesting the critical balance needed between innovation and safeguarding individual rights.
The panel conceded that there is a complex web of regulations that might occasionally cause frustration but also drive advancements in data governance. In summing up, the discourse underlined a shared aspiration for progress, anchored in well-meaning initiatives and bolstered by international collaboration.
The session closed on a hopeful tone, acknowledging the possibility of navigating the intricate terrain of data governance through concerted efforts across different levels of governance, thereby enhancing the digital future for society.