Main Topic 2 –  European approach on data governance 

18 Jun 2024 16:00h - 16:45h

Table of contents

Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.

Full session report

EuroDIG Panel Explores the Complexities of Data Governance and Protection in Europe

In a detailed panel discussion at EuroDIG, moderated by Moritz Taylor from the Council of Europe, experts from diverse backgrounds delved into the intricacies of data governance, protection, and sharing within Europe. The panel featured Mirja Vujeda from the Croatian Data Protection Authority, Aurélie Merquiol from the French research institute IRD, and Darius Amilevičius from Lithuania’s Information Society Development Committee.

Mirja Vujeda commenced the session by acknowledging the EU’s pioneering role in open data legislation and its comprehensive data strategy. She focused on the European Data Governance Act (DGA), which encompasses both personal and non-personal data, aiming to facilitate data sharing and promote data altruism for societal benefits like scientific research.

Aurélie Merquiol highlighted the practical challenges in aligning scientific research with GDPR compliance, particularly in cross-border data transfers and the interpretation of “public interest” by different authorities. She underscored the lengthy process of obtaining transfer authorisations and the limitations of consent in public research contexts.

Darius Amilevičius discussed Lithuania’s public sector data governance reforms, emphasizing the goal of making data more accessible and reusable. He shared the technical and regulatory hurdles in providing public sector data for advanced analytics and machine learning, citing personal experiences of the protracted permissions process.

The panel addressed the DGA’s impact on personal data protection and the safe sharing of data across borders and institutions. They also explored the challenges of making public sector data available for machine learning, with Amilevičius illustrating the time-consuming nature of obtaining necessary data access permissions.

Audience questions enriched the conversation, touching on the global influence of European data governance, ownership issues in sensitive medical data, the relationship between GDPR and newer regulations like the DGA and the Data Act, personal data ownership, and the risk of economic inequality in data protection.

The panelists concurred on the importance of transparency and security in building trust for data sharing. They stressed the need for educating the next generation on data management and the imperative for regulations to keep pace with technological advancements.

In summarising the session, Moritz Taylor reflected on the complex regulatory landscape, acknowledging the good intentions of those working in the field and the potential of international cooperation to enhance data governance. The discussion underscored the need to balance individual rights with the collective good, particularly in the domains of research and innovation, and highlighted the ongoing efforts to improve data governance practices.

Session transcript

Moderator:
So, our break came to an end, and let’s start the new session, European Approach on Data Governments. So, everything as the moderator, Moritz, said, it will be about data, lots of things about data, and I want to ask our moderator, which is Moritz Taylor, Data Protection Unit, Council of Europe, to present his panelists. Moritz?

Moritz Taylor:
Okay. Good afternoon, everyone. My name, as she said, is Moritz Taylor, I work at the Data Protection Unit at the Council of Europe. Our session is on the European Approach on Data Governance. Council of Europe has a bit of experience in this. In 1981, they made Convention 108, which was the first convention on the automatic processing of personal data. And since then, there’s, we’re up to 55 parties, and we’ve got Convention 108 Plus, the updated version, which needs a couple more ratifications to come into force, but assures extra protection for people. So, based on that, and our expertise, and our knowledge, we invited our experts today. First, for us, the approach was to look at it from a research perspective, and from a regulatory perspective. And we thought we’d have kind of representations of people who work with data on a global level, on a regional level in Europe, specifically, and then more on a national level. So, I’ll just introduce our panelists. First, we have Mirja Vujeda. She’s a Senior Expert Advisor at the Croatian Data Protection Authority. Then we have Aurélie Merquiol, the Data Protection Officer at the IRD, which is a French research institute on sustainable development. And then we have Darius Amilevičius, and he’s the Chief Architect, Chief Data Architect, at the Information Society Development Committee, and the Chief Researcher and Head of AI Research Group at the university here in Lithuania. Thank you. And I’ll allow everyone to give… a brief three-minute statement before we then move on to questions. Our hope is that there’ll be audience participation, but if not, we have plenty of things to discuss. Please, Mirja.

Mirja Vujeda:
Thank you, Moritz, and good afternoon to everyone. I would first like to express my gratitude to have a chance to speak today on this panel. Gratitude to the organizers and to the Council of Europe. So, as introduction, I will share a few thoughts about data governance, more precisely about Data Governance Act and the importance of the open data from the perspective of the Data Protection Authority. So, the European Union has been at the forefront of promoting open data through its comprehensive legislation, and this European Union legislation is a cornerstone of the broader European strategy of data, launched in 2020. So, on the EU level, it has been recognized the great potential of data, both for social, for economical and social potential of data, and the importance of the open data. So, some of the key objectives of the European open data policies are fostering economic growth and innovation by making public sector data more available. The EU aims to stimulate economic growth. Also, one of the key objectives are ensuring transparency and accountability, promoting citizens engagement, and encouraging reuse of resources. So, what are the benefits of open data? One of the main benefits are the possibility to make informed decision-making. Open data provide valuable insights both for private and public sector to make informed strategic decisions. Also, some of the benefits which we could highlight are innovation and product development. For example, if an entrepreneur would have data, they can use this data to identify some gaps in the market and to provide innovative solutions. Also, open data improves government and community communication and evidence-based policy development. One of the most important acts when we talk about open data is for sure the European Data Governance Act, which entered into force in June 2022 and is fully applicable from September 2023. So, the scope of the DGA is both personal and non-personal data, and this is a cross-sector legal instrument that aims to regulate the reuse of public health-protected data by boosting data sharing through the regulation of novel providers of data sharing services known as data intermediaries and by encouraging the sharing of data for altruistic purposes. So, these are like two, we can say, novelties which the Data Governance Act regulates and which somehow we expect that it will encourage and boost the data sharing between data holders and data users. So, I believe that later in the discussion we will have an opportunity maybe to speak more about that. Thank you.

Aurélie Merquiol:
Does it work? Do you hear me? Yeah. Good afternoon everybody, I’m very glad to be here, thank you for having me. So I am data protection officer, I’m sure you know what it means. In public sector, scientific research, French institute which work all around the world. So I would like to explain the issues and difficulties that I have in my job to make compliant with GDPR. First one, the non-official recognition of adequacy by local DPAs, I will explain, DPAs, Data Protection Authority. A lot of countries in the world have imported European values in their data protection regulation, you know that, but what happens when public research wants to transfer data from rest of the world to EU? I will give you an example, an African data protection authority told me that even if GDPR is applicable in the EU, they need to officially deliberate to recognize our legislation. So in practice, we needed to have an authorization to the public research. And in practice, this involves making an authorization which make around one year to be given by the DPA. Another question I would like to speak here is the question of interpretation of text of the regulation by Data Protection Authority. The text of GDPR, like you know, requires justification to carry out data processing. You can find consent, or performance of a contract, or the performance of a task carried out in a public interest and three other cases The notion of public interest is subject to interpretation In West Africa, for example, you need to have the nationality to use this public interest In other words, processing data by a French public research organization like ERD cannot be based on this public interest In fact, we need to have an authorization and to ask for consent You know that consent in GDPR is very hard to prove and you can withdraw consent and it’s not adequate with public research I just finished, I see the prompt

Darius Amilevičius:
Thank you, hello everyone I am very happy and very glad to be here I am working in the Information Society Committee as a chief data scientist In Lithuania now is acting the reform of public sector data governance It’s very important because the umbrella of this reform our committee are doing big project to create new architect structure of public sector data to create ontology to make sense of how the data is moving from one user to another. But this is very important because in this case we can make data more findable, more accessible, more reusable. We are introducing common standards for describing the data, for transmitting the data, but doing our project we found that the technical part to describe data and so on is the easiest part. We can easily push the data into public sector data space, but it’s very hard to pull out for customers from this data space and to use this data because of the protection requirements. Public sector data is full of sensitive data, of personal data, so this is very interesting data for all. But in, for example, in Lithuania public sector is the biggest provider of data. For this reason, public sector data is very important for scientific research, for startups, for so for science, for business. On the other side, I am working in the university, one of the Lithuanian universities, and we are creating scientific research data space. This reform is only at the beginning of this reform. We are trying to make part of European scientific research data space and in public data space and scientific data space we have big problems with regulation in science research data space there, for example, intellectual property rights on the other side in public sector, personal data and other things. Thank you.

Moritz Taylor:
Thank you very much. Well, before we move on to opening the floor to everybody, I think it might be valuable for me to be able to ask you each a question as well. Maybe I’ll start with Mirja. So how do you think, since you mentioned it now, this Data Governance Act, how do you think that’s going to impact data protection specifically, especially of personal data?

Mirja Vujeda:
Well, as it was mentioned before, the scope of Data Governance Act will include also personal and non-personal data, but within the open data legal framework, actually, data protection law plays a crucial role, so the principle of the data protection regulation of the GDPR have this important law in applying open data legislation and in general in any situation which involves data processing. And this Data Governance Act makes it, is making very clear that if the activity involves processing of personal data, then the GDPR should prevail. So this Data Governance Act reinforces the protection provided by the GDPR, and it ensures that all the data processing and all the activities that are involved under Data Governance Act should respect and should comply with existing data protection regulation. Actually, this data protection, like providers of data sharing services, they are somehow a novelty with this Data Governance Act. and the aim of them, so somehow they will connect the data holders, they will create relationship between data holders and data users, and they should ensure the security of data and transparency standards, but they will, in this activity, they will be completely neutral in a way that they will not use this data, they will just somehow connect those who need data and those who are providing data. And when it comes to data altruism, like organization, so with this Digital Governance Act, it will be allowed to voluntarily share data for public purposes, like scientific research, but in this case, like those data subjects who would like, for example, to share their data will give consent for sharing this data, which should be complied with the GDPR. So somehow this Digital Governance Act is somehow, all the activities should be compliant with the GDPR and is actually strengthening these data protection principles. Thank you.

Moritz Taylor:
Okay, thank you very much. And perhaps, Aurélie, it’s adjacent to this, what Mirja has been talking about. How can data be shared safely, or how could we improve sharing the data safely across borders, across institutions, in a way that protects people’s data, again, protects safe, keeps them safe?

Aurélie Merquiol:
I think that the best way is anonymization, but in public scientific research, it’s not the good way, because we have to prove the veracity of the research, so anonymization is… It’s not possible. We just can pseudonymize. And moreover, the details of the data must be made available to the other scientific to reproduce and reach research. So we have solution, but we have another issues. Thank you very much.

Moritz Taylor:
And so Darius, before we open the floor, one last question for you as well. How can we ensure that data is available for machine learning, deep learning, and in particular, how can we ensure that it’s public sector data that is available for these tools?

Darius Amilevičius:
Very difficult, but very good question. In Lithuania, we have a data lake that serves as a sandbox for data analytics purposes. It’s a safe place to make some experiments with data analysis with a vast amount of data. But this data lake is not suitable for machine learning. For my purposes, I must have a big amount of data now, not tomorrow, today. Let me give one example. My team in university and the Central Polaris Department of Lithuania were doing a common project to create a very specific chatbot for Lithuanian police. To train this chatbot, I must obtain the data from police. They have thousands of emails, thousands of hours of telephone conversation records. But to obtain permission to research this data, to use this data, I spent four months to obtain all permissions. Like you know, project is limited in time. So four months of my project, I spent only to take the permission. We are also working in the health sector, the same story. For five months to obtain the permissions to have possibility to analyze health records. So for analysis purposes, we have to solve this problem with data sandbox for analytic purposes. In our Lithuanian artificial intelligence strategy, we have put our need to create a sandbox for artificial intelligence needs to create access to big data. Because, like I said, the principal provider of data in Lithuania is public sector. But for now, unfortunately, this is very difficult. And the question is not resolved.

Moritz Taylor:
Thank you, and I think that provides a bit of a basis for people who perhaps in the crowd or online have questions about data protection issues, about the transfer of data for research. And of course, in machine learning. So I’ll open the floor and allow anyone who wishes to raise their hand and participate, ask us any questions, ask the panel entirely or specific people questions. We have two over there, one there. So please up there first. I think there’s some important questions to think about here, about what we’re willing to sacrifice. We always get the impression, and as you said, you had four months, four months in the old days wasn’t that much time. And now, four months is an eternity.

Audience:
Thank you. I’m Pilar, I’m from the YouthDIG org team. So, my question is, how can we ensure that our European approach on data governance and the new data governance act, the new data act and the more recent regulations also become a beacon for the rest of the world? Thank you.

Moritz Taylor:
Thank you. Great question. Should I ask questions? Yeah, go ahead. I have a question. I have a question for both of you.

Audience:
Should I ask questions?

Moritz Taylor:
Yeah, we’ll take a few. Okay.

Audience:
I’m Mariam. I’m from Georgia. I’m the You Stick group. My question is, I’m actually now studying biomedical engineering and we have problems with patient data actually because when we try to do something new and build out the innovation, we have always questioned who is the owner of this data? Is it patient? Is it us who are trying to do the research on it or is it the third party or the provider, for example, the health care provider? If we could kind of like if you could kind of clear up for me who is the like how can we define the ownership of this kind of sensitive data? Thank you.

Moritz Taylor:
Brilliant. Thank you. Perhaps we’ll start and you can already start finding the next ask. There’s one behind you as well if you want. Maybe question one about spreading European standards. It’s a good one and I think one that of course we’re very interested in, not for the GDPR but others, but I’ll let Mirja take this one.

Mirja Vujeda:
Okay. Thank you for the question. Of course, these are like European standards, but all when comparing to the rest of the world, like maybe in Europe we recognize the value of the data because all these regulations are at the end for the benefit of the society and for the benefit of individuals. So, I mean, I think it’s very important to have the data available. And I think it’s important to have the data available. And I think it’s important to have the data available. And I think it’s important to have the data available. all these regulations are not to make our life or work more complicated, it is to regulate how to, for example, at the beginning when I was mentioning some of the benefits of open data, for how huge potential for economical growth for creating this data has, they have, and what is like also for policy making, like to make informed decisions on the government level, also for the public sector, so we believe that all these standards that we do have in Europe will be recognized also in the rest of the world, and when we speak also about the data protection, okay, here in Europe we do have GDPR, but also it’s important to mention this convention 108, and 108 plus protocol, that convention which creates like a global standard for data protection, and it’s open for signature and ratification to any country in the world, so I believe that all the countries for their, and also private sector and public sector, recognize the importance to somehow follow these European values and European approach when it comes to data protection and data governance, so I agree.

Moritz Taylor:
Perhaps Aurelie as well, because especially when it comes to the next question, maybe you have some input on medical research and how it…

Aurélie Merquiol:
Could you repeat me the next question?

Moritz Taylor:
The main point that I understood was that there’s not a lot of clarity in the ownership of data in medical research and biomedical research, so who is the owner of the data in medical research, who is responsible for handling and… keeping that data safe for example and maybe we can add on to it this part about having global standards for exchanging that data because I think that’s also quite an interesting… I really don’t like to say ownership of

Aurélie Merquiol:
data. I think we are in Europe so there are no ownership of data. We are not in United States of Americas where you can have another opinion. But about who has to ensure compliance of data processing of course it’s all the data processor and all the professional scientists and people who are processing data every day because we know that it’s human which increase risk in data processing.

Moritz Taylor:
Did you have a comment as well Darius?

Darius Amilevičius:
Yes for medical data this is my field. I think that the biggest point is that our approach to who is proprietary of this data is wrong because I think we must divide personal data and personal rights and individual rights and group rights. Now in regulation right of individual is put over the right of the group. For example if I am creating new tool for health sector this will give benefits for many people. So if I must look for 10,000 permissions of owners of data to make my new tools, this is something terrible. So in our protection, we must introduce some kind of collective interest, public interest. And in this case, in safe environment, like some kind of sandbox, the researchers like you must have access to those data without making this kind of reasoning to who is proprietary of the data, I think.

Moritz Taylor:
Thank you very much. I know there’s online questions as well before we move on to real life ones. But maybe one real life question and then do you have many? It’s on. Okay.

Audience:
Me or not?

Moritz Taylor:
Please, go on. Okay.

Audience:
I’m Jacques Begringer of Swiss IJF, but I had the privilege to work at the Council of Europe or to represent business at the Council of Europe when we negotiated in 2018 the recommendation on health data. And it was already at that time, there was this sentiment that some was maybe data protection has gone too far and needs some corrective. Now my provocative thesis is that the new EU Data Governance Act and the EU Data Act are somehow cutting back on GDPR, I mean, on the overall idea of data protection and in particular, the idea that every data must be avoided at all price has come to an end and we are at the pendulum swings back. So that’s my thesis, what you think of that. And maybe you can also explain the difference between Governance Act and a Data Governance Act and Data Act. This creates much confusion in business.

Moritz Taylor:
Thank you. Online questions.

Audience:
Okay, thanks for taking this question. Again, Deborah Allen Rogers, coming to you from the Hague Digital Fluency Lab called Find Out Why, representing civil society. I think the idea of data ownership is somewhat should be compared to the idea of we own our own money. And I would like to own my own data. And I would like to be able to opt in and opt out, especially in the case of medical data, for example, of if someone has access to my data. So I don’t think it’s something that we shouldn’t think about in terms of data ownership. It should be on a personal level, like we own our money, or we own our property. The other thing I want to say, and then I’m going to ask a question, is I also appreciate having worked at the Transatlantic Partnership this past summer between American and European values, and I often hear them sort of overlapping. And in many ways, they’re not alike at all. And in other ways, they’re quite similar. And I do think that the time it takes to build better systems and design the governance and design the encryption to get safe data transferring is worthwhile. So to the data scientists in the panel, it’s like the young people can certainly in your classes are helping you with this, I’m sure, because I see it in the classes, if I’m ever in the classroom, their minds think this way. So here’s my question. If we were to have ownership of our own personal data and have the encryption keys, and then have certain levels of protection, if it gets breached, or certain levels of protection for ourselves, if we give access, are these things that you’re thinking about anybody on the panel? Because I think ownership of personal data should be a real concern as a part of the collective and as a part of individual freedom and security. Thank you for taking my question.And I’ll add. Alexandra Yevdokimova. Given the bureaucratic burden, how can we ensure that the public sector remains competitive in open data provision and the big private actors don’t just come in with maybe paid but efficient and quick solutions?

Moritz Taylor:
Okay. Thank you. These are some pretty hefty questions, I think. I think, Mirja, you said you’d volunteer to see if you can answer that first question.

Mirja Vujeda:
First question, if I remember well, it was about the European Data Governance Act and in relation to the Data Act and also about lawfulness of processing something.

Moritz Taylor:
Yeah, depending on the swing back. So, basically, it takes away some of the provisions of the GDPR, and goes against some of the spirit of the GDPR.

Mirja Vujeda:
Well, I would not say that it is just my opinion that it’s taking away the spirit of the GDPR, because at the end, if there will be some kind of conflicts between the provision, the GDPR will prevail. And at the end, if under the Data Governance Act, these data, so the scope of the Act is personal and non-personal data. So, in the case these data will contain personal data, still you will need to have like a lawful basis for legal processing. You cannot process or access to these data without a lawful basis for the processing. So, I would just say that this Act is not confronting the GDPR. There somehow in the provisions of the Data Governance Act, it is very clear that the relationship between the Data Governance Act and GDPR. So, in that case, but I understand like your of concerns when it comes to the practical application of the Act. We have a lot of EU legislation regarding open data, which is like Data Governance Act and Data Act, which is the second pillar of European data strategy, and somehow this Data Act is complementing Data Governance Act. So the aim of this Data Act is to enhance the EU economy and foster competitive data market. Somehow the aim of this Act to somehow regulate who has the right to the value of the data which will be shared under the Data Act, and this Act focuses more on making industrial data more accessible and usable, encouraging data-driven innovation and increasing data availability. So, in Data Governance Act, we’ll somehow, from my perspective, how I see to make it more simple, this difference is to make more protected data from the public sector more accessible for the users. Thank you.

Moritz Taylor:
Do you have comments on some of the other questions, perhaps?

Aurélie Merquiol:
Yes, about ownership and data. I’m sure that we don’t have any ownership on our data, but we have right, right to access, right to withdrawal consent, and so on and so forth. About anonymization, I agree with encrypted protocol, but I have a question, where are the keys? Who has the keys? And in my practice, I often see that the keys are… are kept by American company, it’s another problem.

Darius Amilevičius:
I think that encryption of data, of course, it costs additional money. You must have additional technical resources, but I think that encryption is a good point to start, because maybe additional cost is the cost to safely use the data without anonymization. Because in anonymization case, the data became unusable for machine learning. So maybe encryption is the way to go. Thank you.

Moritz Taylor:
Thank you. I think there’s one question behind you over there. And maybe one more. Other than that, we’re starting to run out of time already, which is quite impressive. Was there anyone else?

Audience:
Hello. I’m Emily Khachatryan, coming from the Advisory Council on Youth of Council of Europe, also representing Youth Development Center of Armenia. Thank you very much for the panel. My question was regarding opting out of the usage of data, because there are now some implementations of that users can pay an additional fee to the platforms to opt out of their data being used. But don’t you think that it will cause inequality between different social groups? Because then people who have more privilege economically will be able to do this, and the lower class, unfortunately, won’t be able to protect their data, which is a big issue in our society now. Thank you.

Moritz Taylor:
I think that’s a fair question. Inequality in data and the power of the data. Who would like to answer?

Aurélie Merquiol:
I will give an answer. I totally agree with you. It’s not equal, and I think it’s not legal, but the European Court of Justice will have to examine this problem.

Darius Amilevičius:
I am a big fan of open data and open source. I think that all the software must be open-sourced, all the data must be available free of cost. Is there some kind of blood donor? When I give my blood free, I can ensure the quality. And if I pay for something, I think that the best way to go is all the data open to all. I agree that the producing of data also costs money, but we must find some resources to make foundation for creation of data, and so I think the fees are unusable for open data.

Moritz Taylor:
Okay. Maybe since I think we’re out of questions at the moment, unless I’ve missed someone somewhere in the corners, we could move on to some solutions. Or each of you work in a particular area of your expertise, what would be a way to enable the free flow of data in a safe manner for your area of work, and how you perceive these risks and opportunities?

Mirja Vujeda:
In this digital era, in the internet, we are much more aware how data protection is even more important. essential than ever and I believe that we can by somehow we can what can be some kind of way to ensure like easily data flows and sharing data is to building trust also about how we use the data especially this principle of transparency which I see very very important that and to ensure like the security of the data which for sure like somehow the consequence of that will be that the data share sharing will be more more easier if we somehow respect all the principles of the data protection if data subjects will know where for which purposes we use their data if we build like safe and secure like environment especially on the internet I would say like security and transparency would somehow affect that in future will affect more like will somehow support and data sharing like if I can say like more in general way like general answer thank you

Aurélie Merquiol:
so I think that today our data are mainly transmitted and managed by large companies like companies of big data or big pharma I think the future of internet will be different because it will decentralize and data are not destined to pass directly between users and users and maybe it will be help to take care of our data however I think it’s important, the most important is to train new generation and use their data.

Darius Amilevičius:
In our times, digital economy, data is fuel, and it’s most important thing for our times. But biggest problem is, I like European Union position, proactive position in regulation. But biggest problem that regulation and technical progress must go step by step, side by side. Because when the two things are separated in time, there is danger that regulation became over regulation, under regulation, regulation in that time, or will regulate that things. So I think that these two must go side by side. Thank you.

Moritz Taylor:
All right, thank you very much. I think we’ve essentially, surprisingly enough, we’ve managed to reach the end, unless there’s final comments. I’d like to thank the panelists very much for participating. I know it was my first EuroDIG, I know it was first EuroDIG for you too. Thank you very much to all the organizers who’ve made, and the co-organizers of this group, by the way, who helped me in the setting up of the panel discussion. And thank you, Maria, thank you, all of you for participating, Oeli, and of course Darius in taking part and answering everyone’s questions. I think on that note, we can just end it now.

Moderator:
As a journalist, Moritz, me, myself, a bit a journalist, I would ask you one last question. If you can say in one sentence, what would be the idea you want all these guys to bring from your session? Okay, two sentences, but you know, the main idea.

Moritz Taylor:
Each or just me?

Moderator:
You, you’re moderator, come on, do the hard work.

Moritz Taylor:
I think what we’re seeing is that there’s a tangled web of regulations and rules and that people are sometimes frustrated with, sometimes encouraged by. We’re getting better at it, I think. And I think from what we’ve heard here, people are doing their very best to adapt to the situation. And I think most regulators and most people who work in this field have good interests and good intentions. And that’s what I think I would take away overall is that with international cooperation, with people working on a global scale, on a regional scale and on a national scale, we’re moving forward.

Moderator:
Thank you, Moritz.

A

Audience

Speech speed

173 words per minute

Speech length

898 words

Speech time

311 secs

AM

Aurélie Merquiol

Speech speed

105 words per minute

Speech length

728 words

Speech time

418 secs

DA

Darius Amilevičius

Speech speed

115 words per minute

Speech length

1057 words

Speech time

551 secs

MV

Mirja Vujeda

Speech speed

131 words per minute

Speech length

1641 words

Speech time

751 secs

M

Moderator

Speech speed

178 words per minute

Speech length

129 words

Speech time

43 secs

MT

Moritz Taylor

Speech speed

171 words per minute

Speech length

1368 words

Speech time

480 secs