Strengthening Worker Autonomy in the Modern Workplace | IGF 2023 WS #494

12 Oct 2023 00:30h - 01:30h UTC

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Eliza

The analysis explores the impact of technology on various social issues, including labour exploitation, inequality, poverty, and workers’ rights. It begins by discussing the prevalence of sweatshops in countries with less stringent labour laws, which has been exacerbated in the digital era. Digitisation and technology are argued to be catalysts for increased inequality, as the majority of global wealth is concentrated in the hands of a few wealthy individuals.

Policy mismanagement is highlighted as a contributing factor to labour exploitation and inequality. Ineffectively enforced tax policies and austerity measures are identified as direct triggers for human rights abuses, while lax tax policies have led to global inequalities. It is suggested that poverty is not a random occurrence but a result of deliberate labour and economic tax practices.

AI technology is scrutinised for its dependence on hidden human labour, particularly in the gig economy. AI startups in the EU were found to use human labour rather than fully automated tools. Furthermore, the development of AI models can infringe upon rights and ethical considerations, with vulnerable populations such as refugees and incarcerated individuals increasingly engaged in tasks such as image labelling.

The analysis argues for policy attention to protect vulnerable populations who may be targeted for AI development. Companies are accused of concealing the human labour involved in creating AI tools, leading to a lack of transparency regarding their methods. Desperate groups such as refugees and asylum seekers, lacking other employment options, may be exploited through digital piecework.

The future development of AI is seen as a potential exacerbator of labour exploitation and inequality. The pressure for companies to assert their relevance in the market may lead to a “race to the bottom” where marketing overshadows the actual functionality of AI tools.

The presence of hidden human labour in AI technologies is viewed as problematic by Eliza, who argues that companies maintain this secrecy to preserve the illusion of advanced technology. Many AI tools and technologies lack transparency, making it difficult to determine the extent of human involvement.

The analysis also emphasises the importance of broadening the scope of the digital rights and tech policy community by promoting interdisciplinary collaboration with fields such as unionisation and climate change. It is believed that the tech sector has significant potential in addressing wealth and racial inequalities.

In conclusion, the analysis highlights the negative impact of technology on various social issues, including labour exploitation, inequality, poverty, and workers’ rights. It emphasises the need for policy reform, transparency in AI development, and interdisciplinary collaboration to address these concerns. Additionally, it underscores the role of the tech sector in tackling inequality.

Wilneida Negron

The digital transformation of work has led to the development and proliferation of intrusive products that collect sensitive data on workers. This data is collected at various stages of work, including biometric data, sentiment analysis, and productivity monitoring. The collected data is then used for purposes such as surveillance, training AI models, and making predictive analyses on workforce risk. However, the increasing influence of ‘Little Tech’ – smaller technology companies – has resulted in the fragmentation and diversification of industries, making it challenging to implement effective privacy regulations.

There is a pressing need for stronger privacy regulations in the workplace due to the growing invasion of technology into workers’ privacy. An extensive and fragmented ecosystem of workplace tech tools collects sensitive data points on workers, with many workers not being aware of these tools or understanding their privacy implications. This lack of awareness highlights the urgency for policy and regulatory action to establish basic protections for workers facing increasing workplace surveillance.

Algorithmic tools also require regulation and impact assessments to prevent the misuse of sensitive data. Legislation is being introduced in the US that focuses on auditing the use of such tools for hiring and recruitment. It is essential to scrutinize the use of sensitive types of data, such as biometric data, to ensure it is handled appropriately.

The rapid pace of market dynamics in technology and work poses a significant challenge for legislation. With hundreds of new products being introduced each year, it becomes increasingly difficult for regulations to keep up. This highlights the need to address and tackle the market dynamics driving technology and work trends to ensure fair and ethical practices.

Private markets, including venture capital, lack accountability and require greater transparency. Companies like Facebook gather capital in private markets before going public, making early-stage intervention necessary. Greater oversight and transparency in these markets can help address issues related to accountability and fairness.

Furthermore, there is an untapped opportunity in building cross-class power with workers across different regions and industries to foster equitable workplaces and reduce inequalities. This involves encouraging shared analysis and collaboration among workers from diverse backgrounds. By creating connections and solidarity, it becomes possible to work towards more equitable and inclusive work environments.

In conclusion, the digital transformation of work has raised concerns about privacy and the collection of sensitive data on workers. Stronger privacy regulations, policy and regulatory action, and impact assessments are needed to protect workers in the face of increasing workplace surveillance. Addressing market dynamics and ensuring transparency in private markets is also essential. Furthermore, fostering cross-class power and building connections among workers can help create more equitable workplaces and reduce inequalities.

Raashi Saxena

The analysis focuses on several topics related to the gig economy, employee rights protection measures, the gender digital divide, online violence against women, digital inclusion for differently-abled women, support for female founders in the venture capital ecosystem, and the need for holding companies accountable.

In the context of the gig economy, the analysis highlights the challenges faced by workers in India, such as low pay, lack of job security, long working hours, and the absence of social protections like healthcare or pension benefits. Additionally, it mentions the efforts made by Kali Piri taxi drivers in Mumbai, who formed their own app to negotiate better wages. Furthermore, a nationwide strike organized by the Indian Federation of App Transport Workers in 2020 demonstrates workers’ demand for improved pay and conditions.

With regards to employee rights protection measures, the analysis takes a supportive stance. It mentions the legislation introduced by the Rajasthan government to provide basic pension and social scheme benefits for gig workers, as well as the labor code implemented by the Indian government aiming to provide social security benefits to gig economy workers. These measures are seen as positive steps towards ensuring decent work and economic growth.

In contrast, the analysis criticizes the implementation of labor protection measures, arguing that issues of exploitation and unfair treatment of workers persist despite the introduction of labor codes. It also notes that during the COVID-19 pandemic, many industries faced inadequate support despite the shutdown of factories, indicating a potentially insufficient safety net for workers.

The analysis also explores the gender digital divide and the potential solutions offered by the WSIS 20 renewal process and the global digital compact. It mentions that these initiatives can help identify barriers preventing women from accessing technology, boost digital literacy skills among women, promote locally relevant content and services, and ensure equal opportunities for women in the digital revolution.

Another important issue highlighted is online violence and harassment against women. The analysis states that with the advancement of artificial intelligence, the phenomenon of online violence has increased. However, it also suggests that robust policies and collaborations with organizations such as T-RIC can help combat online violence, promote digital safety, and provide effective support to victims.

Furthermore, the analysis emphasizes the need for digital inclusion for differently-abled women. It states that women with disabilities face disproportionate challenges and social stigma, and access to digital devices and platforms can facilitate their social and cultural participation, leading to dignified livelihoods.

In the context of venture capital, the analysis notes that only 7% of female founders globally are backed by VC firms, indicating a lack of support for female entrepreneurs. It also states that the venture capital ecosystem is often insular and favors those from established networks, further hindering female founders’ access to funding and resources.

Lastly, the analysis highlights the importance of holding companies accountable. It suggests that more conversations and information-sharing are needed to effectively band together and hold companies accountable for their actions. This aligns with the goals of reducing inequalities and promoting peace, justice, and strong institutions.

Overall, the analysis provides valuable insights into various issues surrounding the gig economy, employee rights, the gender digital divide, online violence against women, digital inclusion, support for female founders, and corporate accountability. It highlights both positive measures and persistent challenges, offering a comprehensive overview of these topics and emphasizing the need for continued efforts to create a more inclusive and fair society.

Eduardo Correggio

The implementation of digital technologies in Paraguay exacerbates poverty and inequality by amplifying historical surveillance and control of workers. Workers’ exploitation is further intensified as the current capitalist system evolves to maximize surplus and exploit them. The gig economy in Paraguay reflects a prevailing normalization of precarious work, which limits economic opportunities for workers.

One of the contributing factors to the vulnerability of workers in Paraguay is the lack of personal data protection and anti-discrimination regulations. Paraguay does not have a personal data protection law or a law against all forms of discrimination, leaving workers without essential regulatory protection for their rights. The absence of these regulations, when combined with digital technologies, expands structural injustices faced by workers.

To address these pressing issues, collective organization among workers is crucial. The struggle for a fair digital future necessitates workers coming together to exercise their rights to freedom of expression, association, and autonomy in a surveillance-free environment. However, Paraguay faces a significant challenge in this regard, as it has one of the lowest rates of unionization in Latin America. This lack of organized resistance hampers worker organization in the gig economy.

Another important aspect is the need for workers to understand the complexities of digital technologies and their intersection with labor rights. Workers who attempt to organize are often fired before they can form a union, creating a culture of fear and uncertainty. Overcoming this requires a cultural shift and an understanding of the risks and challenges posed by digital technologies.

Concerns also arise regarding the implementation of complex governance systems for shared data access, particularly in the global south where the gig economy is still being understood. It is a challenging task to develop effective governance systems that account for the nuances and specific context of the gig economy in the global south.

Recognizing worker dependency is crucial for pursuing worker rights in the digital economy. Most regulations and organizations are yet to fully acknowledge the dependency of gig economy workers, which hinders their ability to access proper protections and benefits.

The concentration of the digital economy is another prominent issue. Currently, it is highly concentrated, limiting competition and negatively impacting working conditions. Encouraging competition in the digital economy is essential to improving working conditions and creating a more equitable environment.

Furthermore, the failure to address the issue of “ghost sweatshops” prevents the exploitation of workers from being effectively addressed. These hidden exploitative practices go unnoticed and unrecognized by platforms, perpetuating the cycle of worker exploitation.

One potential solution that offers hope is platform cooperativism. This emerging concept promotes the exploration and growth of work platforms that are more autonomous and fairer. It involves workers building their own digital infrastructure, giving them greater control over their work and ensuring fairer conditions.

In conclusion, the implementation of digital technologies in Paraguay exacerbates the existing problems of poverty and inequality. The normalization of precarious work in the gig economy, coupled with the lack of personal data protection and anti-discrimination regulations, further accentuates the vulnerability of workers. Workers need to collectively organize and strive for a surveillance-free work environment to address these issues. Additionally, recognizing worker dependency, promoting competition, and exploring platform cooperativism can contribute to creating a fairer digital future. However, challenges such as the low rate of unionization and the need for a cultural shift must be addressed to effectively protect workers’ rights in the digital economy.

Moderator – Ayden Ferdeline

The analysis focuses on several topics related to SDG 8: Decent work and economic growth. One significant issue highlighted in the analysis is the lack of accountability in venture capital. It calls for greater disclosure in this sector, pointing out that private markets, where companies like Facebook go before launching IPOs, currently have very little accountability. This raises concerns about transparency and potential risks for investors and the wider market.

Another area of concern is the market dynamics in the data brokerage industry. The analysis notes that mergers and acquisitions frequently occur in this industry, with smaller data brokers that collect sensitive employment data being acquired by larger data brokers. This concentration of power raises questions about fair competition and potential impacts on the privacy and security of employment data.

The analysis also emphasizes the value of collective rights to worker data. It suggests that the collective use of such data could have significant benefits for both workers and their employers. This argument highlights the importance of establishing mechanisms that allow workers to have control over their data while also enabling its responsible and ethical use in decision-making processes.

Advocating for intervention in the early stages of private markets, the analysis highlights that intervening during the early stages can shape the future trajectory of companies like Facebook while they are still in the private market space. This argument underscores the potential positive impact of early-stage intervention in influencing the direction and practices of companies in terms of decent work and economic growth.

Another concern raised in the analysis is the intricate issue of data governance in the Global South. It points out that people in the Global South are still learning how the digital ecosystem works, and complex data governance systems could potentially be misused by some companies in this region. This observation highlights the need for carefully designed and well-implemented data governance frameworks that protect against exploitation and ensure fair and equitable outcomes.

Shifting focus to the gig economy in Latin America, the analysis reveals significant issues regarding worker dependency. It highlights that none of the gig platforms in Latin America currently recognize worker dependency, which raises concerns about workers’ rights and socio-economic stability. Governments are considering alternative approaches, with a potential hybrid model being pondered to address these challenges.

The growth of the workers’ rights movement through platform cooperatives is presented as a positive development. The analysis notes that workers are attempting to build their own digital infrastructure, and platform cooperatives provide an opportunity for workers to design their own working platforms. This empowerment of workers in the digital economy aligns with the goal of achieving decent work and economic growth.

Lastly, the analysis explores the potential of UN instruments like WSIS plus 20 and the Global Digital Compact in aiding female workers. It highlights the need to bridge the gender digital divide and empower women in the workforce. However, no specific arguments or evidence are provided in this aspect of the analysis.

In conclusion, the analysis offers insights into various aspects related to SDG 8: Decent work and economic growth. It calls for greater accountability in venture capital, highlights concerns about market dynamics in the data brokerage industry, advocates for collective rights to worker data, emphasizes the benefits of early-stage intervention in private markets, raises concerns about complex data governance systems in the Global South, sheds light on challenges in the gig economy in Latin America, outlines the growth of the workers’ rights movement through platform cooperatives, and explores the potential of UN instruments in empowering female workers. Overall, the analysis provides valuable perspectives on promoting decent work and economic growth and calls for measures to address the challenges and opportunities identified.

Session transcript

Moderator – Ayden Ferdeline:
a.m. on Thursday, October 12th. My name is Aidan Verlin. I am a public interest technologist and a Landecker Democracy Fellow and I am the moderator of today’s discussion on strengthening worker autonomy in the modern workplace. I am pleased to be joined today by five esteemed panelists as we discuss the digital transformation of work and how we are redefining the relationships between employers and employees. I won’t introduce each of our panelists just yet, but I’m going to call upon our first speaker who is Monita Nugron, the Director of Policy and Research at Coworker. And Monita has been very instrumental in this space in providing terminology to describe a lot of what we are seeing when it comes to the evolutions in workplace surveillance and new ways of measuring productivity and other sort of forms of datafication that is happening in workplaces. Monita, good morning, good evening for you. The question I have for you is building off of a report that Coworker published in 2021 examining the impact of technologies on workers in the US. You’re now taking your research on little tech as you term it global. Can you maybe just give us a little background as to what little tech is, how it differs from big tech? And now that this research is being taken global, what are some of the preliminary findings that you’re seeing?

Wilneida Negron:
Yes, good morning to you in Kyoto as well and you’re happy to kick-start this conversation. As Aidan mentioned, I’m the Director of Research and Policy at Coworker. And Coworker is considered to be the welcome mat to the labor movement. We really, a lot of the workers we engage with, because we’re not a union, we’re agnostic. So we get to talk to workers from every industry, everything from like the tech workers in the bigger tech companies to workers in retail, workers in manufacturing, workers in hospitality. And so through conversations with a broad set of workers across many industries, we started collecting, it was literally just a list of different apps, different vendors at different workers were sort of like coming to us about like, do I have to download this app now? Like Starbucks workers to track, you know, to sign into work, and I’m not really sure what the privacy issues were. And so we really started collecting different products that we were hearing from different workers across different industries. And at the time, I was also in sort of the consumer privacy, big tech policy conversations. And obviously, those conversations were focused on sort of the five big tech companies. And they were, you know, that play an outsized role in society, so not to be ignored. But it was it was five companies. It was, you know, the result of like 20 years of innovation, really that these like that we’ve got the Googles and the Facebook and the Amazons, etc. And what I what we discovered from the conversations with workers and being a consumer privacy, it was like that the worker there was less attention on sort of the work of privacy surveillance, and that the ecosystem and the marketplace and the what workers are encountering, it’s much more fragmented, it’s much more expensive, it is hundreds, what became really like a short list of 10 apps and vendors became into hundreds. And we stopped researching was hundreds of different apps. Yes, there were some Amazon has a lot of apps that they designed. So big tech was playing a role. But there was also a lot of startups, a lot of platforms, a lot of apps. And we started, we wanted to quantify that by creating a database and just getting a sense of what the ecosystem looks like. And what we when we began to we use, we use the word little tech, it was sort of ironic that it was like little, smaller tech. But it was actually 1000s when you could visualize the five that dominate consumer privacy conversations with the 1000s that dominate worker privacy conversations and workers, because they’re little, sometimes they’re smaller vendors, unknown, like unknown companies that just tailored to different sectors. They’re not really known and workers, they don’t have familiarity. So they were very confused, oftentimes, and very concerned, I should say more confused, just concerned of like, what is this new technology? Who is this vendor? Are they trustworthy? Are they using my, my information, etc. And so out of that, out of quantifying sort of the ecosystem that workers and coining, you know, trying to compete for airtime with the big tech world by calling it little tech, we really developed three hypotheses that that contributed to sort of expanding this to like, what is a global little tech of workplace technologies look like for workers across different regions. And so the three hypotheses were that it was, obviously, it was an unregulated marketplace of different products and vendors. And so we wanted to see if that is if different countries have these like, vast ecosystems of different workplace technologies that are being integrated, and touching on every part of the labor process. And that’s sort of really key, because what we learned, there was obviously a lot of focus on gig economy, but and then bossware and surveillance, but really the the suite of intrusive products to span everything from like workplace benefits to workplace safety during COVID. And other kind of labor optimization products like automation, for example, and so in productivity monitoring, so we created a taxonomy and we and hiring and recruitment. So in other words, like, it’s an unregulated marketplace, these technologies are touching on every part of the labor process. Again, from hiring and recruitment to productivity monitoring, that includes surveillance to workplace benefits, like it really would touch has a lot of touch points in workers lives. And then the third hypothesis that we wanted to see as we globally was that that this expensive does is collecting a lot of sensitive data points on workers. And we saw that with a little tech in the US, there is all of these products that are at every step of the labor process are collecting an increasing amount of really sensitive data. When we when we look at a consumer privacy space, like we know, we’re just starting to come to grips with just how much is being collected. And workers, that ecosystem of awareness, like political education is not as strong, but workers are, we’re starting to uncover just how many sensitive data points. And why is very, we need to sort of focus on like the increasing amount of data points, again, that range from like biometric, to, you know, sentiment analysis, to productivity monitoring, like the outputs to like, time attendance, like what we are seeing now, which is problematic is these data points are obviously in the wave of AI, they’re either being used to train AI models, as we’ve seen in call center work. And so workers, their data is being collected. And you know, everything from sentiment to biometric, to productivity is being collected to train AI models in particular sectors without their awareness. It’s been collected for surveillance, just like traditional privacy issues there. And it’s being used increasingly to make predictions, not about which workers will be a cultural fit, which workers have risk of everything from organizing to, you know, stealing sensitive data, like a lot of employees are really worried with sort of whistleblowing that’s been going on, and data, like industry secrets, etc, being released to that there’s a lot of risk analysis that’s happening. So there’s a lot of predictive elements of like, which workers are going to go rogue, which workers are really risk. And so again, the collection of sensitive data points, with the sophistication of technology to predict, three predictions that can affect workers, with a very limited recourse, has been problematic. So those are three things that we went out to see. And we focused on our global research, just to wrap up as was a Nigeria, Kenya, Colombia, and Brazil. And those we zeroed in on those countries, because they, we were looking at sort of, you know, this next wave of innovation, tech innovation that was like unleashed, because of COVID. So these particular countries had received a large share of sort of venture capital money for tech innovation in the past four or five years. And so we sort of at the like countries at the global level, they were in the top 10. And so we wanted to see like, what is the innovation space look like? What are the types of technologies that are coming out? And what we’re seeing is, we are seeing that the ecosystem of products in the marketplace are mostly still dominated by gig economy, but there is an increasing amount of products, and companies not necessarily in those particular countries, but sometimes it’s, you know, global North countries selling to global majority employers, and that are being used, being sold to workers and sort of as kind of traditional business, to fulfill business outcomes to like process payroll, to everything from processing payroll to, you know, timekeeping, and some low tech things, but the data, again, it’s not only the business function that we’re looking at, but it’s also the types of data that are being collected. Those have been sort of the same patterns that we saw in the US, just a lot of, and in these global majority countries, there is also no data consumer privacy laws. So for workers, again, the awareness of what to do is a lot more limited. I don’t know if you if I just stop there just to give folks, other folks a chance to weigh in.

Moderator – Ayden Ferdeline:
Thank you so much, Juanita, for that excellent introduction. I’m going to bring Eliza into the conversation now, because, Eliza, you’ve just published an excellent report, Digitally Divided, and some of the comments that Juanita was just making about what you term ghost work really hit out at me. So perhaps, Eliza, you could comment on why Amnesty International has long been researching the intersection of technology and global inequality, but it is a bit newer for you to be investigating the intersection of labor rights and technology. What changed? What sparked the need for your Digitally Divided report? And the case study that you introduced in the report on ghost work? Maybe you can briefly summarize that for everyone in the room today.

Eliza:
Sure, yeah, thank you so much. And thank you to Juanita for that really great kind of introduction. I can’t follow that with quite the same level of specificity, because the report covers a much more kind of broad set of issues at the intersection of inequality. And I guess to kind of like back out, and start from like a more high level approach, I will just say, and again, I’m representing my views here and not those of Amnesty, because a lot of our views on these tech issues are still kind of in flux. The human rights community doesn’t really have a strong history when it comes to talking about issues of economic inequality, writ large. And that includes issues of labor exploitation, which is a bit of a problem, because it is the case, and it’s been the case for quite some time, even prior to kind of the advent of the digital era and like the rollout of sort of these more app based or database technologies that Juanita was describing, that the concept of like the sweatshop, right, like the idea of workers in the global majority or in a country where labor laws are much more lax, doing and creating a lot of the value for companies that are based in the global north has always existed. And what I try to show in this report, and the case that I try to make is that the digital advent of this sort of digital sweatshop, I guess, is that’s a term others have used, is basically the same practice, it’s just applied to a different case. And I think what’s useful as another kind of note and point of reference as to why this work is coming out of amnesty right now. So my work is part of a fellowship that’s specifically focused on the intersection of technology and inequality. And I think it does come out of this issue within the human rights community more broadly. And I think within sort of like policy circles, in general, in talking about this issue of like, you see this buzzword everywhere now, inequality and all these grant making schemes and in different kind of like human development reports. And I do take issue with that term a little bit, because I think it kind of anonymizes the issue and sort of makes it out to be this kind of like mystery dropped out of nowhere. But poverty doesn’t come out of nowhere, it comes out of like explicit and deliberate labor practices, and explicit and deliberate economic tax practices. Amnesty also has some really interesting new work coming out right now about tax policy, and about how the lax or ineffectively enforced tax policies of countries around the world have made it possible for austerity measures with cuts to social programs directly leading to enormous human rights abuses. And so this is all part of a larger kind of ecosystem. And that’s what we try to show in the report, is that all of this is coming in the context of two issues that I think Juanita really nicely kind of like laid out for me that I also think are essential kind of context, which is, one, that this is happening in the context of an unprecedented status for global inequality, wealth inequality around the world. I think the latest number I’ve seen is that the world’s poorest own just 2% of global wealth and the world’s richest own 75% of wealth, which is a staggering inequality that’s really hard to fathom. And what’s even more hard to believe is that over the four years since the outbreak of the pandemic, this has really, really accelerated. And that’s happened in tandem with the rollout of different kinds of reliance, different kinds of government, public sector applications of technologies that Juanita alluded to. And so what we lay out in the report is three areas of concern for policymakers that are trying to understand the impact of technology and inequality kind of in a very broad way, because there’s just so much to cover within that. So we try to narrow it down to a couple of kind of core populations of concern. And one of them is labor. And I think we’ll continue to talk about this. And I’ll continue to say why I think that that’s a core area that I laid out. And the other two are migration and borders. So the movement of people and the right to asylum. And the last one is criminal justice and policing. And I think the last thing I’ll say before I finish, this is a long answer to your question, is that part of the reason, and it’s interesting that this work is coming at this moment, Amnesty Tech specifically, has pre existing work that focuses primarily on issues of surveillance, where we mean like spyware, different kinds of predictive policing, and then different issues around digitization, and automated decision making in the public sector. So in some ways, that already kind of like sets up the framework to talk about labor. Because when you think about criminal justice and predictive policing, labor, and migration, it’s easy if you look closely to see how these things are related, and how data sharing between employers between schools between, you know, local law enforcement agencies is going to be increasingly a practice, particularly for populations for whom there are few or, you know, less enforced legal protections, especially for more vulnerable people. And so what I show, and this is just my last point is that I think technology in some ways, has become kind of like an accelerator and a facilitator of inequality, or it’s become kind of like the helpful cover story for why we let inequality persist and become exacerbated. So that’s a very long answer. But hopefully, that’s an introduction to the report, and I’m really happy to go into more detail and

Moderator – Ayden Ferdeline:
talk more about that. Thank you so much, Eliza. And I would love to go into a little more detail in the second half of our session. But for now, I’m gonna bring in Eduardo Corrigio, who is the co-director of the Paraguayan nonprofit, TEDIC. And Eduardo, maybe I can ask what your reaction is to what Eliza and Juanita have just said. And also, I mean, I find the comment that you said, Eliza, that really struck me just then was poverty doesn’t come out of nowhere. It comes out of deliberate policy interventions. What do you think about that, Eduardo? And you have been doing research on the impact of algorithmic management on low-paid workers in Paraguay. Do you agree with Eliza’s statement? And in the context of the low-paid workers in Paraguay that you have been researching in the transport and delivery space, how does that ring true?

Eduardo Correggio:
Is this working? Yes. Well, thank you so much, Aiden. And great to connect with Eliza and with Neda. I think definitely poverty doesn’t come out of nowhere. We are in an inequality system that is now further perpetuated by the implementation of digital technologies. So I’m glad that there’s this connection happening because I sort of like digress with my presentation to talk about the context in Paraguay and the broader surveillance situation that workers traditionally already suffer. And it’s important to recognize that it’s not that technology creates a new surveillance and that before time workers weren’t surveilled. This is a situation in which that already existing surveillance is being augmented and improved by digital systems. So it’s important to also situate ourselves historically and recognize that this is not happening out of the blue. It’s just a way in which our current capitalist system is reinventing itself to continue exploiting workers and extracting as much surplus as possible. So, sorry for that. For those who don’t know me, thank you so much for the presentation, Aiden. My name is Eduardo and I am co-director of TEDIC. We’re a digital rights organization based in Paraguay. broader efforts for a more just digital economy that includes workers’ rights, we partnered last year with the Fair Work Project. Fair Work is an international action research project that evaluates working conditions in the platform economy in more than 30 countries. So for us it’s very important to generate as much data as possible in order to have a comparison of the different platforms that we rate across the globe and that most of the time repeat themselves in different contexts and in different countries. And I’m going to come back to that because there’s an element of transnationality that I think is useful to reflect upon in this particular panel. So in this project we score the platforms against five principles for the methodology fair pay, fair conditions, fair contract, fair management and fair representation. Now going a bit into the beginning of what I was saying before going into the core of my presentation and why we focused on transplantation and delivery apps within the gig economy, I think it is important to highlight that in Paraguay there have been traditional ways in which workers are surveilled in the workplace that are not per se technology dependent but reflect a complicated reality for workers. For example we researched a few years ago how companies when they hire workers they ask for more health data than what is required by law. We’re talking for instance about HIV status. So year by year we hear how workers have been fired because their employers have unlawfully access to HIV status information. And what I want to say here more than anything is that workers rights to privacy and data protection have been historically violated and this acquires a new dimension in the digital economy particularly the gig economy. This is why for the past years we have seen an exponential growth of the platform economy in Paraguay and it is why we partner with the Fair Work Network to evaluate six ride-hailing and delivery platforms operating in Paraguay. Some of the platforms that we evaluated are probably familiar to you all which are talking about the Ubers and the Volt of the world. Pedidos Ya which is a transnational delivery transport app that is very dominant in Latin America. And in general and I don’t want to lose that much time in talking about the findings but only two of the six studied platforms could score any points in the principles that in total scored ten points. So the platform that had the most amount of scoring was only two out of ten. So it is safe to affirm in general that gig economy workers had little to no possibility of meaningfully engaging with these platforms whenever they feel mistreated nor they have true capacity to scrutinize the algorithms that surveil and govern their everyday lives. And this is what is reflected in in the overall scoring that we were able to to gather let’s say. And perhaps connecting these reflections with the broader topic of the panel I think it is important to point out also that and I’m going now to the issue of the transnationality. So in this perspective these platforms are transnational by nature. So this transnationality poses or poses an important data sovereignty aspect that we should also reflect upon when we’re thinking about workers rights in its different nuances and its intersection with data protection specifically. So we’re currently in a highly digital extractive scenario whereby global South workers are providing vast amount of data of sensitive native nature for instance biometric data that is then used to train these algorithms of these platforms for that in a not so far future they can create technology that will make the platform less dependent on workers services quote-unquote in general. So it’s a it’s a circle that never never ends this one’s of the exploitation how exploitation can reinvent itself. Anything that I would like to leave perhaps as a final reflection that in Latin America in general I mean I’m based in Paraguay but we also try to see it in a regional lens this situation. When we think about workplace surveillance that is augmented or improved by digital systems it is important to remember that such surveillance itself insert itself in an already highly precarious work environments where workers normally are excluded from reparation mechanisms in general. And in this scenario introducing digital interfaces that either intermediate work or surveil their workforce they tend to go unnoticed until they become difficult to roll back and more importantly and specifically in relation to the gig economy in global South countries there is a normalization of work precarity in context where there is a lack of economic opportunities in general. So there is a sort of like take it or leave it work philosophy that is installed that has evident counter productive efforts to the full enjoyment of workers rights. And lastly I have to I don’t know if I’m past my time but just to final reflections we also have a complicated scenario that is cut through by the historic regulatory depths in our country so Paraguay doesn’t have a personal data protection law we don’t have a law against all forms of discrimination so that lack of regulatory certainty for traditional rights when intersecting with digital technologies and intersecting with workers rights they pose an additional situation that tends to expand structural injustices for workers. So I think that you know the future or if we are going to aim to try to build a fair digital future will only happen through workers collective organization so we need to fight for a true free surveillance environment workplace for workers to truly exercise their right to freedom of expression association and autonomy and collectively organize. So thank you very much. Thank you so much

Moderator – Ayden Ferdeline:
Eduardo. Can I ask a super quick follow-up question just to the very last point that you made about really which gets to the heart of what this discussion is about which is about resistance and pushing back. What can workers actually do in Paraguay? Are there tactics that workers have been able to use on the other platform other losing a train of thought now. What can workers do? Yeah yeah that was a general question. Yeah resistance what does that look like in the Paraguayan context? Well it’s a complicated reality I know I’m

Eduardo Correggio:
using a lot of that that catchphrase but the thing is that Paraguay curiously or not so curiously is one of the countries with the lowest unionization rate of Latin America. So that’s already a lot in a context of precarity in general. The private sector workforce has less percent has less than 1% of the of an unionization rate. So traditional ways in which workers could organize which is indeed being perhaps in an union is not something that is very traditional in the country and I think that there needs to be a shift in that understanding from workers but it’s a difficult cultural shift that is also associated with a lot of uncertainty because most of the time workers whenever they do try to organize in the country they tend to be fired before they form the Union. So we don’t have data that this is happening currently in the actual gig economy environment but it is definitely a cultural perception of people and workers in general and I think this is the first step that we need to try to overcome for starting to generate these organized spaces that also understand digital technologies and the complexities in the intersection with

Moderator – Ayden Ferdeline:
workers rights. Sure thank you Eduardo and Rashi. Rashi Saxena is a social innovation practitioner based in India and the situation that Eduardo has just described in Paraguay does it sound familiar? I mean hey I imagine the answer is yes that gig economy platforms are being rolled out in India treating workers similarly resistance is futile you’re probably an independent contractor versus a worker. What what do you say to what is what is the situation look like on the ground for you? I do feel like there are a lot of

Raashi Saxena:
we’ve historically had a lot of short-term contracts a quote-unquote gig economy platforms. We have a very large informal sector and since we define it or call it an informal sector it’s usually dubbed as not contributing to the economy but yes there have been various instances of workers and policymakers that are pushing back against this exploitation and a lot of this exploitation has also been exacerbated in India with the emergence of a lot of tech based apps. It could be cab aggregators like Uber or a local Ola Zomato Swiggy which are more of food companies urban companies to kind of work on any conceit services. They offer a wide range of services as you know ordering food e-commerce home services and more and a lot of concerns have been coming on the working conditions of these gig economy workers right from issues of low pay especially with the kickback at the VC ecosystem, job security, ridiculous long working hours and of course the lack of general social protections such as health care or general pension benefits and in response to that there are a lot of there has been a lot of gig economy workers that have banded together and form unions for better working conditions. A lot of them amongst themselves have also banded together and some of them are very inspiring to me personally. In Mumbai we have we have local taxis and we call them Kali Piri taxis, yellow black taxis that have banded together and form their own local app that has slightly better wages. Back in Bangalore where I reside we have we have an application more of I would say again local of you know local transport folks coming in together and it’s called Nama Metro which offers fair pay and also of course is more reliable surprisingly amongst us who are avid users and in 2020 the Indian Federation app transport workers organized a nationwide strike to have better pay conditions for a lot of these ride hailing platforms. Recently there was also the local government in India the Rajasthan government also came up with the legislation where a lot of gig workers should be given basic pension and social scheme benefits and yes they have been they have been significant steps I would say by the Indian government where they introduced a labor code that aims to provide social security benefits to gig economy workers but again a lot of it is an implementation issue in my country. There might be a lot of things that are written well-defined on paper but implementation wise I would still say there have been a lot of instances and opportunities where things have been exploited and most employers do the bare minimum when it comes to addressing these issues and of course leading to them being treated unfairly and we’ve seen several instances even during COVID where most of the factory factories were shut down but there was no direct support to many industries. I hope that helps. Absolutely

Moderator – Ayden Ferdeline:
thank you Rashi and I’m gonna bring Eliza back into the conversation because said before we were gonna dive into your report and you introduced three different case studies in the report which are different examples on the ones that Eduardo and Rashi have highlighted but one of them was when you spoke about the availability of generative AI tools most of which are relying upon hidden human labor trained on the labor and data of people around the world. These are points that Wuneida also raised earlier today and how this poses risks to labor security and workers rights in the gig economy and that it seems likely that this is only going to grow in scope and significance in the future and I’m curious what you think is the future over the next five to ten years or even further. How do you see workplace surveillance technologies evolving particularly with advancements in AI and other emerging technologies? Is it as dystopian to you as it feels to me? Yeah I think we’re probably being kept

Eliza:
up at night by the same vision of the next five to ten years if I’m honest but yeah I think to speak to that very good question and some of the points that have been raised. I saw this point that I keep thinking about by this venture this London-based venture capital firm I think it’s MMM Ventures that surveyed about 2,800 AI startups in the EU that are purporting themselves to be AI first companies and found that more than 40% of them actually weren’t using AI in any meaningful way. It was just sort of like a branding exercise and so I kind of think about that as like the touch point for how I think of where I see kind of the next phase of how this is going to be rolled out and how this is going to impact people across the global economy. I think basically what we’re seeing right now and even though you know the there are remarkable you know advances to some of these generative AI or I don’t necessarily agree with calling it that but with some of the newer versions of automation for image or text generation but it is the case that we have a remarkable lack of clarity about exactly how and in what ways those tools are developed with what models. We know with some certainty that a lot of the models that these tools are being trained on are either committing plagiarism at a huge rate or they’re being trained on data sets that don’t you know take into consideration the vast amounts of inaccurate data the vast amounts of hate speech that they might be absorbing and it’s also the case that a lot of workplaces and a lot of companies now are going to be under pressure kind of as AI becomes like you know basically like I said a very trendy marketing ploy by which a lot of companies kind of have to assert themselves as being relevant in the market. There’s going to be I think a race to the bottom in terms of how the marketing kind of outpaces the tools themselves and at the same time a lot of these companies are actually being forced to rely upon what others have termed ghost work or sort of yeah the digital sweatshop that I mentioned earlier. A lot of the times I mean there have been cases shown in Finland there’s a company that’s gotten a significant round of venture capital funding to create a model using incarcerated people in Finland to help basically do image labeling and to do the kind of sort of digital piecework that makes that tool possible. There have been cases of humanitarian instances or companies that sort of purport themselves to be assisting or helping refugees or asylum seekers that are using those populations again who are in a very precarious situation and don’t have a lot of choices for work using them as image labelers or sort of doing the digital piecework that’s required to make these tools possible. And so my fear is that we’re going to see like an increasingly bottomless need for not just for data but for the capacity to basically do what AI purports to do which does require at least as of now a tremendous amount of actual human labor but that most companies sort of want to keep hidden. And so that’s sort of something I think about as we think about the future and where policymakers should be putting their attention is looking carefully to see where they’re going to be populations of very precarious people who have very little who are desperate who have very little option for how they’re going to make their living and to look where and in what ways they might be kind of fed into the global like supply chain of a lot of these companies and a lot of these technologies and I think that’s something I will watch with a little bit of fear but like I’ll watch closely.

Moderator – Ayden Ferdeline:
Thank you and can I ask a really basic follow-up question which is why do companies want to keep that labor hidden?

Eliza:
Yeah that’s a good question I mean I think it’s embarrassing to them they don’t want to have to admit that the tool that they developed isn’t as advanced as they purport it to be. We saw this first or one of the most the first cases that I saw that was sort of the most impactful was Meta and we still don’t know again because a lot of these tools and a lot of the evidence about them is kept very non-transparent but we have it on pretty decent authority that a lot of the content moderation tools that Meta purported to be AI based were actually just humans in the loop basically and so when I think about the motivation for a company like that in that particular instance I think it makes people feel safe and better about the tool that they’re using and people naturally kind of have a bias towards like respecting something if it’s tech powered frankly and so I think that may be part of it and I also think again it’s just people don’t want to have to consider the human power the human labor that goes into the goods that we consume whether it’s generative AI or it’s like the clothes that we make we just it’s easier not to think about where they

Moderator – Ayden Ferdeline:
came from how they were made thank you Juanita we have an opportunity here at the IGF to provide some core actions to policymakers and to put some core actions in the key messages coming out of the IGF what what thoughts come to your mind given the increasing prevalence of workplace surveillance the trends that you have seen the trajectory that Eliza has outlined what steps or strategies would you recommend that policymakers take to mitigate the impacts of this change in work that is happening around us yeah I think just

Wilneida Negron:
from the conversations in the u.s. and what we’ve been seeing and in other regions it’s it’s really gonna require a mix of policy and regulatory action and some of the policy work is going to be need to be focused on establishing some basic protections for workers a lot of the legislation coming out of the u.s. is really focused at the national and at the state levels focus on just disclosure requiring employers to disclose in a timely manner that they are using these technologies. That is from GDPR, we have known that the consent model is problematic. And when dealing with not one, not a small collection of big kind of tech company to talk about a lot, the being able to consent and for workers to sort of disclose different products that they’re interacting with is highly problematic. So we know that workers need a basic level of protection that goes beyond consenting and disclosure requirements. We are seeing, because it’s fragment, the policy solutions are really fragmented right now. We’re seeing a lot around focusing on algorithm, like the ways of trying to regulate her algorithmic tools are being used either for hiring and recruitment, requiring vendors to undergo particular audits and impact assessments. And there’s a whole ecosystem of just like who audits the auditors that are now being required by some agency. And so I think that, these are all worthwhile conversations. Like looking at the particular kinds of really invasive uses that are algorithmic driven or that collect highly sensitive data, like biometric data. So like zeroing in a particularly sensitive types of data, really sensitive types of uses, and maybe going to GDPR model, where you focus on sort of like risk and regulate by risk and use cases. But again, not forgetting that workers need a basic level of protection. We have in the US, unfortunately, a consumer privacy law. So it’s hard right now to make a board to encroach on private actors, employers in this case and require them. So there’s a whole lot happening on the sort of self-regulation because in the US, and this goes to the economic dynamics that Eliza is talking about. We work in an economic system where there, the state is still does not feel like it can intrude in the private matters of business in a private sector. And so you’re seeing a lot as policy and regulatory try to figure out what particular aspects of these technologies can be regulated or you can provide protections or require employers to sort of do some due diligence on. There’s not much intrusion into the economic private matters of companies in terms of requiring them to provide protections to workers. And so that goes into sort of the reality of the market, the economy. And that goes into sort of the last set of kind of solutions is like you have to address. It’s where a lot of, we spend a lot of time as well, not only just organizing workers, but on the market dynamics, the drivers of these trends, we cannot continue to fight after the fact of just the fact that technology, like hundreds of products are coming out every year. And there’s still no legislation, as I mentioned. And so how can we tackle the market dynamics that have create that inequality in the U.S. that looks like, looking at antimonopoly, it looks like mergers and acquisitions, which is happens a lot in the data brokerage industry in the U.S. We’re seeing a lot of like these smaller data brokers collecting this sensitive employment data and sort of they’re being acquired by bigger data brokers. And so can we use the power of mergers and acquisitions to kind of, lack of a better word, just try to like tone down that market dynamic. Another thing would be looking at the private capital space, the venture capital requiring greater disclosures, requiring right now, there’s very little accountability of sort of the private markets. The private markets are where these companies, it’s where the Facebooks of the world go before they IPO and hit the public markets. And so it’s really critical to try to intervene at those early stages when these future Facebooks of the world are in the private market space. And there was a lot happening there, everything from ESG to other kinds of disclosures of what types of companies are being invested in. And so, yeah, a lot of market kind of industry focused dynamics of like, how can we, we cannot continue to fight this battle with these market, like existing market conditions that drive this kind of innovation and these products. And as a state sort of struggles to intervene, in addition again to like all the policy and like multi-agency work that’s needed to like regulate particular harmful technologies and provide some kind of protection for workers beyond just disclosure and consent. So there’s a lot of work to be done. And to what extent that there are spaces to what Adardo was saying for us to strategize on national regional level, I think it’s really very much needed.

Moderator – Ayden Ferdeline:
Thank you so much, Juanita. And Eduardo, I’d love for you to react to Juanita’s comments just around, do we need to tone down that market dynamic? Do we need greater disclosure when it comes to venture capital? And also if I can throw in an extra point that I have heard others raise, which is around the argument that the value of worker data is arguably in its collective use by both workers and employers. And if we were to think about what that would look like if workers and or their representatives who advocate for their legitimate interests were to have collective rights to worker data, and I don’t mean health data, for example, I mean different forms of data, whether that is around injury rates, other metrics. What might those rights look like? Would that be helpful? What do you think? What else should we be asking for?

Eduardo Correggio:
I think that I’ll start from the last question and then connect with some of the points that Juanita and Elizabeth made that I think were quite interesting. I think that my worry in sort of like opposing those very complex ways of governing a shared access of data in a public interest way, in benefit of workers, I mean, I feel that it’s a bit difficult in global south context where people are still learning how this ecosystem even works. So sometimes I feel that if that is going to be the whitewashing that some companies could potentially do, it could be dangerous and without any true meaning. So I feel that at this point, we need to go back to the basics of what workers rights fight is, at least in global south context. That means to fight for companies to truly allow workers organization and in the context of gig economy platforms, that they actually recognize that they are workers in order for those other complex discussions can come to be. Right now in Latin America in general, none of the platforms recognize the workers dependency and most of the regulation at the moment, because there are regulatory efforts from governments that are trying to understand how the gig economy works are starting to pose this question, are they dependent workers or not? And what kind of models or hybrid models could coexist in terms of dependent and not dependent workers that are interesting and that could potentially pave the way for more safeguards for these workers. And perhaps connecting with some of the things that Eliza and Juanita were saying, I really, I think that another tool that perhaps we don’t think a lot is how competition can perhaps help us in also better improve the current digital economy as it exists, that is currently, as we know, very concentrated. And competition in its intersection with data protection and privacy is pretty much a novelty also at the moment. And I think that we should fight for competition to also help in the fight of creating better working conditions and understand that if there are perhaps very unequal ways of treating workers and in which, or very let’s say predatory ways in which data is exploited and so on, that can also be considered a competitive parameter. I know there’s a lot of resistance in trying to expand how competition currently works, but I think that it’s a conversation worth having because we need to use as much elements as we have to improve the current digital economy ecosystem and a lot of the problems that we have right now is that it is highly concentrated. And that then of course has an impact in the way data protection is enforced in how people interact with platforms and so on. And then I’m very happy that the issue of the ghost sweatshops was mentioned by Eliza. And I think that one of the other reasons of why platforms perhaps don’t recognize that is that if you don’t recognize the problem, then it’s not a problem, right? So you won’t address those issues and you won’t address the current inequality and the current exploitation that these workers are currently facing because you just don’t acknowledge that that is a problem in its own. And that is also connected with a lot of interest from governments that they want these companies to install those sweatshops in their countries in order to create jobs that although precarious are also jobs in a context of a lot of inequality. So I think that lastly also in terms of the future, that was like a question that you asked me at the beginning. I think that a lot of the future and what other things workers could also do is related to platform cooperativism. I think that’s an interesting concept that is starting to become more present in different discussions of workers who are trying to build their own digital infrastructure and have more autonomy in the ways they will design the platforms that they will work on. But I think that those discussions have to be highly supported by national governments that should invest in those kinds of programs and allow this kind of exploration to happen in order to build other sorts of business models that are more cooperative and just in their roots. So, yeah.

Moderator – Ayden Ferdeline:
Thank you. Eduardo, thank you. And Rashid, Saxena, I’ll bring you back into the conversation. Of course, you have the opportunity to respond to Eduardo’s comments as well as the comments from Eliza and Monida. I’ll also give you your own question to answer, which is that we are in UN fora. You have been following the WSIS plus 20 renewal process and the Global Digital Compact and how it can potentially contribute to bridging the gender digital divide and promoting the empowerment of women and girls in the digital sphere. Is there a labor connection here? Are these instruments which can potentially help uplift workers and particularly female workers?

Raashi Saxena:
No, there definitely is. And I feel like a lot of my responses will also contribute to a lot of the aspects that will need are brought out with the VC ecosystem and how fractured it is when it comes to contributing or providing support. I think globally, only 7% of female founders are actually backed up by VCs. And especially in the Indian ecosystem, it’s more of you need to be from an ABC ecosystem to be able to capture that funding. So I feel like it’s very insular in many ways. Also to bring down to Eliza’s point, I have seen that in especially with the labor practices of a lot of content moderators are usually in general, very hidden. And in the VC ecosystem, I’ve also seen that maybe the PR aspect of promoting something as AI would gather or harness more money. So it’s more of a PR exercise than actually and also kind of propagating that AI is this magic wand that will magically wish wash away a lot of the aspects of inequality. But talking about the VCs 20 renewal process and the global digital compact, I do feel like it has the potential to significantly bring in the gender digital divide one with helping identifying the barriers that prevent women from accessing digital technologies. In India, most of the local households have or the device is shared among an entire family. So having agency towards having your own device could help in improving digital literacy skills. Also kind of crossing and cutting the social and cultural barriers that women have when it comes to mobility. And it could also, India also has very cheap internet tariff rates. So having access to these devices and internet could also help in promoting locally relevant content and services and also employment. And one aspect that also gets missed out is that a lot of people with different disabilities, especially women are more disproportionately impacted and have more social stigma. So having proper access would also help them to participate in a social setting, in a cultural setting and also give them dignified livelihood. The other one with the global digital compact could also show that women have access to equal opportunities in the digital revolution whether it’s an initiative that they want to promote on a small scale or a medium scale to also support women in the startup ecosystem and help general representation in digital leadership roles. And one of the very important ones is also addressing the online violence enhancement. The growing phenomena that you have, especially with generative AI, doctor videos, synthetic videos which for the longest time used to affect women in public life but a lot of women such as myself and Will and so many others could be perpetrators of this. And I do feel like having robust policies around this could help develop responses on how to combat this, promote digital safety and security, collaborating with T-RIC perhaps to ensure that victims have access to effective support and addressing mechanisms. And lastly, of course, the global digital compact and WSIS process could also encourage governments and other stakeholders given that we’re at the IGF to take specific needs and priorities for women in the digital sphere to of course increase participation in the decision-making processes and help in the development, implementation and initiation of policies and programs.

Moderator – Ayden Ferdeline:
Thank you so much, Rashi. We are nearing the top of the hour but before we close, I would like to give each speaker just 30 seconds for very brief closing remarks on how we can collectively develop strategies that ensure fair and equitable workplace practices in this new era and I know 30 seconds not enough to actually answer that question but we’ll need that, please.

Wilneida Negron:
Thinking about building cross-class power with workers across regions, cross-class, cross-industries. There is a lot of connective tissue there. There’s a lot of shared analysis that could be connected and I think that is, you know, there’s an opportunity there that we’re not tapping into.

Moderator – Ayden Ferdeline:
Thank you. Eliza.

Eliza:
Yeah, I guess I’ll just say kind of as a final wrap-up that I think our community of people who work on digital rights and tech policy, I think we need to do a lot more to expand the way that we think across different sectors of the policy community and to work with people who are working in unionization, people who are working in climate change. There are increasing numbers of climate issues in the application of AI that we didn’t even get into and just thinking about our sector kind of as part of the global set of issues that are creating and exacerbating wealth inequality, racial inequality and yeah, I’ll stop there.

Moderator – Ayden Ferdeline:
Thank you, Eliza. Eduardo.

Eduardo Correggio:
I would say definitely in the context of the economy, at least we need more regulation that is collectively built with the voices of workers, not necessarily from a top-down approach and also but not least, more ownership of the infrastructure from workers is something that is also important and that should be of the digital infrastructure and it’s something that should be in the discussions as well.

Moderator – Ayden Ferdeline:
And you get last word, Rashi.

Raashi Saxena:
Hi, yeah, I think there needs to be more conversations around these. There’s a lot of cultural stigma on speaking up. We need to stop being in silos, acting up, having more conversations and information around how we could effectively band up together in places like this and hold companies accountable.

Moderator – Ayden Ferdeline:
Thank you, everyone. It has been a pleasure being in company with you today. I hope we can continue this discussion intersessionally and also at next year’s Internet Governance Forum in Riyadh and for now, we can adjourn this session at 9.30 a.m. Thank you. Thank you.

Eduardo Correggio

Speech speed

169 words per minute

Speech length

2403 words

Speech time

854 secs

Eliza

Speech speed

205 words per minute

Speech length

2171 words

Speech time

634 secs

Moderator – Ayden Ferdeline

Speech speed

160 words per minute

Speech length

1513 words

Speech time

568 secs

Raashi Saxena

Speech speed

147 words per minute

Speech length

1252 words

Speech time

509 secs

Wilneida Negron

Speech speed

161 words per minute

Speech length

2555 words

Speech time

951 secs