Data as a stock-in-trade in global trade: Looking beyond outdated notions around data flows

1 Oct 2021 06:00h

Event report

Moderator Ms Gita Sen (General Coordinator, DAWN) pointed out that the data flow discussion today is polarised in a global stalemate between ‘good’ and ‘bad’. In the developing world, researchers are dependent on access to social and professional networks, institutional support, and finances, all of which are lacking. As a remedy, the commons discussion arose. This means treating data not only in terms of an individual’s fundamental rights to privacy, but also as a global economic resource. Sen drew attention to two points: the need for a new economic rights regime around data and the socialisation of data values for new data production approaches beyond technological feudalism.

The idea of technological feudalism was echoed by Ms Vahini Naidu (Coordinator, Trade for Development Program, South Centre). The free flow of data in the past meant the liberalisation of trade agreements in line with the goals of large technological companies and those of developed countries, primarily those with developed data centres and infrastructure. ‘Essentially, it is a one way flow from the developing into the developed countries which have the infrastructure, regulation and the know-how to extract knowledge from data’, Naidu stressed. Local and regional value chains are shunted aside for the sake of global chains. Existing language in current negotiations and trade agreements prohibits data localisation and server localisation requirements, concentrating innovation through data centres in the United States. The UNCTAD Digital Economy Report 2021 is the first sign of hope. It recommends that data flows be taken out of trade negotiations, drawing on the difference between data flows and economic data flows. It also calls for establishing data as a global public good, which should shift the governance of data flows to UN institutions and give a stronger voice to developing countries.

Data is a fundamental resource in the digital capitalist economy; the main concern is whether it will stay in the hands of several corporations. Ms Sofia Scasserra (Researcher, Transnational Institute) said that the current neoliberal approach allows for data value extraction without generating value for the individuals whose data is processed. According to Scasserra, the extractives perspective makes it difficult to get data that is relevant in defining public policies and in driving socially just and fair innovation, for example, in the public sector. Scasserra proposes governing data as a common good, owned neither by the state nor by the private sector. The developing world should drive data flows regulation on the local, contextual level and then move on to the supranational level; not the other way around. Finally, data flow negotiations should be taken out of the WTO trade agreements and included in the UN mandate.

From a European perspective, Ms Ingrid Schneider (Professor, Department of Political Science, University of Hamburg) clarified that ‘the North’ is not a homogenous entity, and Europe also recognises the large power asymmetries in global data flow regulation. Her concern is that technological companies are entering the public sector, making heavy investments into health, education, and mobility, which can lead to privatisation and commodification of services that were previously public. The narrative of ‘it is good to have free data flows’ is obscuring the real issues behind data localisation. Schneider gave the example of the US-Mexico-Canada trade agreements that made it impossible for Mexico to store data. All data that is extracted from Mexico is stored in the US, making digital sovereignty and locally appropriate regulation difficult. However, data sovereignty is a contested and complex notion, and what it really means should be clarified because in authoritarian countries it can stifle the freedom of speech.

As well as the idea of data sovereignty, the idea of data flow is a fuzzy concept. Mr Parminder Jeet Singh (Executive Director, IT for Change) explained that data itself is not a final product. While data is both an economic resource for a company, it is also an economic resource for the individual. However, patterns in aggregated data comprise an even greater resource. The whole sector is driven through systemic intelligence. The usual argument for data flows is that it cannot be governed in its entirety, but knowledge is also a resource and is governed. The whole intellectual property regulatory system was created so that knowledge could be placed into the rights and ownership perspective. The same can be done with data flows. Additionally, community governance can be strengthened by following up on the UNCTAD report’s recommendations of taking data flows out of the WTO trade agreements, as was pointed out by other discussants.

Moderator Ms Gita Sen (General Coordinator, DAWN) pointed out that the data flow discussion today is polarised in a global stalemate between ‘good’ and ‘bad’. In the developing world, researchers are dependent on access to social and professional networks, institutional support, and finances, all of which are lacking. As a remedy, the commons discussion arose. This means treating data not only in terms of an individual’s fundamental rights to privacy, but also as a global economic resource. Sen drew attention to two points: the need for a new economic rights regime around data and the socialisation of data values for new data production approaches beyond technological feudalism.

The idea of technological feudalism was echoed by Ms Vahini Naidu (Coordinator, Trade for Development Program, South Centre). The free flow of data in the past meant the liberalisation of trade agreements in line with the goals of large technological companies and those of developed countries, primarily those with developed data centres and infrastructure. ‘Essentially, it is a one way flow from the developing into the developed countries which have the infrastructure, regulation and the know-how to extract knowledge from data’, Naidu stressed. Local and regional value chains are shunted aside for the sake of global chains. Existing language in current negotiations and trade agreements prohibits data localisation and server localisation requirements, concentrating innovation through data centres in the United States. The UNCTAD Digital Economy Report 2021 is the first sign of hope. It recommends that data flows be taken out of trade negotiations, drawing on the difference between data flows and economic data flows. It also calls for establishing data as a global public good, which should shift the governance of data flows to UN institutions and give a stronger voice to developing countries.

Data is a fundamental resource in the digital capitalist economy; the main concern is whether it will stay in the hands of several corporations. Ms Sofia Scasserra (Researcher, Transnational Institute) said that the current neoliberal approach allows for data value extraction without generating value for the individuals whose data is processed. According to Scasserra, the extractives perspective makes it difficult to get data that is relevant in defining public policies and in driving socially just and fair innovation, for example, in the public sector. Scasserra proposes governing data as a common good, owned neither by the state nor by the private sector. The developing world should drive data flows regulation on the local, contextual level and then move on to the supranational level; not the other way around. Finally, data flow negotiations should be taken out of the WTO trade agreements and included in the UN mandate.

From a European perspective, Ms Ingrid Schneider (Professor, Department of Political Science, University of Hamburg) clarified that ‘the North’ is not a homogenous entity, and Europe also recognises the large power asymmetries in global data flow regulation. Her concern is that technological companies are entering the public sector, making heavy investments into health, education, and mobility, which can lead to privatisation and commodification of services that were previously public. The narrative of ‘it is good to have free data flows’ is obscuring the real issues behind data localisation. Schneider gave the example of the US-Mexico-Canada trade agreements that made it impossible for Mexico to store data. All data that is extracted from Mexico is stored in the US, making digital sovereignty and locally appropriate regulation difficult. However, data sovereignty is a contested and complex notion, and what it really means should be clarified because in authoritarian countries it can stifle the freedom of speech.

As well as the idea of data sovereignty, the idea of data flow is a fuzzy concept. Mr Parminder Jeet Singh (Executive Director, IT for Change) explained that data itself is not a final product. While data is both an economic resource for a company, it is also an economic resource for the individual. However, patterns in aggregated data comprise an even greater resource. The whole sector is driven through systemic intelligence. The usual argument for data flows is that it cannot be governed in its entirety, but knowledge is also a resource and is governed. The whole intellectual property regulatory system was created so that knowledge could be placed into the rights and ownership perspective. The same can be done with data flows. Additionally, community governance can be strengthened by following up on the UNCTAD report’s recommendations of taking data flows out of the WTO trade agreements, as was pointed out by other discussants.