IGF 2020 WS #304 Reaffirming human rights in company responses to crisis

Related event

Session date
Session ID:
WS#304

Resource type
Event reports

Author:
Aida Mahmutovic

The session moderated by Mr Jan Rydzak (Research lead, Ranking Digital Rights) looked at situations of crises (political, outright conflict, network shutdowns, health). The leading question for the workshop was on the shortcomings of company accountability in times of crisis. What can be done by the human rights community to make companies more accountable? The focus is not only on global crises, but also on localised emergencies that can consume the lives of people who suffer their impact the most.  

States are dependent on the co/operation of companies for rolling out technical solutions for managing the pandemic. These challenges for the business nexus are addressed in the UN Guiding Principles on Business and Human Rights, reminded Ms Isabel Ebert (Research and Policy, B-Tech Project, OHCHR). States also have a strong role in supporting companies and carrying out their responsibility to respect human rights. Ebert highlighted three main points: transparent governance and accountability structures, necessity to carry out human rights due diligence, engagement of each stakeholder. 

When it comes to human rights due diligence, businesses that had robust structures in place to respond to crisis situations have been better able to manage.

Ms Dorota Głowacka (Lawyer, Panoptykon Foundation) discussed the implications of the pandemic on freedom of expression in relation to big Internet platforms. Once the pandemic started, big tech companies introduced changes to their content moderation policies but algorithms were already being used for content moderation purposes. For example, Facebook temporarily suspended reinstating content after a user contested a decision to take it down.

What was the effect of this? Studies suggest that algorithms used for content moderation are not 100% effective. They often carry a risk of over removal and fail to identify truly illegal material. Appeals processes and transparency mechanisms should be introduced to mitigate the risks associated with using algorithms or, in general, with just using arbitrary powers of platforms to moderate content.

To mitigate the negative effects of using algorithms to moderate content, three things are needed: (a) proper transparency mechanisms including transparency of the role and functioning of the automated systems used for content moderation, (b) effective appeals processes on the platforms (due process) and (c) external independent oversight of final platform decisions.

Platforms can change their policies overnight. This means we are at the stage where we need a binding regulation instead, said Głowacka.

Companies have a major role to play in what becomes the new formal, according to Mr 'Gbenga Sesan (Executive Director, Paradigm Initiative). Countries often shut down the Internet in different types of crisis, including during elections. Businesses do not ask critical questions when they are faced with shutdown demands. Some countries introduce emergency laws whose provisions are not aimed at the actual crisis, but address matters of national security and national morality, and serve as a path for governments to shut down the Internet. ‘That is based on a political agenda. Companies need to be more like activists,’ said Sesan. In these cases, it is important to keep proportionality and necessity in mind. Transparency reports are key when it comes to companies pushing back. The one element that companies thrive on is trust. There would be no digital gig data economy without trust. 

The session broke into groups to discuss several issues. These are the conclusions.

Group 1 discussed governance-focused solutions, the role of due diligence, including human rights due diligence, and potentially mandatory human rights due diligence frameworks. There is a clear role for mandatory human rights due diligence, not necessarily at a country level but at a regional level. The EU is spearheading a lot of those efforts. Localised problems also require a specific, localised approach.

Group 2 discussed privacy issues. The majority of people in the group were not familiar with the processing of apps. Companies are not transparent about data collections and practices. Cultural aspects are also important for trust across different cultures, including the trust of company owners. Governments and private corporations have different perceptions in different countries. The best practice currently is the opt-in and opt-out control in the hands of the users for data collection and data processing.

Group 3 discussed issues related to freedom of expression. Internet access is important for telecommunication companies, so they ensure access, good connectivity, and also to try to push back against shutdowns. States and companies interact around requests for user data. It is critical for companies, both big and small, to engage with a wide range of standards on human rights and to coordinate around these issues. Content moderation, including the use of algorithms, can be detrimental to freedom of expression. It is important for companies to have human oversight of content moderation which involves algorithms, particularly hate speech, because in these cases understanding context is crucial. There is a need to invest in content moderators in different languages and content moderators who understand the context.