6th United Nations Forum on Business and Human Rights

27 Nov 2017 to 29 Nov 2017
Geneva, Switzerland

Resource4Events

Event report/s:
MacOswald Jumali

Mr Dunstan Allison-Hope, Managing Director at Business for Social Responsibility (BSR), gave a background on destructive machines as technologies that have been programmed to do things, and discussed the need to find out how to bring remedy when decisions are made by machine.

Mr Amol Mehra, Executive Director at International Corporate Accountability Roundtable (ICAR), said that the discussion should focus on the impact machines have on humans, and the impact of mechanisation of less skilled labours.

Mr Steve Crown, Vice President and Deputy General Counsel at Microsoft, commented that it is the responsibility of businesses to respect human rights, and there are potential risks in the evolution of artificial intelligence (AI). Large amounts of data are fed into a machine and instructing it what to do, through identifying patterns and collaborations.  But machines have no empathy or emotions, and the quality of data input has an impact on the effectiveness of the machine. However, human errors and prejudices can be fed into machines, resulting in disastrous consequences.

Crown proposed that as a remedy to such challenges, scientists must strive to programme machines to help humans and ensure the transparency of data input to uphold peoples’ integrity.

Dr Sandra Wachter, Researcher in Data Ethics at University of Oxford and Turing research fellow at Alan Turing Institute, commented about the need of accountability on decisions made by machines. Individuals have a right to know about their data held by machines. To achieve this, companies must update privacy policies, to inform individual on the data the companies collect and how that data may be used. According to Wachter, this would need to be guided by domestic legislation with regulation mechanisms.

Ms Alex Walden, Counsel for Free Expression and Human Rights at Google, stated that a billion people use Google services every day and a billion people’s new data is added every day. Walden said that Google is able redress data protection violations through applicable jurisdiction, and that technology is always being improved to recognise democratic principles. Walden pointed that Google has policies that prohibit violence, extremism, and terrorism, and that have teams reviewing materials in different languages. Exceptions are applicable to education and artistic materials. In collaboration with civil society organisations, Google is helping inform companies on how they can respond to human rights violations through technology.

Ms Cindy Woods, Legal Policy Associate at the International Corporate Accountability Roundtable (ICAR), highlighted that the increased displacement of humans by machines is a human rights concern. There are alarming figures of workers replaced by machines. Woods pointed that robots are another example of destructive technology and that it is projected that by 2020, using a robot will be 4 times cheaper than human labour. The International Labour Organisation (ILO) projects that 2/3 of humans working in the garment industry can be replaced by machines, and yet in countries such Cambodia, the garment industry constitutes 80% of the total labour force.

Mr Theodore Roos, Project Collaborator for Future of Work at World Economic Forum (WEF), stated that the WEF has a project on preparing for future work. Roos stated that there are different solutions required by countries in developed and developing countries, but also within same group. One solution is education, not just in schools, but lifelong education for people to get new skills and adapt to new work.

Roos also proposed social services, for instance, compensating people not working, allowing people to move to counties where work is available, and encouraging and rewarding people working in human capital sectors, for instance, education and health.

MacOswald Jumali

The moderator, Ms Leslie Johnston, Executive Director at the C&A Foundation, introduced the panellists and mentioned that they are technologists who can explain how technology can promote human rights.

Ms Jessi Baker, Founder of Provenance, commented that Provenance’s vision is to use blockchain technology to improve access to information about businesses and human rights. Baker explained that blockchain is a new type of database that facilitates the exchange of information at a global level. Baker gave the example of encrypted currency, and explained that blockchain is a decentralised network with pieces of data coming from different sources, and that it is beyond government control. He argued that this allows data to flow down the market through the supply chain. According to him, there is a need to digitalise data to empower individuals along the supply chain, rather than have top down solutions. Baker gave an example of the fishing industry’s project in South East Asia, in which blockchain connected fishermen and end users, and is helping to reduce abuse of prices along the supply chain.

Ms Beth Holzman, Director for Worker Engagement at Laborlink, observed that technology can unlock the voice of workers. Laborlink technology enables worker engagement and the attainment of unfiltered feedback which provides a better understanding of labour issues. Moreover, quality data improves working conditions. Holzman said that Laborlink China collected 32 000 survey responses from across 20 Chinese factories, and in Bangladesh it collected over 32 000 survey responses from workers in 40 factories. This has given a platform to 47% of workers to report grievances and 38% of them have been effectively assisted.

Holzman said that to achieve meaningful results, there is a need to take necessary action:

  • Share worker survey results with middle management to improve awareness of worker and management relationships
  • Initiate more trainings for workers on grievances

Furthermore, Holzman pointed out that their goal was to put workers at the centre, that worker engagement must be at the core of all factory engagements, and to enhance responsible sourcing of programmes by companies.

In connection to remedy, she claimed that worker perspectives are key to risk assessment, mitigation, and prevention, and that companies need to focus on remedy criteria. Holzman concluded by saying that in future, there must be consent on access to individual information and defined pathways for remedy.

Dr Venkat Maroju, Chief Executive Officer at Source Trace, commented that smallholder farmers are the backbone of agriculture productivity and that farmers should be engaged to improve productivity. Using the digital power of mobile technology brings change through digital transactions. However, most of the farmers are in rural areas with poor connectivity and mostly offline.

Maroju stressed that quality infrastructure, digital payment, training in certification, and training in modern and digital financing through co-operatives is helping to improve farmers’ activity. Companies need to work on social accountability, and Source Trace is helping companies and cooperatives to translate their policies and make them work in areas of social enterprise and social audit by use of technology.

The amount of money in the value chain is at grass root level, making it difficult to attract information and communication technology (ICT). Illiteracy of farmers and lack of support in rural areas are key challenges in digitalising rural farmers.

Mr Kenton Harmer, Certification Director at Equitable Food Initiative (EFI), explained that the EFI is a skills-building and standards-setting organisation. EFI innovation provides relationship building, leadership, and team training, and enhances companies’ compliance with standards, audit, and certification. The initiative uses digital technologies in these activities, as they help to identify and come up with remedies to work-related problems. 

MacOswald Jumali

The moderator, Mr Jean Yves Art, Senior Director, Strategic Partnerships, Microsoft, introduced the panellists and said that Microsoft is proposing a Digital Geneva Convention:

1. To protect civilians against state-sponsored cyber-attacks

2. To assist the private sector to detect and respond to cyber-attacks on companies’ infrastructure

3. To protect companies from states launching cyber-attacks using the companies’ infrastructure

4. To set up institutions to identify the sources of cyber-attacks

H.E. Monique TG van Daalen, Ambassador Extraordinary and Plenipotentiary Permanent Representative of the Kingdom of Netherlands to the United Nations and other international organisations in Geneva, gave a state perspective on the Digital Geneva Convention. The economies of states rely on the Internet more and more. Highly digitalised countries want to keep the Internet open. The Netherlands wants to enhance security on the Internet through international cyber diplomacy. Van Daalen said that Microsoft efforts are greatly appreciated in the Digital Geneva Convention debate. But Van Daalen pointed out that the name could bring confusion because to some, it could mean that the 1949 Geneva Convention is no longer valid. With regard to the proposed Digital Geneva Convention, Van Daalen expressed appreciations towards Microsoft’s efforts, but noted that it will be a cumbersome process to debate such a convention. He also pointed out that the Netherlands remains committed to the principles that the rights people enjoy offline must also apply online.

Mr Laurent Gisel, Legal advisor at International Committee of the Red Cross (ICRC), highlighted that the ICRC is responsible for the development of international humanitarian law. The ICRC’s wish is to see that emerging issues be captured in international law to reduce suffering, since new weapons in warfare pertain to technology.

Cyber-attacks used today are criminal acts. Cyber warfare is as much of a concern as any attack on humanity. The use of cyber-attacks on transportation systems, hospitals, and other critical infrastructurescan result in great human casualties. Cyber operations can endanger humans, and the ICRC backs Microsoft’s proposal for international law. 

MacOswald Jumali

The moderator, Prof. Sheldon Leader, Director of the Essex Business and Human Rights Project and member of the Human Rights Centre at the University of Essex, introduced the panellists and stated that the session would discuss the transfer of data between businesses and government agencies, and the implications that has on human rights.

Leader said that despite business-to-business transactions, the state is at centre as the regulator. In light of this, Leader stressed the need to ensure access to remedy in the context of data-sharing in response to requests by public authorities, justified on grounds of national security or law enforcement interests.

Prof. Changrok Soh, Director of the Human Rights Centre at Korea University, and Member of the Advisory Committee to the Human Rights Council, commented that technology is vital to modern humanity, but that it poses a threat when personal data is misused. Soh said that safeguarding privacy can help to hide data from hacking. He said that technology and the industrial revolution have expanded human rights concerns because of the changes in the way humans interact, and this has had an impact on the power structure of society.

Soh argued that state agencies need to agree on which data can be shared with them to avoid human rights violations. He pointed out that there is some relief in encrypted data since the use of encryption encourages opinions and self-expression. New forms of human rights abuses in the digital age could be avoided if platforms are created where companies, governments, and civil society work together towards this goal.

Soh also mentioned that another solution could rest in international guidance on data regulation, and that there is a need to define new ways of protection when it comes to government overtures on privacy.

Dr. Krisztina Huszti-Orban, Senior Researcher, working with the Human rights, big data and technology Project, Human Rights Centre, University of Essex, noted that there are difficulties in regulating data usage in the technology era. Huszti-Orban pointed on the need to establish a relationship between privacy and data rights, and the ability to regulate the transfer of data. For example, the General Data Protection Regulation (GDPR) is able to regulate data that can be shared, but are policies adequate to protect data in the current age of technology? Regulation demands that individuals are responsible and assume control and autonomy, but it is difficult for individuals to track data. She stated that harm to individuals comes after the aggregation of pieces of data over a long period of time, which is difficult to control because it cannot be tracked to single source.

Huszti-Orban explained that the situation is complex because the regulatory instruments available cannot fully control all sources of data. There is a need to have domestic policies that are in tandem with international law. The basic principle of the UN is to respect human rights. This can be enforced if states have the legislation to enforce this. Data has cross border effects and as such is regulated by bilateral arrangements between states, but this can only be effective if there are both domestic and international laws to regulate data violations.

Huszti–Orban stated that informal data sharing between businesses poses a threat to human rights. Such arrangements can be helpful in quickening processes but there can also be harmful effects.

Different types of data are given different levels of protection, but there is a need to manage all data with sensitivity, and this poses a challenge. It is important to know that there are always human rights violations if data is not protected.

Companies can do a lot to protect data, for example, to challenge a government’s request if it sees that there are violations. However, rubber-stamping of data requests by judges is a problem, and there is a need to find other means to protect human rights. Companies normally do not have full information on international human rights law on how best they can handle issues of human rights when faced with issues of national security.

Sharing data with governments on human rights defenders, the opposition and critics can have serious human rights abuse implications. Huszti-Orban pointed out that companies have no excuse not to protect data, because sometimes data shared can lead to crimes against humanity by governments. 

Mr Bernard Shen, Assistant General Counsel, Microsoft, highlighted that Microsoft is always eager to share information on data management. Shen stated that their core responsibility is on maintaining trust in data usage. Control demands that meaningful information on privacy policies is provided to end users to allow them to give meaningful consent, but reasonable expectations from users must be taken into consideration in protecting privacy. Customers must also be provided with channels to provide feedback. Microsoft handles complaints by referring them to the complaints desk, where they are managed according to the laws of the country they come from.  

According to Shen, there is no direct access to Microsoft data, and any government access is by request. Requests from governments come with a secrecy order to not to inform customer that they are accessing data, and in 2016, Microsoft filed a lawsuit with the US District Court challenging the US government's use of indefinite and overly broad secrecy orders, saying they should only be used on state security issues and that customer must be informed. Government cross-border requests also need to be clarified by the courts. Data is also protected by encrypting.

Governments and stakeholders need to work together on the necessary legislation on how best data can be protected.

Ms Nighat Dad, Founder and Executive Director at Digital Rights Foundation, commented that there is a lack of safeguards and remedy in developing countries. For example, in Pakistan, there is no data protection law and the culture of data protection is missing. Technology companies tend to respect the culture of data protection when in developed countries, but not when in developing countries. Governments also lack knowledge on the data mining capacities of the companies and hence lack regulation. Companies in the global south have a free hand in managing data. Big tech companies need to establish similar practices as they do in the global north.

In many instances where confidentiality is required, national security agents have become strong, demanding any data in the name of national security and the fight against terrorism. States cite many laws and companies fear repression. National security does not mean violating peoples’ rights. Moreover, companies can and should be held accountable when it comes to their sharing of data with governments.

The sixth edition of the United Nations Forum in Business and Human Rights will be held on 27–29 November 2017, in Geneva, Switzerland.

The Forum, established in 2011 by the UN Human Rights Council, is the global platform for yearly stock-taking and lesson-sharing on efforts to implement the UN Guiding Principles on Business and Human Rights. It brings together UN member states, UN agencies, intergovernmental and regional organisations, businesses, labour unions, national human rights institutions, non-governmental organisations, and other interested stakeholders, to discuss major human rights issues in the global economy.

Under the overarching theme 'Realising Access to Effective Remedy', the 2017 Forum will feature discussions on policy trends that consider the role of the business and human rights movement in today's political and social contexts around the world. The programme will include over 60 sessions on current business-related human rights issues; some of the sessions will be lead by the Office of the UN High Commissioner for Human Rights (OHCHR), while others will be organised by external partners.

In order to inform preparations for the event, the Forum Secretariat is doing a mapping of ongoing research and projects linked to this topic. Stakeholders can submit brief information about ongoing or planned research and projects for 2017-18 related to the issue of access to remedy in the context of business and human rights. 

For more information, visit the event webpage.

 

The GIP Digital Watch observatory is provided by

in partnership with

and members of the GIP Steering Committee



 

GIP Digital Watch is operated by

Scroll to Top