Mr Dunstan Allison-Hope, Managing Director at Business for Social Responsibility (BSR), gave a background on destructive machines as technologies that have been programmed to do things, and discussed the need to find out how to bring remedy when decisions are made by machine.
Mr Amol Mehra, Executive Director at International Corporate Accountability Roundtable (ICAR), said that the discussion should focus on the impact machines have on humans, and the impact of mechanisation of less skilled labours.
Mr Steve Crown, Vice President and Deputy General Counsel at Microsoft, commented that it is the responsibility of businesses to respect human rights, and there are potential risks in the evolution of artificial intelligence (AI). Large amounts of data are fed into a machine and instructing it what to do, through identifying patterns and collaborations. But machines have no empathy or emotions, and the quality of data input has an impact on the effectiveness of the machine. However, human errors and prejudices can be fed into machines, resulting in disastrous consequences.
Crown proposed that as a remedy to such challenges, scientists must strive to programme machines to help humans and ensure the transparency of data input to uphold peoples’ integrity.
Dr Sandra Wachter, Researcher in Data Ethics at University of Oxford and Turing research fellow at Alan Turing Institute, commented about the need of accountability on decisions made by machines. Individuals have a right to know about their data held by machines. To achieve this, companies must update privacy policies, to inform individual on the data the companies collect and how that data may be used. According to Wachter, this would need to be guided by domestic legislation with regulation mechanisms.
Ms Alex Walden, Counsel for Free Expression and Human Rights at Google, stated that a billion people use Google services every day and a billion people’s new data is added every day. Walden said that Google is able redress data protection violations through applicable jurisdiction, and that technology is always being improved to recognise democratic principles. Walden pointed that Google has policies that prohibit violence, extremism, and terrorism, and that have teams reviewing materials in different languages. Exceptions are applicable to education and artistic materials. In collaboration with civil society organisations, Google is helping inform companies on how they can respond to human rights violations through technology.
Ms Cindy Woods, Legal Policy Associate at the International Corporate Accountability Roundtable (ICAR), highlighted that the increased displacement of humans by machines is a human rights concern. There are alarming figures of workers replaced by machines. Woods pointed that robots are another example of destructive technology and that it is projected that by 2020, using a robot will be 4 times cheaper than human labour. The International Labour Organisation (ILO) projects that 2/3 of humans working in the garment industry can be replaced by machines, and yet in countries such Cambodia, the garment industry constitutes 80% of the total labour force.
Mr Theodore Roos, Project Collaborator for Future of Work at World Economic Forum (WEF), stated that the WEF has a project on preparing for future work. Roos stated that there are different solutions required by countries in developed and developing countries, but also within same group. One solution is education, not just in schools, but lifelong education for people to get new skills and adapt to new work.
Roos also proposed social services, for instance, compensating people not working, allowing people to move to counties where work is available, and encouraging and rewarding people working in human capital sectors, for instance, education and health.