[Read more session reports and live updates from the 13th Internet Governance Forum]
The session touched on the brief history of the Best Practice Forum (BPF) which brings together specialists and experts on artificial intelligence (AI), big data, and the Internet of Things (IoT). It also touched upon the outcome document by the BPF, and received questions regarding the development and implementation of the technologies and their impact.
Mr Wim Degezelle, Consultant, Duermovo, talked about the BPF which has identified a number of practices, put in draft documents as the best practices which would be discussed in more details in the session. One of the best practices mentioned was being technology and time neutral considering the pace of change in new technologies and their applications. Using ethics and human rights as guidelines and guiding principles, using them to support transparency and supporting small businesses so there is healthy competition between new and old businesses. A report will be published as an output before the end of the year. The purpose of the report is not just to document what happened at the IGF but to provide a document that can be a background information source for further policy and other related work.
The moderator, Mr Alex Comninos, Association for Progressive Communications (APC), introduced the panel which had representatives from government, civil society, academia, and the private sector.
Mr Nobu Nishigata, Policy Analyst, Directorate for Science, Technology and Innovation, the Organisation for Economic Co-operation and Development (OECD), introduced his prior work at the Japanese government on developing AI principles for research and development. Now at the OECD, they are working on identifying best practices that will lead to increased trust and adoption. Challenges include identifying opportunities along with potentially unethical uses, especially in fields as biotech, so risk management is a very important aspect in approaching new technologies.
Mr Taylor Bentley, Innovation, Science and Economic Development Canada (ISED), introduced ISED which started working in 2016 when the largest distributed denial-of-services (DDoS) attack in history happened, leveraging insecure IoTs used in homes and businesses. Canada's best practice is taking a light-handed approach aiming at developing a framework rather than specific legislation. They joined a process which brought together business, civil society, government and academia. It served as a forum for all the Canadian expertise, in this way the government does not monopolise the space while co-developing approaches.
Mr Imani Bellow, Science Po, discussed that literacy on AI needs to be advanced dramatically as the blanket use of the term AI is hurting the efforts. Transparency is crucial since these systems are having and will continue to have a direct and significant impact on people's lives. Data bias is an issue, but it can also be a tool for detect discrimination.
Mr Peter Micek, General Policy Council, Access Now, talked of how IoTs create and collect massive amounts of sensitive personal data. They collaborate with the governor of California who recently signed an IoT bill that requires manufacturers to have reasonable security measurements. It is currently only binding for manufacturers in California, but it is an important step considering how many tech companies are based there. On AI and fairness, the human right community only recently started to consider the physical range of risks. New machine enabled enforcement caused by the removal of hundreds and thousands of videos and photos documenting atrocities in Syria.
Mr Mike Nelson, Tech Strategy, Cloudflare, talked about myths surrounding AI and IoTs. He stated that it does not help if we only focus on 'things'. Focus must be put on networks, gateways, and programmes that connect these things. The second myth is that this is not only about big tech companies as even the smallest companies have an impact. The third point is to stop using general terms such as AI, there are at least 15 definitions of different technologies such as big data and machine learning. Finally, it is possible to identify risks and standardise best practices.
The moderator turned to the audience for questions.
The importance of trust and how to trust sensors, applications and big data; the importance of including youth in these discussions, as many small businesses and startups that utilise new technologies are young people, were among comments.
Nelson talked of competition and choice to bolster trust, as people are more inclined to trust a service if they have the choice to leave it for an alternative service. These options are not always available at the moment, for example, cloud services which should work well through different services. The point of putting people and their knowledge and capacity on new technologies was stressed. Taylor mentioned that technology can be perceived as moving too fast, however it should also be kept in mind that the technical community has been working on some of these questions for 20 years. Stakeholders talking to each other and keeping community channels, open help tremendously in fostering trust.
A remote question was about the challenges unique to Africa and how they can be addressed. The moderator who is from South Africa answered the question, listing the challenges as to whether cloud services can be hosted and housed in Africa soon, while mentioning Amazon cloud services providing collective computing power. Micek mentioned the large amounts of electricity and water consumed by data centres; another key challenge in Africa was cited as data protection although there is legislation it is not widely implemented, and the issue of privacy will increase in importance as IoTs and other tech are widely adopted. Capacity building within and out of the workforce was highlighted as the major priority related to new technologies.
Comments and questions from the floor included: the progress in building devices with safety by design; tech decision-making being hetero-normative and western-centric, what are the efforts put on paper to make the processes more inclusive? Why the community members who develop these technologies and the consumer organisations are not more active and involved in these discussions.
By Su Sonia Herring