Children and AI Securing Child Rights for the AI Generation

Session: OF 17

13 Nov 2018 - 11:15 to 12:15

#IGF2018, #OF17

Report

[Read more session reports and live updates from the 13th Internet Governance Forum]

The introduction of AI creates opportunity and risks to children. While some guidelines are in place, such as the Convention on the Rights of the Child, there is a need to seize the creativity and free-thinking ability inherent to children, and to create the frameworks to involve them in the design, development, deployment, and evaluation of artificial intelligence (AI).

Ms Jasmina Byrne, UNICEF Division of Data, Research and Policy moderated the session. She introduced the first speaker, Ms Jennie Bernstein, UNICEF Office of Innovation. Bernstein explained that UNICEF’s mandate stems from the Convention on the Rights of the Child, which is holistic in nature, encompassing physical, social, political and emotional rights, for example. It entails some rights and principles which are the key to discussions on AI, such as respect for the views of the child, non-discrimination, the right to survival and development, dignity and the best interests of the child. UNICEF’s work on children and AI is in its exploratory phase and aims to map out research that can help understand the most pressing areas of opportunity and risks related to the introduction of AI. On the one hand, AI can create opportunities, such as tailoring education to the specific needs of children, including for children who are differently abled, for example. On the other hand, it can also create privacy, safety, and security risks. The project aims to produce actionable recommendations for governments, companies, caregivers, and others to move children's rights to the centre of the AI debate. 

Ms Sandra Cortesi, Berkman Klein Centre for Internet and Society, Harvard University mentioned some findings of the report ‘Artificial intelligence and Human Rights: opportunities and risks’. Determining the impacts of AI on human rights is not easy as these technologies are being introduced in areas that are not rights neutral, such as schools, the entertainment industry or healthcare systems. Moreover, some rights sometimes clash with each other, such as privacy and safety. Young people are considered users of technologies, but not included in the design, development, deployment, and evaluation of new technologies, and this needs to change.  

Mr Steven Vosloo, UNICEF, highlighted the need to work with the private sector and governments to ensure that children’s rights are at the heart of the products and the services being developed, the importance to build the digital skills, critical thinking, and literacy of children in a way that makes them survive and thrive in an AI world, and that could assist the next generation in finding new jobs in the AI economy. 

Mr Jonnie Penn, University of Cambridge, alerted to the dangers of introducing AI in education. AI could be used to predict, based on indicators of their aptitudes, for example, which professional path a child should follow. Children should be agents in the process of defining the future. There is a need to break the myth of seeing children as incapable of contributing to important decisions. There is evidence that young people think better than AI, because they are capable, for example, of asking ‘what if?’, whereas AI can only analyse patterns. Children question reality, while AI conforms to the parameters given. Young people need to be part of the co-design of technology and policy. When it comes to rights, it is important to guarantee that children can erase their profiles or data as they grow up and prevent data analytics from classifying them into categories that they will not be able to escape throughout life. 

Questions and comments from the audience related to a wide array of topics. Participants wondered if and why it is better for humans to make decisions rather than AI, since humans are often biased and sometimes more fallible. The role of parents was also discussed, and there were views that the private sector has the responsibility to help parents deal with technology and the way it is affecting their children. This would mitigate parents’ emotional responses and probably curb the need for regulation in this area. 

 

By Marilia Maciel

 

The GIP Digital Watch observatory is provided by

in partnership with

and members of the GIP Steering Committee



 

GIP Digital Watch is operated by

Scroll to Top