Addressing children’s privacy and edtech apps

2 Dec 2022 08:15h - 09:45h

Session page

Event report

Most EdTech or educational technology applications used to ensure classroom education from home violate children’s best interest and privacy rights for purposes unrelated to their education, according to a new investigative report by Human Rights Watch. This report analyses the education technology (EdTech) endorsed by 49 governments for children’s education during the pandemic. The session participants agreed that governments’ endorsements of the majority of these online learning platforms put children’s privacy and other children’s rights at risk. But in the rush to connect children to virtual classrooms during the COVID-19 pandemic, few governments checked whether the EdTech they were endorsing was safe for children.

The consideration of children’s student data as an asset is largely unregulated, and EdTech companies seem to have taken advantage of that. Most online learning platforms have sent or granted access to children’s data to third-party companies, usually in advertising. They permitted sophisticated algorithms of advertising technology (AdTech) companies the opportunity to profile, analyse, and track children to anticipate what a child might do next and how they might be influenced. Experts confirmed that some EdTech products targeted children with behavioural advertising and risked influencing their opinions and beliefs at a time in their lives when they are at high risk of manipulative interference. Of the 163 EdTech products reviewed, 145 (89%) appeared to engage in data practices that put children’s rights at risk and monitored children, in most cases without the consent and knowledge of children or their parents.

Most EdTech products were offered to governments at no direct financial cost in exchange for endorsing and ensuring their widespread adoption during the COVID-19 school closure.  Hence, governments made it compulsory for students and teachers to use their EdTech product, making it impossible to opt out as this would mean not attending school.

Based on these facts, the session participants were clear about the needed actions to protect children’s rights online:

  • The remedy is needed for children whose data was collected by EdTech during the pandemic and remain at risk of misuse and exploitation. Governments should conduct data privacy audits of the EdTech endorsed for children’s learning, remove those that fail these audits, and immediately notify and guide affected schools, teachers, parents, and children to prevent further collection and misuse of children’s data.
  • The adoption of child-specific data protection laws that address the significant child rights impacts of the collection, processing, and use of children’s personal data should take place globally for every country.
  • A ban on behavioural advertising to children and a ban on profiling children should be enforced. Commercial interests and behavioural advertising should not be considered legitimate grounds for data processing that override a child’s best interests or their fundamental rights

Experts concluded that technical changes to stop tracking children could be easily enforced or implemented. Many of them do not require a budget, often it is a matter of asking a company to change a line of code to make sure that they do not track children’s location data in a certain way or do not share that data with third-party advertisers. These are actually very easy and specific things that a regulator could require a company to do, and their implementation can be easily checked.


The session in keywords

WS471 WORDCLOUD Addresing children privacy IGF2022