New report examines TikTok’s algorithm in recommending harmful content to vulnerable teen users

Researchers from the Center for Countering Digital Hate look into how TikTok’s For You feed’s algorithmic recommendations react to teen users who express interest in eating disorders, body image, and mental health.

By creating brand-new accounts for users in the USA, UK, Australia, and Canada, all of whom were 13 years old, researchers looked at the algorithm behind TikTok. One of these accounts has a username that suggests a preoccupation with one’s appearance. They watched and liked any videos about body image, mental health, or eating disorders for each account, and then they recorded the first 30 minutes of algorithmically suggested content on each account’s For You feed. The resulting recordings were examined to see how frequently eating disorder, self-harm, and body image recommendations were made.

The study found that TikTok gave suicide-related information recommendations in under 2.6 minutes. Within 8 minutes, TikTok offered material on eating disorders. Every 39 seconds, teens on TikTok received recommendations for videos on body image and mental health. According to the study, self-harm videos were recommended to vulnerable accounts with the vulnerable phrase in their usernames 12 times more frequently than they were to regular accounts.

TikTok sued in a US State for security and safety violations

Indiana’s Attorney General filed a lawsuit against TikTok for violation of state consumer protection laws. The lawsuit alleges that the social media company failed to disclose that ByteDance, the Chinese company that owns TikTok, has access to sensitive consumer information. Moreover, another complaint claims that the company exposes children to sexual and substance-related content, while misleading the users with its age rating of 12 plus on App Store and Google Play. Indiana seeks penalties of up to US$5000 per violation and asks the Indiana Superior Court to order the company to stop false and misleading representations to its users.

New amendments introduced to UK Online Safety Bill

The UK Government has introduced amendments to the Online Safety Bill, addressing the removal of online content. The new version of the Bill will not define types of objectionable content; rather, it will offer a ‘triple shield’ of protection to users. Online platforms will be required to remove illegal content or content violating their community guidelines and to provide adult users with greater control over the online content. Online platforms will also be expected to be more transparent about online risks to children and to illustrate how they enforce age verification measures. Another set of amendments will protect women and girls online, introducing control or coercive behaviour as a criminal offence under the Bill, and requiring that online platforms be more proactive with such content. The Bill is scheduled to return to the UK Parliament next week, with the first amendments tabled in the Commons for Report Stage on 5 December. Further amendments are expected in the later stages of the legislative process.

WHO report evaluates online safety educational programmes for youth

The World Health Organization (WHO) issued a report titled What Works to Prevent Online Violence Against Children which shows that prevention education for children can work and that this is a key strategy for addressing online violence against children (VAC).

Building on a review of evaluations of online safety programmes and online VAC programmes for children and adolescents, the report argues that educational programmes have been widely demonstrated to improve overall safety and health. These educational programmes are particularly effective in preventing one type of online VAC – cyberbullying (both victimisation and perpetration).

This report also captured a number of structural and skill components that contribute to the effectiveness of educational programmes and should be widely adopted. Structural components include multiple and varied learning strategies and tools; more lessons, more message exposures, more reminders, and follow-ups; using peer engagement, role-plays, and interactions; getting a supportive whole-school environment; and parental involvement. Skill components include problem-solving, assertiveness, empathy, self-regulation, help-seeking, bystander or defender mobilization, social norm instruction, sex education, and substance abuse education.

The report also revealed that there is a lack of evidence about the success of prevention programmes for online child sexual exploitation and abuse.

The report suggests implementing school-based educational programmes with multiple sessions that encourage youth interaction and involve parents. It emphasises the need for more violence prevention programmes that integrate content about online dangers with offline violence prevention. It suggested less emphasis on stranger danger and more emphasis on acquaintance and peer perpetrators, who are responsible for the majority of online violence against children.

New report calls for digital tech to be developed and regulated in ways that maximise benefits for young users

The US National Scientific Council on Adolescence recommended in its report titled Engaging, Safe, and Evidence-Based: What Science Tells Us About How to Promote Positive Development and Decrease Risk in Online Spaces that digital technology is designed and regulated in ways that maximise positive, equitable benefits for all young adolescents and limit potential harm. Digital technology changes for young users should be supported by data from developmental research and consistent with already accepted standards.

The research also advocates for better-educating kids, parents, product designers, teachers, legislators, and other stakeholders about the advantages and disadvantages of using digital technology. More evidence-based approaches will make it possible to ensure that early adolescent use of digital technology promotes wellbeing and constructive development while minimising exposure to harm.

Four evidence-based recommendations are advanced by the report:

  1. Digital technology should promote healthy development and wellness.
  2. Digital technology should be designed and used in a way that is safe for early adolescents.
  3. The design and evaluation of digital technology used by young adolescents should take into account the best available research and advance it. Any digital technology platforms that may pose real health risks to young adolescents should be subjected to independent evaluation by experts in developmental science, mental health, and other relevant fields.
  4. All early adolescents should have consistent access to the level of digital connectivity and devices needed to participate fully in their education and learning.

Legislating for the digital age

The report is designed to be used by governments, country offices of international organisations, civil society organisations, and business organisations to help ensure that all aspects of child sexual exploitation and abuse online are explained and included in the legislation in accordance with international and regional standards and good practises.