Meta faces lawsuits over teen mental health concerns

A federal judge in California has ruled that Meta must face lawsuits from several US states alleging that Facebook and Instagram contribute to mental health problems among teenagers. The states argue that Meta’s platforms are deliberately designed to be addictive, harming young users. Over 30 states, including California, New York, and Florida, filed these lawsuits last year.

Judge Yvonne Gonzalez Rogers rejected Meta’s attempt to dismiss the cases, though she did limit some claims. Section 230 of US law, which offers online platforms legal protections, shields Meta from certain accusations. However, the judge found enough evidence to allow the lawsuits to proceed, enabling the plaintiffs to gather further evidence and pursue a potential trial.

The decision also impacts personal injury cases filed by individual users against Meta, TikTok, YouTube, and Snapchat. Meta is the only company named in the state lawsuits, with plaintiffs seeking damages and changes to allegedly harmful business practices. California Attorney General Rob Bonta welcomed the ruling, stating that Meta should be held accountable for the harm it has caused to young people.

Meta disagrees with the decision, insisting it has developed tools to support parents and teenagers, such as new Teen Accounts on Instagram. Google also refuted the allegations, saying its efforts to create a safer online experience for young people remain a priority. Many other lawsuits across the US accuse social media platforms of fuelling anxiety, depression, and body-image concerns through addictive algorithms.

Oman’s TRA to safeguard children online

Telecommunications Regulatory Authority (TRA) in Oman has launched several initiatives to protect children’s internet usage in Oman, responding to alarming statistics revealing that nearly 86% of children in the Sultanate engage with the internet. Recognising that a substantial portion of this demographic spends considerable time online, 43.5% using it for information searches and 34% for entertainment and communication, the authority is actively pursuing a proposed law to regulate children’s internet activities.

The initiative aligns with ITU’s definition of a child, per Oman’s Child Protection Law No. 22/2014, which defines children as individuals under 18. Among these initiatives are the ‘Be Aware’ national awareness campaign, aimed at educating families on safe internet practices, the Secure Net program developed in partnership with Omantel and UNICEF to offer parental control features, and the Safe Net service designed to protect users from online threats such as viruses and phishing attacks.

Through these efforts, the TRA is committed to promoting a safe and responsible digital environment for children in Oman. By addressing the growing challenges of internet usage among minors, the authority aims to foster a culture of awareness and security that empowers families and protects the well-being of the younger generation in the digital landscape.

TikTok faces legal challenges from 13 US states over youth safety concerns

TikTok is facing multiple lawsuits from 13 US states and the District of Columbia, accusing the platform of harming and failing to protect young users. The lawsuits, filed in New York, California, and other states, allege that TikTok uses intentionally addictive software to maximise user engagement and profits, particularly targeting children who lack the ability to set healthy boundaries around screen time.

California Attorney General Rob Bonta condemned TikTok for fostering social media addiction to boost corporate profits, while New York Attorney General Letitia James connected the platform to mental health issues among young users. Washington D.C. Attorney General Brian Schwalb further accused TikTok of operating an unlicensed money transmission service through its live streaming and virtual currency features and claimed that the platform enables the sexual exploitation of minors.

TikTok, in response, denied the allegations and expressed disappointment in the legal action taken, arguing that the states should collaborate on solutions instead. The company pointed to safety measures, such as screen time limits and privacy settings for users under 16.

These lawsuits are part of a broader set of legal challenges TikTok is facing, including a prior lawsuit from the U.S. Justice Department over children’s privacy violations. The company is also dealing with efforts to ban the app in the US due to concerns about its Chinese ownership.

Australian court upholds fine against X for noncompliance with child protection laws

An Australian court upheld an order on Friday requiring Elon Musk’s X to pay a fine of A$610,500 ($418,000) for not cooperating with a regulator’s request regarding anti-child-abuse practices. X had contested the fine, but the Federal Court of Australia determined that the company was obligated to respond to a notice from the eSafety Commissioner, which sought information about measures to combat child sexual exploitation material on the platform.

Musk’s company claimed it was not obligated to respond to the notice due to its integration into a new corporate entity under his control, which it argued eliminated its liability. However, eSafety Commissioner Julie Inman Grant cautioned that accepting this argument could set a troubling precedent, enabling foreign companies to evade regulatory responsibilities in Australia through corporate restructuring. Alongside the fine, eSafety has also launched civil proceedings against X for noncompliance.

This is not the first confrontation between Musk and Australia’s internet safety regulator. Earlier this year, the eSafety Commissioner ordered X to take down posts showing a bishop being stabbed during a sermon. X contested the order in court, claiming that a regulator in one country should not control global content visibility. Ultimately, X retained the posts after the Australian regulator withdrew its case. Musk labelled the order as censorship and claimed it was part of a larger agenda by the World Economic Forum to impose global eSafety regulations.

X must pay fine over child protection dispute

An Australian court has upheld a ruling requiring Elon Musk’s X, previously known as Twitter, to pay a $418,000 fine. The fine was issued for failing to cooperate with a request from the eSafety Commissioner regarding anti-child-abuse measures on the platform.

X had contested the penalty, arguing that it was no longer bound by regulatory obligations following a corporate restructure under Musk’s ownership. However, the court ruled that the platform was still required to respond to the request made by the Australian internet safety regulator.

The eSafety Commissioner stated that accepting X’s argument could have set a worrying precedent for foreign companies merging to avoid regulatory responsibilities. Civil proceedings against X have also begun due to its noncompliance.

Musk’s platform has clashed with authorities in Australia before, notably in a case where X refused to remove content showing a stabbing incident. The company claimed that one country should not dictate global online content.

TikTok faces lawsuit in Texas over child privacy breach

Texas Attorney General Ken Paxton has filed a lawsuit against TikTok, accusing the platform of violating children’s privacy laws. The lawsuit alleges that TikTok shared personal information of minors without parental consent, in breach of Texas’s Securing Children Online through Parental Empowerment Act (SCOPE Act).

The legal action seeks an injunction and civil penalties, with fines up to $10,000 per violation. Paxton claims TikTok failed to provide adequate privacy tools for children and allowed data to be shared from accounts set to private. Targeted advertising to children was also a concern raised in the lawsuit.

TikTok’s parent company, ByteDance, is being held responsible for allegedly prioritising profits over child safety. Paxton stressed the importance of holding large tech companies accountable for their role in protecting minors online.

The case was filed in Galveston County court, with TikTok yet to comment on the matter. The lawsuit represents a broader concern about the protection of children’s online privacy in the digital age.

AV1 robot bridges gap for children unable to attend school

Children who are chronically ill and unable to attend school can now stay connected to the classroom using the AV1 robot, developed by the company No Isolation from Norway. This innovative technology serves as their eyes and ears, allowing them to engage with lessons and interact with friends remotely. Controlled via an app, the robot sits on a classroom desk, enabling students to rotate its view, speak to classmates, and even signal when they want to participate.

The AV1 has been especially valuable for children undergoing long-term treatment or experiencing mental health challenges, helping them maintain a connection with their peers and stay socially included. In the United Kingdom, schools can rent or purchase the AV1, which has been widely adopted, particularly in countries like the UK and Germany, where over 1,000 units are active. For many students, the robot has become a lifeline during extended absences from school.

Though widely praised, there are logistical challenges in introducing the AV1 to schools and hospitals, including administrative hurdles and technical issues like weak Wi-Fi. Despite these obstacles, teachers and families have found the robot to be highly effective, with privacy protections and features tailored to students’ needs, including the option to avoid showing their face on screen.

Research has highlighted the AV1’s potential to keep children both socially and academically connected, and No Isolation has rolled out a training resource, AV1 Academy, to support teachers and schools in using the technology effectively. With its user-friendly design and robust privacy features, the AV1 continues to make a positive impact on the lives of children facing illness and long absences from school.

EU questions YouTube, TikTok, and Snapchat over algorithms

The European Commission has requested information from YouTube, Snapchat, and TikTok regarding the algorithms used to recommend content to users. Concerns have been raised about the influence of these systems on issues like elections, mental health, and protecting minors. The inquiry falls under the Digital Services Act (DSA), aiming to address potential systemic risks, including the spread of illegal content such as hate speech and drug promotion.

TikTok faces additional scrutiny about measures to prevent bad actors from manipulating the platform, especially during elections. These platforms must provide detailed information on their systems by 15 November. Failure to comply could result in further action, including potential fines.

The DSA mandates that major tech companies take more responsibility in tackling illegal and harmful content. In the past, the EU has initiated similar non-compliance proceedings with other tech giants like Meta, AliExpress, and TikTok over content regulation.

This latest request reflects the EU’s ongoing efforts to ensure greater accountability from social media platforms. The focus remains on protecting users and maintaining a fair and safe digital environment.

Ello’s new AI tool lets kids create their own stories

Ello, an AI reading companion designed to help children struggling with reading, has introduced a new feature called ‘Storytime’. This feature enables kids to create their own stories by choosing from a range of settings, characters, and plots. Story options are tailored to the child’s reading level and current lessons, helping them practise essential reading skills.

Ello’s AI, represented by a bright blue elephant, listens to children as they read aloud and helps correct mispronunciations. The tool uses phonics-based strategies to adapt stories based on the child’s responses, ensuring personalised and engaging experiences. It also offers two reading modes: one where the child and Ello take turns reading and another, more supportive mode for younger readers.

The Storytime feature distinguishes itself from other AI-assisted story creation tools by focusing on reading development. The technology has been tested with teachers and children, and includes safeguards to ensure age-appropriate content. Future versions of the product may allow even more creative input from children, while maintaining helpful structure to avoid overwhelming them.

Ello’s subscription costs $14.99 per month, with discounted pricing for low-income families. The company also partners with schools to offer its services for free, and has recently made its collection of decodable children’s books available online at no cost.

Cloudflare partners with ISPs to enhance internet security and privacy for users worldwide

Cloudflare, internet service providers, and network equipment providers have embarked on a collaborative journey to enhance the safety and privacy of internet users globally. By offering Cloudflare’s DNS resolvers at no cost, these providers can deliver advanced security features crucial in today’s digital landscape.

That partnership empowers ISPs and equipment manufacturers to improve their service offerings and ensures that users can enjoy a safer browsing experience without additional costs. With children spending more time online, particularly during the COVID-19 pandemic, the demand for protective measures has never been greater.

Cloudflare’s initiatives, such as the launch of 1.1.1.1 for Families, allow these partners to implement content filtering and security features tailored specifically for households. The strategic alignment ensures that families can confidently navigate the internet, knowing that harmful content is being filtered and their online activities are shielded from threats.

Furthermore, Cloudflare, alongside ISPs and network equipment providers, addresses the challenges users face in setting up effective online protections. Many consumers find configuring DNS settings and implementing security features daunting. To tackle this issue, Cloudflare is working with its partners to simplify the setup process.

By integrating Cloudflare’s services directly into their platforms, ISPs can provide a seamless user experience that encourages the adoption of these important safety measures. That collaborative approach ensures that even the least tech-savvy users can benefit from enhanced security without feeling overwhelmed.

Why does this matter?

Cloudflare, internet service providers, and network equipment providers understand the need for flexible, customisable solutions to meet diverse user needs. With Cloudflare’s Gateway product, ISPs can offer advanced filtering options that let users tailor their online experience, including content restrictions and scheduling, such as limiting social media access. These customisable options empower users to control their online safety while boosting customer satisfaction and loyalty.