Oman’s TRA to safeguard children online

Telecommunications Regulatory Authority (TRA) in Oman has launched several initiatives to protect children’s internet usage in Oman, responding to alarming statistics revealing that nearly 86% of children in the Sultanate engage with the internet. Recognising that a substantial portion of this demographic spends considerable time online, 43.5% using it for information searches and 34% for entertainment and communication, the authority is actively pursuing a proposed law to regulate children’s internet activities.

The initiative aligns with ITU’s definition of a child, per Oman’s Child Protection Law No. 22/2014, which defines children as individuals under 18. Among these initiatives are the ‘Be Aware’ national awareness campaign, aimed at educating families on safe internet practices, the Secure Net program developed in partnership with Omantel and UNICEF to offer parental control features, and the Safe Net service designed to protect users from online threats such as viruses and phishing attacks.

Through these efforts, the TRA is committed to promoting a safe and responsible digital environment for children in Oman. By addressing the growing challenges of internet usage among minors, the authority aims to foster a culture of awareness and security that empowers families and protects the well-being of the younger generation in the digital landscape.

Discord banned in Turkey following court ruling

Turkey has blocked access to the messaging platform Discord after the company refused to share information requested by the government. A court in Ankara issued the decision, citing concerns over child sexual abuse and obscene content being shared by users on the platform. The Information Technologies and Communication Authority confirmed the ban.

The action follows outrage after a 19-year-old in Istanbul murdered two women, with Discord users allegedly praising the incident online. Justice Minister Yilmaz Tunc explained that there was sufficient suspicion of illegal activity linked to the platform, which prompted the court to intervene.

Transport Minister Abdulkadir Uraloglu added that monitoring platforms like Discord is difficult, as security forces can only act when users report content. Discord’s refusal to provide data, such as IP addresses, further complicated the situation, leading to the decision to block the service.

The ban in Turkey coincides with a similar action in Russia, where Discord was blocked for violating local laws after failing to remove prohibited content. The platform has faced growing scrutiny over its handling of illegal activity.

TikTok faces legal challenges from 13 US states over youth safety concerns

TikTok is facing multiple lawsuits from 13 US states and the District of Columbia, accusing the platform of harming and failing to protect young users. The lawsuits, filed in New York, California, and other states, allege that TikTok uses intentionally addictive software to maximise user engagement and profits, particularly targeting children who lack the ability to set healthy boundaries around screen time.

California Attorney General Rob Bonta condemned TikTok for fostering social media addiction to boost corporate profits, while New York Attorney General Letitia James connected the platform to mental health issues among young users. Washington D.C. Attorney General Brian Schwalb further accused TikTok of operating an unlicensed money transmission service through its live streaming and virtual currency features and claimed that the platform enables the sexual exploitation of minors.

TikTok, in response, denied the allegations and expressed disappointment in the legal action taken, arguing that the states should collaborate on solutions instead. The company pointed to safety measures, such as screen time limits and privacy settings for users under 16.

These lawsuits are part of a broader set of legal challenges TikTok is facing, including a prior lawsuit from the U.S. Justice Department over children’s privacy violations. The company is also dealing with efforts to ban the app in the US due to concerns about its Chinese ownership.

Australian court upholds fine against X for noncompliance with child protection laws

An Australian court upheld an order on Friday requiring Elon Musk’s X to pay a fine of A$610,500 ($418,000) for not cooperating with a regulator’s request regarding anti-child-abuse practices. X had contested the fine, but the Federal Court of Australia determined that the company was obligated to respond to a notice from the eSafety Commissioner, which sought information about measures to combat child sexual exploitation material on the platform.

Musk’s company claimed it was not obligated to respond to the notice due to its integration into a new corporate entity under his control, which it argued eliminated its liability. However, eSafety Commissioner Julie Inman Grant cautioned that accepting this argument could set a troubling precedent, enabling foreign companies to evade regulatory responsibilities in Australia through corporate restructuring. Alongside the fine, eSafety has also launched civil proceedings against X for noncompliance.

This is not the first confrontation between Musk and Australia’s internet safety regulator. Earlier this year, the eSafety Commissioner ordered X to take down posts showing a bishop being stabbed during a sermon. X contested the order in court, claiming that a regulator in one country should not control global content visibility. Ultimately, X retained the posts after the Australian regulator withdrew its case. Musk labelled the order as censorship and claimed it was part of a larger agenda by the World Economic Forum to impose global eSafety regulations.

X must pay fine over child protection dispute

An Australian court has upheld a ruling requiring Elon Musk’s X, previously known as Twitter, to pay a $418,000 fine. The fine was issued for failing to cooperate with a request from the eSafety Commissioner regarding anti-child-abuse measures on the platform.

X had contested the penalty, arguing that it was no longer bound by regulatory obligations following a corporate restructure under Musk’s ownership. However, the court ruled that the platform was still required to respond to the request made by the Australian internet safety regulator.

The eSafety Commissioner stated that accepting X’s argument could have set a worrying precedent for foreign companies merging to avoid regulatory responsibilities. Civil proceedings against X have also begun due to its noncompliance.

Musk’s platform has clashed with authorities in Australia before, notably in a case where X refused to remove content showing a stabbing incident. The company claimed that one country should not dictate global online content.

TikTok faces lawsuit in Texas over child privacy breach

Texas Attorney General Ken Paxton has filed a lawsuit against TikTok, accusing the platform of violating children’s privacy laws. The lawsuit alleges that TikTok shared personal information of minors without parental consent, in breach of Texas’s Securing Children Online through Parental Empowerment Act (SCOPE Act).

The legal action seeks an injunction and civil penalties, with fines up to $10,000 per violation. Paxton claims TikTok failed to provide adequate privacy tools for children and allowed data to be shared from accounts set to private. Targeted advertising to children was also a concern raised in the lawsuit.

TikTok’s parent company, ByteDance, is being held responsible for allegedly prioritising profits over child safety. Paxton stressed the importance of holding large tech companies accountable for their role in protecting minors online.

The case was filed in Galveston County court, with TikTok yet to comment on the matter. The lawsuit represents a broader concern about the protection of children’s online privacy in the digital age.

EU questions YouTube, TikTok, and Snapchat over algorithms

The European Commission has requested information from YouTube, Snapchat, and TikTok regarding the algorithms used to recommend content to users. Concerns have been raised about the influence of these systems on issues like elections, mental health, and protecting minors. The inquiry falls under the Digital Services Act (DSA), aiming to address potential systemic risks, including the spread of illegal content such as hate speech and drug promotion.

TikTok faces additional scrutiny about measures to prevent bad actors from manipulating the platform, especially during elections. These platforms must provide detailed information on their systems by 15 November. Failure to comply could result in further action, including potential fines.

The DSA mandates that major tech companies take more responsibility in tackling illegal and harmful content. In the past, the EU has initiated similar non-compliance proceedings with other tech giants like Meta, AliExpress, and TikTok over content regulation.

This latest request reflects the EU’s ongoing efforts to ensure greater accountability from social media platforms. The focus remains on protecting users and maintaining a fair and safe digital environment.

Cloudflare partners with ISPs to enhance internet security and privacy for users worldwide

Cloudflare, internet service providers, and network equipment providers have embarked on a collaborative journey to enhance the safety and privacy of internet users globally. By offering Cloudflare’s DNS resolvers at no cost, these providers can deliver advanced security features crucial in today’s digital landscape.

That partnership empowers ISPs and equipment manufacturers to improve their service offerings and ensures that users can enjoy a safer browsing experience without additional costs. With children spending more time online, particularly during the COVID-19 pandemic, the demand for protective measures has never been greater.

Cloudflare’s initiatives, such as the launch of 1.1.1.1 for Families, allow these partners to implement content filtering and security features tailored specifically for households. The strategic alignment ensures that families can confidently navigate the internet, knowing that harmful content is being filtered and their online activities are shielded from threats.

Furthermore, Cloudflare, alongside ISPs and network equipment providers, addresses the challenges users face in setting up effective online protections. Many consumers find configuring DNS settings and implementing security features daunting. To tackle this issue, Cloudflare is working with its partners to simplify the setup process.

By integrating Cloudflare’s services directly into their platforms, ISPs can provide a seamless user experience that encourages the adoption of these important safety measures. That collaborative approach ensures that even the least tech-savvy users can benefit from enhanced security without feeling overwhelmed.

Why does this matter?

Cloudflare, internet service providers, and network equipment providers understand the need for flexible, customisable solutions to meet diverse user needs. With Cloudflare’s Gateway product, ISPs can offer advanced filtering options that let users tailor their online experience, including content restrictions and scheduling, such as limiting social media access. These customisable options empower users to control their online safety while boosting customer satisfaction and loyalty.

New law targets excessive phone use in California schools

California has introduced a new law requiring schools to limit or ban the use of smartphones to combat rising concerns about their impact on mental health and education. Governor Gavin Newsom signed the bill following increasing evidence linking excessive phone use with anxiety, depression, and learning difficulties.

California is joining thirteen other states, including Florida, which introduced a similar ban last year. Los Angeles County schools, the state’s largest district, already prohibited phones for its 429,000 students earlier this year. The law, aimed at promoting student focus and social development, reflects a broader national movement to reduce smartphone use among young people.

Surgeon General Vivek Murthy has warned of the growing mental health crisis associated with social media, comparing it to the dangers of smoking. Studies in the US suggest that teenagers spending more than three hours a day on social media are at increased risk of mental illness, with average usage exceeding four hours daily.

School boards across California will be required to implement policies limiting phone use by July 2026, with updates every five years. Newsom stressed the importance of addressing the issue early to improve students’ wellbeing and academic focus.

Snapchat’s balance between user safety and growth remains a challenge

Snapchat is positioning itself as a healthier social media alternative for teens, with CEO Evan Spiegel emphasising the platform’s different approach at the company’s annual conference. Recent research from the University of Amsterdam supports this view, showing that while platforms like TikTok and Instagram negatively affect youth mental health, Snapchat use appears to have positive effects on friendships and well-being.

However, critics argue that Snapchat’s disappearing messages feature can facilitate illegal activities. Matthew Bergman, an advocate for social media victims, claimed the platform has been used by drug dealers, citing instances of children dying from fentanyl poisoning after buying drugs via the app. Despite these concerns, Snapchat remains popular, particularly with younger users.

Industry analysts recognise the platform’s efforts but highlight its ongoing challenges. As Snapchat continues to grow its user base, balancing privacy and safety with revenue generation remains a key issue, especially as it struggles to compete with bigger players like TikTok, Meta, and Google for advertising.

Snapchat’s appeal lies in its low-pressure environment, with features like disappearing stories and augmented reality filters. Young users, like 14-year-old Lily, appreciate the casual nature of communication on the platform, while content creators praise its ability to offer more freedom and reduce social pressure compared to other social media platforms.