Australia plans to ban social media for children under 16

The Australian government has announced plans to introduce a ban on social media access for children under 16, with legislation expected to pass by late next year. Prime Minister Anthony Albanese described the move as part of a world-leading initiative to combat the harms social media inflicts on children, particularly the negative impact on their mental and physical health. He highlighted concerns over the influence of harmful body image content for girls and misogynistic material directed at boys.

Australia is also testing age-verification systems, such as biometrics and government ID, to ensure that children cannot access social media platforms. The new legislation will not allow exemptions, including for children with parental consent or those with pre-existing accounts. Social media platforms will be held responsible for preventing access to minors, rather than placing the burden on parents or children.

The proposed ban includes major platforms such as Meta’s Instagram and Facebook, TikTok, YouTube, and X (formerly Twitter). While some digital industry representatives, like the Digital Industry Group, have criticised the plan, arguing it could push young people toward unregulated parts of the internet, Australian officials stand by the measure, emphasising the need for strong protections against online harm.

This move positions Australia as a leader in regulating children’s access to social media, with no other country implementing such stringent age-verification methods. The new rules will be introduced into parliament this year and are set to take effect 12 months after ratification.

TikTok faces lawsuit in France after teen suicides linked to platform

Seven families in France are suing TikTok, alleging that the platform’s algorithm exposed their teenage children to harmful content, leading to tragic consequences, including the suicides of two 15-year-olds. Filed at the Créteil judicial court, this grouped case seeks to hold TikTok accountable for what the families describe as dangerous content promoting self-harm, eating disorders, and suicide.

The families’ lawyer, Laure Boutron-Marmion, argues that TikTok, as a company offering its services to minors, must address its platform’s risks and shortcomings. She emphasised the need for TikTok’s legal liability to be recognised, especially given that its algorithm is often blamed for pushing disturbing content. TikTok, like Meta’s Facebook and Instagram, faces multiple lawsuits worldwide accusing these platforms of targeting minors in ways that harm their mental health.

TikTok has previously stated it is committed to protecting young users’ mental well-being and has invested in safety measures, according to CEO Shou Zi Chew’s remarks to US lawmakers earlier this year.

New video app Loops aims to compete with TikTok

A new app called Loops is aiming to be the TikTok of the fediverse, an open-source social network ecosystem. Loops, which just opened for signups, will feature short, looping videos similar to TikTok’s format. Although still in development, the platform plans to be open-source and integrate with ActivityPub, the protocol that powers other federated apps like Mastodon and Pixelfed.

Loops is the latest project from Daniel Supernault, creator of Pixelfed, and will operate under the Pixelfed umbrella. Unlike mainstream social media, Loops promises not to sell user data to advertisers, nor will it use content to train AI models. Users will retain full ownership of their videos, granting Loops only limited permissions for use.

Like other fediverse platforms, Loops will rely on user donations for funding rather than investor support, with plans to accept contributions through Patreon and similar platforms. The app will also allow users on other federated networks, like Mastodon, to interact with Loops content seamlessly. Loops is currently seeking community input on its policies and looking for moderators to guide the platform’s early stages.

Brazil’s Collective Defense Institute sues Meta, TikTok, Kwai over youth safety

The consumer rights organisation, Brazil’s Collective Defense Institute, has launched two lawsuits against the Brazilian divisions of TikTok, Kwai, and Meta Platforms, seeking damages of 3 billion reais ($525 million). The lawsuits accuse these companies of neglecting to implement adequate protections to prevent young users from excessive social media use, which could harm children’s mental health.

The lawsuits highlight a growing debate over social media regulation in Brazil, especially after a high-profile legal dispute between Elon Musk’s X platform and a Brazilian Supreme Court justice led to significant fines. The consumer rights group is pushing for these platforms to establish clear data protection protocols and issue stronger warnings about the risks of social media addiction for minors.

Based on research into the effects of unregulated social media usage, particularly among teenagers, the lawsuits argue for urgent changes. Attorney Lillian Salgado, representing the plaintiffs, stressed the need for Brazil to adopt safety measures similar to those used in developed countries, including modifying algorithms, managing user data for those under 18, and enhancing account oversight for minors.

In response, Meta stated it has prioritised youth safety for over a decade, creating over 50 tools to protect teens. Meta also announced that a new ‘Teen Account’ feature on Instagram will soon launch in Brazil, automatically limiting what teenagers see and controlling who can contact them. TikTok said it had not received notice of the case, while Kwai emphasised that user safety, particularly for minors, is a primary focus.

ByteDance fires intern for disrupting AI training

ByteDance, the parent company of TikTok, has dismissed an intern for what it described as “maliciously interfering” with the training of one of its AI models. The Chinese tech giant clarified that while the intern, who was part of the advertising technology team, had no experience with ByteDance’s AI Lab, some reports circulating on social media and other platforms have exaggerated the incident’s impact.

ByteDance stated that the interference did not disrupt its commercial operations or its large language AI models. It also denied claims that the damage exceeded $10 million or affected an AI training system powered by thousands of graphics processing units (GPUs). The company highlighted that the intern was fired in August, and it has since notified their university and relevant industry bodies.

As one of the leading tech firms in AI development, ByteDance operates popular platforms like TikTok and Douyin. The company continues to invest heavily in AI, with applications including its Doubao chatbot and a text-to-video tool named Jimeng.

Hundreds lose jobs as TikTok focuses on AI moderation

TikTok, owned by ByteDance, is cutting hundreds of jobs globally as it pivots towards greater use of AI in content moderation. Among the hardest hit is Malaysia, where fewer than 500 employees were affected, mostly involved in moderation roles. The layoffs come as TikTok seeks to improve the efficiency of its moderation system, relying more heavily on automated detection technologies.

The firm’s spokesperson explained that the move is part of a broader plan to optimise its global content moderation model, aiming for more streamlined operations. TikTok has announced plans to invest $2 billion in global trust and safety measures, with 80% of harmful content already being removed by AI.

The layoffs in Malaysia follow increased regulatory pressure on technology companies operating in the region. Malaysia’s government recently urged social media platforms, including TikTok, to enhance their monitoring systems and apply for operating licences to combat rising cybercrime.

ByteDance, which employs over 110,000 people worldwide, is expected to continue restructuring next month as it consolidates some of its regional operations. These changes highlight the company’s ongoing shift towards automation in its content management strategy.

TikTok faces legal challenges from 13 US states over youth safety concerns

TikTok is facing multiple lawsuits from 13 US states and the District of Columbia, accusing the platform of harming and failing to protect young users. The lawsuits, filed in New York, California, and other states, allege that TikTok uses intentionally addictive software to maximise user engagement and profits, particularly targeting children who lack the ability to set healthy boundaries around screen time.

California Attorney General Rob Bonta condemned TikTok for fostering social media addiction to boost corporate profits, while New York Attorney General Letitia James connected the platform to mental health issues among young users. Washington D.C. Attorney General Brian Schwalb further accused TikTok of operating an unlicensed money transmission service through its live streaming and virtual currency features and claimed that the platform enables the sexual exploitation of minors.

TikTok, in response, denied the allegations and expressed disappointment in the legal action taken, arguing that the states should collaborate on solutions instead. The company pointed to safety measures, such as screen time limits and privacy settings for users under 16.

These lawsuits are part of a broader set of legal challenges TikTok is facing, including a prior lawsuit from the U.S. Justice Department over children’s privacy violations. The company is also dealing with efforts to ban the app in the US due to concerns about its Chinese ownership.

Meta revamps Facebook to engage young adults

Facebook, once the go-to platform for connecting with family and friends, is shifting its focus to attract younger users, according to Tom Alison, head of Facebook at Meta. With younger generations favouring apps like Instagram and TikTok, Meta aims to revitalise Facebook by helping users expand their networks and make new connections, aligning with how young adults use the platform today.

To achieve this, Facebook is testing two new tabs, Local and Explore, aimed at helping users find nearby events, community groups, and content tailored to their interests. This initiative aligns with Meta’s efforts to compete with TikTok, which has 150 million US users, by introducing its short-form video feature, Reels, in 2021. Data reveals that young adults on Facebook spend 60% of their time watching videos, with over half engaging with Reels daily.

Facebook also reported a 24% increase in conversations initiated through its dating feature among young adults in the US and Canada. At a recent event in Austin, Texas, the platform promoted its new direction with the slogan ‘Not your mom’s Facebook,’ emphasising its push to attract a younger audience.

TikTok faces lawsuit in Texas over child privacy breach

Texas Attorney General Ken Paxton has filed a lawsuit against TikTok, accusing the platform of violating children’s privacy laws. The lawsuit alleges that TikTok shared personal information of minors without parental consent, in breach of Texas’s Securing Children Online through Parental Empowerment Act (SCOPE Act).

The legal action seeks an injunction and civil penalties, with fines up to $10,000 per violation. Paxton claims TikTok failed to provide adequate privacy tools for children and allowed data to be shared from accounts set to private. Targeted advertising to children was also a concern raised in the lawsuit.

TikTok’s parent company, ByteDance, is being held responsible for allegedly prioritising profits over child safety. Paxton stressed the importance of holding large tech companies accountable for their role in protecting minors online.

The case was filed in Galveston County court, with TikTok yet to comment on the matter. The lawsuit represents a broader concern about the protection of children’s online privacy in the digital age.

EU questions YouTube, TikTok, and Snapchat over algorithms

The European Commission has requested information from YouTube, Snapchat, and TikTok regarding the algorithms used to recommend content to users. Concerns have been raised about the influence of these systems on issues like elections, mental health, and protecting minors. The inquiry falls under the Digital Services Act (DSA), aiming to address potential systemic risks, including the spread of illegal content such as hate speech and drug promotion.

TikTok faces additional scrutiny about measures to prevent bad actors from manipulating the platform, especially during elections. These platforms must provide detailed information on their systems by 15 November. Failure to comply could result in further action, including potential fines.

The DSA mandates that major tech companies take more responsibility in tackling illegal and harmful content. In the past, the EU has initiated similar non-compliance proceedings with other tech giants like Meta, AliExpress, and TikTok over content regulation.

This latest request reflects the EU’s ongoing efforts to ensure greater accountability from social media platforms. The focus remains on protecting users and maintaining a fair and safe digital environment.