Senator Richard Blumenthal has reaffirmed that ByteDance must divest TikTok’s US operations by January 19 or risk a ban. The measure, driven by security concerns over potential Chinese surveillance, was signed into law in April. A one-time extension of 90 days is available if significant progress is made, but Blumenthal emphasised that laws cannot be disregarded.
Blumenthal also raised alarms over China’s influence on US technology companies. Tesla’s production in China and the US military’s reliance on SpaceX were flagged as security risks. He pointed to Elon Musk’s economic ties with China as a potential vulnerability, warning that such dependencies could compromise national interests.
Apple faced criticism for complying with Chinese censorship and surveillance demands while generating significant revenue from the country. Concerns were voiced that major tech companies might prioritise profits over US security. Neither Apple nor Tesla has commented on these claims.
TikTok and ByteDance are challenging the divestment law in court. A decision is expected soon, but restrictions will tighten for app stores and hosting services if compliance is not achieved. The Biden administration has clarified that it supports ending Chinese ownership of TikTok rather than an outright ban.
TikTok has rolled out Symphony Creative Studios worldwide, a generative AI video creation platform designed for advertisers. The platform aims to simplify the creation of tailored, high-quality content for businesses, creators, and agencies.
Unveiled earlier this year at the TikTok World Product Summit, Symphony is part of a broader suite of tools. These include Symphony Assistant, Symphony Digital Avatars, and the TikTok Ads Manager, all focused on enhancing creative capabilities on the platform.
Symphony Creative Studios offers features like automated content generation from text, video previews, remixing, and digital avatar creation. Users can also access tools for translation and customisation, making it easier to adapt content for diverse audiences.
AI-powered tools have become essential in attracting brands, with TikTok joining other tech companies in integrating these technologies to strengthen its advertising business. Symphony aims to position the platform as a leader in digital marketing innovation.
The Irish media regulator, Coimisiún na Meán, has mandated that online platforms TikTok, X, and Meta must take decisive steps to prevent the spread of terrorist content on their services, giving them three months to report on their progress.
This action follows notifications from EU authorities under the Terrorist Content Online Regulation. If the platforms fail to comply, the regulator can impose fines of up to four percent of their global revenue.
This decision aligns with Ireland’s broader enforcement of digital laws, including the Digital Services Act (DSA) and a new online safety code. The DSA has already prompted investigations, such as the European Commission’s probe into X last December, and Ireland’s new safety code will impose binding content moderation rules for video-sharing platforms with European headquarters in Ireland. These initiatives aim to curb the spread of harmful and illegal content on major social media platforms.
The Canadian government has ordered TikTok’s Canadian business to shut down, citing national security concerns over the app’s Chinese ownership. The decision, announced Wednesday, affects the operations of TikTok’s parent company, ByteDance, but does not block Canadians from accessing the app or creating content on it. According to Canadian Innovation Minister Francois-Philippe Champagne, the shutdown aims to address specific security risks posed by ByteDance’s activities in Canada.
This action comes after Canada’s year-long review of TikTok’s investment plans in the country. Canadian law allows the government to scrutinise foreign investments for potential risks, though details of these assessments are confidential. In response, TikTok has announced plans to contest the order in court, citing concerns about job losses for local employees impacted by the decision.
While Canada has already banned TikTok on government-issued devices, the shutdown of ByteDance’s Canadian operations reflects mounting pressure on TikTok in North America. The United States has set a January deadline for ByteDance to divest its US TikTok assets or face a ban. Both countries point to national security risks associated with TikTok’s ownership and data practices as key reasons for these measures
The Australian government has announced plans to introduce a ban on social media access for children under 16, with legislation expected to pass by late next year. Prime Minister Anthony Albanese described the move as part of a world-leading initiative to combat the harms social media inflicts on children, particularly the negative impact on their mental and physical health. He highlighted concerns over the influence of harmful body image content for girls and misogynistic material directed at boys.
Australia is also testing age-verification systems, such as biometrics and government ID, to ensure that children cannot access social media platforms. The new legislation will not allow exemptions, including for children with parental consent or those with pre-existing accounts. Social media platforms will be held responsible for preventing access to minors, rather than placing the burden on parents or children.
The proposed ban includes major platforms such as Meta’s Instagram and Facebook, TikTok, YouTube, and X (formerly Twitter). While some digital industry representatives, like the Digital Industry Group, have criticised the plan, arguing it could push young people toward unregulated parts of the internet, Australian officials stand by the measure, emphasising the need for strong protections against online harm.
This move positions Australia as a leader in regulating children’s access to social media, with no other country implementing such stringent age-verification methods. The new rules will be introduced into parliament this year and are set to take effect 12 months after ratification.
Seven families in France are suing TikTok, alleging that the platform’s algorithm exposed their teenage children to harmful content, leading to tragic consequences, including the suicides of two 15-year-olds. Filed at the Créteil judicial court, this grouped case seeks to hold TikTok accountable for what the families describe as dangerous content promoting self-harm, eating disorders, and suicide.
The families’ lawyer, Laure Boutron-Marmion, argues that TikTok, as a company offering its services to minors, must address its platform’s risks and shortcomings. She emphasised the need for TikTok’s legal liability to be recognised, especially given that its algorithm is often blamed for pushing disturbing content. TikTok, like Meta’s Facebook and Instagram, faces multiple lawsuits worldwide accusing these platforms of targeting minors in ways that harm their mental health.
TikTok has previously stated it is committed to protecting young users’ mental well-being and has invested in safety measures, according to CEO Shou Zi Chew’s remarks to US lawmakers earlier this year.
A new app called Loops is aiming to be the TikTok of the fediverse, an open-source social network ecosystem. Loops, which just opened for signups, will feature short, looping videos similar to TikTok’s format. Although still in development, the platform plans to be open-source and integrate with ActivityPub, the protocol that powers other federated apps like Mastodon and Pixelfed.
Loops is the latest project from Daniel Supernault, creator of Pixelfed, and will operate under the Pixelfed umbrella. Unlike mainstream social media, Loops promises not to sell user data to advertisers, nor will it use content to train AI models. Users will retain full ownership of their videos, granting Loops only limited permissions for use.
Like other fediverse platforms, Loops will rely on user donations for funding rather than investor support, with plans to accept contributions through Patreon and similar platforms. The app will also allow users on other federated networks, like Mastodon, to interact with Loops content seamlessly. Loops is currently seeking community input on its policies and looking for moderators to guide the platform’s early stages.
The consumer rights organisation, Brazil’s Collective Defense Institute, has launched two lawsuits against the Brazilian divisions of TikTok, Kwai, and Meta Platforms, seeking damages of 3 billion reais ($525 million). The lawsuits accuse these companies of neglecting to implement adequate protections to prevent young users from excessive social media use, which could harm children’s mental health.
The lawsuits highlight a growing debate over social media regulation in Brazil, especially after a high-profile legal dispute between Elon Musk’s X platform and a Brazilian Supreme Court justice led to significant fines. The consumer rights group is pushing for these platforms to establish clear data protection protocols and issue stronger warnings about the risks of social media addiction for minors.
Based on research into the effects of unregulated social media usage, particularly among teenagers, the lawsuits argue for urgent changes. Attorney Lillian Salgado, representing the plaintiffs, stressed the need for Brazil to adopt safety measures similar to those used in developed countries, including modifying algorithms, managing user data for those under 18, and enhancing account oversight for minors.
In response, Meta stated it has prioritised youth safety for over a decade, creating over 50 tools to protect teens. Meta also announced that a new ‘Teen Account’ feature on Instagram will soon launch in Brazil, automatically limiting what teenagers see and controlling who can contact them. TikTok said it had not received notice of the case, while Kwai emphasised that user safety, particularly for minors, is a primary focus.
ByteDance, the parent company of TikTok, has dismissed an intern for what it described as “maliciously interfering” with the training of one of its AI models. The Chinese tech giant clarified that while the intern, who was part of the advertising technology team, had no experience with ByteDance’s AI Lab, some reports circulating on social media and other platforms have exaggerated the incident’s impact.
ByteDance stated that the interference did not disrupt its commercial operations or its large language AI models. It also denied claims that the damage exceeded $10 million or affected an AI training system powered by thousands of graphics processing units (GPUs). The company highlighted that the intern was fired in August, and it has since notified their university and relevant industry bodies.
As one of the leading tech firms in AI development, ByteDance operates popular platforms like TikTok and Douyin. The company continues to invest heavily in AI, with applications including its Doubao chatbot and a text-to-video tool named Jimeng.
TikTok, owned by ByteDance, is cutting hundreds of jobs globally as it pivots towards greater use of AI in content moderation. Among the hardest hit is Malaysia, where fewer than 500 employees were affected, mostly involved in moderation roles. The layoffs come as TikTok seeks to improve the efficiency of its moderation system, relying more heavily on automated detection technologies.
The firm’s spokesperson explained that the move is part of a broader plan to optimise its global content moderation model, aiming for more streamlined operations. TikTok has announced plans to invest $2 billion in global trust and safety measures, with 80% of harmful content already being removed by AI.
The layoffs in Malaysia follow increased regulatory pressure on technology companies operating in the region. Malaysia’s government recently urged social media platforms, including TikTok, to enhance their monitoring systems and apply for operating licences to combat rising cybercrime.
ByteDance, which employs over 110,000 people worldwide, is expected to continue restructuring next month as it consolidates some of its regional operations. These changes highlight the company’s ongoing shift towards automation in its content management strategy.