TikTok faces legal challenges from 13 US states over youth safety concerns

TikTok is facing multiple lawsuits from 13 US states and the District of Columbia, accusing the platform of harming and failing to protect young users. The lawsuits, filed in New York, California, and other states, allege that TikTok uses intentionally addictive software to maximise user engagement and profits, particularly targeting children who lack the ability to set healthy boundaries around screen time.

California Attorney General Rob Bonta condemned TikTok for fostering social media addiction to boost corporate profits, while New York Attorney General Letitia James connected the platform to mental health issues among young users. Washington D.C. Attorney General Brian Schwalb further accused TikTok of operating an unlicensed money transmission service through its live streaming and virtual currency features and claimed that the platform enables the sexual exploitation of minors.

TikTok, in response, denied the allegations and expressed disappointment in the legal action taken, arguing that the states should collaborate on solutions instead. The company pointed to safety measures, such as screen time limits and privacy settings for users under 16.

These lawsuits are part of a broader set of legal challenges TikTok is facing, including a prior lawsuit from the U.S. Justice Department over children’s privacy violations. The company is also dealing with efforts to ban the app in the US due to concerns about its Chinese ownership.

Meta revamps Facebook to engage young adults

Facebook, once the go-to platform for connecting with family and friends, is shifting its focus to attract younger users, according to Tom Alison, head of Facebook at Meta. With younger generations favouring apps like Instagram and TikTok, Meta aims to revitalise Facebook by helping users expand their networks and make new connections, aligning with how young adults use the platform today.

To achieve this, Facebook is testing two new tabs, Local and Explore, aimed at helping users find nearby events, community groups, and content tailored to their interests. This initiative aligns with Meta’s efforts to compete with TikTok, which has 150 million US users, by introducing its short-form video feature, Reels, in 2021. Data reveals that young adults on Facebook spend 60% of their time watching videos, with over half engaging with Reels daily.

Facebook also reported a 24% increase in conversations initiated through its dating feature among young adults in the US and Canada. At a recent event in Austin, Texas, the platform promoted its new direction with the slogan ‘Not your mom’s Facebook,’ emphasising its push to attract a younger audience.

TikTok faces lawsuit in Texas over child privacy breach

Texas Attorney General Ken Paxton has filed a lawsuit against TikTok, accusing the platform of violating children’s privacy laws. The lawsuit alleges that TikTok shared personal information of minors without parental consent, in breach of Texas’s Securing Children Online through Parental Empowerment Act (SCOPE Act).

The legal action seeks an injunction and civil penalties, with fines up to $10,000 per violation. Paxton claims TikTok failed to provide adequate privacy tools for children and allowed data to be shared from accounts set to private. Targeted advertising to children was also a concern raised in the lawsuit.

TikTok’s parent company, ByteDance, is being held responsible for allegedly prioritising profits over child safety. Paxton stressed the importance of holding large tech companies accountable for their role in protecting minors online.

The case was filed in Galveston County court, with TikTok yet to comment on the matter. The lawsuit represents a broader concern about the protection of children’s online privacy in the digital age.

EU questions YouTube, TikTok, and Snapchat over algorithms

The European Commission has requested information from YouTube, Snapchat, and TikTok regarding the algorithms used to recommend content to users. Concerns have been raised about the influence of these systems on issues like elections, mental health, and protecting minors. The inquiry falls under the Digital Services Act (DSA), aiming to address potential systemic risks, including the spread of illegal content such as hate speech and drug promotion.

TikTok faces additional scrutiny about measures to prevent bad actors from manipulating the platform, especially during elections. These platforms must provide detailed information on their systems by 15 November. Failure to comply could result in further action, including potential fines.

The DSA mandates that major tech companies take more responsibility in tackling illegal and harmful content. In the past, the EU has initiated similar non-compliance proceedings with other tech giants like Meta, AliExpress, and TikTok over content regulation.

This latest request reflects the EU’s ongoing efforts to ensure greater accountability from social media platforms. The focus remains on protecting users and maintaining a fair and safe digital environment.

ByteDance moves towards Huawei chips for AI

ByteDance, the parent company of TikTok, is reportedly developing a new AI model using chips from Chinese tech giant Huawei. The move comes as US restrictions on advanced AI chips, such as those from Nvidia, have led the company to look for domestic alternatives. Sources suggest ByteDance will use Huawei’s Ascend 910B chip to power a new large-language AI model.

Huawei’s Ascend 910B chip has already been used by ByteDance for less demanding AI tasks, but training a new AI model requires a higher level of computational power. While ByteDance continues to order significant quantities of Huawei’s chips, supply shortages are reportedly slowing down their efforts, with only a fraction of the requested units received so far.

Industry experts say AI has become essential for a range of sectors, from gaming to e-commerce, where businesses are developing custom AI models to stay competitive. ByteDance’s decision to turn to Huawei reflects the increasing importance of AI, particularly as global supply chains face challenges.

Both ByteDance and Huawei have remained tight-lipped regarding specific details of this development. A spokesperson from ByteDance denied the existence of a new AI model in progress, while Huawei did not provide any comment on the situation.

TikTok faces legal battle over potential US ban

TikTok and its parent company ByteDance are locked in a high-stakes legal battle with the US government to prevent a looming ban on the app, used by 170 million Americans. The legal confrontation revolves around a US law that mandates ByteDance divest its US assets by 19 January or face a complete ban. Lawyers for TikTok argue that the law violates free speech and is an unprecedented move that contradicts America’s tradition of fostering an open internet. A federal appeals court in Washington recently heard arguments from both sides, with TikTok’s legal team pushing for an injunction to halt the law’s implementation.

The US government, represented by the Justice Department, contends that TikTok’s Chinese ownership poses a significant national security threat, citing the potential for China to access American user data or manipulate the flow of information. This concern is at the core of the new legislation passed by Congress earlier this year, highlighting the risks of having a popular social media platform under foreign control. The White House, while supportive of curbing Chinese influence, has stopped short of advocating for an outright ban.

ByteDance maintains that divesting TikTok is neither technologically nor commercially feasible, casting uncertainty over the app’s future as it faces potentially severe consequences amid a politically charged environment.

The case comes at a pivotal moment in the US political landscape, with both presidential candidates, Donald Trump and Kamala Harris, actively using TikTok to engage younger voters. The judges expressed concerns over the complexities involved, especially with monitoring the massive codebase that powers TikTok, making it difficult to assess risks in real-time. As the legal wrangling continues, a ruling is expected by 6 December, and the case may eventually reach the US Supreme Court.

Legal showdown could decide TikTok ban in US

TikTok is facing a critical legal battle that could determine the future of the app in the US. On Monday, the US Court of Appeals in Washington, DC, will hear arguments from TikTok and its parent company, ByteDance, as they seek to block a new law that threatens to ban the app by 19 January 2024. With around 170 million US users, TikTok’s fate hangs in the balance just as the presidential election ramps up.

Donald Trump, the Republican candidate, and Vice President Kamala Harris are using TikTok to engage with younger voters, underscoring the app’s significant political and social influence. However, the US government remains concerned about national security risks, particularly the potential for China to access American user data through the app. Lawmakers passed the measure, calling for ByteDance to divest from TikTok, citing fears of surveillance.

ByteDance argues that the law violates free speech and insists that divesting from TikTok is not feasible. With a looming January deadline for a sale or a potential ban, TikTok’s legal team is seeking a ruling by early December. This would allow the US Supreme Court time to consider the case before any decision takes effect. President Joe Biden, who signed the law in April, holds the power to extend the deadline if ByteDance shows progress toward selling TikTok.

While the White House maintains that the move is about national security, not eliminating TikTok, the upcoming court ruling will be pivotal in shaping the app’s future in the US and possibly beyond.

Nepal lifts TikTok ban after ten months

The Nepalese government has lifted the ban on TikTok after nearly ten months, following a cabinet meeting on 22 August 2024. This decision came after discussions with ByteDance representatives, who agreed to several conditions for TikTok’s operation in Nepal. These conditions include registering as a business, appointing a local contact, promoting tourism, supporting digital literacy, and moderating content in Nepali languages.

The Nepal Telecommunications Authority (NTA) has directed all Internet Service Providers (ISPs) to lift the ban, citing Section 15 of the Telecommunications Act. TikTok has three months to meet the government’s conditions and will collaborate with local authorities to ensure compliance with the new regulations.

The ban was initially imposed in November 2023 due to concerns about social harmony and inappropriate content, leading to criticism regarding freedom of expression. The recent decision to lift the ban has been positively received by TikTok, which is committed to fostering creativity and free expression among Nepali users, reflecting a balance between regulation and digital innovation.

TikTok faces lawsuit over viral challenge death

A US appeals court has recently revived a lawsuit against TikTok, filed by the mother of a 10-year-old girl who tragically died after participating in a dangerous viral challenge on the platform. The blackout challenge, which involved users choking themselves until they lost consciousness, led to the death of Nylah Anderson in 2021.

The case hinges on the argument that TikTok’s algorithm recommended the harmful challenge to Nylah despite federal protections typically shielding internet companies from liability for user-generated content. The 3rd US Circuit Court of Appeals in Philadelphia ruled that Section 230 of the Communications Decency Act, which generally protects online platforms from such lawsuits, does not apply to algorithmic recommendations made by the company itself.

Judge Patty Shwartz, writing for the panel, explained that while Section 230 covers third-party content, it does not extend to the platform’s content curation decisions. This ruling marks a substantial shift from previous cases where courts had upheld Section 230 to shield platforms from liability related to harmful user-generated content.

The court’s decision reflects a broader interpretation of a recent US Supreme Court ruling, which recognised that algorithms used by platforms represent editorial judgments by the companies themselves. According to this view, TikTok’s algorithm-driven recommendations are considered the company’s speech, not protected by Section 230.

The lawsuit, brought by Tawainna Anderson against TikTok and its parent company ByteDance, was initially dismissed by a lower court. Still, the appeals court has now allowed the case to proceed. Anderson’s lawyer, Jeffrey Goodman, hailed the ruling as a loss for Big Tech’s immunity protections. Meanwhile, Judge Paul Matey criticised TikTok for prioritising profits over safety, underscoring that the platform cannot claim immunity beyond what Congress has granted.

TikTok faces new challenges as key leader exits

Nicole Lacopetti, TikTok’s head of content strategy and policy, is set to leave the company in September, marking a significant change in the platform’s leadership. Her departure follows the earlier exit of former COO V Pappas and the ongoing reorganisation led by current COO Adam Presser.

TikTok’s strategy is evolving as the platform grows, aiming to cater to an older audience. According to industry insights, content is becoming more complex and engaging, with a notable trend toward interactive elements like online games, which have gained popularity among users over 30.

The platform has faced severe scrutiny from US lawmakers, who have raised concerns over data privacy and its connections to China, leading to discussions of a potential ban. Despite these challenges, TikTok remains a powerful tool for reaching younger audiences, particularly in the political sphere, where it engages younger voters.

As TikTok navigates these changes, the platform’s influence in the political landscape is expected to grow, with the next US president needing to acknowledge its power in connecting with voters more personally and dynamically.