Meta Platforms is facing a lawsuit in Massachusetts for allegedly designing Instagram features to exploit teenagers’ vulnerabilities, causing addiction and harming their mental health. A Suffolk County judge rejected Meta’s attempt to dismiss the case, asserting that claims under state consumer protection law remain valid.
The company argued for immunity under Section 230 of the Communications Decency Act, which shields internet firms from liability for user-generated content. However, the judge ruled that this protection does not extend to Meta’s own business conduct or misleading statements about Instagram’s safety measures.
Massachusetts Attorney General Andrea Joy Campbell emphasised that the ruling allows the state to push for accountability and meaningful changes to safeguard young users. Meta expressed disagreement, maintaining that its efforts demonstrate a commitment to supporting young people.
The lawsuit highlights internal data suggesting Instagram’s addictive design, driven by features like push notifications and endless scrolling. It also claims Meta executives, including CEO Mark Zuckerberg, dismissed concerns raised by research indicating the need for changes to improve teenage users’ well-being.
US federal prosecutors are ramping up efforts to tackle the use of AI tools in creating child sexual abuse images, as they fear the technology could lead to a rise in illegal content. The Justice Department has already pursued two cases this year against individuals accused of using generative AI to produce explicit images of minors. James Silver, chief of the Department’s Computer Crime and Intellectual Property Section, anticipates more cases, cautioning against the normalisation of AI-generated abuse material.
Child safety advocates and prosecutors worry that AI systems can alter ordinary photos of children to produce abusive content, making it more challenging to identify and protect actual victims. The National Center for Missing and Exploited Children reports approximately 450 cases each month involving AI-generated abuse. While this number is small compared to the millions of online child exploitation reports received, it represents a concerning trend in the misuse of technology.
The legal framework is still evolving regarding cases involving AI-generated abuse, particularly when identifiable children are not depicted. Prosecutors are resorting to obscenity charges when traditional child pornography laws do not apply. This is evident in the case of Steven Anderegg, accused of using Stable Diffusion to create explicit images. Similarly, US Army soldier Seth Herrera faces child pornography charges for allegedly using AI chatbots to alter innocent photos into abusive content. Both defendants have pleaded not guilty.
Nonprofit groups like Thorn and All Tech Is Human are working with major tech companies, including Google, Amazon, Meta, OpenAI, and Stability AI, to prevent AI models from generating abusive content and to monitor their platforms. Thorn’s vice president, Rebecca Portnoff, emphasised that the issue is not just a future risk but a current problem, urging action during this critical period to prevent its escalation.
Republican presidential candidate Donald Trump revealed that he spoke with Apple CEO Tim Cook about the financial penalties imposed on the tech giant by the European Union. Trump claimed that Cook informed him about a recent $15 billion fine from the EU, along with an additional $2 billion penalty, although Apple has not confirmed the details of the call.
The EU is investigating major tech companies to limit their influence and promote fair competition for smaller businesses. Recently, Apple encountered major challenges, including a court ruling that required the company to pay about $14 billion in back taxes to Ireland. Additionally, Apple was hit with a $2 billion antitrust fine for allegedly restricting competition in the music streaming sector via its App Store.
During the podcast with Patrick Bet-David, Trump expressed his commitment to protect American companies from what he described as unfair treatment. He stated, ‘Tim, I got to get elected first. But I’m not going to let them take advantage of our companies.’ Trump and Democrat Kamala Harris are currently in a tight race for the 5 November presidential election.
X (formerly Twitter), has updated its terms of service, requiring users to file any lawsuits against the company in Texas’ Northern District in the US, a court known for conservative rulings. This change, effective November 15, appears to align with Musk’s increasing support for conservative causes, including backing Donald Trump’s 2024 presidential campaign. Critics argue the move is an attempt to ‘judge-shop,’ as the Northern District has become a popular destination for right-leaning litigants seeking to block parts of President Biden’s agenda.
X’s headquarters are in Bastrop, Texas, located in the Western District, but the company has chosen the Northern District for legal disputes. This district already hosts two lawsuits filed by X, including one against Media Matters after the watchdog group published a report linking ads on the platform to posts promoting Nazism. The move to steer legal cases to this specific court highlights the company’s efforts to benefit from a legal environment more favorable to conservative causes.
A 25-year-old man from Alabama has been arrested for hacking the US Securities and Exchange Commission’s X account in a scheme to manipulate Bitcoin prices. The incident, which occurred in January, involved a false post on the SEC’s account claiming the approval of Bitcoin exchange-traded funds, briefly causing Bitcoin’s price to rise by $1,000. The SEC swiftly deleted the post and denied the message, but the hack sparked criticism over security vulnerabilities on X.
The suspect, Eric Council Jr., used a SIM-swapping technique to access the account and later received Bitcoin as payment for his involvement in the hack. Following the incident, he reportedly searched online for information on how to avoid FBI detection. Council now faces charges of conspiracy to commit aggravated identity theft and access device fraud.
The SEC expressed its gratitude to law enforcement for their prompt action in the case, while the incident reignited concerns over the security of social media platforms, particularly since X’s acquisition by Elon Musk.
The European Commission has determined that X, Elon Musk’s social media platform, does not qualify as a ‘gatekeeper’ under the Digital Markets Act (DMA), exempting it from additional compliance obligations. The Commission’s decision follows a May investigation initiated after X asserted it was not a key intermediary between businesses and consumers. While X meets user thresholds and turnover criteria, the Commission clarified that it does not significantly connect business users with end consumers.
Under the DMA, which took effect in 2023, companies must have at least 45 million end users and 10,000 business users in Europe, along with an annual turnover of €7.5 billion over the last three years, to be classified as gatekeepers. Major tech firms like Google, Amazon, Apple, Meta, Microsoft, and TikTok’s parent company ByteDance have already received gatekeeper status, imposing on them strict regulations to ensure fair competition and consumer choice.
Apple has faced penalties under the DMA, with the European Commission ruling in June that its App Store practices violated the regulations. While several companies, including Apple and Meta, have appealed their gatekeeper designations, X remains unaffected by these rules for now. This decision allows X more operational flexibility compared to its competitors, although it indicates that the Commission is closely monitoring the interactions between large platforms, businesses, and consumers in the digital marketplace.
The US Department of Justice (DOJ) has released a significant Statement of Interest, urging scrutiny of surveys and information exchanges managed by trade associations. The DOJ expressed concerns that such exchanges may create unique risks to competition, particularly when competitors share sensitive information exclusively among themselves.
According to the DOJ, antitrust laws will evaluate the context of any information exchange to determine its potential impact on competition. Sharing competitively sensitive information could disproportionately benefit participating companies at the expense of consumers, workers, and other stakeholders. The department noted that advancements in AI technology have intensified these concerns, allowing large amounts of detailed information to be exchanged quickly, potentially heightening the risk of anticompetitive behaviour.
This guidance follows the DOJ’s withdrawal of long-standing rules that established “safety zones” for information exchanges, which previously indicated that certain types of sharing were presumed lawful. By retracting this guidance, the DOJ signals a shift toward a more cautious, case-by-case approach, urging businesses to prioritise proactive risk management.
The DOJ’s statement, made in relation to an antitrust case in the pork industry, has wider implications for various sectors, including real estate. It highlights the need for organisations, such as Multiple Listing Services (MLS) and trade associations, to evaluate their practices and avoid environments that could lead to price-fixing or other anticompetitive behaviours. The DOJ encourages trade association executives to review their information-sharing protocols, educate members on legal risks, and monitor practices to ensure compliance with antitrust laws.
James Howells, a software engineer from Wales, has taken legal action against Newport City Council to recover a hard drive containing around 8,000 Bitcoin. The hard drive, which was accidentally discarded, is now worth approximately $514 million.
Howells has been repeatedly denied permission to excavate the landfill where the drive is believed to be located. In response, he filed a lawsuit seeking damages of £495 million, aiming to pressure the council into allowing the search. Howells has offered the council 10% of the recovered Bitcoin’s value if successful.
Despite these efforts, Newport Council remains firm in its refusal, citing potential environmental risks, and has dismissed the lawsuit as weak. The case is expected to be heard in December.
The US Supreme Court declined to hear an appeal from Uber Technologies Inc. and its subsidiary Postmates regarding California’s Assembly Bill 5 (AB5), effectively upholding a lower court ruling that mandates stricter worker classification standards. AB5 requires companies to classify their drivers as employees instead of independent contractors, which would significantly increase labour costs for these companies.
The Supreme Court’s decision upholds a ruling from the 9th US Circuit Court of Appeals, which determined that Uber and Postmates failed to demonstrate that AB5 unfairly targeted their services while exempting other industries. Although California voters approved Proposition 22 in 2020, allowing gig economy companies to classify drivers as independent contractors, this measure does not completely exempt them from AB5’s requirements. Recently, the California Supreme Court upheld Proposition 22, rejecting labour union claims that it violated the state constitution.
Theane Evangelis, an attorney for Uber, reiterated the company’s position, stating that Proposition 22 ensures drivers retain independence while receiving certain benefits. Critics argue that classifying workers as independent contractors allows companies to avoid providing essential protections, such as minimum wage and overtime pay. As debates over gig worker classification continue, the US Department of Labor has proposed a federal rule to tighten criteria for independent contractor status, which is also being challenged in court by business groups.
Coinbase has filed a motion seeking partial summary judgment in its ongoing legal battle against the US Securities and Exchange Commission (SEC). The cryptocurrency exchange aims to access internal SEC documents, hoping to gain insight into the regulator’s approach toward the crypto industry. This stems from the SEC’s decision to deny requests under the Freedom of Information Act (FOIA) for crucial records on its enforcement strategies.
Coinbase, through History Associates, has been attempting to understand the SEC’s stance on digital assets, especially concerning the regulation of cryptocurrencies as securities. The SEC initially withheld documents under law enforcement exemptions but later acknowledged that these protections might no longer apply. Despite this, the regulator has delayed the document review process for three years, which Coinbase argues is unwarranted.
This motion is part of Coinbase’s broader efforts to challenge the SEC’s regulatory approach to the crypto sector, which many believe lacks clear guidelines. The case highlights the need for transparency regarding how the SEC enforces securities laws in the rapidly growing digital asset space.