Silent album released to challenge UK AI copyright reforms

More than 1,000 musicians have joined forces to release a silent album as part of a protest against the UK government’s proposed changes to copyright laws. The changes would allow AI companies to use artists’ work to train models without needing permission, a move critics argue would undermine creators’ rights. The silent album, titled ‘Is This What We Want?’, features empty studios and performance spaces, symbolising the potential loss of control over their work.

The changes have sparked outrage from high-profile artists such as Kate Bush, who warned that this could lead to the exploitation of musicians by tech companies. The protest album, which includes contributions from other major artists like Ed Sheeran and Dua Lipa, aims to highlight the negative impact of such reforms on the livelihoods of creators.

The UK government argues that these changes will help boost the AI and creative industries, allowing them to reach their full potential. However, the controversy over copyright law is growing, with many in the music industry urging a rethink before any new regulations are finalised.

For more information on these topics, visit diplomacy.edu.

Bluesky teams up with IWF to tackle harmful content

Bluesky, the rapidly growing decentralised social media platform, has partnered with the UK-based Internet Watch Foundation (IWF) to combat the spread of child sexual abuse material (CSAM). As part of the collaboration, Bluesky will gain access to the IWF’s tools, which include a list of websites containing CSAM and a catalogue of digital fingerprints, or ‘hashes,’ that identify abusive images. This partnership aims to reduce the risk of users encountering illegal content while helping to keep the platform safe from such material.

Bluesky’s head of trust and safety, Aaron Rodericks, welcomed the partnership as a significant step in protecting users from harmful content. With the platform’s rapid growth—reaching over 30 million users by the end of last month—the move comes at a crucial time. In November, Bluesky announced plans to expand its moderation team to address the rise in harmful material following the influx of new users.

The partnership also highlights the growing concern over online child sexual abuse material. The IWF reported record levels of harmful content last year, with over 291,000 web pages removed from the internet. The foundation’s CEO, Derek Ray-Hill, stressed the urgency of tackling the crisis, calling for a collective effort from governments, tech companies, and society.

For more information on these topics, visit diplomacy.edu.

UK users face reduced cloud security as Apple responds to government pressure

Apple has withdrawn its Advanced Data Protection (ADP) feature for cloud backups in Britain, citing government requirements.

Users attempting to enable the encryption service now receive an error message, while existing users will eventually have to deactivate it. The move weakens iCloud security in the country, allowing authorities access to data that would otherwise be encrypted.

Experts warn that the change compromises user privacy and exposes data to potential cyber threats. Apple has insisted it will not create a backdoor for encrypted services, as doing so would increase security risks.

The UK government has not confirmed whether it issued a Technical Capability Notice, which could mandate such access.

Apple’s decision highlights ongoing tensions between tech companies and governments over encryption policies. Similar legal frameworks exist in countries like Australia, raising concerns that other nations could follow suit.

Security advocates argue that strong encryption is essential for protecting user privacy and safeguarding sensitive information from cybercriminals.

For more information on these topics, visit diplomacy.edu.

Apple rejects UK plans for mobile browser controls

Apple has pushed back against proposed remedies from the UK’s competition watchdog, arguing they could hinder innovation in the mobile browser market. The Competition and Markets Authority (CMA) is investigating Apple and Google’s dominance in browser engines and cloud gaming distribution through app stores, with potential regulatory measures under consideration.

In its response, Apple stated that mandating free access to future WebKit updates or iOS features used by Safari would be unfair, given the significant resources required to develop them. The company warned this could lead to ‘free-riding’ by third parties and discourage further investment in browser technologies.

The UK CMA’s investigation aims to increase competition in the mobile browser space, where Apple’s WebKit engine is a key player. However, Apple insists that the proposed changes would harm its ability to innovate and could ultimately reduce the quality of browser experiences for users. The regulator is expected to continue assessing industry feedback before making a final decision.

For more information on these topics, visit diplomacy.edu.

Former GCHQ chief calls for transparency amid UK’s attempt to access encrypted iCloud accounts

A controversy has emerged over the British government’s reported attempt to compel Apple to grant authorities access to encrypted iCloud accounts, leading to calls for increased transparency from intelligence agencies. Sir Jeremy Fleming, the former head of the UK’s GCHQ from 2017 to 2023, addressed this issue at the Munich Cyber Security Conference, highlighting the need for public understanding and trust in intelligence operations. He emphasised that an agency’s ‘license to operate’ should be grounded in transparency.

The UK government has contested the description of a ‘back door’ in relation to the notice, clarifying that it seeks to ensure Apple maintains the capability to provide iCloud data in response to lawful warrants, a function that existed prior to the introduction of end-to-end encryption for iCloud in December 2022.

Since 2020, Apple has provided iCloud data to UK authorities in response to four of more than 6,000 legal requests for customer information under non-IPA laws. However, this data excludes requests made under the Investigatory Powers Act (IPA), the UK’s primary law for accessing tech company data.

Fleming emphasised the importance of intelligence agencies providing clear explanations of their operations, particularly in relation to new technologies. He pointed out the need for a better understanding of how intelligence agencies operate in practice, particularly as technological advancements change their methods.

For more information on these topics, visit diplomacy.edu.

Study warns of AI’s role in fueling bank runs

A new study from the UK has raised concerns about the risks of bank runs fueled by AI-generated fake news spread on social media. The research, published by Say No to Disinfo and Fenimore Harper, highlights how generative AI can create false stories or memes suggesting that bank deposits are at risk, leading to panic withdrawals. The study found that a significant portion of UK bank customers would consider moving their money after seeing such disinformation, especially with the speed at which funds can be transferred through online banking.

The issue is gaining traction globally, with regulators and banks worried about the growing role of AI in spreading malicious content. Following the collapse of Silicon Valley Bank in 2023, which saw $42 billion in withdrawals within a day, financial institutions are increasingly focused on detecting disinformation that could trigger similar crises. The study estimates that a small investment in social media ads promoting fake content could cause millions in deposit withdrawals.

The report calls for banks to enhance their monitoring systems, integrating social media tracking with withdrawal monitoring to better identify when disinformation is impacting customer behaviour. Revolut, a UK fintech, has already implemented real-time monitoring for emerging threats, urging financial institutions to be prepared for potential risks. While banks remain optimistic about AI’s potential, the financial stability challenges it poses are still a growing concern for regulators.

As financial institutions work to mitigate AI-related risks, the broader industry is also grappling with how to balance the benefits of AI with the threats it may pose. UK Finance, the industry body, emphasised that banks are making efforts to manage these risks, while regulators continue to monitor the situation closely.

For more information on these topics, visit diplomacy.edu.

Anthropic’s Claude tested as UK explores AI chatbot for public services

The UK government has partnered with AI startup Anthropic to explore the use of its chatbot, Claude, in public services. The collaboration aims to improve access to public information and streamline interactions for citizens.

Anthropic, a competitor of ChatGPT creator OpenAI and supported by tech giants Google and Amazon, signed a memorandum of understanding with the government.

The initiative aligns with Prime Minister Keir Starmer’s ambition to establish the UK as a leader in AI and enhance public service efficiency through innovative technologies.

Technology minister Peter Kyle highlighted the importance of this partnership, emphasising its role in positioning the UK as a hub for advanced AI development.

Claude has already been employed by the European Parliament to simplify access to its archives, demonstrating its potential in reducing time for document retrieval and analysis.

This step underscores Britain’s commitment to leveraging cutting-edge AI for the benefit of individuals and businesses nationwide.

For more information on these topics, visit diplomacy.edu.

AI development is outpacing our understanding, says expert

Dario Amodei, CEO of AI firm Anthropic, has warned that the race to develop AI is moving faster than efforts to fully understand it. Speaking at an event in Paris, he stressed the need for deeper research into AI models, describing it as a race between expanding capabilities and improving transparency. ‘We can’t slow down development, but our understanding must match our ability to build,’ he said.

Amodei rejected the notion that AI safety measures hinder progress, arguing instead that they help refine and improve models. He pointed to earlier discussions at the UK’s Bletchley Summit, where risk assessment strategies were introduced, and insisted they had not slowed technological growth. ‘Better testing and measurement actually lead to better models,’ he said.

The Anthropic CEO also discussed the evolving AI market, including competition from Chinese firm DeepSeek, whose claims of dramatically lower training costs he dismissed as ‘not based on facts.’ Looking ahead, he hinted at upcoming improvements in AI reasoning, with a focus on creating more seamless transitions between different types of models. He remains optimistic, predicting that AI will drive innovation across industries, from healthcare to finance and energy.

For more information on these topics, visit diplomacy.edu.

Apple granted UK authorities iCloud data in just 4 of 6,000 requests since 2020—excluding Investigatory Powers Act cases

Since 2020, Apple has provided iCloud data to UK authorities in response to four of more than 6,000 legal requests for customer information under non-IPA laws. This data excludes requests made under the Investigatory Powers Act (IPA), the UK’s primary law for accessing tech company data.

From January 2020 to June 2023, Apple received between 0 and 499 IPA-related requests in the first half of 2023, reported in bands of 500. Due to legal limitations, Apple cannot disclose details about these requests.

Earlier reporting linked the low number of content disclosures to efforts by the UK government to force Apple to provide encrypted iCloud data. However, due to the data’s lack of detail, no direct connection can be made.

The UK government previously stated that it has made over 10,000 requests to US companies since the US-UK Data Access Agreement began, providing crucial data for law enforcement in cases related to terrorism, organized crime, and other serious offenses.

Apple’s transparency reports suggest that content data is shared more frequently in other countries, such as the US, where it responded to 22,306 requests in 2020-2023. In comparison, most countries see lower content disclosures due to restrictions on sharing with foreign governments.

The British government’s Technical Capability Notice (TCN), revealed by The Washington Post, follows Apple’s 2022 introduction of optional end-to-end encryption (E2EE) for iCloud. While the UK government did not characterise it as such, critics see the TCN as a potential ‘back door’ to Apple’s encrypted data. Apple has declined comment, while the UK government refrains from discussing operational matters.

The controversy reflects ongoing debates about the balance between encryption, privacy, and law enforcement access to encrypted data.

Motorola loses appeal over UK emergency services contract

Motorola has been denied permission to appeal against the UK competition regulator’s ruling that it was making excessive profits from its contract to provide communications for Britain’s emergency services. The Court of Appeal unanimously dismissed the company’s application, upholding the Competition and Markets Authority’s (CMA) decision to impose a price cap on Motorola’s Airwave network.

The CMA introduced the cap in July 2023, reducing the cost of the Airwave service to reflect a competitive market, cutting an estimated £200 million in annual charges. Motorola had previously challenged the regulator’s findings at a tribunal but was unsuccessful. CMA Executive Director George Lusty welcomed the court’s decision, stating it ensures fair pricing for emergency services and marks the end of the legal dispute.

A Motorola spokesperson defended the company’s role, emphasising that Airwave remains essential for UK public safety communications. Despite disagreeing with the CMA’s ruling, Motorola said it is focused on continuing to provide high-quality emergency communication services.