UK users face reduced cloud security as Apple responds to government pressure

Apple has withdrawn its Advanced Data Protection (ADP) feature for cloud backups in Britain, citing government requirements.

Users attempting to enable the encryption service now receive an error message, while existing users will eventually have to deactivate it. The move weakens iCloud security in the country, allowing authorities access to data that would otherwise be encrypted.

Experts warn that the change compromises user privacy and exposes data to potential cyber threats. Apple has insisted it will not create a backdoor for encrypted services, as doing so would increase security risks.

The UK government has not confirmed whether it issued a Technical Capability Notice, which could mandate such access.

Apple’s decision highlights ongoing tensions between tech companies and governments over encryption policies. Similar legal frameworks exist in countries like Australia, raising concerns that other nations could follow suit.

Security advocates argue that strong encryption is essential for protecting user privacy and safeguarding sensitive information from cybercriminals.

For more information on these topics, visit diplomacy.edu.

Quantum computing could render today’s encryption obsolete

The rise of quantum computing poses a serious threat to modern encryption systems, with experts warning that critical digital infrastructure could become vulnerable once quantum devices reach sufficient power.

Unlike classical computers that process binary bits, quantum computers use qubits, allowing them to perform vast numbers of calculations simultaneously.

This capability could make breaking widely used encryption methods, like RSA, possible in minutes—something that would take today’s computers thousands of years.

Although quantum systems powerful enough to crack encryption may still be years away, there is growing concern that hackers could already be collecting encrypted data to decode it once the technology catches up.

Sensitive information—such as national security data, intellectual property, and personal records—could be at risk. In response, the US National Institute of Standards and Technology has introduced new post-quantum encryption standards and is encouraging organisations to transition swiftly, though the scale of the upgrade needed across global infrastructure remains immense.

Updating web browsers and modern devices may be straightforward, but older systems, critical infrastructure, and the growing number of Internet of Things (IoT) devices pose significant challenges.

Satellites, for instance, vary in how easily they can be upgraded, with remote sensing satellites often requiring full replacements. Cybersecurity experts stress the need for ‘crypto agility’ to make the transition manageable, aiming to avoid a chaotic scramble once quantum threats materialise.

For more information on these topics, visit diplomacy.edu.

New Microsoft’s quantum chip sparks fresh debate over Bitcoin’s security

According to Bitcoin exchange River, Microsoft’s latest quantum computing chip, Majorana 1, could accelerate the timeline for making Bitcoin resistant to quantum threats. While the risk of a quantum attack remains distant, experts warn that preparations must begin now. The chip, launched on 19 February, is part of a growing race in quantum technology, with Google’s Willow chip also making headlines in December.

River suggests that if quantum computers reach one million qubits by 2027-2029, they could crack Bitcoin addresses in long-range attacks. Though some argue such a scenario is still decades away, River insists early action is key. The potential threat has reignited discussions on BIP-360, a proposed upgrade to strengthen Bitcoin’s defences against future quantum advancements.

Critics remain sceptical, arguing that quantum computing is still in its infancy, with major technical challenges to overcome. Some believe traditional banking systems, which hold far greater assets than Bitcoin, would be targeted first. Others see quantum developments as an opportunity, suggesting they could help fortify Bitcoin’s security rather than weaken it.

For more information on these topics, visit diplomacy.edu.

Microsoft unveils groundbreaking quantum computing chip

Microsoft has announced a groundbreaking quantum computing chip, Majorana 1, which it claims could make useful quantum computers a reality within years. The company believes this innovation puts it ahead in the race to unlock quantum computing’s vast potential.

Unlike classical computers, quantum systems could perform calculations in fields like medicine and chemistry that would otherwise take millions of years, although they also pose risks to current encryption standards.

The Majorana 1 chip relies on a particle called the Majorana fermion, theorised in the 1930s. Microsoft says its unique design makes the chip less error-prone than its competitors.

Despite having fewer qubits than chips from Google and IBM, the company argues that the lower error rates mean fewer qubits are needed for practical applications.

Microsoft’s development of Majorana 1 combines advanced materials like indium arsenide and aluminium, using a superconducting nanowire to observe and control the Majorana particles.

Fabricated at its labs in Washington and Denmark, the chip was described as a ‘high risk, high reward’ endeavour by Jason Zander, a senior Microsoft executive.

Quantum physicist Philip Kim from Harvard University praised the innovation, calling it an exciting step forward. While scaling up the technology remains a challenge, experts suggest Microsoft’s approach could lead to significant advancements in quantum computing.

For more information on these topics, visit diplomacy.edu.

Europol chief warns trust in law enforcement at risk

Law enforcement agencies must ensure public understanding of the need for expanded investigative powers to effectively combat the increasing scale and complexity of cybercrime, Europol’s chief Catherine De Bolle stated at the Munich Cyber Security Conference.

De Bolle emphasised that cybercriminal activity is not only growing in volume but also evolving in sophistication, leveraging both traditional telecom infrastructure and advanced digital tools, including dark web marketplaces. In response, she underscored the necessity for law enforcement agencies to strengthen their technical capabilities. However, she noted that implementing large-scale investigative measures must be balanced with maintaining public confidence in state institutions.

Her remarks followed those of Sir Jeremy Fleming, former director of the UK’s cyber intelligence agency GCHQ, who spoke about the importance of maintaining public trust in intelligence operations.

De Bolle further stressed the need for stronger collaboration between government agencies, private sector entities, and international organisations to address cyber threats effectively. As cybercrime and state-sponsored cyber activities increasingly overlap, she advocated for a shift away from fragmented approaches, calling for ‘multilateral responses’ to improve collective cybersecurity readiness.

For more information on these topics, visit diplomacy.edu.

Former GCHQ chief calls for transparency amid UK’s attempt to access encrypted iCloud accounts

A controversy has emerged over the British government’s reported attempt to compel Apple to grant authorities access to encrypted iCloud accounts, leading to calls for increased transparency from intelligence agencies. Sir Jeremy Fleming, the former head of the UK’s GCHQ from 2017 to 2023, addressed this issue at the Munich Cyber Security Conference, highlighting the need for public understanding and trust in intelligence operations. He emphasised that an agency’s ‘license to operate’ should be grounded in transparency.

The UK government has contested the description of a ‘back door’ in relation to the notice, clarifying that it seeks to ensure Apple maintains the capability to provide iCloud data in response to lawful warrants, a function that existed prior to the introduction of end-to-end encryption for iCloud in December 2022.

Since 2020, Apple has provided iCloud data to UK authorities in response to four of more than 6,000 legal requests for customer information under non-IPA laws. However, this data excludes requests made under the Investigatory Powers Act (IPA), the UK’s primary law for accessing tech company data.

Fleming emphasised the importance of intelligence agencies providing clear explanations of their operations, particularly in relation to new technologies. He pointed out the need for a better understanding of how intelligence agencies operate in practice, particularly as technological advancements change their methods.

For more information on these topics, visit diplomacy.edu.

Apple granted UK authorities iCloud data in just 4 of 6,000 requests since 2020—excluding Investigatory Powers Act cases

Since 2020, Apple has provided iCloud data to UK authorities in response to four of more than 6,000 legal requests for customer information under non-IPA laws. This data excludes requests made under the Investigatory Powers Act (IPA), the UK’s primary law for accessing tech company data.

From January 2020 to June 2023, Apple received between 0 and 499 IPA-related requests in the first half of 2023, reported in bands of 500. Due to legal limitations, Apple cannot disclose details about these requests.

Earlier reporting linked the low number of content disclosures to efforts by the UK government to force Apple to provide encrypted iCloud data. However, due to the data’s lack of detail, no direct connection can be made.

The UK government previously stated that it has made over 10,000 requests to US companies since the US-UK Data Access Agreement began, providing crucial data for law enforcement in cases related to terrorism, organized crime, and other serious offenses.

Apple’s transparency reports suggest that content data is shared more frequently in other countries, such as the US, where it responded to 22,306 requests in 2020-2023. In comparison, most countries see lower content disclosures due to restrictions on sharing with foreign governments.

The British government’s Technical Capability Notice (TCN), revealed by The Washington Post, follows Apple’s 2022 introduction of optional end-to-end encryption (E2EE) for iCloud. While the UK government did not characterise it as such, critics see the TCN as a potential ‘back door’ to Apple’s encrypted data. Apple has declined comment, while the UK government refrains from discussing operational matters.

The controversy reflects ongoing debates about the balance between encryption, privacy, and law enforcement access to encrypted data.

Data Protection Day 2025: A new mandate for data protection

This analysis will be a detailed summary of Data Protection Day, providing the most relevant aspects from each session. The event welcomed people to Brussels, as well as virtually, to celebrate Data Protection Day 2025 together.

Filled with a tight schedule, the event programme kicked off with opening remarks by the Secretary General of the European Data Protection Supervisor (EDPS), followed by a day of panels, speeches and side sessions from the brightest of minds in the data protection field.

Keynote speech by Leonardo Cervera Navas

Given the recent political turmoil in the EU, specifically the repealing of the Romanian elections a few months ago, it was no surprise that the first keynote speech addressed how algorithms are used to destabilise democracies and threaten them. Navas explained how third-country algorithms are used against EU democracies to target their values.

He then went on to discuss how there is a big power imbalance when certain wealthy people with their companies dominate the tech world and end up violating our privacy. However, he turned towards a hopeful future when he spoke about how the crisis in Europe is making us Europeans stronger. ‘Our values are what unite us, and part of them are the data protection values the EDPB strongly upholds’, he emphasised.

He acknowledged the evident overlap of rules and regulations between different legal instruments but also highlighted the creation of tools that can help uphold our privacy, such as the Digital Clearing House 2.0.

Organiser’s panel moderated by Kait Bolongaro

This panel discussed a wide variety of data protection topics, such as the developments on the ground, how international cooperation played a role in the fight against privacy violations, and what each panellist’s priorities were for the upcoming years. That last question was especially interesting to hear given the professional affiliations of each panellist.

What is interesting about these panels, is the fact that the organisers spent a lot of time curating a diverse panel. They had people from academia, private industry, public bodies, and even the EDPS. This ensures that a panel’s topic is discussed from more than one point of view, which is much more engaging.

Wojciech Wiewiorowski, the current European Data Protection Supervisor, reminded us of the important role that data protection authorities (DPAs) play in the effective enforcement of the GDPR. Matthias Kloth, Head of Digital Governance and Sport, CoE, showed us a broader perspective. As his work surrounds the evolved Convention 108, now known as Convention 108+, he shed some light on the advancements of updating and bringing past laws into today’s modern age.

Regarding international cooperation, each panellist had their own unique take on how to facilitate and streamline it. Wiewiorowski correctly stated that data has no borders and that cooperation with everyone is needed, as a global effort. However, he reminded, that in the age of cooperation, we cannot have a low level of protection by following the ‘lowest common denominator level of protection’.

Jo Pierson, Professor at the Vrije University Brussels and the Hasselt University, said that international cooperation is very challenging. He gave the example that country’s values may change overnight, giving the example of Trump’s recent re-election victory.

Audience questions

A member of the audience posed a very relevant question regarding the legal field as a whole.
He asked the panellists what they thought of the fact that enforcing one’s rights is a difficult and
costly process. To provide context, he explained how a person must be legally literate and bear their own costs for litigation to litigate or filing an appeal.

Wiewiorowski of the EDPS pointed out that changing the procedural rules of the GDPR is not feasible to tackle this issue. There is the option for small-scale procedural amendments, but he does not foresee the GDPR being opened up in the coming years.

However, Pierson had a more practical take on the matter and suggested that this is where individuals and civil society organisations can join forces. Individuals can approach organisations such as noyb, Privacy International, and EDRi for help or advice on the matter. But then it begs the question, on whose shoulders should this burden rest?

One last question from the audience was about the bombshell new Chinese AI ‘DeepSeek’ recently dropped onto the market. The panellists were asked whether this new AI is an enemy or a friend to us Europeans. Each panellist avoided calling Chinese AI an enemy or a friend, but they did find common ground on the fact that we need international cooperation and that an open-source AI is not a bad thing if it can be trained by Europeans.

The last remark regarding this panel was Wiewiorowski’s comment on Chinese AI and how he compared it to ‘Sputnik Day’ (the 1950s space race between the United States and the USSR). Are we in a new technological gap? Will non-Western allies and foes beat us in this digital arms race?

Data protection in a changing world: What lies ahead? Moderated by Anna Buchta

This session also had a series of interesting questions for high-profile panellists. The range of this panel was impressive as it regrouped opinions from the European Commission, the Polish Minister of Digital Affairs, the European Parliament, the UK’s Information Commissioner, and DIGITALEUROPE.

Notably, Marina Kaljurand from LIBE and her passion for cyber matters. She revealed that many people in the European Parliament are not tech literate. On the other hand, some people are extremely well-versed in how the technology is used. There seems to be a big information asymmetry within the European Parliament that needs to be addressed if they are to vote on digital regulations.

She gave an important overview of the state of data transfers with the UK and the USA. The UK has in place an adequacy decision that has raised multiple flags in the European Parliament and is set to expire in June 2025.

The future of data transfer in the UK is very uncertain. As for the USA, she mentioned that there will be difficult times due to the actions of the recently re-elected President Trump that are degrading US-EU relations. Regarding her views on the child sexual abuse material regulation, she stresses how important it is to protect children and that the debate is not about whether or not to protect them or not, but that it is difficult to find out ‘how’ to protect them.

The current proposed regulations will put too much stress on violating one’s privacy, but on the other hand, it is difficult to find alternatives to protect children. This reflects how difficult regulating can be even when everyone at the table may have the same goals.

Irena Moozova, the Deputy Director-General of DG JUST at the European Commission, said that her priorities for the upcoming years are to cut red tape, simplify guidelines for businesses to work and support business compliance efforts for small and medium-sized enterprises. She mentions the public consultation phases that will be held for the upcoming Digital Fairness Act this summer.

John Edwards, the UK Information Commissioner, highlighted the transformative impact of emerging technologies, particularly Chinese AI, and how disruptive innovations can rapidly reshape markets. He discussed the ICO’s evolving strategies, noting their alignment with ideas shared by other experts. The organisation’s focus for the next two years includes key areas such as AI’s role in biometrics and tracking, as well as safeguarding children’s privacy. To address these priorities, the ICO has published an online tracking strategy and conducted research on children’s data privacy, including the development of systems tailored to protect young users.

Alberto Di Felice, Legal Counsel to DIGITALEUROPE, stressed the importance of simplifying regulations. He repeatedly stated numerous times that there is too much bureaucracy and too many actors involved in regulation. For example, if a company wants to operate in the EU market, they will have to consult DPAs, AI Act authority, data from the public sector (Data Governance Act), manufacturers or digital products (authorities for this), and financial sector authorities.

He advocated for a single regulator. He also mentioned how the quality of regulation in Europe
is poor and that sometimes regulations are too long. For example, some AI Act articles are 17 lines long with exceptions and sub-exceptions that lawyers cannot even make sense of. He suggested reforms such as having one regulator and proposing changes to streamline legal compliance.

Keynote speech by Beatriz de Anchorena on global data protection

Beatriz de Anchorena, Head of Argentina’s DPA and current Chair of the Convention 108+ Committee, delivered a compelling address on the importance of global collaboration in data protection. Representing a non-European perspective, she emphasised Argentina’s unique contribution to the Council of Europe (CoE).

Argentina was the first country outside Europe to receive an EU adequacy decision, which has since been renewed. Despite having data protection laws originating in the 2000s, Argentina remains a leader in promoting modernised frameworks.

Anchorena highlighted Argentina’s role as the 23rd state to ratify the Convention 108+, noting that only seven more countries need to ratify it to come into force fully. She advocated Convention 108+ as a global standard for data protection, capable of upgrading current data protection standards without demanding complete homogeneity. Instead, it offers a common ground for nations to align on privacy matters.

What’s on your mind: Neuroscience and data protection moderated by Ella Mein

Marcello Ienca, a Professor of Ethics of AI and Neuroscience at the University of Munich, gave everyone in the audience a breakdown of how data and neuroscience intersect and the real-world implications for people’s privacy.

The brain, often described as the largest data repository in the world, presents a vast opportunity for exploration and AI is acting as a catalyst in this process. Large-scale language models are helping researchers in decoding the brain’s ‘hardware’ and ‘software’, although the full ‘language of thought’ remains unclear and uncertain.

Neurotechnology raises real privacy and ethical concerns. For instance, the ability to biomark
conditions like schizophrenia or dementia introduces new vulnerabilities, such as the risk of
‘neuro discrimination’, where predicting one’s illness might lead to stigmatisation or unequal
treatment.

However, it is argued that understanding and predicting neurological conditions is important, as nearly every individual is expected to experience at least one neurological condition in their lifetime. As one panellist put it, ‘We cannot cure what we don’t understand, and we cannot understand what we don’t measure.’

This field also poses questions about data ownership and access. Who should have the ‘right to read brains’, and how can we ensure that access to such sensitive data, particularly emotions and memories unrelated to clinical goals, is tightly controlled? With the data economy in an ‘arms race’, there is a push to extract information directly from its source: the human brain.

As neurotechnology advances, balancing its potential benefits with safeguards will be important to ensure that innovation does not come at the cost of individual privacy and autonomy as mandated by law.

In addition to this breakdown, Jurisconsult Anna Austin explained to us the ECtHR’s legal background surrounding this. A jurisconsult plays a key role in keeping the court informed by maintaining a network that monitors relevant case law from member states and central to this discussion are questions of consent and waiver.

Current ECtHR case law states that any waiver must be unequivocal, fully informed, and fully understand its consequences, which can be challenging to meet. This high standard exists to safeguard fundamental rights, such as protection from torture and inhumane treatment and ensuring the right to a fair trial. As it stands, she stated that there is no fully comprehensive waiver mechanism.

The right to a fair trial is an absolute right that needs to be understood in this context. One nuance in this context is therapeutic necessity where forced medical interventions can be justified under strict conditions with safeguards to ensure proportionality.

Yet concerns remain regarding self-incrimination under Article 6. Particularly in scenarios where reading one’s mind could improperly compel evidence, raising questions about the abuse of such technologies.

Alessandra Pierucci from the Italian DPA made a relevant case for whether new laws should be
created for this matter or whether existing ones are sufficient. Within the context of her work, they are developing a mental privacy risk assessment.

Beyond privacy unveiling the true stakes of data protection. Moderated by Romain Robert

Nathalie Laneret, Vice President of Government Affairs and Public Policy at Criteo, presented her viewpoint on the role of AI and data protection. Addressing the balance between data protection and innovation, Laneret explained that these areas must work together.

She stressed the importance of finding a ways to use pseudonymised data and clear codes of conduct for businesses to use when pointing out that innovation is high on the European Commission’s political agenda.

Laneret addressed concerns about sensitive data, such as children’s data, highlighting Criteo’s proactive approach. With an internal ethics team, the company anticipated potential regulatory challenges around sensitive data, ensuring it stayed ahead of ethical and compliance issues.

In contrast, Max Schrems, Chair of noyb, offered a more critical perspective on data practices. He pointed out the economic disparity in the advertising model, explaining that while advertisers generate minimal revenue per user annually, they often charge users huge fees for their data. Schrems highlighted the importance of individuals having the right to freely give up their privacy if they choose, provided that consent is genuinely voluntary and given.

Forging the future: reinventing data protection? Moderated by Gabriela Zanfir-Fortuna

In this last panel, Johnny Ryan from the Irish Council for Civil Liberties painted a stark picture of
the societal challenges tied to data misuse. He described a crisis fuelled by external influence,
misunderstandings, and data being weaponised against individuals.

However, Ryan argued that the core issue is not merely the problems themselves but the fact that the EU lacks an effective and immediate response strategy. He stated the need for swift protective measures, criticising the current underuse of interim tools that could mitigate harm in real-time.

Nora Ni Loideain, a Lecturer and Director at the University of London’s Information Law and Policy Centre, discussed the impact of the GDPR on data protection enforcement. Explaining how DPAs had limited powers in the past and, for example, in events like the Cambridge Analytica scandal, she noted that the UK’s Data Protection Authority could only fine Facebook £500,000 due to a lack of resources and authority.

This is where the GDPR has allowed for DPAs to step up with independence, greater resources, and stronger enforcement capabilities, significantly improving their ability to hold companies accountable for their privacy violations.

Happy Data Protection Day 2025!

UK officials push Apple to unlock cloud data, according to TWP

Britain’s security officials have reportedly ordered Apple to create a so-called ‘back door’ to access all content uploaded to the cloud by its users worldwide. The demand, revealed by The Washington Post, could force Apple to compromise its security promises to customers. Sources suggest the company may opt to stop offering encrypted storage in the UK rather than comply with the order.

Apple has not yet responded to requests for comment outside of regular business hours. The Home Office has served Apple with a technical capability notice, which would require the company to grant access to the requested data. However, a spokesperson from the Home Office declined to confirm or deny the existence of such a notice.

In January, Britain initiated an investigation into the operating systems of Apple and Google, as well as their app stores and browsers. The ongoing regulatory scrutiny highlights growing tensions between tech giants and governments over privacy and security concerns.

Italian activist targeted by spyware, Meta warns

Luca Casarini, a prominent Italian migrant rescue activist, was warned by Meta that his phone had been targeted with spyware. The alert was received through WhatsApp, the same day Meta accused surveillance firm Paragon Solutions of using advanced hacking methods to steal user data. Paragon, reportedly American-owned, has not responded to the allegations.

Casarini, who co-founded the Mediterranea Saving Humans charity, has faced legal action in Italy over his rescue work. He has also been a target of anti-migrant media and previously had his communications intercepted in a case related to alleged illegal immigration. He remains unaware of who attempted to hack his device or whether the attack had judicial approval.

The revelation follows a similar warning issued to Italian journalist Francesco Cancellato, whose investigative news outlet, Fanpage, recently exposed far-right sympathies within Prime Minister Giorgia Meloni’s political youth wing. Italy’s interior ministry has yet to comment on the situation.