A wave of reconsideration is sweeping across UK businesses as they reassess the use of facial recognition technology and fingerprint scanning for staff attendance monitoring. This shift comes in response to a clampdown by the Information Commissioner’s Office (ICO), which recently ordered a Serco subsidiary to cease using biometrics for attendance tracking at leisure centres it manages.
The ICO’s directive followed its discovery that over 2,000 employees’ biometric data had been unlawfully processed across 38 Serco-managed leisure centres. As a result, Serco has been granted a three-month window to align its systems with the ICO’s compliance standards.
In the wake of the ICO’s ruling, various leisure centre operators and corporations are either reviewing or halting the use of similar biometric technologies. Notable among them is Virgin Active, which has removed biometric scanners from 32 sites and is actively seeking alternative attendance monitoring solutions for its staff.
Why does it matter?
The ICO’s intervention underscores broader concerns regarding the increasing prevalence of facial recognition and surveillance tools in employment contexts. The scrutiny extends beyond leisure centres, as highlighted by a recent case involving an Uber Eats driver who received a financial settlement over allegations of racially discriminatory facial recognition checks. These developments underscore the urgent need for robust regulations to safeguard workers’ rights in the age of AI and automated processes.
The United Kingdom Treasury is set to introduce a regulatory framework for crypto assets and stablecoins by July, with the aim of promoting local innovation in digital assets and blockchain technology. Bim Afolami, the UK’s economic secretary to the Treasury, emphasized the importance of crypto regulations in maintaining global competitiveness in fintech. The regulatory framework seeks to strike a balance between fostering innovation and safeguarding consumers.
The Treasury is currently finalizing proposals for regulations on stablecoins and crypto staking, which are expected to be delivered in June or July. Once implemented, various activities involving crypto assets, such as operating exchanges and taking custody of customer assets, will come under regulatory oversight for the first time.
In addition to the regulatory framework, the UK is enacting a new law that grants authorities the power to confiscate crypto assets directly from exchanges and custodian wallet providers. This measure, effective from April 26, aims to address economic crime and illicit activities. The law is an amendment to the Economic Crime and Corporate Transparency Act 2023. While the specific method of destroying a crypto token is not specified, the news text mentions that burning the token is a common way to remove it from circulation.
UK Prime Minister Rishi Sunak has announced a substantial investment of £55.5 million over four years in facial recognition technology, which aims to combat retail crime by identifying repeated shoplifters.
The initiative, part of a broader crackdown on theft, includes deploying bespoke mobile units equipped with live facial recognition capabilities across high streets nationwide. While controversial, its deployment has resulted in numerous arrests, primarily for offences ranging from theft to assault. However, concerns persist regarding privacy and false positives.
Despite criticism from privacy advocates like Big Brother Watch, Home Secretary James Cleverly emphasises the technology’s preventative nature, while the Metropolitan Police views it as a transformative tool in law enforcement. The Office of the Scottish Biometrics Commissioner noted that careful deployment is needed to maintain public confidence.
Why does it matter?
The development has emerged months after Scotland’s biometrics commissioner, Brian Plastow, raised concerns about the trajectory towards autocracy driven by inappropriate use of biometric surveillance in the UK. While supporting specific biometric surveillance applications, like live facial recognition, he critiques government overreach and highlights risks such as database misuse and privacy erosion. Plastow’s concerns are exemplified by incidents like the arrest of an eight-month-pregnant woman for failing to report community service. While Scotland may resist England’s path towards a vigilant state, the stance of Wales remains uncertain.
The US and UK have announced a partnership on the science of AI safety, with a particular focus on developing tests for the most advanced AI models.
US Commerce Secretary Gina Raimondo and British Technology Secretary Michelle Donelan signed a memorandum of understanding in Washington to collaborate on advanced AI model testing after agreements during the AI Safety Summit at Bletchley Park last November. The joint program will involve the UK’s and US’s AI Safety Institutes working together on research, safety evaluations, and guidance for AI safety.
Why does it matter?
The partnership aims to accelerate the work of both institutes across the full spectrum of AI risks, from national security concerns to broader societal issues. The UK and US plan to conduct at least one joint testing exercise on a publicly accessible model and are considering staff exchanges between the institutes. The two partners are among several countries that have created public AI safety institutions.
In October, British Prime Minister Rishi Sunak said that its AI Safety Institute would investigate and test new AI models. The US announced in November that it was establishing its own institute to assess threats from frontier AI models, and in February, Secretary Raimondo launched the AI Safety Institute Consortium (AISIC) to partner with 200 firms and organisations. The US-UK partnership is intended to strengthen the special relationship between the two countries and contribute to the global effort to ensure the safe development of AI.
During a crypto event in London, the UK’s Economic Secretary to the Treasury Bim Afolami stated that the government is working intensively to ensure the new legislation regulating stablecoins and crypto staking. However, no specific details about the regulations were provided due to the ongoing developments in the field.
In 2022, UK Prime Minister Rishi Sunak pledged to establish the country as a global crypto hub, emphasizing the need for crypto firms to be able to invest, innovate, and scale up within the UK. Progress on implementing clearer regulations has been slow, despite calls from cryptocurrency firms for more concise rules.
The UK Law Commission published recommendations in July 2023 suggesting conducting a common law analysis of crypto assets and establishing an industry-specific panel consisting of technical experts, academics, and legal practitioners to advise courts on crypto-related legal matters.
On October 30, 2023, the UK government announced plans to introduce more crypto-specific regulations in 2024. This includes bringing the regulation of fiat-backed stablecoins under the purview of the Financial Conduct Authority (FCA).
Shoplifting of high-value items, such as alcohol, steak, and cosmetics, continues to be a significant problem. Thieves target these items due to their value and demand on the market. Goss’s call for Meta to enforce identity and location verification aims to deter potential shoplifters and make it harder for them to anonymously sell stolen goods on the platform.
Under the plan, police forces are now prioritising shoplifting incidents and attending the location where a suspect is being held by store staff. This indicates that shoplifting has become a growing concern, requiring immediate attention and stronger preventive measures.
Currently, there is a self-regulatory system for online advertising overseen by the Advertising Standards Authority (ASA), but it lacks the power to address illegal harms as effectively as harmful advertising by legitimate businesses. The government plans to introduce statutory regulation to tackle illegal paid-for online adverts and enhance child protection. This new regulation will extend responsibilities to major players across the online advertising supply chain, including ad tech intermediary services and promotional posts by social media influencers.
The government will launch a consultation on the specifics of potential legislation, including the preferred choice for a regulator to oversee the new rules. The task force will gather evidence on illegal advertising and collaborate with industry initiatives to protect children and address harmful practices. The proposed regulations aim to strike a balance between internet safety and supporting innovation in online advertising while ensuring transparency, consumer trust, and industry growth.
UK Prime Minister Rishi Sunak has announced that the UK will host the world’s first major global summit on AI safety in autumn 2023. The summit aims to address the risks associated with the rapid development of AI and discuss international cooperation to ensure its safe and responsible use. The announcement coincides with the Prime Minister’s visit to Washington DC to discuss the UK-US approach to opportunities and challenges of emerging technologies.
The summit will focus on examining the potential risks associated with AI, including frontier systems, and will discuss ways to address these risks through collaborative efforts on a global scale. It will be build on recent discussions at the G7, OECD, and Global Partnership on AI. Additionally, technology companies, including DeepMind, Anthropic, and Palantir, have expressed support for the summit and highlighted the importance of international collaboration in developing AI safely and ethically.
In parallel, the UK government plans to increase the number of scholarships for post-graduate STEM studies at UK and US universities to enhance expertise in these fields. The increased scholarships include a rise in Marshall scholarships and funding for five new Fulbright scholarships annually, focusing on STEM-related subjects, aiming to strengthen the mutual expertise of the UK and the US in future technologies.
The UK examines the potential full stack of crypto regulations within the next 12 months. The UK government is aiming to gain the status of the world’s crypto asset hub and has taken a different approach to crypto regulation compared to other countries. Primarily, the UK seeks to establish itself as a destination for crypto innovation and attract companies running businesses in the cryptocurrency sphere.
The UK is moving fast in order to create a clear regulatory framework for crypto activity. Several countries are competing to establish themselves as crypto-friendly destinations, while the United States has taken a stricter stance on cryptocurrencies. The UK now has the ability to control its own legislation and make important decisions on crypto regulation. The UK should move ‘in an agile and proportionate way’ about this, and the proposed new law will focus on key areas such as exchange, custody, and lending activities.
The period for government consultation on the regulations will conclude on April 30. Former UK finance minister and now Prime Minister, Rishi Sunak, expressed his desire last year to position Britain as a leading destination for crypto asset technology. The US SEC and CFTC have been pressing charges against US crypto companies and crypto exchanges.
The Trade Union Congress (TUC) in the UK, shared its opinion on the new UK initiative to establish a single AI watchdog institution for future AI development oversight. The TUC is arguing that the new UK law will dilute the rights of human workers, and has called for stronger protections against AI that is making decisions about workers’ lives and employment. In particular, AI is used in facial recognition technology (to analyze expressions, accents, tone of voice), in devices that record data about worker activity, which can then be analyzed, or in recruiting systems that carry an inherent bias towards certain groups of workers. In their words the UK government didn’t do enough to ensure AI is used ethically in the workplace. They also urged for the inclusion of workers in discussions around AI regulation.
In the response from the UK government, legislators claimed that safeguards will remain in place. Furthermore, AI development will bring more jobs to the market, which will further help the economy grow and, consequently, improve worker conditions.