Britain’s antitrust regulator, the Competition and Markets Authority (CMA), has launched an investigation into Google’s search operations to assess their impact on consumers, businesses, and competition. With Google handling 90% of UK online searches and supporting over 200,000 businesses through advertising, the CMA aims to ensure fair competition and innovation in search services, said CMA chief Sarah Cardell.
The probe will evaluate whether Google’s dominant position restricts market entry and innovation, as well as whether it provides preferential treatment to its own services. The CMA will also investigate the company’s extensive collection and use of consumer data, including its role in AI services. The findings, expected within nine months, could lead to measures such as requiring Google to share data with rivals or giving publishers more control over their content.
Google has defended its role, stating that its search services foster innovation and help UK businesses grow. The company pledged to work constructively with the CMA to create rules that benefit both businesses and users. The investigation follows similar scrutiny in the US, where prosecutors have pushed for major reforms to curb Google’s dominance in online search.
Ian Russell, father of Molly Russell, has called on the UK government to take stronger action on online safety, warning that delays in regulation are putting children at risk. In a letter to Prime Minister Sir Keir Starmer, he criticised Ofcom’s approach to enforcing the Online Safety Act, describing it as a “disaster.” Russell accused tech firms, including Meta and X, of prioritising profits over safety and moving towards a more dangerous, unregulated online environment.
Campaigners argue that Ofcom’s guidelines contain major loopholes, particularly in addressing harmful content such as live-streamed material that promotes self-harm and suicide. While the government insists that tech companies must act responsibly, the slow progress of new regulations has raised concerns. Ministers acknowledge that additional legislation may be required as AI technology evolves, introducing new risks that could further undermine online safety.
Russell has been a prominent campaigner for stricter online regulations since his daughter’s death in 2017. Despite the Online Safety Act granting Ofcom the power to fine tech firms, critics believe enforcement remains weak. With concerns growing over the effectiveness of current safeguards, pressure is mounting on the government to act decisively and ensure platforms take greater responsibility in protecting children from harmful content.
British Prime Minister Keir Starmer has announced an ambitious plan to position the UK as a global leader in AI. In a speech on Monday, Starmer outlined proposals to establish specialised zones for data centres and incentivise technology-focused education, aiming to boost economic growth and innovation. According to the government, fully adopting AI could increase productivity by 1.5% annually, adding £47 billion ($57 billion) to the economy each year over the next decade.
Central to the plan is the adoption of recommendations from the “AI Opportunities Action Plan,” authored by venture capitalist Matt Clifford. Measures include fast-tracking planning permissions for data centres and ensuring energy connections, with the first such centre to be built in Culham, Oxfordshire. Starmer emphasised the potential for AI to create jobs, attract investment, and improve lives by streamlining processes like planning consultations and reducing administrative burdens for teachers.
The UK, currently the third-largest AI market behind the US and China, faces stiff global competition in establishing itself as an AI hub. While Starmer pledged swift action to maintain competitiveness, challenges persist. The Labour government’s recent high-tax budget has dampened some business confidence, and the Bank of England reported stagnation in economic growth last quarter. However, Starmer remains optimistic, declaring, “We must move fast and take action.”
By integrating AI into its economic strategy, the UK hopes to capitalise on technological advancements, balancing innovation with regulatory oversight in an increasingly competitive global landscape.
A new app designed to help children aged seven to twelve manage anxiety through gaming is being launched in Lincolnshire, UK. The app, called Lumi Nova, combines cognitive behavioural therapy (CBT) techniques with personalised quests to gently expose children to their fears in a safe and interactive way.
The digital game has been created by BFB Labs, a social enterprise focused on digital therapy, in collaboration with children, parents, and mental health experts. The app aims to make mental health support more accessible, particularly in rural areas, where traditional services may be harder to reach.
Families in Lincolnshire can download the app for free without needing a prescription or referral. Councillor Patricia Bradwell from Lincolnshire County Council highlighted the importance of flexible mental health services, saying: ‘We want to ensure children and young people have easy access to support that suits their needs.’
By using immersive videos and creative tasks, Lumi Nova allows children to confront their worries at their own pace from the comfort of home, making mental health care more engaging and approachable. The year-long pilot aims to assess the app’s impact on childhood anxiety in the region.
The UK’s Competition and Markets Authority (CMA) has announced plans to begin two investigations this month under its new digital markets powers. These measures focus on encouraging investment, innovation, and growth while targeting the largest tech firms.
Only companies designated with ‘Strategic Market Status’ (SMS) will face these investigations, with the bar for SMS status set high. Apple and Google were previously identified for potentially limiting competition in mobile ecosystems. Further details on the investigations will be revealed soon, with a third inquiry expected in about six months.
The regulator, which has gained greater merger control powers post-Brexit, was urged by Prime Minister Keir Starmer to focus more on growth. The new regime seeks to balance market competitiveness with the UK’s appeal for tech investment.
Faculty AI, a consultancy company with significant experience in AI, has been developing AI technologies for both civilian and military applications. Known for its close work with the UK government on AI safety, the NHS, and education, Faculty is also exploring the use of AI in military drones. The company has been involved in testing AI models for the UK’s AI Safety Institute (AISI), which was established to study the implications of AI safety.
While Faculty has worked extensively with AI in non-lethal areas, its work with military applications raises concerns due to the potential for autonomous systems in weapons, including drones. Though Faculty has not disclosed whether its AI work extends to lethal drones, it continues to face scrutiny over its dual roles in advising both the government on AI safety and working with defense clients.
The company has also generated some controversy because of its growing influence in both the public and private sectors. Some experts, including Green Party members, have raised concerns about potential conflicts of interest due to Faculty’s widespread government contracts and its private sector involvement in AI, such as its collaborations with OpenAI and defence firms. Faculty’s work on AI safety is seen as crucial, but critics argue that its broad portfolio could create a risk of bias in the advice it provides.
Despite these concerns, Faculty maintains that its work is guided by strict ethical policies, and it has emphasised its commitment to ensuring AI is used safely and responsibly, especially in defence applications. As AI continues to evolve, experts call for caution, with discussions about the need for human oversight in the development of autonomous weapons systems growing more urgent.
The UK‘s competition regulator, the Competition and Markets Authority (CMA), announced it may accept remedies proposed by Synopsys and Ansys to address concerns over their $35 billion merger. The deal, announced in January of last year, involves Synopsys acquiring Ansys, a company known for its software used in industries like aerospace and sports equipment manufacturing.
The CMA outlined the proposed remedies, which include the sale of Ansys’ power consumption analysis product for digital chips and Synopsys’ global optics and photonics software business. The regulator has until March 5 to decide whether to accept these remedies, though it can extend the deadline to 6 May.
Synopsys expressed satisfaction with the CMA’s progress and reiterated its commitment to working closely with the authority. The outcome of the regulator’s review could significantly impact the completion of the merger, which aims to enhance the companies’ capabilities in chip design software.
The Ministry of Defence announced that the UK is developing its first quantum clock, a cutting-edge device designed to enhance military intelligence and reconnaissance. Created by the Defence Science and Technology Laboratory, the clock boasts unparalleled precision, losing less than one second over billions of years.
By leveraging quantum mechanics to measure atomic energy fluctuations, the technology reduces reliance on vulnerable GPS systems, offering greater resilience against disruption by adversaries. That marks the UK’s debut in building such a device, with deployment anticipated within five years.
While not the world’s first quantum clock (similar technology was pioneered in the US 15 years ago), the UK effort highlights a growing global race in quantum advancements. Quantum clocks hold potential beyond military applications, impacting satellite navigation, telecommunications, and scientific research.
Countries like the United States and China are heavily investing in quantum technology, seeing its transformative potential. Future UK research aims to miniaturise the quantum clock for broader applications, including integration into military vehicles and aircraft, underscoring its strategic importance in defence and industry.
The UK government is intensifying efforts to safeguard children online, with new measures requiring social media platforms to implement robust age verification and protect young users from harmful content. Technology Secretary Peter Kyle highlighted the importance of ‘watertight’ systems, warning that companies failing to comply could face significant fines or even prison terms for executives.
The measures, part of the Online Safety Act passed in 2023, will see platforms penalised for failing to address issues such as bullying, violent content, and risky stunts. Ofcom, the UK‘s communications regulator, is set to outline further obligations in January, including stricter ID verification for adult-only apps.
Debate continues over the balance between safety and accessibility. While some advocate for bans similar to Australia‘s under-16 restrictions, teenagers consulted by Kyle emphasised the positive aspects of social media, including learning opportunities and community connections. Research into the impact of screen time on mental health is ongoing, with new findings expected next year.
The UK’s Competition and Markets Authority (CMA) has voiced concerns over Synopsys’ proposed $35 billion acquisition of Ansys, claiming the deal could harm innovation, reduce product quality, and increase costs in the semiconductor design and light-simulation software markets. The regulator fears diminished competition could negatively impact UK businesses and consumers, particularly in sectors such as artificial intelligence and cloud computing, which rely heavily on semiconductor technology.
Synopsys, a leader in chip design software, announced the acquisition in January, aiming to combine its tools with Ansys’ diverse software offerings, used in industries ranging from aerospace to consumer goods. However, the CMA has highlighted risks of reduced consumer choice and a potential stifling of advancements in the sector. If these concerns are not adequately addressed, the regulator may initiate an in-depth investigation into the merger.
In response, Synopsys has proposed selling its optical solutions business to Keysight Technologies, a move it believes will satisfy the CMA’s concerns. A company spokesperson expressed confidence in resolving the regulatory hurdles and expects the deal to close in the first half of 2025. The CMA’s final decision could shape the future landscape of competition in the semiconductor and simulation software industries, as global demand for advanced technologies continues to grow.