AI boom drives massive surge in data centre power demand

According to Goldman Sachs, the surge in AI is set to transform global energy markets, with data centres expected to consume 165% more electricity by 2030 compared to 2023. The bank reports that US spending on data centre construction has tripled in just three years, while occupancy rates at existing facilities remain close to record highs.

The demand is driven by hyperscale operators like Amazon Web Services, Microsoft Azure, and Google Cloud, which are rapidly expanding their infrastructure to meet the power-hungry needs of AI systems.

Global data centres use about 55 gigawatts of electricity, more than half of which supports cloud computing. Traditional workloads like email and storage still account for a third, while AI represents just 14%.

However, Goldman Sachs projects that by 2027, overall consumption could rise to 84 gigawatts, with AI’s share growing to over a quarter. That shift is straining grids and pushing operators toward new solutions as AI servers can consume ten times more electricity than traditional racks.

Meeting this demand will require massive investment. Goldman Sachs estimates that global grid upgrades could cost as much as US$720 billion by 2030, with US utilities alone needing an additional US$50 billion in new generation capacity for data centres.

While renewables like wind and solar are increasingly cost-competitive, their intermittent output means operators lean on hybrid models with backup gas and battery storage. At the same time, technology companies are reviving interest in nuclear power, with contracts for over 10 gigawatts of new capacity signed in the US last year.

The expansion is most evident in Europe and North America, with Nordic countries, Spain, and France attracting investment due to their renewable energy resources. At the same time, hubs like Germany, Britain, and Ireland rely on incentives and established ecosystems. Yet, uncertainty remains.

Advances like DeepSeek, a Chinese AI model reportedly as capable as US systems but more efficient, could temper power demand growth. For now, however, the trajectory is clear, AI is reshaping the data centre industry and the global energy landscape.

Would you like to learn more about AI, tech and digital diplomacyIf so, ask our Diplo chatbot!

Beijing seeks to curb excess AI investment while sustaining growth

China has pledged to rein in excessive competition in AI, signalling Beijing’s desire to avoid wasteful investment while keeping the technology central to its economic strategy.

The National Development and Reform Commission stated that provinces should develop AI in a coordinated manner, leveraging local strengths to prevent duplication and overlap. Officials in China emphasised the importance of orderly flows of talent, capital, and resources.

The move follows President Xi Jinping’s warnings about unchecked local investment. Authorities aim to prevent overcapacity problems, such as those seen in electric vehicles, which have fueled deflationary pressures in other industries.

While global investment in data centres has surged, Beijing is adopting a calibrated approach. The state also vowed stronger national planning and support for private firms, aiming to nurture new domestic leaders in AI.

At the same time, policymakers are pushing to attract private capital into traditional sectors, while considering more central spending on social projects to ease local government debt burdens and stimulate long-term consumption.

Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!

Meta faces turmoil as AI hiring spree backfires

Mark Zuckerberg’s ambitious plan to assemble a dream team of AI researchers at Meta has instead created internal instability.

High-profile recruits poached from rival firms have begun leaving within weeks of joining, citing cultural clashes and frustration with the company’s working style. Their departures have disrupted projects and unsettled long-time executives.

Meta had hoped its aggressive hiring spree would help the company rival OpenAI, Google, and Anthropic in developing advanced AI systems.

Instead of strengthening the company’s position, the strategy has led to delays in projects and uncertainty about whether Meta can deliver on its promises of achieving superintelligence.

The new arrivals were given extensive autonomy, fuelling tensions with existing teams and creating leadership friction. Some staff viewed the hires as destabilising, while others expressed concern about the direction of the AI division.

The resulting turnover has left Meta struggling to maintain momentum in its most critical area of research.

As Meta faces mounting pressure to demonstrate progress in AI, the setbacks highlight the difficulty of retaining elite talent in a fiercely competitive field.

Zuckerberg’s recruitment drive, rather than propelling Meta ahead, risks slowing down the company’s ability to compete at the highest level of AI development.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!

Stethoscope with AI identifies heart issues in seconds

A new stethoscope powered by AI could enable doctors to identify three serious heart conditions in just seconds, according to UK researchers.

The device replaces the traditional chest piece with a small sensor that records both electrical signals from the heart and the sound of blood flow, which are then analysed in the cloud by AI trained on large datasets.

The AI tool has shown strong results in trials across more than 200 GP practices, with patients tested using the stethoscope being more than twice as likely to be diagnosed with heart failure within 12 months compared with those assessed through usual care.

It was also 3.45 times more likely to detect atrial fibrillation and almost twice as likely to identify heart valve disease.

Researchers from Imperial College London and Imperial College Healthcare NHS Trust said the technology could help doctors provide treatment at an earlier stage instead of waiting until patients present in hospital with advanced symptoms.

The findings, known as Tricorder, will be presented at the European Society of Cardiology Congress in Madrid.

The project, supported by the National Institute for Health and Care Research, is now preparing for further rollouts in Wales, south London and Sussex. Experts described the innovation as a significant step in updating a medical tool that has remained largely unchanged for over 200 years.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!

How local LLMs are changing AI access

As AI adoption rises, more users explore running large language models (LLMs) locally instead of relying on cloud providers.

Local deployment gives individuals control over data, reduces costs, and avoids limits imposed by AI-as-a-service companies. Users can now experiment with AI on their own hardware thanks to software and hardware capabilities.

Concerns over privacy and data sovereignty are driving interest. Many cloud AI services retain user data for years, even when privacy assurances are offered.

By running models locally, companies and hobbyists can ensure compliance with GDPR and maintain control over sensitive information while leveraging high-performance AI tools.

Hardware considerations like GPU memory and processing power are central to local LLM performance. Quantisation techniques allow models to run efficiently with reduced precision, enabling use on consumer-grade machines or enterprise hardware.

Software frameworks like llama.cpp, Jan, and LM Studio simplify deployment, making local AI accessible to non-engineers and professionals across industries.

Local models are suitable for personalised tasks, learning, coding assistance, and experimentation, although cloud models remain stronger for large-scale enterprise applications.

As tools and model quality improve, running AI on personal devices may become a standard alternative, giving users more control over cost, privacy, and performance.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!

India to host OpenAI’s new Stargate data centre

OpenAI is preparing to build a significant new data centre in India as part of its Stargate AI infrastructure initiative. The move will expand the company’s presence in Asia and strengthen its operations in its second-largest market by user base.

OpenAI has already registered as a legal entity in India and begun assembling a local team.

The company plans to open its first office in New Delhi later this year. Details regarding the exact location and timeline of the proposed data centre remain unclear, though CEO Sam Altman may provide further information during his upcoming visit to India.

The project represents a strategic step to support the company’s growing regional AI ambitions.

OpenAI’s Stargate initiative, announced by US President Donald Trump in January, involves private sector investment of up to $500 billion for AI infrastructure, backed by SoftBank, OpenAI, and Oracle.

The initiative seeks to develop large-scale AI capabilities across major markets worldwide, with the India data centre potentially playing a key role in the efforts.

The expansion highlights OpenAI’s focus on scaling its AI infrastructure while meeting regional demand. The company intends to strengthen operational efficiency, improve service reliability, and support its long-term growth in Asia by establishing local offices and a significant data centre.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!

Schneider joins SK Telecom on new AI data centre project in Ulsan

SK Telecom has expanded its partnership with Schneider Electric to develop an AI Data Centre (AIDC) in Ulsan.

Under the deal, Schneider Electric will supply mechanical, electrical and plumbing equipment, such as switchgear, transformers, automated control systems and Uninterruptible Power Supply units.

The agreement builds on a partnership announced at Mobile World Congress 2025 and includes using Schneider’s Electrical Transient Analyser Program within SK Telecom’s data centre management system.

It will allow operations to be optimised through a digital twin model instead of relying only on traditional monitoring tools.

Both companies have also agreed on prefabricated solutions to shorten construction times, reference designs for new facilities, and joint efforts to grow the Energy-as-a-Service business.

A Memorandum of Understanding extends the partnership to other SK Group affiliates, combining battery technologies with Uninterruptible Power Supply and Energy Storage Systems.

Executives said the collaboration would help set new standards for AI data centres and create synergies across the SK Group. It is also expected to support SK Telecom’s broader AI strategy while contributing to sustainable and efficient infrastructure development.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!

Econet brings smart tech to Zimbabwe Agricultural Show to support farmers

Econet Wireless Zimbabwe is showcasing its latest technologies at the 2025 Zimbabwe Agricultural Show under the theme ‘Building Bridges: Connecting Agriculture, Industry & Community’.

The company is engaging thousands of visitors, including farmers and policymakers, by spotlighting digital inclusive finance, insurance and smart infrastructure innovations.

The display features EcoCash mobile payments, Moovah Insurance for agricultural and business risks, and digital entertainment platforms. A standout addition is Econet’s smart water metres, which provide real-time monitoring to help farmers and utilities manage water use, minimise waste and support sustainable development in agriculture.

Econet emphasises that these solutions reinforce its vision of empowering communities through accessible technology. Smart infrastructure and financial tools are presented as vital enablers for productivity, resilience and economic inclusion in Zimbabwe’s agricultural sector.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!

ENISA takes charge of new EU Cybersecurity Reserve operations with €36 million in funding

The European Commission has signed a contribution agreement with the European Union Agency for Cybersecurity (ENISA), assigning the agency responsibility for operating and administering the EU Cybersecurity Reserve.

The arrangement includes a €36 million allocation over three years, complementing ENISA’s existing budget.

The EU Cybersecurity Reserve, established under the EU Cyber Solidarity Act, will provide incident response services through trusted managed security providers.

The services are designed to support EU Member States, institutions, and critical sectors in responding to large-scale cybersecurity incidents, with access also available to third countries associated with the Digital Europe Programme.

ENISA will oversee the procurement of these services and assess requests from national authorities and EU bodies, while also working with the Commission and EU-CyCLONe to coordinate crisis response.

If not activated for incident response, the pre-committed services may be redirected towards prevention and preparedness measures.

The reserve is expected to become fully operational by the end of 2025, aligning with the planned conclusion of ENISA’s existing Cybersecurity Support Action in 2026.

ENISA is also preparing a candidate certification scheme for Managed Security Services, with a focus on incident response, in line with the Cyber Solidarity Act.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!

Quantum computing production expands with Shenzhen’s factory project in China

China has begun construction on its first facility dedicated to the production of photonic quantum computers in Shenzhen, Guangdong Province. The project marks a step toward the development of large-scale quantum computing capabilities in the country.

The factory, led by Beijing-based quantum computing company QBoson, is expected to manufacture several dozen photonic quantum computers each year once operations begin.

QBoson’s founder, Wen Kai, explained that photonic quantum computing uses the quantum properties of light and is viewed as a promising path in the field.

Compared with other approaches, it does not require extremely low temperatures to function and offers advantages such as stable operation at room temperature, a higher number of qubits, and longer coherence times.

The upcoming facility will be divided into three core areas: module development, full-system production, and quality testing. Construction is already underway, and equipment installation is scheduled to begin by the end of October.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!