Russia’s state communications watchdog has intensified its campaign against major foreign platforms by blocking Snapchat and restricting FaceTime calls.
The move follows earlier reports of disrupted Apple services inside the country, while users could still connect through VPNs instead of relying on direct access. Roskomnadzor accused Snapchat of enabling criminal activity and repeated earlier claims targeting Apple’s service.
A decision that marks the authorities’ first formal confirmation of limits on both platforms. It arrives as pressure increases on WhatsApp, which remains Russia’s most popular messenger, with officials warning that a whole block is possible.
Meta is accused of failing to meet data-localisation rules and of what the authorities describe as repeated violations linked to terrorism and fraud.
Digital rights groups argue that technical restrictions are designed to push citizens toward Max, a government-backed messenger that activists say grants officials sweeping access to private conversations, rather than protecting user privacy.
These measures coincide with wider crackdowns, including the recent blocking of the Roblox gaming platform over allegations of extremist content and harmful influence on children.
The tightening of controls reflects a broader effort to regulate online communication as Russia seeks stronger oversight of digital platforms. The latest blocks add further uncertainty for millions of users who depend on familiar services instead of switching to state-supported alternatives.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
A UK pornographic website has been fined £1m by Ofcom for failing to comply with mandatory age verification under the Online Safety Act. The company, AVS Group Ltd, did not respond to repeated contact from the regulator, prompting an additional £50,000 penalty.
The Act requires websites hosting adult content to implement ‘highly effective age assurance’ to prevent children from accessing explicit material. Ofcom has ordered the company to comply within 72 hours or face further daily fines.
Other tech platforms are also under scrutiny, with one unnamed major social media company undergoing compliance checks. Regulators warn that non-compliance will result in formal action, highlighting the growing enforcement of child safety online.
Critics argue the law must be tougher to ensure real protection, particularly for minors and women online. While age checks have reduced UK traffic to some sites, loopholes like VPNs remain a concern, and regulators are pushing for stricter adherence.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!
The US tech giant, Google, has announced a $2.1 million Google.org commitment to support Nigeria’s AI-powered future, aiming to strengthen local talent and improve digital safety nationwide.
An initiative that supports Nigeria’s National AI Strategy and its ambition to create one million digital jobs, recognising the economic potential of AI, which could add $15 billion to the country’s economy by 2030.
The investment focuses on developing advanced AI skills among students and developers instead of limiting progress to short-term training schemes.
Their work will introduce advanced AI curricula into universities and provide developers with structured, practical routes from training to building real-world products.
The commitment also expands digital safety initiatives so communities can participate securely in the digital economy.
Junior Achievement Africa will scale Google’s ‘Be Internet Awesome’ curriculum to help families understand safe online behaviour, while the CyberSafe Foundation will deliver cybersecurity training and technical assistance to public institutions, strengthening national digital resilience.
Google aims to create more opportunities similar to those of Nigerian learners who used digital skills to secure full-time careers instead of remaining excluded from the digital economy.
By combining advanced AI training with improved digital safety, the company intends to support inclusive growth and build long-term capacity across Nigeria.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
AI has pushed customer support into a new era, where anticipation replaces reaction. SAP has built a proactive model that predicts issues, prevents failures and keeps critical systems running smoothly instead of relying on queues and manual intervention.
Major sales events, such as Cyber Week and Singles Day, demonstrated the impact of this shift, with uninterrupted service and significant growth in transaction volumes and order numbers.
Self-service now resolves most issues before they reach an engineer, as structured knowledge supports AI agents that respond instantly with a confidence level that matches human performance.
Tools such as the Auto Response Agent and Incident Solution Matching enable customers to retrieve solutions without having to search through lengthy documentation.
SAP has also prepared organisations scaling AI by offering support systems tailored for early deployment.
Engineers have benefited from AI as much as customers. Routine tasks are handled automatically, allowing experts to focus on problems that demand insight instead of administration.
Language optimisation, routing suggestions, and automatic error categorisation support faster and more accurate resolutions. SAP validates every AI tool internally before release, which it views as a safeguard for responsible adoption.
The company maintains that AI will augment staff rather than replace them. Creative and analytical work becomes increasingly important as automation handles repetitive tasks, and new roles emerge in areas such as AI training and data stewardship.
SAP argues that progress relies on a balanced relationship between human judgement and machine intelligence, strengthened by partnerships that turn enterprise data into measurable outcomes.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Video games have long since outgrown their roots as niche entertainment. What used to be arcades and casual play is now a global cultural phenomenon.
A recent systematic review of research argues that video games play a powerful role in cultural transmission. They allow players worldwide, regardless of language or origin, to absorb cultural, social, and historical references embedded in game narratives.
Importantly, games are not passive media. Their interactivity gives them unique persuasive power. As one academic work on ‘gaming in diplomacy’ puts it, video games stand out among cultural media because they allow for procedural rhetoric, meaning that players learn values, norms, and worldviews not just by watching or hearing, but by actively engaging with them.
As such, gaming has the capacity to transcend borders, languages and traditional media’s constraints. For many young players around the world, including those in developing regions, gaming has become a shared language, a means to connecting across cultures, geographies, and generations.
Esports as soft power and public diplomacy
Nation branding, cultural export and global influence
Several countries have recognised the diplomatic potential of esports and gaming. Waseda University researchers emphasise that esports can be systematically used to project soft power, engaging foreign publics, shaping favourable perceptions, and building cultural influence, rather than being mere entertainment or economic ventures.
A 2025 study shows that the use of ‘game-based cultural diplomacy’ is increasingly common. Countries such as Japan, Poland, and China are utilising video games and associated media to promote their national identity, cultural narratives, and values.
An article about the games Honor of Kings and Black Myth: Wukong describes how the state-backed Chinese gaming industry incorporates traditional Chinese cultural elements (myth, history, aesthetics) into globally consumed games, thereby reaching millions internationally and strengthening China’s soft-power footprint.
For governments seeking to diversify their diplomatic tools beyond traditional media (film, music, diplomatic campaigns), esports offers persistent, globally accessible, and youth-oriented engagement, particularly as global demographics shift toward younger, digital-native generations.
Esports diplomacy in practice: People-to-people exchange
Cross-cultural understanding, community, identity
In bilateral diplomacy, esports has already been proposed as a vehicle for ‘people-to-people exchange.’ For example, a commentary on US–South Korea relations argues that annual esports competitions between the two countries’ top players could serve as a modern, interactive form of public diplomacy, fostering mutual cultural exchange beyond the formalities of traditional diplomacy.
On the grassroots level, esports communities, being global, multilingual and cross-cultural, foster friendships, shared experiences, and identities that transcend geography. That moment democratises participation, because you don’t need diplomatic credentials or state backing. All you need is access and engagement.
Some analyses emphasise how digital competition and community-building in esports ‘bridge cultural differences, foster international collaboration and cultural diversity through shared language and competition.’
From a theoretical perspective, applying frameworks from sports diplomacy to esports, supported by academic proposals, offers a path to sustainable and legitimate global engagement through gaming, if regulatory, equality and governance challenges are addressed.
Tensions & challenges: Not just a soft-power fairy tale
Risk of ‘techno-nationalism’ and propaganda
The use of video games in diplomacy is not purely benign. Some scholars warn of ‘digital nationalism’ or ‘techno-nationalism,’ where games become tools for propagating state narratives, shaping collective memory, and exporting political or ideological agendas.
The embedding of cultural or historical motifs in games (mythology, national heritage, symbols) can blur the line between cultural exchange and political messaging. While this can foster appreciation for a culture, it may also serve more strategic geopolitical or soft-power aims.
From a governance perspective, the rapid growth of esports raises legitimate concerns about inequality (access, digital divide), regulation, legitimacy of representation (who speaks for ‘a nation’), and possible exploitation of youth. Some academic literature argues that without proper regulation and institutionalisation, the ‘esports diplomacy gold rush’ risks being unsustainable.
Why this matters and what it means for Africa and the Global South
For regions such as Africa, gaming and esports represent not only recreation but potential platforms for youth empowerment, cultural expression, and international engagement. Even where traditional media or sports infrastructure may be limited, digital games can provide global reach and visibility. That aligns with the idea of ‘future pathways’ for youth, which includes creativity, community-building and cross-cultural exchange.
Because games can transcend language and geography, they offer a unique medium for diaspora communities, marginalised youth, and underrepresented cultures to project identity, share stories, and engage with global audiences. In that sense, gaming democratises cultural participation and soft-power capabilities.
On a geopolitical level, as game-based diplomacy becomes more recognised, Global South countries may leverage it to assert soft power, attract investment, and promote tourism or cultural heritage, provided they build local capacity (developers, esports infrastructure, regulation) and ensure inclusive access.
Gaming & esports as emerging diplomatic infrastructure
The trend suggests that video games and esports are steadily being institutionalised as instruments of digital diplomacy, soft power, and cultural diplomacy, not only by private companies, but increasingly by states and policymakers. Academic bibliometric analysis shows a growing number of studies (2015–2024) dedicated to ‘game-based cultural diplomacy.’
As esports ecosystems grow, with tournaments, global fans and the cultural export, we may see a shift from occasional ‘cultural-diplomacy events’ to sustained, long-term strategies employing gaming to shape international perceptions, build transnational communities, and influence foreign publics.
However, for this potential to be realised responsibly, key challenges must be addressed. Those challenges include inequality of access (digital divide), transparency over cultural or political messaging, fair regulation, and safeguarding inclusivity.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Quantum computing is advancing as governments and industry pursue new frontiers beyond AI. The UK benefits from strong research traditions and skilled talent. Policymakers see early planning as vital for long-term competitiveness.
Companies across finance, energy and logistics are testing quantum methods for optimisation and modelling. Early pilots suggest that quantum techniques may offer advantages where classical approaches slow down or fail to scale. Interest in practical applications is rising across Europe.
The UK benefits from strong university spinouts and deep industrial partnerships. Joint programmes are accelerating work on molecular modelling and drug discovery. Many researchers argue that early experimentation helps build a more resilient quantum workforce.
New processors promise higher connectivity and lower error rates as the field moves closer to quantum advantage. Research teams are refining designs for future error-corrected systems. Hardware roadmaps indicate steady progress towards more reliable architectures.
Policy support will shape how quickly the UK can translate research into real-world capability. Long-term investments, open scientific collaboration and predictable regulation will be critical. Momentum suggests a decisive period for the country’s quantum ambitions.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!
Meta has begun removing Australian users under 16 from Facebook, Instagram and Threads ahead of a national ban taking effect on 10 December. Canberra requires major platforms to block younger users or face substantial financial penalties.
Meta says it is deleting accounts it reasonably believes belong to underage teenagers while allowing them to download their data. Authorities expect hundreds of thousands of adolescents to be affected, given Instagram’s large cohort of 13 to 15 year olds.
Regulators argue the law addresses harmful recommendation systems and exploitative content, though YouTube has warned that safety filters will weaken for unregistered viewers. The Australian communications minister has insisted platforms must strengthen their own protections.
Rights groups have challenged the law in court, claiming unjust limits on expression. Officials concede teenagers may try using fake identification or AI-altered images, yet still expect platforms to deploy strong countermeasures.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
EU regulators are preparing to enforce the Cyber Resilience Act, setting core security requirements for digital products in the European market. The law spans software, hardware, and firmware, establishing shared expectations for secure development and maintenance.
Scope captures apps, embedded systems, and cloud-linked features. Risk classes run from default to critical, directing firms to self-assess or undergo third-party checks. Any product sold beyond December 2027 must align with the regulation.
Obligations apply to manufacturers, importers, distributors, and developers. Duties include secure-by-design practices, documented risk analysis, disclosure procedures, and long-term support. Firms must notify ENISA within 24 hours of active exploitation and provide follow-up reports on a strict timeline.
Compliance requires technical files covering threat assessments, update plans, and software bills of materials. High-risk categories demand third-party evaluation, while lower-risk segments may rely on internal checks. Existing certifications help, but cannot replace CRA-specific conformity work.
Non-compliance risks fines, market restrictions, and reputational damage. Organisations preparing early are urged to classify products, run gap assessments, build structured roadmaps, and align development cycles with CRA guidance. EU authorities plan to provide templates and support as firms transition.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!
Yesterday, Canada released the CAN-ASC-6.2 – Accessible and Equitable Artificial Intelligence Systems standard, marking the first national standard focused specifically on accessible AI.
A framework that ensures AI systems are inclusive, fair, and accessible from design through deployment. Its release coincides with the International Day of Persons with Disabilities, emphasising Canada’s commitment to accessibility and inclusion.
The standard guides organisations and developers in creating AI that accommodates people with disabilities, promotes fairness, prevents exclusion, and maintains accessibility throughout the AI lifecycle.
It provides practical processes for equity in AI development and encourages education on accessible AI practices.
The standard was developed by a technical committee composed largely of people with disabilities and members of equity-deserving groups, incorporating public feedback from Canadians of diverse backgrounds.
Approved by the Standards Council of Canada, CAN-ASC-6.2 meets national requirements for standards development and aligns with international best practices.
Moreover, the standard is available for free in both official languages and accessible formats, including plain language, American Sign Language and Langue des signes québécoise.
By setting clear guidelines, Canada aims to ensure AI serves all citizens equitably and strengthens workforce inclusion, societal participation, and technological fairness.
An initiative that highlights Canada’s leadership in accessible technology and provides a practical tool for organisations to implement inclusive AI systems.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
OpenAI Foundation has named the first recipients of the People-First AI Fund, awarding $40.5 million to 208 community groups across the United States. The grants will be disbursed by the end of the year, with a further $9.5 million in Board-directed funding to follow.
Nationwide listening sessions and recommendations from an independent Nonprofit Commission shaped applications. Nearly 3,000 organisations applied, underscoring strong demand for support across US communities. Final selections were made following a multi-stage human review involving external experts.
Grantees span digital literacy programmes, rural health initiatives and Indigenous media networks. Many operate with limited exposure to AI, reflecting the fund’s commitment to trusted, community-centred groups. California features prominently, consistent with the Foundation’s ties to its home state.
Funded projects span primary care, youth training in agricultural areas, and Tribal AI literacy work. Groups are also applying AI to food networks, disability education, arts and local business support. Each organisation sets priorities through flexible grants.
The programme focuses on AI literacy, community innovation and economic opportunity, with further grants targeting sector-level transformation. OpenAI Foundation says it will continue learning alongside grantees and supporting efforts that broaden opportunity while grounding AI adoption in local US needs.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!