Russia’s state communications watchdog has intensified its campaign against major foreign platforms by blocking Snapchat and restricting FaceTime calls.
The move follows earlier reports of disrupted Apple services inside the country, while users could still connect through VPNs instead of relying on direct access. Roskomnadzor accused Snapchat of enabling criminal activity and repeated earlier claims targeting Apple’s service.
A decision that marks the authorities’ first formal confirmation of limits on both platforms. It arrives as pressure increases on WhatsApp, which remains Russia’s most popular messenger, with officials warning that a whole block is possible.
Meta is accused of failing to meet data-localisation rules and of what the authorities describe as repeated violations linked to terrorism and fraud.
Digital rights groups argue that technical restrictions are designed to push citizens toward Max, a government-backed messenger that activists say grants officials sweeping access to private conversations, rather than protecting user privacy.
These measures coincide with wider crackdowns, including the recent blocking of the Roblox gaming platform over allegations of extremist content and harmful influence on children.
The tightening of controls reflects a broader effort to regulate online communication as Russia seeks stronger oversight of digital platforms. The latest blocks add further uncertainty for millions of users who depend on familiar services instead of switching to state-supported alternatives.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Google has made Workspace Studio generally available, allowing employees to design, manage, and share AI agents directly within Workspace. Powered by Gemini 3, these agents automate tasks ranging from simple routines to complex business workflows, all without coding.
The platform aims to save time on repetitive work, freeing employees to focus on higher-value activities.
Agents can understand context, reason through problems, and integrate with core Workspace apps such as Gmail, Drive, and Chat, as well as enterprise platforms like Asana, Jira, Mailchimp, and Salesforce.
Early adopters, including cleaning solutions leader Kärcher, have utilised Workspace Studio to streamline workflows, reducing planning time by up to 90% and consolidating multiple tasks into a single minute.
Workspace Studio allows users to build agents using templates or natural language prompts, making automation accessible to non-specialists. Agents can manage status reports, reminders, email triage, and critical tasks, such as legal notices or travel requests.
Teams can also easily share agents, ensuring collaboration and consistency across workflows.
The rollout to business customers will continue over the coming weeks. Users can start creating agents immediately, explore templates, use prompts for automations, and join the Gemini Alpha program to test early features and controls.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Generative AI is rapidly altering the political campaign landscape, argues the ORF article, which outlines how election teams worldwide are adopting AI tools for persuasion, outreach and content creation.
Campaigns can now generate customised messages for different voter groups, produce multilingual content at scale, and automate much of the traditional grunt work of campaigning.
On one hand, proponents say the technology makes campaigning more efficient and accessible, particularly in multilingual or resource-constrained settings. But the ease and speed with which content can be generated also lowers the barrier for misuse: AI-driven deepfakes, synthetic voices and disinformation campaigns can be deployed to mislead voters or distort public discourse.
Recent research supports these worries. For example, a large-scale study published in Science and Nature demonstrated that AI chatbots can influence voter opinions, swaying a non-trivial share of undecided voters toward a target candidate simply by presenting persuasive content.
Meanwhile, independent analyses show that during the 2024 US election campaign, a noticeable fraction of content on social media was AI-generated, sometimes used to spread misleading narratives or exaggerate support for certain candidates.
For democracy and governance, the shift poses thorny challenges. AI-driven campaigns risk eroding public trust, exacerbating polarisation and undermining electoral legitimacy. Regulators and policymakers now face pressure to devise new safeguards, such as transparency requirements around AI usage in political advertising, stronger fact-checking, and clearer accountability for misuse.
The ORF article argues these debates should start now, before AI becomes so entrenched that rollback is impossible.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Authorities in Tokyo have issued an arrest warrant for a 17-year-old boy from Osaka on suspicion of orchestrating a large-scale cyberattack using artificial intelligence. The alleged target was the operator of the Kaikatsu Club internet-café chain (along with related fitness-gym business), which may have exposed the personal data of about 7.3 million customers.
According to investigators, the suspect used a computer programme, reportedly built with help from an AI chatbot, to send unauthorised commands around 7.24 million times to the company’s servers in order to extract membership information. The teenager was previously arrested in November in connection with a separate fraud case involving credit-card misuse.
Police have charged him under Japan’s law against unauthorised computer access and for obstructing business, though so far no evidence has emerged of misuse (for example, resale or public leaks) of the stolen data.
In his statement to investigators, the suspect reportedly said he carried out the hack simply because he found it fun to probe system vulnerabilities.
This case is the latest in a growing pattern of so-called AI-enabled cyber crimes in Japan, from fraudulent subscription schemes to ransomware generation. Experts warn that generative AI is lowering the barrier to entry for complex attacks, enabling individuals with limited technical training to carry out large-scale hacking or fraud.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
AI has pushed customer support into a new era, where anticipation replaces reaction. SAP has built a proactive model that predicts issues, prevents failures and keeps critical systems running smoothly instead of relying on queues and manual intervention.
Major sales events, such as Cyber Week and Singles Day, demonstrated the impact of this shift, with uninterrupted service and significant growth in transaction volumes and order numbers.
Self-service now resolves most issues before they reach an engineer, as structured knowledge supports AI agents that respond instantly with a confidence level that matches human performance.
Tools such as the Auto Response Agent and Incident Solution Matching enable customers to retrieve solutions without having to search through lengthy documentation.
SAP has also prepared organisations scaling AI by offering support systems tailored for early deployment.
Engineers have benefited from AI as much as customers. Routine tasks are handled automatically, allowing experts to focus on problems that demand insight instead of administration.
Language optimisation, routing suggestions, and automatic error categorisation support faster and more accurate resolutions. SAP validates every AI tool internally before release, which it views as a safeguard for responsible adoption.
The company maintains that AI will augment staff rather than replace them. Creative and analytical work becomes increasingly important as automation handles repetitive tasks, and new roles emerge in areas such as AI training and data stewardship.
SAP argues that progress relies on a balanced relationship between human judgement and machine intelligence, strengthened by partnerships that turn enterprise data into measurable outcomes.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Video games have long since outgrown their roots as niche entertainment. What used to be arcades and casual play is now a global cultural phenomenon.
A recent systematic review of research argues that video games play a powerful role in cultural transmission. They allow players worldwide, regardless of language or origin, to absorb cultural, social, and historical references embedded in game narratives.
Importantly, games are not passive media. Their interactivity gives them unique persuasive power. As one academic work on ‘gaming in diplomacy’ puts it, video games stand out among cultural media because they allow for procedural rhetoric, meaning that players learn values, norms, and worldviews not just by watching or hearing, but by actively engaging with them.
As such, gaming has the capacity to transcend borders, languages and traditional media’s constraints. For many young players around the world, including those in developing regions, gaming has become a shared language, a means to connecting across cultures, geographies, and generations.
Esports as soft power and public diplomacy
Nation branding, cultural export and global influence
Several countries have recognised the diplomatic potential of esports and gaming. Waseda University researchers emphasise that esports can be systematically used to project soft power, engaging foreign publics, shaping favourable perceptions, and building cultural influence, rather than being mere entertainment or economic ventures.
A 2025 study shows that the use of ‘game-based cultural diplomacy’ is increasingly common. Countries such as Japan, Poland, and China are utilising video games and associated media to promote their national identity, cultural narratives, and values.
An article about the games Honor of Kings and Black Myth: Wukong describes how the state-backed Chinese gaming industry incorporates traditional Chinese cultural elements (myth, history, aesthetics) into globally consumed games, thereby reaching millions internationally and strengthening China’s soft-power footprint.
For governments seeking to diversify their diplomatic tools beyond traditional media (film, music, diplomatic campaigns), esports offers persistent, globally accessible, and youth-oriented engagement, particularly as global demographics shift toward younger, digital-native generations.
Esports diplomacy in practice: People-to-people exchange
Cross-cultural understanding, community, identity
In bilateral diplomacy, esports has already been proposed as a vehicle for ‘people-to-people exchange.’ For example, a commentary on US–South Korea relations argues that annual esports competitions between the two countries’ top players could serve as a modern, interactive form of public diplomacy, fostering mutual cultural exchange beyond the formalities of traditional diplomacy.
On the grassroots level, esports communities, being global, multilingual and cross-cultural, foster friendships, shared experiences, and identities that transcend geography. That moment democratises participation, because you don’t need diplomatic credentials or state backing. All you need is access and engagement.
Some analyses emphasise how digital competition and community-building in esports ‘bridge cultural differences, foster international collaboration and cultural diversity through shared language and competition.’
From a theoretical perspective, applying frameworks from sports diplomacy to esports, supported by academic proposals, offers a path to sustainable and legitimate global engagement through gaming, if regulatory, equality and governance challenges are addressed.
Tensions & challenges: Not just a soft-power fairy tale
Risk of ‘techno-nationalism’ and propaganda
The use of video games in diplomacy is not purely benign. Some scholars warn of ‘digital nationalism’ or ‘techno-nationalism,’ where games become tools for propagating state narratives, shaping collective memory, and exporting political or ideological agendas.
The embedding of cultural or historical motifs in games (mythology, national heritage, symbols) can blur the line between cultural exchange and political messaging. While this can foster appreciation for a culture, it may also serve more strategic geopolitical or soft-power aims.
From a governance perspective, the rapid growth of esports raises legitimate concerns about inequality (access, digital divide), regulation, legitimacy of representation (who speaks for ‘a nation’), and possible exploitation of youth. Some academic literature argues that without proper regulation and institutionalisation, the ‘esports diplomacy gold rush’ risks being unsustainable.
Why this matters and what it means for Africa and the Global South
For regions such as Africa, gaming and esports represent not only recreation but potential platforms for youth empowerment, cultural expression, and international engagement. Even where traditional media or sports infrastructure may be limited, digital games can provide global reach and visibility. That aligns with the idea of ‘future pathways’ for youth, which includes creativity, community-building and cross-cultural exchange.
Because games can transcend language and geography, they offer a unique medium for diaspora communities, marginalised youth, and underrepresented cultures to project identity, share stories, and engage with global audiences. In that sense, gaming democratises cultural participation and soft-power capabilities.
On a geopolitical level, as game-based diplomacy becomes more recognised, Global South countries may leverage it to assert soft power, attract investment, and promote tourism or cultural heritage, provided they build local capacity (developers, esports infrastructure, regulation) and ensure inclusive access.
Gaming & esports as emerging diplomatic infrastructure
The trend suggests that video games and esports are steadily being institutionalised as instruments of digital diplomacy, soft power, and cultural diplomacy, not only by private companies, but increasingly by states and policymakers. Academic bibliometric analysis shows a growing number of studies (2015–2024) dedicated to ‘game-based cultural diplomacy.’
As esports ecosystems grow, with tournaments, global fans and the cultural export, we may see a shift from occasional ‘cultural-diplomacy events’ to sustained, long-term strategies employing gaming to shape international perceptions, build transnational communities, and influence foreign publics.
However, for this potential to be realised responsibly, key challenges must be addressed. Those challenges include inequality of access (digital divide), transparency over cultural or political messaging, fair regulation, and safeguarding inclusivity.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Meta has begun removing Australian users under 16 from Facebook, Instagram and Threads ahead of a national ban taking effect on 10 December. Canberra requires major platforms to block younger users or face substantial financial penalties.
Meta says it is deleting accounts it reasonably believes belong to underage teenagers while allowing them to download their data. Authorities expect hundreds of thousands of adolescents to be affected, given Instagram’s large cohort of 13 to 15 year olds.
Regulators argue the law addresses harmful recommendation systems and exploitative content, though YouTube has warned that safety filters will weaken for unregistered viewers. The Australian communications minister has insisted platforms must strengthen their own protections.
Rights groups have challenged the law in court, claiming unjust limits on expression. Officials concede teenagers may try using fake identification or AI-altered images, yet still expect platforms to deploy strong countermeasures.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Yesterday, Canada released the CAN-ASC-6.2 – Accessible and Equitable Artificial Intelligence Systems standard, marking the first national standard focused specifically on accessible AI.
A framework that ensures AI systems are inclusive, fair, and accessible from design through deployment. Its release coincides with the International Day of Persons with Disabilities, emphasising Canada’s commitment to accessibility and inclusion.
The standard guides organisations and developers in creating AI that accommodates people with disabilities, promotes fairness, prevents exclusion, and maintains accessibility throughout the AI lifecycle.
It provides practical processes for equity in AI development and encourages education on accessible AI practices.
The standard was developed by a technical committee composed largely of people with disabilities and members of equity-deserving groups, incorporating public feedback from Canadians of diverse backgrounds.
Approved by the Standards Council of Canada, CAN-ASC-6.2 meets national requirements for standards development and aligns with international best practices.
Moreover, the standard is available for free in both official languages and accessible formats, including plain language, American Sign Language and Langue des signes québécoise.
By setting clear guidelines, Canada aims to ensure AI serves all citizens equitably and strengthens workforce inclusion, societal participation, and technological fairness.
An initiative that highlights Canada’s leadership in accessible technology and provides a practical tool for organisations to implement inclusive AI systems.
Would you like to learn more aboutAI, tech and digital diplomacy? If so, ask our Diplo chatbot!
EU judges have ruled that online marketplaces must verify advertisers’ identities before publishing personal data. The judgment arose from a Romanian case involving an abusive anonymous advertisement containing sensitive information.
In this Romanian case, the Court found that marketplace operators influence the purposes and means of processing and therefore act as joint controllers. They must identify sensitive data before publication and ensure consent or another lawful basis exists.
Judges also held that anonymous users cannot lawfully publish sensitive personal data without proving the data subject’s explicit agreement. Platforms must refuse publication when identity checks fail or when no valid GDPR ground applies.
Operators must introduce safeguards to prevent unlawful copying of sensitive content across other websites. The Court confirmed that exemptions under E-commerce rules cannot override GDPR accountability duties.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
Leaders from academia and industry in Hyderabad, India are stressing that humans must remain central in decision-making as AI and automation expand across society. Collaborative intelligence, combining AI experts, domain specialists and human judgement, is seen as essential for responsible adoption.
Universities are encouraged to treat students as primary stakeholders, adapting curricula to integrate AI responsibly and avoid obsolescence. Competency-based, values-driven learning models are being promoted to prepare students to question, shape and lead through digital transformation.
Experts highlighted that modern communication is co-produced by humans, machines and algorithms. Designing AI to augment human agency rather than replace it ensures a balance between technology and human decision-making across education and industry.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!