Qwen3-Next strengthens Alibaba’s position in global AI race
New Qwen3-Next release boosts performance efficiency and marks Alibaba’s latest move to rival US AI developers.

Alibaba has open-sourced its latest AI model, Qwen3-Next, claiming it is ten times more powerful and cheaper to train than its predecessor.
Developed by Alibaba Cloud, the 80-billion-parameter model reportedly performs on par with the company’s flagship Qwen3-235B-A22B while remaining optimised for deployment on consumer-grade hardware.
Qwen3-Next introduces innovations such as hybrid attention for long text processing, high-sparsity mixture-of-experts architecture, and multi-token prediction strategies. These upgrades boost both efficiency and model stability during training.
Alibaba also released Qwen3-Next-80B-A3B-Thinking, a reasoning-focused model that outperformed its own Qwen3-32B-Thinking and Google’s Gemini-2.5-Flash-Thinking in benchmark tests.
The release strengthens Alibaba’s position as a major player in open-source AI, following last week’s preview of its 1-trillion-parameter Qwen-3-Max model, which ranked sixth on UC Berkeley’s LMArena leaderboard.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!