New TranslateGemma models support 55 languages efficiently
A new model gives researchers and developers open tools to expand language coverage and build custom translation models.
A new suite of open translation models, TranslateGemma, has been launched, bringing advanced multilingual capabilities to users worldwide. Built on the Gemma 3 architecture, the models support 55 languages and come in 4B, 12B, and 27B parameter sizes.
The release aims to make high-quality translation accessible across devices without compromising efficiency.
TranslateGemma delivers impressive performance gains, with the 12B model surpassing the 27B Gemma 3 baseline on WMT24++ benchmarks. The models achieve higher accuracy while requiring fewer parameters, enabling faster translations with lower latency.
The 4B model also performs on par with larger models, making it ideal for mobile deployment.
The development combines supervised fine-tuning on diverse parallel datasets with reinforcement learning guided by advanced metrics. TranslateGemma performs well in high- and low-resource languages and supports accurate text translation within images.
Designed for flexible deployment, the models cater to mobile devices, consumer laptops, and cloud environments. Researchers and developers can use TranslateGemma to build customised translation solutions and improve coverage for low-resource languages.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
