AI translation
The AI Translation feature lets you automatically generate translations for the open TS file using large language models (LLMs). You can use either a local LLM server (such as Ollama, LM Studio, or llama.cpp) or cloud-based APIs that support the OpenAI-compatible REST protocol (such as OpenAI, Groq, or Anthropic).
Setting up a local LLM server
To use AI Translation with a local server, install one of the following and download at least one model:
- Ollama - Easy to use, manages models automatically
- LM Studio - GUI application with model browser
- llama.cpp - Lightweight, runs GGUF models directly
For Ollama, pull a model using the command line:
ollama pull qwen3:14b
ollama serveFor LM Studio, download models through the application's interface and start the local server:
lms server start
For llama.cpp, you can either use one of the built-in model presets:
llama-server --fim-qwen-7b-default
Or download a GGUF model file and start the server manually:
llama-server -m model.gguf --port 8080
Using cloud APIs
To use cloud-based translation services, select OpenAI Compatible as the API type, enter the provider's API endpoint URL, and provide your API key. The API key field accepts authentication tokens for services like OpenAI, Anthropic, Groq, and other OpenAI-compatible providers.
Using the AI Translation dialog
In Linguist, choose Tools > AI Translation to open the AI Translation dialog:

The dialog provides:
- API Type: choose between Ollama for local Ollama servers or OpenAI Compatible for LM Studio, llama.cpp, or cloud APIs.
- Server URL: the REST endpoint where the server listens (default
http://localhost:11434for Ollama,http://localhost:8080for OpenAI Compatible). - API Key: authentication key for cloud APIs (optional for local servers).
- Model: drop-down list of available models.
- Context: optional application context to improve translation accuracy (e.g., "medical software", "video game", "financial application").
- File: the TS file to translate.
- Filter: limit translation to specific groups (contexts or labels).
- Translate: start the AI translation.
- Apply Translations: apply the translated items into the TS file.
During translation, progress messages appear in the Translation Log. When complete, review the translated texts in the log. Click Apply Translations to insert the AI-generated translations into the TS file.
Recommended models
The following models are recommended for translation tasks, balancing quality, speed, and resource usage. These models can be found on:
- Ollama - search by model name (e.g.,
ollama pull qwen3:14b) - LM Studio - search in the model browser
- Hugging Face - download GGUF files for llama.cpp
| Model | Size | Notes |
|---|---|---|
| Mistral Small 24B | 14 GB | High translation quality with strong multilingual support. Requires >16 GB VRAM for optimal performance. |
| Qwen3 14B | 9 GB | Balance of quality and resource usage. Supports 100+ languages. |
| Qwen3 30B | 19 GB | High-quality translations. Uses MoE architecture for efficient inference. |
| Qwen2.5 14B | 9 GB | Strong multilingual support for 29+ languages including CJK languages. |
| Gemma 3 12B | 8 GB | Supports 140+ languages. Good for resource-constrained systems. |
| 7shi/llama-translate 8B | 5 GB | Specialized translation model for English, French, Chinese, and Japanese. Lightweight option for limited hardware. Available on Ollama only. |
For systems with limited resources, smaller variants like Qwen3 8B (5 GB) or Qwen2.5 7B (5 GB) provide reasonable translation quality while requiring less memory.
Note: Translation quality varies by language pair and model. Test different models to find the best combination of speed, quality, and resource usage for your specific translation needs.
© 2025 The Qt Company Ltd. Documentation contributions included herein are the copyrights of their respective owners. The documentation provided herein is licensed under the terms of the GNU Free Documentation License version 1.3 as published by the Free Software Foundation. Qt and respective logos are trademarks of The Qt Company Ltd. in Finland and/or other countries worldwide. All other trademarks are property of their respective owners.