AI translation
The AI Translation feature lets you automatically generate translations for the open TS file using a local LLM via Ollama.
To use AI Translation you must first install Ollama and pull at least one model, for example:
ollama pull gpt-oss:20b
ollama pull 7shi/llama-translate:8b-q4_K_M
Then start the Ollama server if not already started:
ollama serve
In Linguist, choose Translation > AI Translation to open the AI Translation dialog:
The dialog provides:
- Ollama Server: the REST endpoint where Ollama listens (default
http://127.0.0.1:11434
). - Model: drop-down of locally installed models.
- File: the TS file to translate.
- Filter (optional): limit to strings in a specific group (context or label).
- Translate button: start the AI translation.
- Apply Translations button: apply the translated items into the TS file.
During translation, progress messages appear in the status bar and in the Translation Log. When complete, you can check out the translated texts on the log. Upon clicking on Apply Translations, AI-generated translations are inserted into the TS file.
Note: We suggest using one of OpenAI’s open-weight models, e.g., gpt-oss:20b or other LLMs trained for translation, e.g., 7shi/llama-translate:8b-q4_K_M. Feel free to try other models to find the best combination of speed, quality, and resource usage.
© 2025 The Qt Company Ltd. Documentation contributions included herein are the copyrights of their respective owners. The documentation provided herein is licensed under the terms of the GNU Free Documentation License version 1.3 as published by the Free Software Foundation. Qt and respective logos are trademarks of The Qt Company Ltd. in Finland and/or other countries worldwide. All other trademarks are property of their respective owners.