I assume every browser will do the same as on-device models start becoming more useful.
Last time I looked I wasn't able to find any easy to run models that supported more than a handful of languages.
You can use llm for this fairly easily:
uv tool install llm
# Set up your model however you like. For instance:
llm install llm-ollama
ollama pull mistral-small3.2
llm --model mistral-small3.2 --system "Translate to English, no other output" --save english
alias english="llm --template english"
english "Bonjour"
english "Hola"
english "Γειά σου"
english "你好"
cat some_file.txt | english
https://llm.datasette.ioPlus, mistral-small3.2 has too many parameters. Not all devices can run it fast. That probably isn't the exact translation model being used by Chrome.
https://github.com/facebookresearch/fairseq/tree/nllb/
If running locally is too difficult, you can use llm to access hosted models too.
You could also look into Argos Translate, or just use the same models as Firefox through kotki [4].
[0] https://huggingface.co/facebook/nllb-200-distilled-600M [1] https://huggingface.co/facebook/m2m100_418M [2] https://huggingface.co/google/madlad400-3b-mt [3] https://huggingface.co/models?other=base_model:quantized:goo... [4] https://github.com/kroketio/kotki
Not the easiest, but easy enough (requires building).
I used these two projects to build an on-device translator for Android.
https://ollama.com/library/gemma3
> support for over 140 languages