Preferences

crazygringo parent
That doesn't match my experience at all. Maybe it's something to do with what your prompts are asking for or the way you're passing translations? Or the size of chunks being translated?

I have been astounded at the sophistication of LLM translation, and haven't encountered a single false-friend example ever. Maybe it depends a lot on which models you're using? Or it thinks you're trying to have a conversation that code-switches mid-sentence, which is a thing LLM's can do if you want?


f38zf5vdt
I'm using o3 and Gemini Pro 2.5, paying for the high tier subscriptions. The complaints I get are from native speakers -- editors and end consumers. The LLMs tend to overfit to the English language, sometimes make up idioms that don't exist, use false friend words (especially verbs), directly translate English idioms, and so on. I've translated several book length texts now and I've seen it all.

This item has no comments currently.