Hacker News new | past | comments | ask | show | jobs | submit login

> When LLMs came out, and they now produce text that sounded like it was written by a native speaker (in major languages).

While the general syntax of the language seem to be somewhat correct now, the LLM's still don't know anything about those languages and keep mis-translating words due to its inherit insane design around english. A whole lot of concepts don't even exist in english so these translation oracles just can never do it successfully.

If i i read a few minutes of LLM translated text, there's always a couple of such errors.

I notice younger people don't see these errors because of their worse language skills, and the LLM:s enforce their incorrect understanding.

I don't think this problem will go away as long as we keep pushing this inferior tech, but instead the languages will devolve to "fix" it.

Languages will morph into a 1-to-1 mapping of english and all the cultural nuances will get lost to time.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: