Hacker News new | past | comments | ask | show | jobs | submit login

> LLMs follow instructions.

No they don't, they generate a statistically plausible text response given a sequence of tokens.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: