Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

someone somewhere probably put it in adequate enough words already, but misaligned and hallucinating LLMs, aka the coders who coded all that and get to review the interactions, learn a lot about how to 'properly'/'adequately' distrust users and humanity broadly ...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: