Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, it just means we trained the model to work on instructions written that way. Since the result works out, that means the model must've learned to deal with it.

There isn't much research on what's actually going on here, mainly because nobody has access to the weights of the really good models.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: