Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Because they are smart enough to realize current LLM tech is nearing a dead end and cannot serve as a full AGI, even ignoring context and hallucination issues, without actual knowledge of the real world.


Most world models so far are based on transformers, no?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: