Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But technically you can do that only because you recognize the pattern, because the pattern (sequence) is there and you were taught that it’s a pattern and how to recognize it. Publicly available LLMs of now are taught different patterns, and are also constrained by how they are made.

Maybe there’s something for LLMs in reflection and self-reference that has to be “taught” to them (or has to be not blocked from them if it’s already achieved somehow), and once it becomes a thing they will be “cognizant” in the way humans feel about their own cognition. Or maybe the technology, the way we wire LLMs now simply doesn’t allow that. Who knows.

Of course humans are wired differently, but the point I’m trying to make is that it’s pattern recognition all the way down both for humans and LLMs and whatnot.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: