Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That Oxford definition highlights why work such as Nagel's is needed. It can plausibly be argued that LLMs or other AI systems can qualify on all those counts, but many (most?) people wouldn't consider them to have conscious experience.

Characterizing that distinction is surprisingly tricky. "What is it like to be..." is one way to do that. David Chalmers' article about "the hard problem of consciousness" is another: https://consc.net/papers/facing.pdf



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: