Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I would say a lot would need to be added. Given the same input, the tetris game will respond exactly the same each time. There is no awareness, learning, no decisions made, but purely a 100% predictible process.

The Oxford Living Dictionary defines consciousness as "[t]he state of being aware of and responsive to one's surroundings", "[a] person's awareness or perception of something", and "[t]he fact of awareness by the mind of itself and the world".



That Oxford definition highlights why work such as Nagel's is needed. It can plausibly be argued that LLMs or other AI systems can qualify on all those counts, but many (most?) people wouldn't consider them to have conscious experience.

Characterizing that distinction is surprisingly tricky. "What is it like to be..." is one way to do that. David Chalmers' article about "the hard problem of consciousness" is another: https://consc.net/papers/facing.pdf




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: