Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

All current LLMs openly make simple mistakes that are completely incompatible with true "reasoning" (in the sense any human would have used that term years ago).

I feel like I'm taking crazy pills sometimes.



If you showed the raw output of, say, QwQ-32 to any engineer from 10 years ago, I suspect they would be astonished to hear that this doesn't count as "true reasoning".


Genuine question: what does "reasoning" mean to you?


How do you assess how true one's reasoning is?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: