Hacker News new | past | comments | ask | show | jobs | submit login

I have a rather specialized interest in and obscure subject but one which has a physical aspect pretty much any person can relate to/reason about, and pretty much every time I try to "discuss" the specifics of it w/ an LLM, it tells me things which are blatantly false, or otherwise attempts to carry on a conversation in a way which no sane human being would.





Did you specifically prompt it to pretend to be a person with limbs and all?

No, I don't have to do that with a person, why would I need to do that with an LLM?

The LLM is not designed to pass the turing test. An application that suitably prompts the LLM can. It's like asking why can't I drive the nail with the handle of the hammer. That's not what it's for.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: