I have a rather specialized interest in and obscure subject but one which has a physical aspect pretty much any person can relate to/reason about, and pretty much every time I try to "discuss" the specifics of it w/ an LLM, it tells me things which are blatantly false, or otherwise attempts to carry on a conversation in a way which no sane human being would.
The LLM is not designed to pass the turing test. An application that suitably prompts the LLM can. It's like asking why can't I drive the nail with the handle of the hammer. That's not what it's for.