Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Lying requires intent to deceive.

If you ask me to remove whitespace from a string in Python and I mistakenly tell you use ".trim()" (the Java method, a mistake I've made annoyingly too much) instead of ".strip()", am I lying to you?

It's not a lie. It's just wrong.



You are correct that there is a difference between lying and making a mistake, however

> Lying requires intent to deceive

LLMs do have an intent to deceive, built in!

They have been built to never admit they don't know an answer, so they will invent answers based on faulty premises

I agree that for a human mixing up ".trim()" and ".strip()" is an honest mistake

In the example I gave you are asking for a function that does not exist. If it invents a function, because it is designed to never say "you are wrong that doesn't exist" or "I don't know the answer" that seems to qualify to me as "intent to deceive" because it is designed to invent something rather than give you a negative sounding answer


An LLM is not "just wrong" either. It's just bullshit.

The bullshitter doesn't care about if what they say is true or false or right or wrong. They just put out more bullshit.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: