Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Intent does matter if you want to classify things as lies.

If someone told you it's Thursday when it's really Wednesday, we would not necessary say they lied. We would say they were mistaken, if the intent was to tell you the correct day of the week. If they intended to mislead you, then we would say they lied.

So intent does matter. AI isn't lying, it intends to provide you with accurate information.




The AI doesn't intend anything. It produces, without intent, something that would be called lies if it came from a human. It produces the industrial-scale mass-produced equivalent of lies – it's effectively an automated lying machine.

Maybe we should call the output "synthetic lies" to distinguish it it from the natural lies produced by humans?


There is actually an acknowledged term of art for this: "bullshit".

Summary from Wikipedia: https://en.m.wikipedia.org/wiki/Bullshit

> statements produced without particular concern for truth, clarity, or meaning, distinguishing "bullshit" from a deliberate, manipulative lie intended to subvert the truth

It's a perfect fit for how LLMs treat "truth": they don't know so that can't care.


I’m imagining your comment read by George Carlin … if only he were still here to play with this. You know he would.


Elwood: What was I gonna do? Take away your only hope? Take away the very thing that kept you going in there? I took the liberty of bullshitting you.

Jake: You lied to me.

Elwood: Wasn't lies, it was just... bullshit.


AI doesn’t have “intent” at all.


So intent does matter. AI isn't lying, it intends to provide you with accurate information.

Why are we making excuses for machines?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: