Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If a car dealership had a parrot in their showroom named Rupert, and Rupert learned to repeat "that's a deal!", no judge would entertain the idea that because someone heard Rupert repeat the phrase that it amounted to any legally binding promise. It's just a bird.

It's a pet, a novelty, entertainment for the bored kids who are waiting on daddy to finish buying his mid-life crisis Corvette. It's not a company representative.

> If someone claims to be representing the company, and the company knows, and the interaction is reasonable,

A chatbot isn't "someone" though.

> Try convincing a judge that the above was on purpose, by a 62 year old farmer that's never heard of AI.

I don't think you know how judges think. That's ok. You should be proud of the lack of proximity that you have to judges, means you didn't do anything exceedingly stupid in your life. But it also makes you a very poor predictor of how they go about making judgements.



If the company is leading the customer to believe the chatbot is a person (i.e by giving it a common name, not advertising that it is not a human), it could be at least be a reasonable case for false advertising.


In this case the showroom had a sign saying "Please talk to Rupert the sales parrot for pricing."


That wouldn't change anything. The judge would rule that's clearly a joke, and the plaintiff would still lose.


> If a car dealership had a parrot in their showroom named Rupert, and Rupert learned to repeat "that's a deal!", no judge would entertain the idea that because someone heard Rupert repeat the phrase that it amounted to any legally binding promise. It's just a bird.

If the car dealership trained a parrot named Rupert and deployed it to the sales floor as a salesperson as a representative of itself, however, that's a different situation.

> It's not a company representative.

But this chat bot is posturing itself as one. "Chevrolet of Watson Chat Team," it's handle reads, and I'm assuming that Chevrolet of Watson is a dealership.

And you know, if their chat bot can be prompted to say it's down for selling an $80,000 truck for a buck, frankly, they should be held to that. That's ridiculously shitty engineering to be deployed to production and maybe these companies would actually give a damn about their front-facing software quality if they were held accountable to it's boneheaded actions.


"bots" can make legally binding trades on Wall Street, and have been for decades. Why should car dealers be held to a different standard? IMO whether or not you "present" it as a person, this is software deployed by the company, and any screwups are on them. If your grocery store's pricing gun is mis adjusted and the cans of soup are marked down by a dollar, they are obligated to honor that "incorrect" price. This is much the same, with the word "AI" thrown in as mystification.


And if a machine hurts an employee on a production line, the company is liable for their medical bills. Just because you've automated part of your business doesn't mean you get to wash your hands of the consequences of that automation with a shrug when it goes wrong and a "well the robot made a mistake." Yeah, it did. Probably wanna fix that, in the meantime, bring that truck around Donnie, here's your dollar.


> they are obligated to honor that "incorrect" price.

Clearly false. If the store owner sees the incorrect price, he can say "that's incorrect, it costs more... do you still want it?". If you call the cops, they'll say "fuck off, this is civil, leave me alone or I'll make up a charge to arrest you with". And if you sue, because the can of off-brand macaroni and hot dog snippets was mismarked the judge will award the other guy legal costs because you were filing frivolous lawsuits.

> "bots" can make legally binding trades on Wall Street, and have been for decades.

Both parties want the trades to go through. No one contests a trade... even if their bot screwed up and lost them money, even if the courts would agree to reverse or remedy it, then it shuts down bot trading which costs them even more than just eating the one-time screwup.

This isn't analogous. They don't want their chatbot to be able to make sales, not even good ones. So shutting that down doesn't concern them. It will be contested. And given that this wasn't the intent of the creator/operator of the chatbot, given that letting the "sale" stand wouldn't be conducive to business in general, that there's no real injury to remedy, that buyers are supposed to exercise some minimum amount of sense in their dealings and that they weren't relying on that promise and that if they were doing so caused them no harm...

The judge would likely excoriate any lawyer who brought that lawsuit to court. They tend not to put up with stupid shit.


I can assure you that, at least in the US, you can ask for a manager and start mentioning "attorney general" and you will get whatever price is on the cans of soup.


Perhaps true, but irrelevant. You're no longer talking about the point in question, but whether some other social interaction is likely.


> And you know, if their chat bot can be prompted to say it's down for selling an $80,000 truck for a buck, frankly, they should be held to that.

Your "should" is just your personal feelings. When it went to court, the judge would agree with me, because for one he's not supposed to have any personal feelings in the matter, and for two they've ruled repeatedly in the past that such frivolous notions as yours don't hold up... thus both precedence and rationale.

The courts simply aren't a mechanism for you to enforce your views on how important website engineering is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: