Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Fun experiment, but it isn't as much of a gotcha as people here think. They could have verbally tricked a human customer service agent into promising them the car for $1 in the same way but the end result would be the same – the agent (whether human or bot) doesn't have the authority to make that promise so you are walking away with nothing. I doubt the company is sweating because of this hack.

Now if Chevrolet hooks their actual sales process to an LLM and has it sign contracts on their behalf... that'll be a sight to behold.



> They could have verbally tricked a human customer service agent into promising them the car for $1 in the same way

When's the last time you spoke to a human?


When was the last time you spoke to a car salesman?


To add, it's not just about who has authority or not. If you try to trick someone, even if the person you tricked has some kind of authority, a contract signed based on this trick (i.e., fraud) can likely be voidable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: