Hacker News new | past | comments | ask | show | jobs | submit login

> Or one that parses it into a 7 :)

if it's known and acceptable that LLMs can hallucinate arguments to an API then i don't see how this isn't perfectly acceptable behavior either.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: