Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLM's are a nice UI to existing information.

They're like a card shark showing you how they can pull the red queen from out of your ear after making it vanish from the table.

When that happens you can enjoy the trick and wonder how it's done but your response shouldn't be that we need to ban magic. What's going on isn't magic. Banning "magic" might stop the card shark (if that's the goal) but it's fundamentally not understanding what's going on.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: