Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I didn't mean you were forcing them to approve it.

I meant that, over time, your system will become right more and more often, not by virtue of genuinely understanding the situations, but because it has more training data and comes up with correlations that happen to be correct.

Once it's mostly-kinda-reliable, your users will, over time, stop paying attention to it and just start trusting it, even if they don't mean to, because that's how humans work.

And then it becomes a rubberstamp, and outcomes start getting generated where no human really thought it through.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: