Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But would GPT4 actually check something it had not checked the first time? Remember, telling the truth is not a consideration for it (and probably isn't even modeled), just saying something that would typically be said in similar circumstances.


Only in as much as there's an element of randomness to the way GPT responds to a prompt - so you can re-run effectively the same prompt and get a different result depending on the outcome of several hundred billion floating point calculations with a random seed thrown in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: