Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you're talking about some sort of non-existent sci-fi future "AI" that isn't just log-likelihood optimization, then most likely such a fantastical thing wouldn't be using NVidia's GPU with CUDA.

This hardware is only good for current-generation "AI".



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: