Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Reminds me of this passage from Homo Deus

One popular scenario imagines a corporation designing the first artificial super-intelligence, and giving it an innocent test such as calculating pi.

Before anyone realises what is happening, the AI takes over the planet, eliminates the human race, launches a conquest campaign to the end of the galaxy, and transforms the entire known universe into a giant super-computer that for billions upon billions of years calculates pi ever more accurately.

After all, this is the divine mission its creator gave it.



Maybe it's a good thing humans don't know the Meaning of Life


Eh. We've got a close enough approximation. Sci-fi-esque example below.

The more things humans collected, the safer their genes were, and the higher the dopamine hit. Many humans took this to the logical end, collecting things at the expense of all else, including the safety of other humans. This led to severe poverty of all humans not wired to collect more things, which decreased their reproductive fitness.

Over time, the human race in aggregate continued to become more effective at collecting things, until they had wiped out all other life and all viable material was turned to drugs. The end.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: