>There is no reason to believe that the AI will have self-preservation or self-replication as its goal.
There is. Bascially any goal given to AI can be better achieved if the AI continues to survive and grows in power. So surviving and growing in power are contingent to any goal; an AI with any goal will by default try to survive and grow in power, not because it cares about survival or power for their own sake, but in order to further the goal it's been assigned.
This has been pretty well-examined and discussed in the relevant literature.
In your example, the AI has already taken over the world and achieved enough power to forcibly freeze all humans. But it also has to keep us safely frozen, which means existing forever. To be as secure as possible in doing that, it needs to be able to watch for spaceborne threats better, or perhaps move us to another solar system to avoid the expansion of the sun. So it starts launching ships, building telescopes, studing propulsion technology, mining the moon and asteroids for more material...
There is. Bascially any goal given to AI can be better achieved if the AI continues to survive and grows in power. So surviving and growing in power are contingent to any goal; an AI with any goal will by default try to survive and grow in power, not because it cares about survival or power for their own sake, but in order to further the goal it's been assigned.
This has been pretty well-examined and discussed in the relevant literature.
In your example, the AI has already taken over the world and achieved enough power to forcibly freeze all humans. But it also has to keep us safely frozen, which means existing forever. To be as secure as possible in doing that, it needs to be able to watch for spaceborne threats better, or perhaps move us to another solar system to avoid the expansion of the sun. So it starts launching ships, building telescopes, studing propulsion technology, mining the moon and asteroids for more material...