Not really - at least at current goals, population size, etc -, even with the very high energy expenditure of, say, LOTS of AI hardware to run the "Skynet" we're driving ourselves into, we're talking the order of magnitude of 30TWh (human generation today per Wikipedia).
Imagining a future: with a ~3% growth, so let's say fusion is deployed and everything is electric in the next few years (not happening that fast though), with AI data-centers everywhere so individual-level AI runs (say "Her"'s movie personal OS-level stuff) per human and we reach the out-of-my-buttocks figure of 500 TWh/year in 10 years time, which is crazy shit ... well, that would not "boil the world"!
The Sun delivers ~170,000 TWh per year. So 500 TWh still would not be that significant, and within the Sun's yearly delivery fluctuations.
The problem with energy generation today is that it's releasing gases, and these gases are disrupting the planet’s energy balance - especially how Earth gets rid of the massive energy it receives from the Sun. We do need to restore the balance between what comes in and what goes back out - fusion can help tackle that problem specifically, so it's beneficial overall even if it eventually adds a fractional percentage to the overall planetary energy bill.
I picture that fusion would be a complementary source, not the only one, and, once/if deployed, would help close some of key the loopholes that prevent solar (and other renewables) from being deployed 100%.
It delivers 170,000 TWh per hour (i.e. 170,000 TW)!
3.14 * (6378km)^2 * 1300W/m^2 = 166PW
It's a ludicrous amount of energy - roughly the entire human annual energy usage is delivered every 70 minutes. The whole problem of AGW is that even a tiny modulation in absolute terms of things that affect the steady state (i.e. greenhouse gases) can have substantial effects. But it's also, presumably, going to be key to fixing the problem, if we do fix it.
Could that actually work? You'd have to expend energy to concentrate the heat into your power generation system to power the (I assume) a laser or similar emitter to beam the energy away. Would you be able to make sure that the extra energy used to move the heat around, plus inefficiencies in the laser power generation gets included in the outgoing photons? This seems, perhaps naively, like the entropy is going the "wrong" way.
You could presumably radiate it to space by moving the heat to something that can "see" a clear sky, but you can have this happen naturally on a far huger scale by reducing GHG content in the atmosphere and increasing the radiative efficiency of the entire planet surface, as well as various passive systems like cool roofs, albedo manipulation and special materials that radiate specific wavelengths.
Yes, it would require radiative surfaces with sky view. Such systems are already in use, including surfaces that get much cooler than air temperature, in the sun on a warm sunny day.
When you cool a building or a data center or whatever, you can pump that heat into a high temperature fluid and send it to a sky-radiator instead of sending it to an air-exchange radiator. So heat produced in processes could be moved to radiator assemblies and “beamed” into space (I probably should have said radiated).
“Energy” is a colloquially ambiguous term. The better terms are available energy (exergy) and entropy.
The Earth radiates away almost exactly as much energy as it receives. It has to. Otherwise it would boil. Our biosphere, however, extracts a lot of available energy from that system. That results in the Sun shining low-entropy energy on the Earth, and the Earth radiating high-entropy radiation away.
Put another way, a universe that is homogenous at 10 million degrees has plenty of energy. But it has zero useful energy because you have no entropy gradient.
You can stop worrying, because fusion energy from this kind of reactor will be anything but cheap. It will likely be more expensive than energy from current generation fission power plants.