This is actually a very good and creative idea, but doesn't account for the cost of the computational capacity (specifically the chips). In the future, if chips become commoditized, your solution could work, but not in the current chip environment.
Because amortised capital cost > operational (power) cost
It's cheaper to build a 1 GW baseload power plant to sit beside your existing GPUs, than to buy another 1 GW of GPUs to place on the other side of the planet. Also, latency.
Also, a large scale roll out of nuclear would be cheaper than solar regardless. Most of the cost of recent nuclear builds is learning cost. We build them so infrequently that people retire or move on before the next one is built, so we have to learn how to build them from scratch, and develop the supply chains from scratch each time.
Unless, of course, it's the Vogtle plant or similar Western-built one, which cost about 17 billions per GW and took 10 years to construct. In this case, the price is comparable if you factor in the cost of money.
Why not? Just have the excess of computational capacity in many different regions and compute where the sun is shining bright now.