At present (or near-future) levels of technology, as far as I can tell this idea makes no sense. I will talk about physics reasons here, though there are also pragmatic economic reasons why this seems crazy. I've written before that I think some of the AI/data center evangelists are falling victim to magical thinking, because they come from the software world and don't in their heart of hearts appreciate that there are actual hardware constraints on things like chip manufacturing and energy production.
Others have written about this - see here for example. The biggest physics challenges with this idea (beyond lofting millions of kg of cargo into orbit):
- While the cosmic microwave background is cold, cooling things in space is difficult, because vacuum is an excellent thermal insulator. On the ground, you can use conduction and convection to get rid of waste heat. In space, your only option (beyond throwing mass overboard, which is not readily replenishible) is radiative cooling. The key physics here is the Stefan-Boltzmann law, which is a triumph of statistical physics (and one of my favorite derivations to discuss in class - you combine the Planck result for the energy density of a "gas" of photons in thermal equilibrium at some temperature \(T\) with a basic kinetic theory of gases result for the flux of particles out of a small hole). It tells you that the best you can ever do is for an ideal black body, the total power radiated away is proportional to the area of the radiator and \(T^{4}\), with fundamental constants making up the proportionality constant with zero adjustable parameters.
Remember, data centers right now consume enormous amounts of power (and cooling water). While you can use heat pumps to try to get the radiators up to well above the operating temperatures of the electronics, that increases mass and waste power, and realistically there is an upper limit on the radiator temperature below 1000 K. An ideal black body radiator at 1000 K puts out about 57 kW per square meter, and you probably need to get rid of tens of megawatts, necessitating hundreds to thousands of square meters of radiator area. There are clever ideas on how to try to do this. For example, in the liquid droplet radiator, you could spray a bunch of hot droplets out into space, capitalizing on their large specific surface area. Of course, you'd need to recapture the cooled droplets, and the hot liquid needs to have sufficiently low vapor pressure that you don't lose a lot of material. Still, as far as I am aware, to date no one has actually deployed a large-scale (ten kW let alone MW level) droplet radiator in space.
A liquid droplet radiator, from this excellent site
- High end computational hardware is vulnerable to radiation damage. There are no rad-hard GPUs. Low earth orbit is a pretty serious radiation environment, with some flux of high energy cosmic rays quite a bit higher than on the ground. While there are tests going on, and astronauts are going to bring smartphones on the next Artemis mission, it's rough. Putting many thousands to millions of GPUs and huge quantities of memory in a harsh environment where they cannot be readily accessed or serviced seems unwise. (There are also serious questions of vulnerability to attack. Setting off a small nuclear warhead in LEO injects energetic electrons into the lower radiation belts and would be a huge mess.)
I think we will be faaaaaaar better off in the long run if we take a fraction of the money that people want to invest in space-based data centers, and instead plow those resources into developing energy-efficient computing. Musk has popularized the engineering sentiment "The best part is no part". The best way to solve the problem of supplying and radiating away many GW of power for data centers is to make data centers that don't consume many GW of power.
No comments:
Post a Comment