Search This Blog

Sunday, February 08, 2026

Data centers in space make no sense to me

There seems to be a huge push lately in the tech world for the idea of placing data centers in space.  This is not just coming from Musk via the merging of SpaceX and XAi.  Google has some effort along these lines.  NVIDIA is thinking about it. TED talks are being given by startup people in San Francisco on this topic, so you know we've reached some well-defined hype level.    Somehow the idea has enough traction that even the PRC is leaning in this direction.  The arguments seem to be that (1) there is abundant solar power in space; (2) environmental impact on the earth will be less, with no competition for local electricity, water, real estate; (3) space is "cold", so cooling these things should be do-able; (4) it's cool and sounds very sci-fi/high frontier.  

At present (or near-future) levels of technology, as far as I can tell this idea makes no sense.  I will talk about physics reasons here, though there are also pragmatic economic reasons why this seems crazy.  I've written before that I think some of the AI/data center evangelists are falling victim to magical thinking, because they come from the software world and don't in their heart of hearts appreciate that there are actual hardware constraints on things like chip manufacturing and energy production.  

Others have written about this - see here for example.  The biggest physics challenges with this idea (beyond lofting millions of kg of cargo into orbit):
  • While the cosmic microwave background is cold, cooling things in space is difficult, because vacuum is an excellent thermal insulator.  On the ground, you can use conduction and convection to get rid of waste heat.  In space, your only option (beyond throwing mass overboard, which is not readily replenishible) is radiative cooling.  The key physics here is the Stefan-Boltzmann law, which is a triumph of statistical physics (and one of my favorite derivations to discuss in class - you combine the Planck result for the energy density of a "gas" of photons in thermal equilibrium at some temperature \(T\) with a basic kinetic theory of gases result for the flux of particles out of a small hole).  It tells you that the best you can ever do is for an ideal black body, the total power radiated away is proportional to the area of the radiator and \(T^{4}\), with fundamental constants making up the proportionality constant with zero adjustable parameters.  
A liquid droplet radiator, from this excellent site
Remember, data centers right now consume enormous amounts of power (and cooling water).  While you can use heat pumps to try to get the radiators up to well above the operating temperatures of the electronics, that increases mass and waste power, and realistically there is an upper limit on the radiator temperature below 1000 K.  An ideal black body radiator at 1000 K puts out about 57 kW per square meter, and you probably need to get rid of tens of megawatts, necessitating hundreds to thousands of square meters of radiator area.  There are clever ideas on how to try to do this.  For example, in the liquid droplet radiator, you could spray a bunch of hot droplets out into space, capitalizing on their large specific surface area.  Of course, you'd need to recapture the cooled droplets, and the hot liquid needs to have sufficiently low vapor pressure that you don't lose a lot of material.  Still, as far as I am aware, to date no one has actually deployed a large-scale (ten kW let alone MW level) droplet radiator in space.  

  • High end computational hardware is vulnerable to radiation damage.  There are no rad-hard GPUs.  Low earth orbit is a pretty serious radiation environment, with some flux of high energy cosmic rays quite a bit higher than on the ground.  While there are tests going on, and astronauts are going to bring smartphones on the next Artemis mission, it's rough.  Putting many thousands to millions of GPUs and huge quantities of memory in a harsh environment where they cannot be readily accessed or serviced seems unwise.  (There are also serious questions of vulnerability to attack.  Setting off a small nuclear warhead in LEO injects energetic electrons into the lower radiation belts and would be a huge mess.)
I think we will be faaaaaaar better off in the long run if we take a fraction of the money that people want to invest in space-based data centers, and instead plow those resources into developing energy-efficient computing.  Musk has popularized the engineering sentiment "The best part is no part".  The best way to solve the problem of supplying and radiating away many GW of power for data centers is to make data centers that don't consume many GW of power.  

No comments: