Tuesday, November 22, 2022

The need for energy-efficient computing

Computing is consuming a large and ever-growing
fraction of the world's energy capacity.
I've seen the essential data in this figure several times over the last few months, and it has convinced me that the need for energy-efficient computing hardware is genuinely pressing.  This is from a report by the Semiconductor Research Corporation from 2020.  It argues that if computing needs continue to grow at the present rate, then by the early 2030s something like 10% of all of the world's energy production (and therefore something like 40% of the world's electricity production) will be tied up in computing hardware.  (ZIPs = \(10^21\) instructions per second)

Now, we all know the dangers of extrapolation.  Still, this trend tells us that something is going to change drastically - either the rate at which computing power grows will slow dramatically, or we will be compelled to find a much more energy-efficient computational approach, or some intermediate situation will develop.  (Note:  getting rid of crypto currencies sure wouldn't hurt, as they are incredibly energy-hungry and IMO have virtually zero positive contributions to the world, but that just slows the timeline.)

I've written before about neuromorphic computing as one approach to this problem.  Looking at neural nets as an architectural model is not crazy - your brain consumes about 12 W of power continuously, but it is far better at certain tasks (e.g. identifying cat pictures) than much more power-hungry setups.  Here is a nice article from Quanta on this, referencing a recent Nature paper.  Any big change will likely require the adoption of new materials and therefore new manufacturing processes.  Just something to bear in mind when people ask why anyone is studying the physics of electronic materials.

4 comments:

  1. Anonymous4:48 PM

    No, the solution is to produce more energy using nuclear, creating extreme energy abundance.

    ReplyDelete
  2. I agree that we need more nuclear power, but look at the exponentials. Unless we start building a new 2GWe plant every few days, the problem is still there.

    ReplyDelete
  3. Considering the amount of work I do everyday using a computer, I wouldn't feel unnatural that computers take a large portion of the energy consumption. I don't really understand what is the issue here. If this prediction means that in the future more and more tasks will be done on computers and more and more people in the world will have access to more computers, then I think it's fine.

    Of cource, reducing the energy consumption is good, but I think if we somehow achieve that, the energy consumption by computers will only increase because people will try to use more energy in a given amount of time (power) to do even more things.

    ReplyDelete
    Replies
    1. The issue is, we use electricity for many things besides computing, and if computing’s share grows so quickly that we are forced to decide between, e.g. computing and refrigeration, there will be problems. Clearly something will give. You are right, though, that if computing was energetically very cheap, we would end up doing more of it.

      Delete