Computing is consuming a large and ever-growing fraction of the world's energy capacity. |
Now, we all know the dangers of extrapolation. Still, this trend tells us that something is going to change drastically - either the rate at which computing power grows will slow dramatically, or we will be compelled to find a much more energy-efficient computational approach, or some intermediate situation will develop. (Note: getting rid of crypto currencies sure wouldn't hurt, as they are incredibly energy-hungry and IMO have virtually zero positive contributions to the world, but that just slows the timeline.)
I've written before about neuromorphic computing as one approach to this problem. Looking at neural nets as an architectural model is not crazy - your brain consumes about 12 W of power continuously, but it is far better at certain tasks (e.g. identifying cat pictures) than much more power-hungry setups. Here is a nice article from Quanta on this, referencing a recent Nature paper. Any big change will likely require the adoption of new materials and therefore new manufacturing processes. Just something to bear in mind when people ask why anyone is studying the physics of electronic materials.
4 comments:
No, the solution is to produce more energy using nuclear, creating extreme energy abundance.
I agree that we need more nuclear power, but look at the exponentials. Unless we start building a new 2GWe plant every few days, the problem is still there.
Considering the amount of work I do everyday using a computer, I wouldn't feel unnatural that computers take a large portion of the energy consumption. I don't really understand what is the issue here. If this prediction means that in the future more and more tasks will be done on computers and more and more people in the world will have access to more computers, then I think it's fine.
Of cource, reducing the energy consumption is good, but I think if we somehow achieve that, the energy consumption by computers will only increase because people will try to use more energy in a given amount of time (power) to do even more things.
The issue is, we use electricity for many things besides computing, and if computing’s share grows so quickly that we are forced to decide between, e.g. computing and refrigeration, there will be problems. Clearly something will give. You are right, though, that if computing was energetically very cheap, we would end up doing more of it.
Post a Comment