Tuesday, November 22, 2022

The need for energy-efficient computing

Computing is consuming a large and ever-growing
fraction of the world's energy capacity.
I've seen the essential data in this figure several times over the last few months, and it has convinced me that the need for energy-efficient computing hardware is genuinely pressing.  This is from a report by the Semiconductor Research Corporation from 2020.  It argues that if computing needs continue to grow at the present rate, then by the early 2030s something like 10% of all of the world's energy production (and therefore something like 40% of the world's electricity production) will be tied up in computing hardware.  (ZIPs = \(10^21\) instructions per second)

Now, we all know the dangers of extrapolation.  Still, this trend tells us that something is going to change drastically - either the rate at which computing power grows will slow dramatically, or we will be compelled to find a much more energy-efficient computational approach, or some intermediate situation will develop.  (Note:  getting rid of crypto currencies sure wouldn't hurt, as they are incredibly energy-hungry and IMO have virtually zero positive contributions to the world, but that just slows the timeline.)

I've written before about neuromorphic computing as one approach to this problem.  Looking at neural nets as an architectural model is not crazy - your brain consumes about 12 W of power continuously, but it is far better at certain tasks (e.g. identifying cat pictures) than much more power-hungry setups.  Here is a nice article from Quanta on this, referencing a recent Nature paper.  Any big change will likely require the adoption of new materials and therefore new manufacturing processes.  Just something to bear in mind when people ask why anyone is studying the physics of electronic materials.

Saturday, November 12, 2022

Bob Curl - it is possible to be successful and also a good person

I went to a memorial service today at Rice for my late colleague Bob Curl, who died this past summer, and it was a really nice event.  I met Bob almost immediately upon my arrival at Rice back in 2000 (though I’d heard about him from my thesis advisor, who’d met him at the Nobel festivities in Stockholm in 1996).  As everyone who interacted with him for any length of time will tell you, he was simultaneously extremely smart and amazingly nice.  He was very welcoming to me, even though I was a new assistant professor not even in his department.  I’d see him at informal weekly lunch gatherings of some folks from what was then called the Rice Quantum Institute, and he was always interested in learning about what his colleagues were working on - he had a deep curiosity and an uncanny ability to ask insightful questions.  He was generous with his time and always concerned about students and the well-being of the university community.

A refrain that came up over and over at the service was that Bob listened.  He talked with you, not at you, whether you were an undergrad, a grad student, a postdoc, a professor, or a staff member.  I didn’t know him nearly as well as others, but in 22 years I never heard him say a cross word or treat anyone with less than respect.  

His insatiable curiosity also came up repeatedly.  He kept learning new topics, right up to the end, and actually coauthored papers on economics, like this one.  By all accounts he was scientifically careful and rigorous.

Bob was a great example of how it is possible to be successful as an academic and a scientist while still being a nice person.  It’s important to be reminded of that sometimes.

Saturday, November 05, 2022

The 2022 Welch Conference

The last couple of weeks have been very full.  

One event was the annual Welch Foundation conference (program here).  The program chair for this one was W. E. Moerner, expert (and Nobel Laureate) on single-molecule spectroscopy, and it was really a great meeting.  I'm not just saying that because it's the first one in several years that was well aligned to my own research.  

The talks were all very good, and I was particularly impressed by the presentation by Yoav Shechtman, who spoke about the use of machine learning in super-resolution microscopy.  It basically had me convinced that machine learning (ML) can, under the right circumstances, basically be magic.   The key topic is discussed in this paper.  The basic idea of some flavors of super-resolution microscopy is to rely on the idea that fluorescence is coming from individual, hopefully well-separated single emitters.  Diffraction limits the size of a spot, but if you know that the light is coming from one emitter, you can use statistics to figure out the x-y centroid position of that spot to much higher precision.  That can be improved by ML methods, but there's more.  There are ways to get z information as well.  Xiaowei Zhuang's group had this paper in 2008 that's been cited 2000+ times, using a clever idea:  with a cylindrical lens in the beam path, a spot from an emitter above the focal plane is distorted along one axis, while a spot from an emitter below the focal plane is distorted along the orthogonal axis.  In the new work, Shechtman's folks have gone further, putting a phase mask into the path that produces more interesting distortions along those lines.  They use ML trained on a detailed simulation of their microscope data to get improved z precision.  Moreover, they also can use ML to then design an optimal version of that phase mask, to get even better precision.  Very impressive.

The other talk that really stuck out was the Welch award talk by Carolyn Bertozzi, one of this year's Nobel Laureates in Chemistry.  She gave a great presentation about the history of bioorthogonal chemistry, and it was genuinely inspiring, especially given the clinical treatment possibilities it's opened up.  Even though she must've given some version of that talk hundreds of times, her passion and excitement about the actual chemistry (e.g. see, these bonds here are really strained, so we know that the reaction has to happen here) was just palpable.