Saturday, December 03, 2022

The wormhole kerfuffle, ER=EPR, and all that

I was busy trying to finish off a grant proposal and paper revisions this week and didn't have the time to react in realtime to the PR onslaught surrounding the recent Nature paper by the Google team.  There are many places to get caught up on this, but the short version:

  • Using their Sycamore processor, the experimentalists implemented a small-scale version of the SYK model.  This is a model that has many interesting properties, including the fact that it is a testbed for holography, in which a bulk system may be understood by the degrees of freedom on its boundary.  For an infinitely large SYK system, there is a duality to a 2D gravitational system.  So, a protocol for moving entanglement in the qubits that make up the SYK system is equivalent to having a traversable wormhole in that 2D gravitational system.  
  • The actual experiment is very cool.
  • The coverage in the press was extensive (Quanta, NY Times, e.g.).  There was a lot of controversy (see Peter Woit's blog for a summary, and Scott Aaronson for a good take) surrounding this, because there was some initial language usage that implied to a lay-person that the team had actually created a traversable wormhole.  Quanta revised their headline and qualified their language, to their credit.  
Rather than dogpiling on the media coverage, there are two main points at issue here that I think are worthy of discussion:
  1.  What do we mean when we say that we have experimentally implemented a model of a system?     When atomic physicists use ultracold fermionic atoms to make a 2D lattice governed by the Mott-Hubbard model (like here and here), we say that they have made a Mott insulator.  That same model is thought to be a good description of copper oxide superconductors.  However, no one would say that it actually is a copper oxide superconductor.  When is a model of a thing actually the thing itself?   This is at the heart of the whole topic of quantum simulation, but the issue comes up in classical systems as well.  My two cents:  If system A and system B are modeled extremely well by the same mathematics, that can give us real insights, but it doesn't mean that system A is system B.  Better language might be to say that system A is an analog to system B.  Physicists can be sloppy with language, and certainly it is much more attention-getting to editors of all stripes (be they journal editors or journalism editors) to have a short, punchy, bold description.  Still, it's better to be careful.  
  2. What do theorists like Lenny Susskind truly mean when they claim that entanglement is genuinely equivalent to wormholes?  This is summarized by the schematic equation ER = EPR, where ER = Einstein-Rosen wormhole and EPR = Einstein-Podolsky-Rosen entanglement.  I think I get the core intellectual idea that, in quantum gravity, spacetime itself may be emergent from underlying degrees of freedom that may be modeled as sorts of qubits; and that one can come up with fascinating thought experiments about what happens when dropping one member of an entangled pair of particles into the event horizon of a black hole.  That being said, as an experimentalist, the idea that any kind of quantum entanglement involves actual Planck-scale wormholes just seems bonkers.  That would imply that sending a photon through a nonlinear crystal and producing two lower energy entangled photons is actually creating a Planck-scale change in the topology of spacetime.  Perhaps someone in the comments can explain this to me.  Again, maybe this is me not understanding people who are being imprecise with their word choice.

Tuesday, November 22, 2022

The need for energy-efficient computing

Computing is consuming a large and ever-growing
fraction of the world's energy capacity.
I've seen the essential data in this figure several times over the last few months, and it has convinced me that the need for energy-efficient computing hardware is genuinely pressing.  This is from a report by the Semiconductor Research Corporation from 2020.  It argues that if computing needs continue to grow at the present rate, then by the early 2030s something like 10% of all of the world's energy production (and therefore something like 40% of the world's electricity production) will be tied up in computing hardware.  (ZIPs = \(10^21\) instructions per second)

Now, we all know the dangers of extrapolation.  Still, this trend tells us that something is going to change drastically - either the rate at which computing power grows will slow dramatically, or we will be compelled to find a much more energy-efficient computational approach, or some intermediate situation will develop.  (Note:  getting rid of crypto currencies sure wouldn't hurt, as they are incredibly energy-hungry and IMO have virtually zero positive contributions to the world, but that just slows the timeline.)

I've written before about neuromorphic computing as one approach to this problem.  Looking at neural nets as an architectural model is not crazy - your brain consumes about 12 W of power continuously, but it is far better at certain tasks (e.g. identifying cat pictures) than much more power-hungry setups.  Here is a nice article from Quanta on this, referencing a recent Nature paper.  Any big change will likely require the adoption of new materials and therefore new manufacturing processes.  Just something to bear in mind when people ask why anyone is studying the physics of electronic materials.

Saturday, November 12, 2022

Bob Curl - it is possible to be successful and also a good person

I went to a memorial service today at Rice for my late colleague Bob Curl, who died this past summer, and it was a really nice event.  I met Bob almost immediately upon my arrival at Rice back in 2000 (though I’d heard about him from my thesis advisor, who’d met him at the Nobel festivities in Stockholm in 1996).  As everyone who interacted with him for any length of time will tell you, he was simultaneously extremely smart and amazingly nice.  He was very welcoming to me, even though I was a new assistant professor not even in his department.  I’d see him at informal weekly lunch gatherings of some folks from what was then called the Rice Quantum Institute, and he was always interested in learning about what his colleagues were working on - he had a deep curiosity and an uncanny ability to ask insightful questions.  He was generous with his time and always concerned about students and the well-being of the university community.

A refrain that came up over and over at the service was that Bob listened.  He talked with you, not at you, whether you were an undergrad, a grad student, a postdoc, a professor, or a staff member.  I didn’t know him nearly as well as others, but in 22 years I never heard him say a cross word or treat anyone with less than respect.  

His insatiable curiosity also came up repeatedly.  He kept learning new topics, right up to the end, and actually coauthored papers on economics, like this one.  By all accounts he was scientifically careful and rigorous.

Bob was a great example of how it is possible to be successful as an academic and a scientist while still being a nice person.  It’s important to be reminded of that sometimes.

Saturday, November 05, 2022

The 2022 Welch Conference

The last couple of weeks have been very full.  

One event was the annual Welch Foundation conference (program here).  The program chair for this one was W. E. Moerner, expert (and Nobel Laureate) on single-molecule spectroscopy, and it was really a great meeting.  I'm not just saying that because it's the first one in several years that was well aligned to my own research.  

The talks were all very good, and I was particularly impressed by the presentation by Yoav Shechtman, who spoke about the use of machine learning in super-resolution microscopy.  It basically had me convinced that machine learning (ML) can, under the right circumstances, basically be magic.   The key topic is discussed in this paper.  The basic idea of some flavors of super-resolution microscopy is to rely on the idea that fluorescence is coming from individual, hopefully well-separated single emitters.  Diffraction limits the size of a spot, but if you know that the light is coming from one emitter, you can use statistics to figure out the x-y centroid position of that spot to much higher precision.  That can be improved by ML methods, but there's more.  There are ways to get z information as well.  Xiaowei Zhuang's group had this paper in 2008 that's been cited 2000+ times, using a clever idea:  with a cylindrical lens in the beam path, a spot from an emitter above the focal plane is distorted along one axis, while a spot from an emitter below the focal plane is distorted along the orthogonal axis.  In the new work, Shechtman's folks have gone further, putting a phase mask into the path that produces more interesting distortions along those lines.  They use ML trained on a detailed simulation of their microscope data to get improved z precision.  Moreover, they also can use ML to then design an optimal version of that phase mask, to get even better precision.  Very impressive.

The other talk that really stuck out was the Welch award talk by Carolyn Bertozzi, one of this year's Nobel Laureates in Chemistry.  She gave a great presentation about the history of bioorthogonal chemistry, and it was genuinely inspiring, especially given the clinical treatment possibilities it's opened up.  Even though she must've given some version of that talk hundreds of times, her passion and excitement about the actual chemistry (e.g. see, these bonds here are really strained, so we know that the reaction has to happen here) was just palpable.  

Wednesday, October 26, 2022

Rice University Academy of Fellows postdoc opportunity, 2023

As I have posted in previous years, Rice has a university-wide endowed honorific postdoctoral program called the Rice Academy of Fellows.   Like all such things, it's very competitive. The new application listing has gone live here with a deadline of January 4, 2023. Applicants have to have a faculty mentor, so in case someone is interested in working with me on this, please contact me via email. We've got some fun, exciting stuff going on!

Sunday, October 16, 2022

Materials labs of the future + cost

The NSF Division of Materials Research has been soliciting input from the community about both the biggest outstanding problems in condensed matter and materials science, and the future of materials labs - what kind of infrastructure, training, etc. will be needed to address those big problems.  In thinking about this, I want to throw out a stretch idea.  

I think it would have transformative impact on materials research and workforce development if there were fabrication and characterization tools that offered great performance at far lower prices than currently possible.  I'd mentioned the idea of developing a super-cheap SEM a while ago. I definitely worry that we are approaching a funding situation where the separation between top universities and everyone else will continue to widen rapidly.  The model of a network of user facilities seems to be how things have been trending (e.g. go to Harvard and use their high-res TEM, if your institution can't afford one).  However, if we really want to move the needle on access and training for a large, highly diverse workforce, it would be incredible to find a way to bring more capabilities to the broadest sweep of universities.   Maybe it's worth thinking hard about what could be possible to radically reduce hardware costs for the suite of materials characterization techniques that would be most important.

Saturday, October 08, 2022

Getting light out of plasmonic tunnel junctions - the sequel

A couple of years ago I wrote about our work on "above threshold" light emission in planar metal tunnel junctions.  In that work, we showed that in a planar tunnel junction, you can apply a bias voltage \(V\) and get lots of photons out at energies quite a bit greater than \(\hbar \omega = eV\).  In the high current regime when there are strong local plasmon resonances, it is possible to drive (steady state) some part of the electronic distribution to very high effective electron temperatures, and then observe radiation from the recombination of those hot carriers.  One neat thing about this is that by analyzing the spectra, it is possible to back out the actual plasmon-modified density of photonic states for emission to the far-field, \(\rho(\hbar \omega)\) of a particular junction.   

In our new paper published this week, we have been able to take this quite a bit further.  In the low current regime with weaker local plasmon resonances, the energy deposited by tunneling electrons is able to diffuse away rapidly compared to the arrival of more carriers, so that kind of carrier heating above isn't important.  Instead, it's been known for a while that the right way to think about light emission in that case is as a process connected to fluctuations (shot noise) in the tunneling current, as demonstrated very prettily here.  Within that mechanism, it should be possible to predict with precision what the actual emission spectrum should look like, given the tunneling conductance, the bias voltage, and \(\rho(\hbar \omega)\).  As shown in the figure, we can now test this, and it works very well.  Take a planar aluminum tunnel junction made by electromigration, and in the high conductance/high current limit, use the hot carrier emission to determine \(\rho(\hbar \omega)\).  Then gently migrate the junction further to lower the conductance and fall out of the hot carrier emission regime.  Using the measured conductance and the previously found \(\rho(\hbar \omega)\), the theory (dashed lines in the right panel) agrees extremely well with the measured spectra (colored data points) with only two adjustable parameters (an overall prefactor, and a slightly elevated electronic temperature that gets the rounding of the emission at the \(eV\) cutoff, indicated by the arrows in the right panel).  

I think this agreement is pretty darn impressive.  It confirms that we have a quantitative understanding of how shot noise (due to the discreteness of charge!) affects light emission processes all the way up to optical frequencies.