Saturday, August 06, 2022

Brief items - talks, CHIPS, and a little reading

An administrative role I've taken on for the last month that will run through September has been eating quite a bit of my time, but I wanted to point out some interesting items:

Sunday, July 31, 2022


In thinking about presenting physics to a lay audience, I think we haven't sufficiently emphasized what we mean by particles or objects being "indistinguishable" and everything that touches in modern physics.

The vernacular meaning of "indistinguishable" is clear.  Two objects are indistinguishable if, when examined, they are identical in every way - there is no measurement or test you could do that would allow you to tell the difference between them.  Large objects can only approach this.  Two identical cue balls made at a billiard ball factory might be extremely similar (white, spherical, smooth, very close to the same diameter), but with sufficient tools you could find differences that would allow you to label the balls and tell them apart.  In the limit of small systems, though, as far as we know it is possible to have objects that are truly indistinguishable.  Two hydrogen atoms, for example, are believed to be truly indistinguishable, with exactly the same mass, exactly the same optical and magnetic properties, etc.  

This idea of indistinguishable particles has profound consequences.  In statistical mechanics, entropy is commonly given by \(k_{\mathrm{B}}\ln \Omega\), where \(k_{\mathrm{B}}\) is Boltzmann's constant, and \(\Omega\) is the number of microscopic ways to arrange a system.  If particles are indistinguishable, this greatly affects our counting of configurations.  (A classical example of this issue involves mixing entropy and the Gibbs Paradox.)  

Indistinguishability and quantum processes deeply bothered the founders of quantum theory.  For example, take a bunch of hydrogen atoms all in the 2p excited state.  Any measurement you could do on those atoms would show them to have exactly the same properties, without any way to distinguish the first atom from, say, the fifth atom selected.  Left alone, each of those atoms will decay to the 1s ground state and spit out a photon, but they will do so in a random order at random times, as far as we can tell.  We can talk about the average lifetime of the 2p state, but there doesn't seem to be any way to identify some internal clock that tells each excited atom when to decay, even though the atoms are indistinguishable. 

It gets more unsettling.  Any two electrons are indistinguishable.  So, "common sense" says that swapping any two electrons should get you back to a state that is the same as the initial situation.  However, electrons are fermions and follow Fermi-Dirac statistics.  When swapping any two electrons, the quantum mechanical state, the mathematical object describing the whole system, has to pick up a factor of -1.   Even weirder, there can be interacting many-body situations when swapping nominally indistinguishable particles takes the system to an entirely different state (non-Abelian anyons).  The indistinguishability of electrons has prompted radical ideas in the past, like Wheeler suggesting that there really is only one electron.

TL/DR:  Indistinguishability of particles is intuitively weird, especially in the context of quantum mechanics.

Wednesday, July 13, 2022

A (very late) book review: The Cold Wars

On a recommendation, I just finished reading The Cold Wars:  A History of Superconductivity, by Jean Matricon and Georges Waysand.  This book was originally published in French in 1994, and then translated in English in 2003.  Unfortunately, the book appears to be out of print, and I was fortunate enough to pick up a used hardcover copy.

Beginning with the race to achieve very low temperatures through liquefaction of helium, this work tells a pretty compelling story of the discovery and understanding of superconductivity and superfluidity, ending in the comparatively early days of the high Tc cuprates.  Along the way, the authors introduce a cast of varyingly compelling characters and some fascinating stories that aren't all well-known.  I knew that Landau had been sent to a gulag for a time; I did not know that Kapitsa wrote personally to Stalin to try to argue that this had to be a mistake and was probably a consequence of Landau being very abrasive to the wrong people more than any actual counterrevolutionary beliefs.  (Spoiler:  Landau was abrasive, but he did also sign onto a letter that slammed Stalin hard.)  I did not know that Bardeen and Schrieffer were most concerned about Feynman possibly scooping them thanks to the latter's brilliance and expertise in both superfluidity and diagrammatic methods.  The story is also in there about the initial papers on YBCO and how the chemical formula was "accidentally" wrong in the submitted manuscript, corrected only once the paper was in proofs.

The authors also do a good job of conveying the ebb and flow of science - from the sudden onset of a fashionable topic, to the transition toward more detail and a greater focus on applications.  The social dimensions come through as well, with the coming and going of particular great centers of excellence in the research, and approaches to "big" science.    

The style is accessible, and throughout there are indented sidebars meant to provide scientific context for those readers who are probably not from a physics background.  If you track down a copy, I definitely recommend reading it if you're interested in the topic and the history of this part of condensed matter physics.

Friday, July 08, 2022

More about the costs of doing research

This isn't physics, but it's still something that might be of interest to some readers.  There is still a great deal of mystery among many about how university research is funded and supported.  Five years ago I wrote a bit about "indirect costs", more properly called facilities and administrative costs (F&A).  I recently had the opportunity to learn more about this, and I came across a very helpful document from the Council on Governmental Relations.  COGR is an association of organizations (universities, med centers, research institutes) who help to advocate to government policymakers about research administration and finances.

The document explains a lot about the history of how research finances are handled in the US.  One thing that struck me as I was reading this is the fact that the actual administrative costs that can be charged to grants (which pay for things like running the grants and contracts offices and the accountants who track the money) has been capped only for universities at 26% since 1991, even though there have been more and more reporting and administrative requirements placed on universities ever since.   (If you want to know what happened in 1991 that led to this cap, you can read this [that might be paywalled] or this wiki article whose detailed accuracy I cannot confirm.)

As I wrote before, vice provosts/presidents/chancellors for research at major US universities would be happy to explain at length that F&A cost recovery doesn't come close to covering the actual costs associated with doing university-based research.  ("Universities lose money doing research.")  Obviously this is an oversimplification - if research was truly a large net financial negative, universities wouldn't do it.  Successful research universities accrue benefits from research in terms of stature and reputation that are worth enough to make the enterprise worthwhile.  Of course, the danger is that the balance will shift enough that only the wealthiest, most prestigious universities will be able to afford groundbreaking research in expensive fields (typically the physical sciences and engineering).  

Anyway, if you want to understand the issues better, I encourage reading that document.  I'll write more about physics soon.

Friday, June 24, 2022

Implementing a model of polyacetylene

An impressive paper was just published in Nature, in which atomically precisely fabricated structures in Si were used as an analog model of a very famous problem in physics, the topological transition in trans-polyacetylene. 

Actual trans-polyacetylene is an aromatic organic chain molecule, consisting of sp2 hybridized carbons, as shown.  This is an interesting system, because you could imagine swapping the C-C and C=C bonds, and having domains where the (bottom-left to top-right) links are double bonds, and other domains where the (top-left to bottom-right) links are double bonds.  The boundaries between domains are topological defects ("solitons").  As was shown by Su, Schrieffer, and Heeger, these defects are spread out over a few bonds, are energetically cheap to form, and are mobile.  

(Adapted from Fig 1 here)
The Su-Schrieffer-Heeger model is a famous example of a model that shows a topological transition.  Label site-to-site hopping along those two bond directions as \(v\) and \(w\).  If you have a finite chain, as shown here, and \(v > w\), there are no special states at the ends of the chain.  However, \(v < w\) for the system as shown, it is favorable to nucleate two "surface states" at the chain ends, with the topological transition happening at \(v = w\).  

The new paper that's just been published takes advantage of the technical capabilities developed over the last two decades by the team of Michelle Simmons at UNSW.  I have written about this approach here.  They have developed and refined the ability to place individual phosphorus dopant atoms on Si with near-atomic precision, leading them to be able to fabricate "dots" (doped islands) and gate electrodes, and then wire these up and characterize them electrically.  The authors made two devices, each  a chain of islands analogous to the C atoms, and most importantly were able to use gate electrodes to tune the charge population on the islands.  One device was designed to be in the topologically trivial limit, and the other (when population-tuned) in the limit with topological end states.  Using electronic transport, they could perform spectroscopy and confirm that the energy level structure agrees with expectations for these two cases.

(Adapted from Fig 2 here)

This is quite a technical accomplishment.  Sure, we "knew" what should happen, but the level of control demonstrated in the fabrication and measurement are very impressive.  These bode well for the future of using these tools to implement analog quantum simulators for more complicated, much harder to solve many-body systems.  

Sunday, June 12, 2022

Quasiparticles and what is "real"

This week a paper was published in Nature about the observation via Raman scattering of a particular excitation in the charge density wave materials RTe3 (R = La, Gd) that is mathematically an example of an "amplitude mode" that carries angular momentum that the authors identify as an axial Higgs mode.  (I'm not going to get into the detailed physics of this.)

The coverage of this paper elicited a kerfuffle on blogs (e.g here and here) for two main reasons that I can discern.  First, there is disagreement in the community about whether calling a mode like this "Higgs" is appropriate, given the lack of a gauge field in this system (this is in the comments on the second blog posting).  That has become practice in the literature, but there are those who strongly disapprove.  Second, some people are upset because some of the press coverage of the paper, with references to dark matter, hyped up the result to make it sound like this was a particle physics discovery, or at least has implications for particle physics. 

This does give me the opportunity, though, to talk about an implication that I see sometimes from our high energy colleagues in discussions of condensed matter, that "quasiparticles" are somehow not "real" in the way of elementary particles.  

What are quasiparticles?  In systems with many degrees of freedom built out of large numbers of constituents, amazingly it is often possible to look at the low energy excitations above the ground state and find that those excitations look particle-like - that is, there are discrete excitations that, e.g., carry (crystal) momentum \(\hbar \mathbf{k}\), have an energy that depends on the momentum in a clear way \(\epsilon(\mathbf{k})\), and also carry spin, charge, etc.  These excitations are "long lived" in the sense that they propagate many of their wavelengths (\(2 \pi/|\mathbf{k}|\)) before scattering and have lifetimes \(\tau\) such that their uncertainty in energy is small compared to their energy above the ground state, (\(\hbar/\tau << \epsilon(\mathbf{k})\)).  The energy of the many-body system can be well approximated as the sum of the quasiparticle excitations:  \(E \approx \Sigma n(\mathbf{k})\epsilon(\mathbf{k})\).  

There are many kinds of quasiparticles in condensed matter systems.  There are the basic ones like (quasi)electrons and (quasi)holes in metals and semiconductors, phonons, magnons, polarons, plasmons, etc.  While it is true that quasiparticles are inherently tied to their host medium, these excitations are "real" in all practical ways - they can be detected experimentally and their properties measured.  Indeed, I would argue that it's pretty incredible that complicated, many-body interacting systems so often host excitations that look so particle-like.  That doesn't seem at all obvious to me a priori.  

What has also become clear over the last couple of decades is that condensed matter systems can (at least in principle) play host to quasiparticles that act mathematically like a variety of ideas that have been proposed over the years in the particle physics world.  You want quasiparticles that mathematically look like massless fermions described by the Dirac equationGraphene can do that.  You want more exotic quasiparticles described by the Weyl equationTaAs can do that.  You want Majorana fermions?  These are expected to be possible, though challenging to distinguish unambiguously.  Remember, the Higgs mechanism started out in superconductors, and the fractional quantum Hall system supports fractionally charged quasiparticles.  (For a while it seemed like there was a cottage industry on the part of a couple of teams out there:  Identify a weird dispersion relation \(\epsilon(\mathbf{k})\) predicted in some other context; find a candidate material whose quasiparticles might show this according to modeling; take ARPES data and publish on the cover of a glossy journal.)

Why are quasiparticles present in condensed matter, and why to they "look like" some models of elementary particles?  Fundamentally, both crystalline solids and free space can be usefully described using the language of quantum field theory.  Crystalline solids have lower symmetry than free space (e.g. the lattice gives discrete rather than continuous translational symmetry), but the mathematical tools at work are closely related.  As Bob Laughlin pointed out in his book, given that quasiparticles in condensed matter can be described in very particle-like terms and can even show fractional charge, maybe its worth wondering whether everything is in a sense quasiparticles.  

Saturday, May 28, 2022

Brief items - reviews, videos, history

Here are some links from the past week:

  • I spent a big portion of this week attending Spin Caloritronics XI at scenic UIUC, for my first in-person workshop in three years.  (The APS March Meeting this year was my first conference since 2019.)  It was fun and a great way to get to meet and hear from experts in a field where I'm a relative newbie.  While zoom and recorded talks have many upsides, the in-person environment is still tough to beat when the meeting is not too huge.  
  • Topical to the meeting, this review came out on the arxiv this week, all about the spin Seebeck effect and how the thermally driven transport of angular momentum in magnetic insulators can give insights into all sorts of systems, including ones with exotic spin-carrying excitations.
  • Another article on a topic near to my heart is this new review (to appear in Science) about strange metals.  It makes clear the distinction between strange and bad metals and gives a good sense of why these systems are interesting.
  • On to videos.  While at the meeting, Fahad Mahmood introduced me to this outreach video, by and about women in condensed matter at UIUC.
  • On a completely unrelated note, I came across this short film from 1937 explaining how differential steering works in cars.  This video is apparently well known in car enthusiast circles, but it was new to me, and its clarity was impressive.  
  • Finally, here is the recording of the science communication symposium that I'd mentioned.  The keynote talk about covid by Peter Hotez starts at 1h49m, and it's really good. 
  • In terms of history (albeit not condensed matter), this article (written by the founding chair) describes the establishment of the first (anywhere) Space Science department, at Rice University,  In 1999 the SPAC department merged with Physics to become the Department of Physics and Astronomy, where I've been since 2000.