Sunday, December 31, 2023

Very brief end of the year round-up

It's hard to believe that it's already the end of 2023.  It's been a busy year for condensed matter; it's unfortunate that two of the biggest stories (problems with high pressure superconductivity papers; the brief excitement about LK99, the not-actually-a-superconductor) were probably the field's highest profile events.  Still, hopefully the latter at least had the effect of bringing to the public a little bit of the excitement and potential of how condensed matter and materials physics affects our lives.  Physics World summarizes some of their picks for big materials-related stories of 2023 here.  Similarly, here are Quanta's choices for biggest physics stories of the year, and these are the choices from the editors of APS's Physics.  

It's been a busy year personally, with lots going on and too much proposal writing, but at least my blog posting was more frequent than in 2022.  It's still surprising to me that I've been writing this since mid-2005, enough to see almost the entire lifecycle of blogging.  Happy New Year to my readers, and if there are any particular topics about which you think I should write, please let me know in the comments.  I'm always looking for CM/materials concepts that I can try to explain on a non-specialist accessible level.  Still looking for the time and appealing perspective to write that popular book....

Anyway, I hope you have a very happy new year, and best wishes for a great 2024.

Thursday, December 21, 2023

New paper - plasmons, excitons, and steering energy

We have a new paper out in Nano Letters (arxiv version here), and I wanted to explain a bit about it and why I think it's a really cool result.   

I've written before about the Purcell Effect.  When we study quantum mechanics, we learn that the rates of processes, like the spontaneous emission of light from an atom, are actually malleable.  The rate of a particular process is usually proportional to the number of ways that process can happen - this is quantified in something called Fermi's Golden Rule.  When we are talking about something like emission of light from an atom, the rate is proportional to the number of possible final states of the photon.  We know how to count those states in a given energy range in free space, and Purcell pointed out that by placing that atom in an optical cavity, we alter the density of final states as a function of frequency, \(\rho(\omega)\) from its empty space value, and hence can change the rate of emission.  Pretty wild that placing a system in a cavity can alter the flow of energy in that system away from what it would otherwise be.

I've also written before about what happens we take two resonators and couple them together - we get "hybridization" or "new normal modes".  If you take a mass on a spring (natural frequency \(\omega_0 = \sqrt{k/m}\)) and couple it mechanically to another identical mass on an identical spring, the coupled system will now have two resonances, one above and one below \(\omega_{0}\).  The chemistry analog of this is, bonding two hydrogen atoms (each with 1s orbitals) together leads to two \(\sigma\) orbitals, one bonding and one antibonding.  

In the new paper, we start with a little metal tunnel junction that hosts plasmonic resonances, like the junctions I wrote about here.  We showed in that paper and subsequent work that it is possible to use an applied voltage and current to get some of the electrons, right near where the electrodes almost touch, to become effectively so hot that they glow (emitting light at energies larger than the applied voltage), while the atomic lattice itself remains cold.  The light emission process here is the radiative recombination of hot electrons and holes in the metal, where an electron drops down in energy to fill in a hole and spit out a photon.  The plasmon resonances of the bare metal act like a sort of cavity, shaping the density of photon states \(\rho(\omega)\), as we also showed here.  The plasmons, set by the metal shape and electronic properties, actually affect the rate at which the electrons and holes in that same metal radiatively combine.

Left: A thin flake of WSe2 is placed on a plasmonic
Au junction.  Right: Overbias light emission from the
device at a particular emitted polarization shows a big
peak splitting right around where the exciton resonance
is of the WSe2 (orange curve).  Adapted from the
SI of this paper.

The wrinkle in the new paper is that we couple that metal plasmonic junction with a thin (few nm) layer of 2D semiconductor by placing the semiconductor on top of the metal.  The semiconductor can host excitons, bound electron-hole pairs, and if the semiconductor is excited with enough energy to create them, the excitons can radiatively annihilate, leading to a comparatively narrow resonance at an energy that overlaps the plasmon resonances of the metal junction.  Thanks to hybridization between the plasmons in the metal and the excitons in the semiconductor, the photon density of states now has a split peak structure ("upper and lower plexciton polariton resonances" if you are an expert).  Light emission in this device is still due to recombination of electrons and holes in the metal, but now the recombination dynamics of those electrons "feels" the strong coupling between the excitons and plasmons.  (The polarization of the emitted light is rather complicated because of the polarization properties of the plasmon resonances).  

There are a lot of interesting possibilities on where to go from here, but it's always amazing to me to see how this physics comes together.  In this case, by changing the optical environment of a metal structure, we can alter the fate of energy stored in the electrons of that metal.  Really neat.

Tuesday, December 12, 2023

AI/ML and condensed matter + materials science

Materials define the way we live.  That may sound like an exaggeration that I like to spout because I'm a condensed matter physicist, but it's demonstrably true.  Remember, past historians have given us terms like "Stone Age", "Bronze Age", and "Iron Age", and the "Information Age" has also been called the "Silicon Age".  (And who could forget plastics.)

Perhaps it's not surprising, then, that some of the biggest, most wealthy companies in the world are turning their attention to materials and the possibility that AI approaches could lead to disruptive changes.  As I mentioned last week, there have been recent papers (back to back in Nature) by the Google Deep Mind group on this topic.  The idea is to use their particular flavor of AI/machine learning to identify potential new compounds/solids that should be thermodynamically stable and synthesizable, and make predictions about their structures and properties.  This is not a new idea, in that the Materials Genome Initiative (started in 2011) has been working in this direction, compiling large amounts of data about solid materials and their properties, and the Materials Project has been pushing on efficient computational methods with the modest goal of computing "the properties of all inorganic materials and provid[ing] the data and associated analysis algorithms for every materials researcher free of charge".

In addition to the Google work, Microsoft has released on the arxiv their effort, MatterGen, which uses a generative AI approach to try to predict new stable materials with desirable properties, such as a target symmetry or chemical composition or mechanical/electronic/magnetic response.  An example from their paper is to try to find new magnetic materials that have industrially useful properties but do not involve rare earths.  

There is a long way to go on any of these projects, but it's easy to see why the approach is enticing.  Imagine saying, I want a material that's as electrically conductive and mechanically strong and workable as aluminum, but transparent in the visible, and having software give you a credible approach likely to succeed (rather than having to rely on a time-traveling Mr. Scott).  

I'd be curious to know readers' opinions of what constitute the biggest obstacles on this path.  Is it the reliability of computational methods at predicting formation energies and structures?  Is it the lack of rapid yet robust experimental screening approaches?  Is it that the way generative AI and related tools work is just not well-suited to finding truly new systems beyond their training sets?

Friday, December 01, 2023

Intriguing papers - exquisite thermal measurements + automated materials discovery/synthesis

It's a busy time, but I wanted to point out a couple of papers from this past week.

First, I want to point to this preprint on the arxiv, where the Weizmann folks do an incredibly technically impressive thing.  I'd written recently about the thermal Hall effect, when a longitudinal heat current (and temperature gradient) in the presence of a magnetic field results in a transverse temperature gradient as well as the usual longitudinal one.  One of the most interesting ways this can happen is if there are edge modes, excitations that propagate around the perimeter of a 2D system and can carry heat (even if they are neutral and don't carry charge).  Unsurprisingly, to measure thermal transport requires putting thermometers at different places on the sample and carefully measuring temperature differences.  Well, these folks have done just exquisitely nice measurements of Johnson-Nyquist noise in particular contacts for thermometry, and they can see the incredibly tiny heat currents carried by rather exotic edge modes in some unusual fractional quantum Hall states.  It's just a technical tour de force.

Second, on a completely unrelated note, there are back to back papers in Nature this week from the Google deep mind folks - their own write-up is here.  The first paper uses their methods to predict a large number of what are expected to be new stable crystal structures.  The second paper talks about how they used an automated/robot-driven lab to try to synthesize a bunch of these in an automated way and characterize the resulting material.  This is certainly thought-provoking.  It is worth noting that detailed characterization (including confirming that you've made what you were trying to make) and optimized synthesis of new materials is very challenging and of concern here.  Update:  there is further discussion of the characterization here (on LinkedIn by the authors) as well, and more on Twitter here and here.

Third, this paper looks extremely interesting.  It’s long been a staple of condensed matter theory to try to capture complex materials with effective low energy models, like suggesting the Hubbard model as a treatment of the essential physics of the cuprate superconductors.  The authors here report that they’ve done a more orbital-based/ab initio version of this, solved these models numerically, and state that they can reproduce details of the phase diagram of four of the cuprates spanning a big range of superconducting transition temperatures.  Seems like this may bode well for gaining insights into these systems.

Monday, November 27, 2023

Noise in a strange metal - pushing techniques into new systems

Over the holiday weekend, we had a paper come out in which we report the results of measuring charge shot noise (see here also) in a strange metal.   Other write-ups of the work (here and especially this nice article in Quanta here) do a good job of explaining what we saw, but I wanted to highlight a couple of specific points that I think deserve emphasis.  

In thermal equilibrium at some temperature \(T\), there are current and voltage fluctuations in a conductor - this is called Johnson-Nyquist noise - and it is unavoidable.  Shot noise in electrical current results from the granularity of charge and, as shown in its original incarnation (pdf is in German), from the statistical variation in the arrival times of electrons.   Shot noise is an "excess" noise that appears in addition to this, only when a conductor is driven out of equilibrium by an applied voltage and carries a net current.

While the idea of shot noise is tunnel junctions and vacuum tubes had been worked out a long time ago (see the above 1918 paper by Schottky), it was in the 1990s when people really turned to the question of what one should see in noise measurements in small metal or semiconductor wires.  Why don't we see shot noise in macroscopic conductors like your house wiring?  Well, shot noise requires some deviation of the electrons from their thermal equilibrium response - otherwise you would just have Johnson-Nyquist noise.  The electrons in a metal or semiconductor are coupled to the vibrations of the atoms (phonons) - the clearest evidence for this is that the decrease in scattering of the electrons by the phonons explains why metals become more conductive as temperature is decreased.  In conductors large compared to the (temperature-dependent) electron-phonon scattering length, the electrons should basically be in good thermal equilibrium with the lattice at temperature \(T\), so all that should be detected is Johnson-Nyquist noise.  To see shot noise in a wire, you'd need the wire to be small compared to that e-ph length, typically on the order of a micron at low temperatures.  In the 1980s and 1990s, it was now possible to make structures on that scale.

Fig. 4 from the paper
The theory of what should be seen was worked out in a couple of different ways, initially assuming that it is safe to describe the conductor as a Fermi gas (ignoring electron-electron interactions).  One approach started from the conduction-as-wave-transmission picture of Landauer (see here and here for two examples).  A complementary approach (see here) calculated noise from the electronic distribution functions and got the same answer for non-interacting electrons, that the current noise should be 1/3 of the classic Schottky result.  That factor of 1/3 is called the Fano factor, \(F\).   If electron-electron interactions are "turned on", allowing the electrons to exchange energy amongst themselves but not lose energy to the lattice, the noise is actually a bit larger, \(F \rightarrow \sqrt{3}/4\).   It turns out that these values were verified in experiments in gold wires (see here and here, though one has to be careful in experimental design to see  \(F \rightarrow \sqrt{3}/4\)).  This confirmation is a great triumph of our understanding of physics at these mesoscopic scales.  (Interestingly, similar results are expected even with a non-degenerate electron gas - see here and here.)

We applied these same experimental approaches to nanowires we made from exquisite films of a strange metal, YbRh2Si2, and we found that the noise is much reduced from the usual result seen in Au wires (which we also confirmed).  We tested whether phonons could be responsible for the noise suppression, applying the same approach as had been done in the '90s (measurements on wires tens of microns long, where e-ph scattering should be important), and found (in addition to further confirming the e-ph energy loss results in Au from the '90s) that energy loss to phonons can't explain what we see in YbRhsSi2.  

Some further points of interest:

  • Until recently there really has not been much attempt to push the theoretical analysis of these kinds of measurements beyond the 1990s/early 2000s results.  My colleague Qimiao Si and his group have looked at whether strong Fermi liquid corrections affect the expected noise, and the answer is "no".  Of course, there are all kinds of additional complications that one could imagine.
  • This work was only possible because of the existence of high quality thin films of the material, and the ability to fabricate nanostructures from this stuff without introducing so much disorder or chemical change as to ruin the material.  My collaborator Silke Bühler-Paschen and her group have spent years learning how to grow this and related materials, and long-term support for materials growth is critically important.  My student, the lead author on the study, did great work figuring out the fabrication.  It's really not trivial.  
  • I think it's worthwhile to consider pushing older techniques into new regimes and applying them to new materials systems.  The heyday of mesoscopics in the 1990s doesn't need to be viewed as a closed, somewhat completed subfield, but rather as a base from which to consider new ways to characterize the rich variety of materials and phases that we have to play with in condensed matter.  

Thursday, November 16, 2023

Faculty positions at Rice - follow-up

I had mentioned about 6 weeks ago that my department at Rice is searching in the quantum/AMO space for experiment and theory.   Now I want to put the larger context of this out there - Rice has four quantum-related searches going on right now:

Quantum experiment (PHYA): https://apply.interfolio.com/131378
Quantum engineering (ECE): https://apply.interfolio.com/133316
Quantum materials (MSNE): https://apply.interfolio.com/135086

Interested candidates, we hope you will apply!  It's an exciting time here, and our quantum initiative folks can help make sure applications end up in the right place.  

Postdoctoral opportunities at Rice

I will be sending some emails shortly, but I wanted to point out postdoctoral opportunities here at Rice University.

The Smalley-Curl Institute is having a competition for two two-year postdoctoral fellow slots.  Click on the link for the details.  The requirements for a candidate:

  • Nomination by current SCI faculty member
  • Ph.D. in a field related to an SCI focus areas
  • Successful Ph.D. thesis defense before start of appointment
  • Ph.D. completed no more than three years before the start of the appointment
I would be happy to work with an interested, competitive candidate on this, and the deadline for applying is December 31.  Research areas in my lab these days include:  nanostructure-based studies of correlated quantum materials, including noise-based measurements; studies of spin transport and thermally driven spin effects in insulating magnets, from basic science to applications in low-power electronics; plasmon-based nanophotonic light sources and plasmonic junctions for physical chemistry.  If you're a student finishing up and are interested, please contact me, and if you're a faculty member working with possible candidates, please feel free to point out this opportunity. 

Rice also has a university-wide endowed honorific postdoctoral program called the Rice Academy of Fellows.   Like all such things, it's very competitive, and it similarly has a deadline of January 3, 2024.  Again, applicants have to have a faculty mentor, so in case someone is interested in working with me on this, please contact me via email. 

Saturday, November 11, 2023

Scientific publishing - where are we going?

I think it's safe to say that anyone involved in scientific publishing will tell you that it's a mess and the trends are worrisome.  This week, this news release/article came out about this preprint which shows a number of the issues.  In brief (not all of this is in the preprint; some is me editorializing):

Figure 1 from this preprint

  • The number of scientific papers being published is growing at a rate that looks completely unsustainable.  In my opinion, it's problematic on multiple levels.  There aren't enough reviewers (though that doesn't bother all publishers) and the average paper gets smaller and smaller readership (raising the question of why bother to publish papers that no one reads).  Does it make sense that the number of papers is skyrocketing while the number of PhDs granted is falling?
  • Some publishers (especially Frontiers, Hindawi, MDPI) have boosted this by drastically cranking up the number of papers that they publish, through launching specialized journals with "special issues" designed to have super-short review times (assuming that review is even truly part of the process).  Lest you think this is only the provenance of publishers previously accused of being predatory, this week alone I have received five different "special issue" announcements from AIP journals.
  • Why do people do this?  To try to game the impact factor calculations.  I've aired my grievances before about why journal impact factor is a lousy metric.  
  • Why do people want to inflate impact factors?  Because that's how journals keep score, and some countries put in place big-time incentives tied to impact factor.  A publisher worries that if its journal's impact factor falls below some threshold, then the government of China, for example, will no longer view that journal as important, and then thousands of authors will stop submitting....
  • Open access is a complicating factor, with some publishers charging absolutely sky-high charges, while at the same time having very high profit margins.  In the US, at least, those charges can be much larger than what grants will support.
  • Over all of this is the concern that massively inflating the amount of scientific literature lowers its quality and undermines the credibility of science in general.  
Coincidentally, this week we hosted Steinn Sigurðsson for a colloquium.  He is now the scientific director of the arxiv, the preprint archive that went from a quick and dirty preprint sharing site in 1991 to an enormously important part of the global scientific enterprise.  In his talk he hit on some wild numbers.  The arxiv is up to around 20,000 papers per month now (!) (in part because new disciplines like quantitative biology are using the arxiv).  Thankfully the arxiv has recently landed some good support.  Their annual operating budget is around $3.5M, and this is an enormous bargain by any measure.  The arxiv is partnering with volunteer developers who are adding some neat functionality.  Unsurprisingly, generative AI is a serious concern, even more so than for the publishing houses.  

It's a transformative time, for sure.  Maybe what we are seeing is analogous to the fluctuations that happen when approaching a 2nd order phase transition, and we are headed for a real change in the way publishing works.  It's hard to see how the current trends can continue unabated.

Wednesday, November 01, 2023

Strategic planning + departmental reviews

It's been a while since I've written a post about the ways of academia, so I thought it might be time, though it's not exactly glamorous or exciting.  There are certain cycles in research universities, and two interrelated ones are the cycle of departmental strategic planning and the cycle of external departmental reviews.

Strategic planning can be extremely important, as it allows departments to take stock of where things are, what opportunities exist for improvement (in terms of research, teaching, departmental operations), and how the department aspires to move forward.  Often this can involve a hiring plan, based on demographic trends in the department (e.g., how many faculty lines are expected to be available in the next, say, five to seven years?), rising field/school/university research priorities (e.g., there is likely to be enormous investment in AI/ML in the coming years).  Discussions for strategic planning can be frought, since even maintaining departmental faculty size means alloting new hires between different possible research areas in a zero sum.  Still, arriving at a departmental plan is often expected at one level up (that of a School or College, depending on the university's org chart labeling scheme), and having a plan that department members know and understand is helpful in transparency of how decisions get made that shape the future of the department.  It doesn't make sense to do reformulate these plans at too rapid a frequency, since the ability to implement the plan can be strongly perturbed by, e.g., economic events, global pandemics, or big changes in university leadership.

Very often, deans (or provosts) also value periodic reviews of departments by an external visiting committee.  The visiting committee is typically put together with input from the department (research areas that should be represented, suggestions of possible reviewers) and invited to come for a couple of days of interviews and departmental presentations.  These reviews are typically very broad, looking at research, teaching, departmental climate, staffing levels and organization, infrastructure and space needs, etc.  It's important to talk to all stakeholders (departmental leadership, TT and NTT faculty, staff, undergrad and grad students, postdocs, and of course the dean or equivalent who is the intended recipient of the report). The expected output of these visits is a report to the dean (or provost).  Such a report can be very helpful for the department to get feedback on their plans and operations, and to serve as a way of putting priorities forward to the dean/provost level.  Similarly, often deans find these things valuable as a way to make certain arguments up to higher levels.  It seems to be human nature that a statement made by a nominally objective external committee can get more traction than the same statement made by locals.  Like strategic plans, it only makes sense to do external reviews on a timescale sufficiently long that the department would have a chance to address issues raised from the previous visit before the next one.  For both of these things, every five years is on the edge of being too frequent, and every ten years would definitely be too long an interval.

Participating in external visits takes time, but I've found it to be a very valuable experience.  It's allowed me to meet and work with faculty from a variety of places, and it can be very helpful to see how other institutions do things (even at a level of learning about tools like software that can be useful for tracking degree progress, or organizations that work to facilitate career placement at the graduate level).  

Friday, October 27, 2023

Reading material - orders of magnitude and difficult times

Over the past couple of weeks (and more) I have found a number of things to read that I wanted to pass on.  First, if you'd like a break from the seemingly continual stream of bad news in the world and enjoy good "think like a physicist"/dimensional analysis/order of magnitude estimate/Fermi problem discussions, I suggest:

On a much more sobering note, I was saddened to learn of the grave illness of Prof. Jan Zaanen, who has terminal cancer.  A colleague brought my attention to an essay (link here) that Prof. Zaanen has written in the hopes that it will be widely read, and I pass it along.  

More soon.

Wednesday, October 18, 2023

Scientific travel

Particularly in these post-pandemic, climate-change-addled, zoom-enabled times, I appreciate the argument that it's always worth asking, "Is this trip really necessary?"  We are in the age of remote work and zoom seminars that are attended by people from all over the world.  Is there sufficient benefit to in-person visits to justify travel for work?  I just got back from my first really lengthy science trip in a number of years, and it was definitely very valuable, with experiences and knowledge transfer that just could not have happened nearly as readily any other way.  

I was fortunate enough to be able to attend and speak at a (beginning of October) summer school at ISTA, which is a large and growing scientific institute in Klosterneuberg, outside of Vienna, and I was also able to visit my collaborator's lab at TU Wien as well as the Vienna MicroKelvin Laboratory.  I spend the following week visiting the Laboratoire de Physique des Solides in Orsay, hanging out with the quantum electronics group.  Many thanks to my hosts for helping to organize these trips and for making me feel so welcome.

In-person visits allow for longer, extensive, interactive conversations - standing at a whiteboard, or having coffee, or pointing at actual apparatus.  It's a completely different experience than talking to someone over zoom or over the phone.  I think I did a better job explaining our work, and I definitely think that I learned a lot more about diverse topics than if I'd only had brief virtual interactions.   As an experimentalist, it can be very valuable to learn details about how some measurements are actually done, even including which bits of equipment and instrumentation are preferred by others.  (LPS has a ton of Rohde and Schwarz equipment, which I've really not seen to that extent in the US.  I'd also never heard of mycryofirm and their closed cycle cryostats.)

As an added bonus, I got to visit the Musée Curie in Paris and see Marie Curie's lab.  Here is a photograph of a Geiger counter that they'd made c. 1930.  Hand-soldered, uninsulated wires.  The biggest tube is the actual Geiger-Müller tube which produces current pulses when ionizing radiation zips through it.  The other two tubes make an amplifier to crank up the current pulses enough to turn an electric motor that drives a mechanical counter on the far right outside the box.

Hopefully I will finish up some writing and be posting more soon.

Thursday, October 05, 2023

The Nobels, physics and chemistry

As you undoubtedly know, the 2023 Nobel in physics has been awarded to Pierre Agostini, Ferenc Krausz, and Anne L'Huillier, for the development of techniques associated with attosecond-scale optical pulses.  Here is the more popular write-up about this (including a good handwave of how attosecond pulses can be made) from the Nobel Foundation, and here is the more technical version.  A number of people (including friends and relatives) have asked me in the last couple of days about this, including what discoveries have these techniques led to, and how is this work different than preceding Nobel prizes (like the 1999 chemistry prize for femtosecond chemistry, the 2005 prize in physics for frequency combs, and the half of the 2018 physics prize for femtosecond pulsed lasers).  This isn't really my area of expertise, but my impression from talking with people is that the attosecond work is thus far more of a technical achievement than a technique that has led to a series of groundbreaking scientific results or technologies.  

Scientifically, the attosecond regime is very fast compared to the dynamics of, e.g., solids.  That said, attosecond techniques have been used to characterize condensed matter systems, as described here. Crudely speaking, the relevant energy scale associated with 100 as is \(h/10^{-16} s \sim\) 40 eV, the kind of energy (in the deep ultraviolet range) associated with photoemission.  It makes sense that some of the results highlighted in the Nobel citation have to do with using these methods to measure time delays associated with photoemission - like seeing that 4f electrons take longer to photoemit than s and p electrons in other bands.  If readers can point to a great explanation that goes deeper than this, please leave it in the comments.

The 2023 chemistry prize has been awarded to Moungi Bawendi, Louis Brus, and Alexey Ekimov, for the discovery and development of semiconductor nanocrystals now popularly called quantum dots.  These systems are absolutely great platforms to demonstrate quantum confinement.  By taking a bulk semiconductor and carving it up into pieces so small that the electronic wavefunctions get squeezed by the boundaries - this generally increases the energy spacings between levels, including the energy associated with the gap between the valence band and the conduction band. That is, a semiconductor that might fluoresce in the red in the bulk can be chopped into pieces that fluoresce in the green or the blue (higher energies).  The story of these materials (their growth, how to make them uniform and stable without bad defects at their surfaces) is very cool.  Quantum dots are now widely used as luminescent materials in display devices, and they are also broadly employed as fluorophores for biological imaging and related applications.  (Louis Brus is a Rice alumnus - huzzah!  I've never met Dr. Ekimov, but in my experience, Brus and Bawendi are both very nice, down-to-earth people whose groups write clear, non-hype-ridden papers.)

Saturday, September 30, 2023

Faculty positions at Rice, + annual Nobel speculation

Trying to spread the word:

The Department of Physics and Astronomy at Rice University in Houston, Texas invites applications for two tenure-track faculty positions, one experimental and one theoretical, in the area of quantum science using atomic, molecular, or optical methods. This encompasses quantum information processing, quantum sensing, quantum networks, quantum transduction, quantum many-body physics, and quantum simulation conducted on a variety of platforms. The ideal candidates will intellectually connect AMO physics to topics in condensed matter and quantum information theory. In both searches, we seek outstanding scientists whose research will complement and extend existing quantum activities within the Department and across the University. In addition to developing an independent and vigorous research program, the successful applicants will be expected to teach, on average, one undergraduate or graduate course each semester, and contribute to the service missions of the Department and University. The Department anticipates making the appointments at the assistant professor level. A Ph.D. in physics or related field is required by June 30, 2024.

Applications for these positions must be submitted electronically at apply.interfolio.com/131378 (experimental) and apply.interfolio.com/131379 (theoretical). Applicants will be required to submit the following: (1) cover letter; (2) curriculum vitae; (3) statement of research; (4) statement on teaching; (5) statement on diversity, mentoring, and outreach; (6) PDF copies of up to three publications; and (7) the names, affiliations, and email addresses of three professional references. Rice University, and the Department of Physics and Astronomy, are strongly committed to a culturally diverse intellectual community. In this spirit, we particularly welcome applications from all genders and members of historically underrepresented groups who exemplify diverse cultural experiences and who are especially qualified to mentor and advise all members of our diverse student population. We will begin reviewing applications by November 15, 2023. To receive full consideration, all application materials must be received by December 15, 2023. The expected appointment date is July 2024.

____

In addition, the Nobels will be announced this week.  For the nth year in a row, I will put forward my usual thought that it could be Aharonov and Berry for geometric phases in physics (though I know that Pancharatnam is intellectually in there and died in 1969).  Speculate away below in the comments.  I'm traveling, but I will try to follow the discussion.

Tuesday, September 26, 2023

A few quick highlights

 It's been a very busy time, hence my lower posting frequency.  It was rather intense trying to attend both the KITP conference and the morning sessions of the DOE experimental condensed matter PI meeting (pdf of agenda here).  A few quick highlights that I thought were interesting:

  • Kagome metals of the form AV3Sb5 are very complicated.  In these materials, in the a-b plane the V atoms form a Kagome lattice (before that one reader corrects me, I know that this is not formally a lattice from the crystallographic point of view, just using the term colloquially).  Band structure calculations show that there are rather flat bands (for an explanation, see here) near the Fermi level, and there are Dirac cones, van Hove singularities, Fermi surface nesting, etc.  These materials have nontrivial electronic topology, and CsV3Sb5 and KV3Sb5 both have charge density wave transitions and low-temperature superconductivity.  Here is a nice study of the CDW in CsV3Sb5, and here is a study that shows that there is no spontaneous breaking of time-reversal symmetry below that transition.  This paper shows that there is funky nonlinear electronic transport (apply a current at frequency \(\omega\), measure a voltage at frequency \(2 \omega\)) in CsV3Sb5 that is switchable in sign with an out-of-plane magnetic field.  Weirdly, that is not seen in KV3Sb5 even though the basic noninteracting band structures of the two materials are almost identical, implying that it has something to do with electronic correlation effects.
  • Related to that last paper, here is a review article about using focused ion beams for sample preparation and material engineering.  It's pretty amazing what can be done with these tools, including carving out micro/nanostructured devices from originally bulk crystals of interesting materials.  
  • The temperature-dependent part of the electrical resistivity of Fermi liquids is expected to scale like \(T^{2}\) as \(T \rightarrow 0\).  One can make a very general argument (that ignores actual kinematic restrictions on scattering) based on the Pauli exclusion principle that the inelastic e-e scattering rate should go like \(T^{2}\) (number of electron quasiparticles excited goes like \(T\), number of empty states available to scatter into also goes like \(T\)).  However, actually keeping track of momentum conservation, it turns out that one usually needs Umklapp scattering processes to get this.  That isn't necessary all the time, however.  In very low density metals, the Fermi wavevector is far from the Brillouin zone boundary and so Umklapp should not be important, but it is still possible to get \(T^{2}\) resistivity (see here as well).  Similarly, in 3He, a true Fermi liquid, there is no lattice, so there is no such thing as Umklapp, but at the lowest temperatures the \(T^{2}\) thermal conduction is still seen (though some weird things happen at higher temperatures). 
There are more, but I have to work on writing some other things.  More soon....

Sunday, September 17, 2023

Meetings this week

This week is the 2023 DOE experimental condensed matter physics PI meeting - in the past I’ve written up highlights of these here (2021), here (2019), here (2017), here (2015), and here (2013).  This year, I am going to have to present remotely, however, because I am giving a talk at this interesting conference at the Kavli Institute for Theoretical Physics.  I will try to give some takeaways of the KITP meeting, and if any of the ECMP attendees want to give their perspective on news from the DOE meeting, I’d be grateful for updates in the comments.

Thursday, September 07, 2023

Things I learned at the Packard Foundation meeting

Early in my career, I was incredibly fortunate to be awarded a David and Lucille Packard Foundation fellowship, and this week I attended the meeting in honor of the 35th anniversary of the fellowship program.  Packard fellowships are amazing, with awardees spanning the sciences (including math) and engineering, providing resources for a sustained period (5 years) with enormous flexibility.  The meetings have been some of the most fun ones I've ever attended, with talks by incoming and outgoing fellows that are short (20 min) and specifically designed to be accessible by scientifically literate non-experts.  My highlights from the meeting ten years ago (the last one I attended) are here.  Highlights from meetings back when I was a fellow are here, herehere, here.

Here are some cool things that I learned at the meeting (some of which I'm sure I should've known), from a few of the talks + posters.  (Unfortunately I cannot stay for the last day, so apologies for missing some great presentations.)   I will further update this post later in the day and tomorrow.

  • By the 2040s, with the oncoming LISA and Cosmic Explorer/Einstein Telescope instruments, it's possible that we will be able to detect every blackhole merger in the entire visible universe.
  • It's very challenging to have models of galaxy evolution that handle how supernovae regulate mass outflow and star formation to end up with what we see statistically in the sky
  • Machine learning can be really good at disentangling overlapping seismic events.
  • In self-propelled/active matter, it's possible to start with particles that just have a hard-shell repulsion and still act like there is an effective attractive interaction that leads to clumping.
  • There are about \(10^{14}\) bacteria in each person, with about 360\(\times\) the genetic material of the person.  Also, the gut has lots of neurons, five times as many as the spinal cord (!).  The gut microbiome can seemingly influence concentrations of neurotransmitters.
  • Bees can deliberately damage leaves of plants to stress the flora and encourage earlier and more prolific flowering.
  • For some bio-produced materials that are nominally dry, their elastic properties and the dependence of those properties on humidity is seemingly controlled almost entirely by the water they contain.  
  • It is now possible to spatially resolve gene expression (via mRNA) at the single cell level across whole slices of, e.g., mouse brain tissue.  Mind-blowing links here and here.
  • I knew that ordinary human red blood cells have no organelles, and therefore they can't really respond much to stimuli.  What I did not know is that maturing red blood cells (erythrocyte precurors) in bone marrow start with nuclei and can participate in immune response, and that red blood cells in fetuses (and then at trace level in pregnant mothers) circulate all the different progenitor cells, potentially playing an important role in immune response.
  • 45% of all deaths in the US can be attributed in part to fibrosis (scarring) issues (including cardiac problems), but somehow the uterus can massively regenerate monthly without scarring.  Also, zero common lab animals menstruate, which is a major obstacle for research; transgenic mice can now be made so that there are good animal models for study. 
  • Engineered cellulose materials can be useful for radiative cooling to the sky and can be adapted for many purposes, like water harvesting from the atmosphere with porous fabrics.


Thursday, August 31, 2023

What is the thermal Hall effect?

One thing that physics and mechanical engineering students learn early on is that there are often analogies between charge flow and heat flow, and this is reflected in the mathematical models we use to describe charge and heat transport.  We use Ohm's law, \(\mathbf{j}=\tilde{\sigma}\cdot \mathbf{E}\), which defines an electrical conductivity tensor \(\tilde{\sigma}\) that relates charge current density \(\mathbf{j}\) to electric fields \(\mathbf{E}=-\nabla \phi\), where \(\phi(\mathbf{r})\) is the electric potential.  Similarly, we can use Fourier's law for thermal conduction, \(\mathbf{j}_{Q} = - \tilde{\kappa}\cdot \nabla T\), where \(\mathbf{j}_{Q}\) is a heat current density, \(T(\mathbf{r})\) is the temperature distribution, and \(\tilde{\kappa}\) is the thermal conductivity.  


We know from experience that the electrical conductivity really has to be a tensor, meaning that the current and the electric field don't have to point along each other.  The most famous example of this, the Hall effect, goes back a long way, discovered by Edwin Hall in 1879.  The phenomenon is easy to describe.  Put a conductor in a magnetic field (directed along \(z\)), and drive a (charge) current \(I_{x}\) along it (along \(x\)), as shown, typically by applying a voltage along the \(x\) direction, \(V_{xx}\).  Hall found that there is then a transverse voltage that develops, \(V_{xy}\) that is proportional to the current.  The physical picture for this is something that we teach to first-year undergrads:  The charge carriers in the conductor obey the Lorentz force law and curve in the presence of a magnetic field.  There can't be a net current in the \(y\) direction because of the edges of the sample, so a transverse (\(y\)-directed) electric field has to build up.  

There can also be a thermal Hall effect, when driving heat conduction in one direction (say \(x\)) leads to an additional temperature gradient in a transverse (\(y\)) direction.  The least interesting version of this (the Maggi–Righi–Leduc effect) is in fact a consequence of the regular Hall effect:  the same charge carriers in a conductor can carry thermal energy as well as charge, so thermal energy just gets dragged sideways.   

Surprisingly, insulators can also show a thermal Hall effect.  That's rather unintuitive, since whatever is carrying thermal energy in the insulator is not some charged object obeying the Lorentz force law.  Interestingly, there are several distinct mechanisms that can lead to thermal Hall response.  With phonons carrying the thermal energy, you can have magnetic field affecting the scattering of phonons, and you can also have intrinsic curving of phonon propagation due to Berry phase effects.  In magnetic insulators, thermal energy can also be carried by magnons, and there again you can have Berry phase effects giving you a magnon Hall effect.  There can also be a thermal Hall signal from topological magnon modes that run around the edges of the material.  In special magnetic insulators (Kitaev systems), there are thought to be special Majorana edge modes that can give quantized thermal Hall response, though non-quantized response argues that topological magnon modes are relevant in those systems.  The bottom line:  thermal Hall effects are real and it can be very challenging to distinguish between candidate mechanisms. 

(Note: Blogger now compresses the figures, so click on the image to see a higher res version.)




Wednesday, August 23, 2023

Some interesting recent papers - lots to ponder

As we bid apparent farewell to LK99, it's important to note that several other pretty exciting things have been happening in the condensed matter/nano world.  Here are a few papers that look intriguing (caveat emptor:  I have not had a chance to read these in any real depth, so my insights are limited.)

  • Somehow I had never heard of Pines' Demon until this very recent paper came out, and the story is told briefly here.  The wikipedia link is actually very good, so I don't know that I can improve upon the description.  You can have coupled collective modes for electrons in two different bands in a material, where the electrons in one band are sloshing anti-phase with the electrons in the other band.  The resulting mode can be "massless" (in the sense that its energy is linearly proportional to its momentum, like a photon's), and because it doesn't involve net real-space charge displacement, to first approximation it doesn't couple to light.  The UIUC group used a really neat, very sensitive angle-resolved electron scattering method to spot this for the first time, in high quality films of Sr2RuO4.  (An arxiv version of the paper is here.) 
  • Here is a theory paper in Science (arxiv version) that presents a general model of so-called strange metals (ancient post on this blog).  Strange metals appear in a large number of physical systems and are examples where the standard picture of metals, Fermi liquid theory, seems to fail.  I will hopefully write a bit more about this soon.  One of the key signatures of strange metals is a low temperature electrical resistivity that varies like \(\rho(T) = \rho_{0} + AT\), as opposed to the usual Fermi liquid result \(\rho(T) = \rho_{0} + AT^{2}\).  Explaining this and the role of interactions and disorder is a real challenge.  Here is a nice write-up by the Simons Foundation on this.
  • Scanning tunneling microscopy is a great spectroscopic tool, and here is an example where it's been possible to map out information about the many-body electronic states in magic-angle twisted bilayer graphene (arxiv version).  Very pretty images, though I need to think carefully about how to understand what is seen here.
  • One more very intriguing result is this paper, which reports the observation of the fractional quantum anomalous Hall effect (arxiv version).  As I'd mentioned here, the anomalous Hall effect (AHE, a spontaneous voltage appearing transverse to a charge current) in magnetic materials was discovered in 1881 and not understood until recently.  Because of cool topological physics, some materials show a quantized AHE.  In 2D electron systems, the fractional quantum Hall effect is deeply connected to many-body interaction effects.  Seeing fractional quantum Hall states spontaneously appear in the AHE is quite exciting, suggesting that rich many-body correlations can happen in these topological magnetic systems as well.  Note: I really need to read more about this - I don't know anything in depth here.
  • On the more applied side, this article is an extremely comprehensive review of the state of the art for transistors, the critical building block of basically every modern computing technology.  Sorry - I don't have a link to a free version (unless this one is open access and I missed it).  Anyway, for anyone who wants to understand modern transistor technology, where it is going, and why, I strongly encourage you to read this.  If I was teaching my grad nano class, I'd definitely use this as a reference.
  • Again on the applied side, here is a neat review of energy harvesting materials.  There is a lot of interest in finding ways to make use of energy that would otherwise go to waste (e.g. putting piezo generators in your clothing or footwear that could trickle charge your electronics while you walk around).  
  • In the direction of levity, in all too short supply these days, xkcd was really on-point this week.  For condensed matter folks, beware the quasiparticle beam weapon.  For those who do anything with electronics, don't forget this handy reference guide

Thursday, August 17, 2023

Neutrality and experimental detective work

One of the remarkable aspects of condensed matter physics is the idea of emergent quasiparticles, where through the interactions of many underlying degrees of freedom, new excitations emerge that are long-lived and often can propagate around in ways very different than their underlying constituents.  Of course, it’s particularly interesting when the properties of the quasiparticles have quantum numbers or obey statistics that are transformed from their noninteracting counterparts.  For example, in the resonating valence bond model, starting from electrons with charge \(-e\) and spin 1/2, the low energy excitations are neutral spin-1/2 spinons and charge \(e\) holons.  It’s not always obvious in these situations whether the emergent quasiparticles act like fermions (obeying the Pauli principle and stacking up in energy) or bosons (all falling into the lowest energy state as temperature is reduced).  See here for an example.

Suppose there is an electrically insulating system that you think might host neutral fermionic excitations.  How would you be able to check?  One approach would be to look at the low temperature specific heat, which relates how much the temperature of an isolated object changes when a certain amount of disorganized thermal energy is added.  The result for (fermionic) electrons in a metal is well known:  because of the Pauli principle, the specific heat scales linearly with temperature, \(C \sim T\).  (In contrast, for the vibrational part of the specific heat due to bosonic phonons, \(C \sim T^3\) in 3D.).  So, if you have a crystalline(*) insulator that has a low temperature specific heat that is linear in temperature (or, equivalently, when you plot \(C/T\) vs. \(T\) and there is a non-zero intercept at \(T=0\)), then this is good evidence for neutral fermions of some kind.  Such a system should also have a linear-in-\(T\) thermal conductivity, too, and an example of this is reported here. This connects back to a post that I made a month ago.  Neutral fermions (presumably carrying spin) can lead to quantum oscillations in the specific heat (and other measured quantities). 

This kind of detective work, considering which techniques to use and how to analyze the data, is the puzzle-solving heart of experimental condensed matter physics.  There is a palette of measurable quantities - how can you use those to test for complex underlying physics?


(*) It’s worth remembering that amorphous insulators generally have a specific heat that varies like \(T^{1.1}\) or so, because of the unreasonably ubiquitous tunneling two-level systems.  The neutral fermions I’m writing about in this post are itinerant entities in nominally perfect crystals, rather than the localized TLS in disordered solids.  

Friday, August 11, 2023

What is a metal-insulator transition?

The recent excitement about the alleged high temperature superconductor "LK99" has introduced some in the public to the idea of a metal-insulator or insulator-metal transition (MIT/IMT).  For example, one strong candidate explanation for the sharp drop in resistance as a function of temperature is a drastic change in the electronic (and structural) properties of Cu2S at around 328 K, as reported here.  

In condensed matter physics, a metal is usually defined as a material with an electrical conductivity that increases with decreasing temperature.  More technically, in a (macroscopic) metal it is possible to create an electronic excitation (moving some electron from one momentum to another, for example) at arbitrarily small energy cost.  A metal is said to have "gapless excitations" of the electrons.  Even more technically, a metal has a nonzero electronic density of states at the electronic chemical potential.   

In contrast, an insulator has an electronic conductivity that is low and decreases with decreasing temperature.  In an insulator, it costs a non-zero amount of energy to create an electronic excitation, and the larger that energy cost, the more insulating the material.  An insulator is said to have an "energy gap".  If that energy gap is small compared to the thermal energy available (\( \sim k_{\mathrm{B}}T\)), there will be some conduction because of thermally excited electrons (and holes).  One way to classify insulators is by the reason for the energy gap, though knowing the mechanism for certain is often challenging.  A material is a "band insulator" if that gap comes about just because of how the atoms are stacked in space and how each atom shares its electrons.  This is the case for diamond, for example, or for common semiconductors like Si or GaAs (called semiconductors because their energy gaps are not too large).  A material can be an insulator due primarily to electron-electron interactions (a Mott insulator or the related charge transfer insulator); a material can be an insulator primarily because of interactions between the electrons and the lattice structure (a Peierls insulator); a material can be an insulator because of disorder, which can lead to electrons being in localized states (an Anderson insulator).

In some materials, there can be a change between metallic and insulating states as a function of some physically tunable parameter.  Common equilibrium control knobs are temperature, pressure, magnetic field, and density of charge carriers.  It's also possible to drive some materials between insulating and metallic states by hitting them with light or applying large electric fields.  

Sudden changes in properties can be very dramatic, as the Cu2S case shows.  That material tends to be in one crystal structure at high temperatures, in which it happens to be a band insulator with a large gap.  Then, as the temperature is lowered, the material spontaneously changes into a different crystal structure in which there is much more conduction.  There are other materials well known for similar transitions (often between a high temperature conducting state and a low temperature insulating state), such as VO2 and V2O3, in which the electrical conductivity can abruptly change by 5 orders of magnitude over a small temperature range.  

MIT/IMT materials can be of technological interest, particularly if their transitions are readily triggered.  For example, vanadium oxides are used in thermochromic and electrochromic switchable windows, because the optical properties of the material are drastically different in the conducting vs insulating phases.   The fundamental interest in MIT/IMTs systems is clear as well, especially when electronic interactions are thought to be responsible - for example, the rich array of conducting, superconducting, and insulating states that show up in twisted bilayer graphene as a function of carrier density (a representative paper here).  It's always interesting to consider how comparatively simple ingredients can lead to such rich response, through energetic (and entropic) competition between different states with wildly disparate properties.

Sunday, August 06, 2023

Desirable properties for a superconductor

Given the present interest, let's talk about what kind of properties one wants in a superconductor, as some people on social media seem ready to jump straight on the "what does superconductivity mean for bitcoin?" train.

First, the preliminaries.  Superconductivity is a state of matter in which the conduction electrons act collectively in an interesting way.   In the superconductors we know about, electrons pair up and can be described by a single collective quantum state (with a well-defined phase - the quantum state can be written as a complex quantity that has an amplitude and a phase angle, as in \(A \exp{i\phi}\), where \(\phi\) is the phase).  A consequence of this is that there is an "energy gap" - it costs a certain amount of energy to create individual unpaired electrons.  It's this energy gap that allows dc current to flow without electrical resistance in a superconductor. There is a length scale, the coherence length, over which the superconducting state tends to vary, like at the boundary of a material.  There is also a length scale, the penetration depth, over which magnetic field can penetrate into a superconductor.  Magnetic field is expelled from the bulk of a superconductor because the material spontaneously develops surface currents such that the field from those currents cancels out the external field in the bulk of the material.  Depending on the ratio of the coherence length and the penetration depth, a superconductor can be Type I (expels all magnetic field until the field \(H\) exceeds some critical value \(H_{c}\), at which point superconductivity dies) or Type II (allows magnetic field above a critical field \(H_{c1}\) to penetrate in the form of vortices, with a non-superconducting core and surrounded by screening currents, until superconductivity is eventually killed above some upper critical field \(H_{c2}\)).   Motion of vortices actually leads to energy losses, so it is desirable for applications involving AC currents especially to have the vortices be pinned in place somehow in the material, often by disorder.  It is this pinning that leads to superconducting levitation in fixed orientations relative to a magnet, even with the SC hanging below the magnet.   Superconductivity tends to die either by the pairs falling apart (common in Type I superconductors as temperature is increased until thermal energy exceeds the attractive pairing interaction) or by the loss of global phase coherence (a crude analogy:  the dance partners are still paired up, but each pair is dancing to their own tune).  

Superconductors have a critical temperature above which global superconductivity is lost.  They also have critical field scales, as mentioned above.  Clearly, for many applications, it would be greatly desirable for a superconductor to have both a high critical temperature (obviously) and high critical fields.  Similarly, superconductors have a critical current density - some combination of the local field (from the current) exceeding the critical field and current-driven phase slips can lead to loss of superconductivity.  It would be great to have a high critical current density.  The relationship between critical temperature, critical field, and critical current density is not always simple, though they tend to correlate, because if SC is very robust all three quantities will tend to be larger.

It would also be wonderful if a new superconducting family of materials was ductile.  The higher temperature superconductors (cuprates, pnictides, nickelates) are all ceramics, meaning that they are brittle and not readily formed into wires.  It's taken 36 years or so for people to get good at making wires and ribbons that incorporate the cuprate superconductors, typically by encasing them in powder form inside Cu or Ag tubes, then squeezing appropriately and annealing.  

Lastly, and perhaps not well appreciated, from a practical perspective, it'd be nice if superconductors were air stable.  That is, it's annoying to work with materials that react poorly to oxygen, humidity in the air, O2 or water in the presence of UV light from the sun, etc.  Having a material that is chemically very stable with a clearly known and set stoichiometry would be great.  Along with this, it would be nice if the material was easily made, at scale, without having to resort to crazy conditions (super high temperatures or pressures; weird rare or hazardous constituents).

How useful any candidate superconductor will be and on what timescale is set by the combination of these properties.  A room temperature superconductor that turns into goo in the presence of damp air would not be nearly as useful as one that is chemically stable sitting on a bench.  

For all the people who seem to be jumping to the conclusion that room temperature superconductivity will suddenly lead to breakthroughs in quantum information processing, that is far from clear.  Lots of processes that screw up superconducting qubits happen more at higher temperatures, even if superconductivity is robust.  I'm not aware of anyone peddling qubits based on copper oxide superconductors right now, even though the transition temperature is 10 times higher than that of Nb.

In short:  usefulness does not flow instantly from materials discovery, even if the material parameters all seem good.  Patience is hard to come by yet essential in trying to adapt new materials to applications.

Thursday, July 27, 2023

Condensed matter on the public stage, and not in a good way

This week, condensed matter physics has been getting far more broad public attention than usual, and while in the abstract I like our discipline getting noticed, this is definitely not how I’d have preferred it to happen.

First, more fun re Ranga Dias.  Fresh off renewed controversy about claims of room temperature superconductivity in Lu-N-H at high pressures (claims of reproduction of the effect seem unreliable to me), it’s come out that this paper, already under an “expression of concern”, is being retracted.  This has been widely reported - see here (hurray for student journalism) and here and here for example.  It is abundantly clear that data fabrication/copying has taken place here.  Between this, the other retraction, and the clear evidence of thesis content plagiarism, it’s hard to see any signs of credibility remaining.  

Then there is the claim via preprints (here, here) of room temperature superconductivity at ambient pressure in a lead oxide compound from investigators in Korea.  Cutting to the chase:  it is very unlikely, in my view, that this pans out, for multiple reasons.  Extraordinary claims hardly ever hold up.  There are multiple weird issues with the data in the figures (e.g., magnetic susceptibility data that shows up in both papers with the same units but axes that differ in magnitude by a factor of 7000 - which numbers are reliable, if either?  Resistivity that seem bizarrely large (0.01 Ohm-cm is bigger than the Mott-Ioffe-Regel limit - again, are the units right?).  A specific heat that doesn’t reach 3R at high temperatures.  Not clear of the resistance is really zero in the nominally superconducting part of the V-I curves.).  That said, if the video and the not-crazy-scale susceptibility data are trustworthy, this stuff is very diamagnetic, more so than graphite, which is quite unusual.  At least the authors do provide a comparatively straightforward synthesis recipe, so replication attempts should clear this up in a week or two.  

None of this is thaaaaaaat unusual, by the way.  There are claims of weird superconductivity at some rate.  It’s easy to screw up measurements, especially in inhomogeneous materials.  Unfortunately, social media (esp the site formerly known as twitter) drastically amplifies this stuff.  I assume Michio Kaku is going to be on tv any second now talking about how this will change the world.  Hopefully responsible journalists will be effective at pointing out that a non-reviewed preprint on the arxiv is not conclusive.  

I’m traveling, so updates will be sparse, but I will try to keep an eye on the comments.

Sunday, July 23, 2023

Disorganized thoughts on "Oppenheimer"

I saw "Oppenheimer" today.  Spoiler warning, I suppose, though I think we all know how this story ends.  Just in case you were wondering, there is no post-credit scene to set up the sequel.  (For the humor-impaired: that was a joke.)

The movie was an excellent piece of film-making, and I hope it's an opportunity for a large viewing audience to learn about a reasonable approximation of incredibly consequential history.  Sure, I can nitpick about historical details (why did Nolan leave out "Now we are all sons of bitches", transfer a bet to a different person,  and omit Fermi dropping bits of paper to estimate the yield of the Trinity test?  Why did he show Vannevar Bush seemingly hanging out at Los Alamos?  Update: rereading The Making of the Atomic Bomb, I was surprised to learn that Bush apparently was, in fact, present at the Trinity test!  Also, I do now see on an updated cast list that Kistiakowsky was portrayed in the movie, so I may have been wrong about the bet as well.  Mea culpa.).  Still, the main points come through - the atmosphere of war-time Los Alamos, and the moral complexity and ambiguity of Oppenheimer and the bomb.  

The definitive work about the Manhattan Project is The Making of the Atomic Bomb by Richard Rhodes.  That book truly captures the feeling of the era and the project.  Rereading it now, it still amazes how physicists and chemists of the time were able to make astonishing progress.  Reading about how Fermi & co. discovered moderation of neutrons (that is, slowing of neutrons through inelastic scattering off of hydrogen-containing materials like paraffin) is just mind-blowing as an experimentalist.  (They stumbled upon this by realizing that they got different experimental results if they ran their measurements on wood tables rather than marble tables within the same lab.)  

I saw someone lamenting on twitter that this movie was unlikely to inspire a generation of young people to go into physics.  Clearly that was not the intent of the film at all.  I think it's a net positive if people come away from the movie with a sense of the history and the fact that individual personalities have enormous sway even in the face of huge historical events.  Many people in the story are physicists, but the point is that they're complicated people dealing with the morality of enormously consequential decisions (on top of the usual human frailties).  (One thing the movie gets right is Teller's relentless interest in "the super" and his challenges in working with others on the Manhattan Project.  If Teller had been a less challenging personality, the course of nuclear weapons development may have been very different.  It reminds me superficially of William Shockley, whose managerial skills or lack thereof directly led to the creation of Silicon Valley.) 

For those interested in reading more about the context of the Manhattan Project, I recommend a couple of items.  The Los Alamos Primer are the notes that were given to incoming Project members and make for fascinating reading, accessible at the advanced undergrad level.  The Farm Hall transcripts are the transcribed recordings of interned German scientists held by the British in August, 1945.  They go from denial (the Americans couldn't possibly have developed a bomb) to damage control (clearly we slow-walked everything because we didn't really want the Nazis to get nuclear weapons) in the space of a couple of days.  

Sunday, July 16, 2023

What are "quantum oscillations"?

For the first time in a couple of decades, I was visiting the Aspen Center for Physics, which is always a fun, intellectually stimulating experience.  (Side note: I sure hope that the rapidly escalating costs of everything in the Aspen area don't make this venue untenable in the future, and that there are growing generous financial structures that can allow this to be accessible for those of limited funding.)  One of the topics of discussion this week was "quantum oscillations" in insulators, and I thought it might be fun to try to explain, on some accessible level, just how weird those observations are.  

Historically, quantum oscillations are observed in metals and (doped) semiconductors, and they have been a great tool for understanding electronic structure in conductive materials, a topic sometimes called "fermiology".   First, I need to talk about Fermi surfaces.

Annoyingly, it's easiest to describe the electronic states in a crystal in terms of "reciprocal space" or \(\mathbf{k}\)-space, where the wave-like electronic states are labeled by some wavevector \(\mathbf{k}\), and have some (crystal) momentum given by \(\hbar \mathbf{k}\).  ( Near the bottom of an energy band, the energy of such a state is typically something like \(E_{0} + (\hbar^2 k^2)/2m^{*}\), where \(m^{*}\) is an effective mass.)

At low temperatures, the electrons settle into their lowest energy states (toward low values of \(\mathbf{k}\)), but they stack up in energy because of the Pauli principle, so that there is some blob (possibly more than one) of filled states in \(\mathbf{k}\)-space, with a boundary called the Fermi surface, surrounded by empty states.  Because the relationship between energy and momentum, \(E(\mathbf{k})\), depends on the atoms in the material and the crystal structure, the Fermi surface can be complicated and have funny shapes, like the one shown in the link.  "Fermiology" is the term for trying to figure out, experimentally, what Fermi surfaces look like.  This matters because knowing which electronic states are the highest occupied affects many properties that you might care about.  The electrons in states right "at" the Fermi surface are the ones that have energetically nearby empty states and thus are the ones that respond to perturbations like electric fields, temperature gradients, etc.

Now turn on a magnetic field.  Classically, free electrons in a magnetic field \(B\) with some velocity perpendicular to the field will tend to move in circles (in the plane perpendicular to the field) called cyclotron orbits, and that orbital motion has a characteristic cyclotron frequency, \(\omega_{c} = eB/m\).  In the quantum problem, free electrons in a magnetic field have allowed energies given by \((n+1/2)\hbar \omega_{c}\).  Since there are zillions of conduction electrons in a typical chunk of conductor, that means that each of these Landau levels holds many electrons.  

An electron with wavevector
\(\mathbf{k}\) in a magnetic 
field \(\mathbf{B}\) will trace
out an orbit (yellow) in
\(\mathbf{k}\)-space.
For electrons in a conducting crystal, the idea of cyclotron motion still works, though the energy of an electronic state involves both the magnetic field and the zero-field band structure.  For an electron with wavevector \(\mathbf{k}\), one can define a velocity \(\mathbf{v}= (1/\hbar) \nabla_{\mathbf{k}}E(\mathbf{k})\) and use that in the Lorentz force law to figure out how \(\mathbf{k}\) varies in time.  It turns out that an electron at the Fermi surface will trace out an orbit in both real space and \(\mathbf{k}\)-space.  (Of course, for this physics to matter, the system has to be sufficiently free of disorder and at sufficiently low temperatures that the electrons are unlikely to scatter as they trace out orbits.)

Now imagine sweeping the magnetic field.  As \(B\) is ramped up, discrete cyclotron energy levels will sweep past the energy of the highest occupied electronic states, the Fermi surface.  That coincidence, when there are a lot of electronic states at the Fermi energy coinciding with a cyclotron level, leads to a change in the number of electronic states available to undergo transitions, like scattering to modify the electrical resistance, or shifting to different spin states because of an external magnetic field.  The result is, quantities like the resistance and the magnetization start to oscillate, periodic in \(1/B\).    (It's a bit more  complicated than that for messy looking Fermi surfaces - oscillations in measured quantities happen when "extremal orbits" like the ones shown in the second figure are just bracketed by contours of cyclotron energy levels.  The period in \(1/B\) is inversely proportional to the area in \(\mathbf{k}\)-space enclosed by the orbit.).  
Fermi surface of Cu.  If a magnetic field
is directed as shown, there are two orbits
(purple) that will contribute oscillations
in resistivity and magnetization.

Bottom line:  in clean conductors at low temperatures and large magnetic fields, it is possible to see oscillations in certain measured quantities that are periodic in \(1/B\), and that period allows us to infer the cross-sectional area of the Fermi surface in \(\mathbf{k}\)-space.  Oscillations of the resistivity are called Shubnikov-De Haas oscillations, and oscillations of magnetization are called De Haas-van Alphen oscillations. 

These quantum oscillations, measured as a function of field at many different field orientations, have allowed us to learn a lot about the Fermi surfaces in many conducting systems.   

Imagine the surprise when De Haas-van Alphen oscillations were found in a material whose bulk is expected to be electrically insulating!  More on this soon.