Friday, December 31, 2021

A book review, and wishes for a happy new year

 I was fortunate enough to receive a copy of Andy Zangwill's recent biography of Phil Anderson, A Mind Over Matter:  Philip Anderson and the Physics of the Very Many.  It's a great book that I would recommend to any physics student or graduate interested in learning about one of the great scientists of the 20th century.  Zangwill does an excellent job with the difficult task of describing (in a way accessible to scientists, if not necessarily always the lay-public) the rise of solid-state physics in the last century and its transformation, with significant guidance from Anderson, into what we now call condensed matter.  This alone is reason to read the book - it's more accessible than the more formally historical (also excellent) Out of the Crystal Maze and a good pairing with Solid State Insurrection (which I discussed here).  

This history seamlessly provides context for the portrait of Anderson, a brilliant, intuitive theorist who prized profound, essential models over computational virtuosity, and who had a litany of achievements that is difficult to list in its entirety.  The person described in the book gibes perfectly with my limited direct interactions with him and the stories I heard from my thesis advisor and other Bell Labs folks.  Some lines ring particularly true (with all that says about the culture of our field):  "Anderson never took very long to decide if a physicist he had just met was worth his time and respect."

On a separate note:  Thanks for reading, and I wish you a very happy new year!  I hope that you and yours have a safe, healthy, and fulfilling 2022.

Monday, December 27, 2021

US News graduate program rankings - bear this in mind

The US News rankings of graduate programs have a surprisingly out-sized influence.  Prospective graduate students seem to pay a lot of attention to these, as do some administrators.  All ranking schemes have issues:  Trying to encapsulate something as complex and multi-variate as research across a whole field + course offerings + student life + etc. in a single number is inherently an oversimplification. 

The USNWR methodology is not secret - here is how they did their 2018 rankings.   As I wrote over a decade ago, it's a survey.  That's all.  No detailed metrics about publications or research impact or funding or awards or graduate rates or post-graduation employment.  It's purely a reputational survey of department chairs/heads and "deans, other administrators and/or senior faculty at schools and programs of Ph.D. Physics programs", to quote the email I received earlier this month.   (It would be nice to know who gets the emails besides chairs - greater transparency would be appreciated.)

This year for physics, they appear to have sent the survey to 188 departments (the ones in the US that granted PhDs in the last five years), and historically the response rate is about 22%.  This implies that the opinions of a distressingly small number of people are driving these rankings, which are going to have a non-perturbative effect on graduate recruiting (for example) for the next several years.   I wish people would keep that in mind when they look at these numbers as if they are holy writ.  

(You also have to be careful about rough analytics-based approaches.  If you ranked departments based purely on publications-per-faculty-member, for example, you would select for departments that are largely made up of particle physics experimentalists.  Also, as the NRC found out the last time they did their decadal rankings, the quality of data entry is incredibly important.)

My advice to students:  Don't place too much emphasis on any particular ranking scheme, and actually look closely at department and research group websites when considering programs.  

Saturday, December 18, 2021

No, a tardigrade was not meaningfully entangled with a qubit

This week this paper appeared on the arxiv, claiming to have entangled a tardigrade with a superconducting transmon qubit system.  My readers know that I very rarely call out a paper in a negative way here, because that's not the point of this blog, but this seems to be getting a lot of attention, including in Physics World and New Scientist.  I also don't know how seriously the authors were about this - it could be a tongue-in-cheek piece.  That said, it's important to point out that the authors did not entangle a tardigrade with a qubit in any meaningful sense.  This is not "quantum biology".

Tardigrades are amazingly robust.  We now have a demonstration that you can cool a tardigrade in high vacuum down to millikelvin temperatures, and if you are sufficiently gentle with the temperature and pressure changes, it is possible to revive the little creature.  

What the authors did here was put a tardigrade on top of the capacitive parts of one of two coupled transmon qubits.  The tardigrade is mostly (frozen) water, and here it acts like a dielectric, shifting the resonance frequency of the one qubit that it sat on.   (It is amazing deep down that one can approximate the response of all the polarizable bits of the tardigrade as a dielectric function, but the same could be said for any material.)

This is not entanglement in any meaningful sense. If it were, you could say by the same reasoning that the qubits are entangled with the macroscopic silicon chip substrate.  The tardigrade does not act as a single quantum object with a small number of degrees of freedom.  The dynamics of the tardigrade's internal degrees of freedom do not act to effectively decohere the qubit (which is what happens when a qubit is entangled with many dynamical degrees of freedom that are then traced over).  

Atoms and molecules in our bodies are constantly entangling at a quantum level with each other and with the environment around us.  Decoherence means that trying to look at these tiny constituents and see coherent quantum processes related to entanglement generally becomes hopeless on very short timescales.  People still argue over exactly how the classical world seems to emerge from this constant churning of entanglement - it is fascinating.  Just nothing to do with the present paper. 

Saturday, December 11, 2021

Real progress on machine learning for density functional theory

(Sorry about the slow pace of posting.  The end of the semester has been very intense, including a faculty retreat for our department last week.)

I've written before (here, here, and here) about density functional theory, arguably one of the most impactful intellectual physics results of 20th century physics.   DFT is one approach to trying to solve the quantum electronic structure problem for molecules or solids containing many electrons.  As explained in the links above, the idea is powerful.  It turns out that the ground state (lowest energy state) electronic density as a function of position \(n(\mathbf{r})\), contains all the information needed to calculate basically anything you could want to know about the ground state.  There is a functional \(E[n(\mathbf{r})]\), for example, that will give you the energy of the full-on, interacting many-electron ground state.  It's possible to do a non-interacting electron model that can get you arbitrarily close to the true, correct \(n(\mathbf{r})\), The tricky bit is, there is no exact analytical expression for the functional \(E[n(\mathbf{r})]\), which includes a particularly tricky contribution called the exchange-correlation part of the functional, \(E_{\mathrm{xc}}[n(\mathbf{r})]\).  Because we are talking about functionals rather than functions,  \(E_{\mathrm{xc}}[n(\mathbf{r})]\) might depend in a non-local way on \(n(\mathbf{r})\) and its derivatives at all points in space - there is no reason to think it will be simple to write down.  


I wrote six years ago about the idea that machine learning techniques might make it possible to get a working version of something close to the exact \(E_{\mathrm{xc}}[n(\mathbf{r})]\) , even if we can't readily write it down in some closed form.  Now it seems that real progress has been made in this direction.  Here is a blog post from the DeepMind team about their paper in Science this week where they demonstrate a new functional that they claim is very good and accurate vs exact calculations on test systems, computationally tractable, and satisfies fundamental properties that have to hold for the true exact functional.  They argue that their code is more than just a fancy look-up table and that it contains generalizable knowledge so that it's useful well beyond their specific training test cases.  

If this is so, then it could be a major step forward in (for some definitions of the term) first-principles calculations of molecular and material properties.  I'm curious about whether the new functional will actually let us gain some physical insight into why physics requires that particular underlying mathematical structure.  Still, even if we end up with a "black box" that allows greatly improved calculations, that would really be something.  I'd appreciate it if knowledgable DFT/electronic structure experts could comment here on how excited we should be about this.

Sunday, November 28, 2021

LEDs - condensed matter/nanostructures having real impact

I'd written two years ago about the pervasiveness of light emitting diodes for holiday decorations.  While revising some notes for my class on nanoscience and nanotechnology, I recently came upon some numbers that really highlight the LED as a great example of condensed matter (and recently nanoscience) having a serious positive impact on energy consumption and environmental impacts. 

Image from here.

Back when I was growing up, incandescent light bulbs were common, and pretty lousy at generating light for a given amount of energy input.  Incandescents produce something like 20 lumens/W, while compact fluorescent bulbs are more like 60 lm/W.  In contrast, LED lighting is well over 100 lm/W and is hitting numbers like 200 lm/W in more expensive bulbs, and in theory could reach more like 325 lm/W.   (For good sources of information about this, I recommend this report by the International Energy Agency, and this 2020 report (pdf) from the US Department of Energy.   LED "white" lighting works either by having a UV LED that excites the same kind of phosphors that are in fluorescent bulbs, or by "color mixing" through having red, green, and blue LEDs all in one package.  (The "nano" comes into this both through the precision growth of the semiconductors and in some cases nanostructuring to enhance the fraction of emitted light that actually gets out of the LED.)

Six years ago, lighting accounted for about 15% of global electricity demand.  In just a few years, LEDs have gone from a few % of market share for new lighting to well above 50% of market share, and there is no sign of this slowing down.  The transition to LEDs is expected to save hundreds of billions of dollars per year in energy costs, gigatons per year in CO2 emissions, and to stave off the need to construct over a hundred new municipal-scale power plants over the next decade.  

This is a big deal.  One way to cast "the energy problem" is that there is no clear, environmentally reasonable path toward raising the standard of living of billions of people up to the level of per capita energy consumption seen in the most developed economies.  Cutting that per capita energy use would be great, and LED lighting is a true success story in that regard.  

Sunday, November 21, 2021

Hanle magnetoresistance - always more to learn....

You would think that, by now, we would have figured out basically all there is to know about comparatively simple metals conduct electricity, even in the presence of a magnetic field.  I mean, Maxwell and Faraday etc. were figuring out electric and magnetic fields a century and a half ago.  Lorentz wrote down the force on a moving charge in a magnetic field in 1895.  The Hall Effect goes back to 1879.  Sommerfeld and his intellectual progeny laid the groundwork for a quantum theory of electronic conduction starting about a hundred years ago.  We have had good techniques for measuring electrical resistances (that is, sourcing a current and measuring the voltage differences between different places on a material) for many decades, and high quality magnets for around as long.  

Surprisingly, even in very recent times we are still finding out previously unknown effects that influence the resistance of a metal in a magnetic field.  Let me give you an example.  

I'd written here about the spin Hall effect and its inverse, which were only "discovered" relatively recently.  In brief, because of strong spin-orbit coupling (SOC) effects on the electronic structure of comparatively heavy metals (Pt, Ta, W), passing a current through a thin film strip of such a material generates a spin current, leading to the accumulation of spin at the top and bottom of the strip.  If those interfaces are in contact with magnetic materials, exchange processes can take place so that there is a net transfer of angular momentum between the metal and the magnetic system.  

There is actually a correction to the resistance of the SOC metal:  The spin accumulation can lead to a diffusive spin current between the top and bottom surfaces, which (thanks to the inverse spin Hall effect, ISHE) gives an additive kick to the charge current (and effectively lowers the resistance of the metal from what it would be in the absence of the spin Hall physics).  If the top and bottom interfaces are in contact with a magnetic system and therefore affect the spin accumulation, that correction can be modified depending on the orientation of the magnetization of the magnetic material, leading to the spin Hall magnetoresistance.  

Spin Hall/inverse spin Hall
resistive correction,
adapted from here.

That's not the end of the story, however.  Even without an adjoining magnetic material, there is an additional magnetoresistive correction, \(\delta \rho(\mathbf{H})\) to the resistivity of the SOC metal.  If the magnetic field has a component transverse to the direction of the SHE accumulated spins, the spins will precess about that field, and that can affect the ISH correction to the resistivity.  This was predicted in 2007 by Dyakanov (arxiv, PRL), and it was found experimentally several years later, as reported in PRL (arxiv version here).  There are readily measurable effects in both the longitudinal resistivity \(\rho_{xx}\) (voltage measured along the direction of the current) and the transverse resistivity \(\rho_{xy}\) (voltage measured transverse to the current, as in the Hall effect, but this holds even when the external magnetic field is in the plane of the film).

Hanle magnetoresistance idea, 
adapted from here.

This correction is called the Hanle magnetoresistance.  

(Aside:  There is some interesting scientific history behind the name.  Hanle was the first to explain an atomic physics optical effect, where the precession of magnetic moments of a gas of atoms in a magnetic field affects the polarization of light passing through the gas.  In condensed matter, the name "Hanle effect" shows up in discussions of spin transport in metals.  The first time I ever encountered the term was in this paper, which foreshadows the discovery of giant magnetoresistance.  A ferromagnetic emitter contact is used to inject spin-polarized electrons into a non-magnetic metal, aluminum.  Those electrons diffuse over to a second ferromagnetic collector contact, where their ability to enter that contact (and hence the resistance of the gadget) depends on the relative alignment of the spins and the magnetization of the collector.  If there is a magnetic field perpendicular to the plane of the device, the spins precess while the electrons diffuse, and one can analyze the magnetoresistance to infer the spin relaxation time in the metal.)

One of my students and I have been scratching our heads trying to see if we really understand the Hanle magnetoresistance, which we have been measuring recently as a by-product of other work.  I think it's pretty amazing that we are still discovering new effects in something as simple as the resistance of a metal in a magnetic field.

Saturday, November 13, 2021

The community of department chairs

For the vast majority, there is no formal training process that professors go through before-hand to become chair or head of a department.  That makes access to the experiences and knowledge of others an invaluable resource.  In recent years, the APS has been sponsoring conferences of physics (or physics & astronomy) department chairs, and that's great, but pre-dating that have been electronic mailing lists for department chairs and heads*.  There is a long-standing, somewhat appropriately named "Midwest Physics Department Chairs" email listserv, and similarly there is an analogous American Astronomical Society astronomy chairs listserv.  

The chairs mailing list has been a great way to learn how processes work at other places, and to get advice or sanity checks.  Sometimes it can be very helpful to be able to say to your administration, "here is how everyone else does this."  Not everything translates, as large public universities have some real structural differences in operations compared to private universities, but it's still been informative. Examples of recent discussion topics in no particular order:

  • Rough startup costs for hires in different subfields (and how those costs are borne between departments, deans, provosts, etc.)
  • Qualifying/candidacy exams - what they cover (undergrad v grad), their value or lack thereof
  • Promotion and tenure processes 
  • Diversity/equity/inclusion at all levels
  • Graduate admissions in the post-standardized-test era
  • International students in the era of covid + recent changes in student visa policies
  • Various curricular issues (incorporating computation; lab staffing)
  • Mental health at all levels (undergrads, grad students, faculty, staff)

The group also has an annual get-together.  Last weekend I attended a meeting (face to face!) of about 30 physics department chairs at the exotic O'Hare Airport Hilton in Chicago.  While not everyone was able to make it, it was helpful to talk and compare notes.  People had a lot to say about teaching methods and what will stick around post-pandemic.  It was also very informative to learn what it takes financially and in terms of personnel to support a successful bridge program.  

Being chair or head can be isolating, and it's good having a community of people who understand the weird issues that can come up.  

* The definitions are not rigid, but a chair is often elected and expected to make decisions through consensus and voting, while a head is appointed and typically has more autonomy and authority.  As one former head at a big place once told me, though, you basically need consensus as a head, too, otherwise you can't get anything done.

Saturday, November 06, 2021

The noise is the signal

 I am about to attend a gathering of some physics department chairs/heads from around the US, and I'll write some about that after the meeting, but I wanted to point out a really neat paper (arxiv version here) in a recent issue of Science.  A group at Leiden has outfitted their scanning tunneling microscope with the ability to measure not just the tunneling current, but the noise in the tunneling current, specifically the "shot noise" that results out of equilibrium because charge is transported by the tunneling of discrete carriers.  See here for a pretty extensive discussion about how charge shot noise is a way to determine experimentally whether electrons are tunneling one at a time independently, or whether they are, for example, being transported two at a time because of some kind of pairing.

Adapted from Fig. 1 of this paper.
The experiment is quite pretty, looking at disordered thin films of TiN, with a macroscopic superconducting transition temperature of \(T_{c} =\) 2.95 K.  With the shot noise measurement, the experimenters see enhanced noise at low applied voltages consistent with pairing (with pairs being transported presumably by the process of Andreev reflection).  The interesting point is that this enhanced noise persists up to temperatures as high as 2.7 times \(T_{c}\), despite the fact that the tunneling conductance \(dI/dV\) shows no sign of a gap or pseudogap up there.  This implies that superconductivity in this material dies as \(T\) exceeds \(T_{c}\) not because the pairing between electrons falls apart, but instead because of the loss of the global coherence needed for the superconducting state.  That's an exciting result.  

I'm a big fan of noise measurements and applying them to a broader class of condensed matter systems.  We'd seen enhanced noise in cuprate tunnel junctions above \(T_{c}\) and at large biases, as mentioned here, but in the cuprates such persistence of pairing is less surprising than in the comparatively "simple" TiN system.  Noise measurements on demand via STM should be quite the enabling capability!

Sunday, October 24, 2021

The physics of ornithopters

One thing that the new Dune film captures extremely well is the idea that the primary small-capacity air transportation mode on Arrakis is travel by ornithopter.  The choice of flapping wings as a lift/propulsion mechanism can be in-fictional-universe justified by the idea that jet turbines probably won't do well in an atmosphere with lots of suspended dust and sand, especially on take-off and landing.  Still, I think Frank Herbert decided on ornithopters because it just sounded cool.

The actual physics and engineering of flight via flapping wings is complicated.  This site is a good place to do some reading.  The basic idea is not hard to explain.  To get net lift, in the cyclical flapping motion of a wing, somehow the drag force pushing downward on the wing during the upstroke has to be more than balanced by the flux of momentum pushed downward on the wing's downstroke.  To do this, the wing's geometry can't be unchanging during the flapping.  The asymmetry between up and down strokes is achieved through the tilting (at the wing base and along the wing) and flexing of the wing during the flapping motion.  

The ornithopters in the new movie have wings on the order of 10 m long, and wing motions that look like those of a dragonfly, and the wings are able to flap up and down and an apparent frequency of a couple of hundred hertz (!).  If you try to run some numbers on the torque, power, and material strength/weight that would be required to do this, you can see pretty quickly why this has not worked too well yet as a strategy on earth.   (As batteries, motor technology, and light materials continue to improve, perhaps ornithopters will become more than a fun hobby.)  

This issue - that cool gadgets in sci-fi or superhero movies would need apparently unachievable power densities at low masses - is common (see, e.g., Tony Stark's 3 GW arc reactor that fits in your hand, weighs a few pounds, and somehow doesn't have to radiate GW of waste heat), and that's ok; the stories are not meant to be too realistic. Still, the ornithopter fulfills its most important purpose in the movie:  It looks awesome.  

Sunday, October 17, 2021

Brief items - Sarachik, Feynman, NSF postdocs and more

 Here are several items of interest:

  • I was saddened to learn of the passing of Myriam Sarachik, a great experimental physicist and a generally impressive person.  I was thinking about writing a longer piece about her, but this New York Times profile from last year is better than anything I could do.  This obituary retells the story to some degree. (I know that it's pay-walled, but I can't find a link to a free version.)  In the early 1960s, after fighting appalling sexism to get a doctorate and a position at Bell Labs, she did foundational experimental work looking at the effect of dilute magnetic impurities in the conduction of nonmagnetic metals.  For each impurity, the magnetic atom has an unpaired electron in a localized orbitals.  A conduction electron of opposite spin could form a singlet to fill that orbital, but the on-site Coulomb repulsion of the electron already there makes that energetically forbidden except as a virtual intermediate state for a scattering process.  The result is that scattering by magnetic impurities gets enhanced as \(T\) falls, leading to an upturn in the resistivity \(\rho(T)\) that is logarithmic in \(T\) at low temperatures.  Eventually the localized electron is entangled with the conduction electrons to form a singlet, and the resistivity saturates.  This is known as the Kondo Effect based on the theoretical explanation of the problem, but Sarachik's name could credibly have been attached.  Her family met with a personal tragedy from which it took years to recover.  Later in her career, she did great work looking at localization and the metal-insulator transition in doped semiconductors.  She also worked on the quantum tunneling of magnetization in so-called single-molecule magnets, and was a key player in the study of the 2D metal-insulator transition in silicon MOSFETs.  I was fortunate enough to meet her when she came through Rice in about 2003, and she was very generous in her time meeting with me when I was a young assistant professor.  Sarachik also had a great service career, serving as APS President around that time.  Heck of a career! 
  • The audio recordings of the famous Feynman Lectures on Physics are now available for free to stream from Caltech.  You can also get to these from the individual lectures by a link on the side of each page.
  • There is a new NSF postdoctoral fellowship program for math and physical sciences.  I would be happy to talk to anyone who might be interested in pursuing one of these who might want to work with me.  Please reach out via email.
  • I've written before about the "tunneling time" problem - how long does quantum mechanical tunneling of a particle through a barrier take?  Here is an experimental verification of one of the most counterintuitive results in this field:  the farther "below" the barrier the particle is (in the sense of having a smaller fraction of the kinetic energy needed classically to overcome the potential barrier), the faster the tunneling.  A key experimental technique here is the use of a "Larmor clock", with the precession of the spin of a tunneling atom acting as the time-keeping mechanism.
  • Did you know that it is possible, in Microsoft Word, to turn on some simple LaTeX-style symbolic coding?  The key is to enable "Math Autocorrect", and then typing \alpha will automatically be turned into \(\alpha\).  (I know act like doing scientific writing in Word is heretical, but not everyone in every discipline is facile with LaTeX/Overleaf.)

Sunday, October 10, 2021

The Purcell effect - still mind-blowing.

The Purcell effect is named after E. M. Purcell, a Nobel-winning physicist who also was a tremendous communicator, author of one of the great undergraduate textbooks and a famous lecture about the physical world from the point of view of, e.g., a bacterium.  I've written about this before here, and in a comment I include the complete (otherwise paywalled) text of the remarkable original "paper" that describes the effect.

When we calculate things like the Planck black-body spectrum, we use the "density of states" for photons - for a volume \(V\), we are able to count up how many electromagnetic modes are available with frequency between \(\nu\) and \(\nu + \mathrm{d}\nu\), keeping in mind that for each frequency, the electric field can be polarized in two orthogonal directions.  The result is \( (8\pi/c^3)\nu^2 \mathrm{d}\nu\) states per unit volume of "free space".

In a cavity, though, the situation is different - instead, there is, roughly speaking, one electromagnetic mode per the bandwidth of the cavity per the volume of the cavity.  In other words, the effective density of states for photons in the cavity is different than that in free space.  That has enormous ramifications:  The rates of radiative processes, even those that we like to consider as fundamental, like the rate at which electrically excited atoms radiatively decay to lower states state, can be altered in a cavity.  This is the basis for a lot of quantum optics work, as in cavity quantum electrodynamics.  Similarly, the presence of an altered (from free space) photon density of states also modifies the spectrum of thermal radiation from that cavity away from the Planck black-body spectrum.  

Consider an excited atom in the middle of such a cavity.  When it is going to emit a photon, how does it "know" that it's in a cavity rather than in free space, especially if the cavity is much larger than an atom?  The answer is, somehow through the electromagnetic couplings to the atoms that make up the cavity.  This is remarkable, at least to me.   (It's rather analogous to how we picture the Casimir effect, where you can think about the same physics either, e.g., as due to altering local vacuum fluctuations of the EM field in the space between conducting plates, or as due to fluctuating dipolar forces because of fluctuating polarizations on the plates.)

Any description of a cavity (or plasmonic structure) altering the local photon density of states is therefore really short-hand.  In that approximation, any radiative process in question is tacitly assuming that an emitter or absorber in there is being influenced by the surrounding material.  We just are fortunate that we can lump such complicated, relativistically retarded interactions into an effective photon density of states that differs from that in free space. 

Tuesday, October 05, 2021

Spin glasses and the Nobel

The Nobel Prize in physics this year was a bit of a surprise, at least to me.  As one friend described it, it's a bit of a Frankenprize, stitched together out of rather disparate components.  (Apologies for the slow post - work was very busy today.)  As always, it's interesting to read the more in-depth scientific background of the prize.  I was unfamiliar with the climate modeling of Manabe and Hasselmann, and this was a nice intro.

The other prize recipient was Giorgio Parisi, a statistical mechanician whose key cited contribution was in the theory of spin glasses, but was generalizable to many disordered systems with slow, many-timescale dynamics including things like polymers and neural networks.  

The key actors in a spin glass are excess spins - local magnetic moments that you can picture as little magnetic dipoles. In a generic spin glass, there is both disorder (as shown in the upper panel of the cartoon, spins - in this case iron atoms doped into copper - are at random locations, and that leads to a broad distribution of spin-spin interactions in magnitude and sign) and frustration (interactions such that flipping spin A to lower its interaction energy with spin B ends up raising the interaction energy with spin C, so that there is no simple configuration of spins that gives a global minimum of the interaction energy).  One consequence of this is a very complicated energy landscape, as shown in the lower panel of the cartoon.  There can be a very large number of configurations that all have about the same total energy, and flipping between these configurations can require a lot of energy such that it is suppressed at low temperatures.  These magnetic systems then end up having slow, "glassy" dynamics with long, non-exponential relaxations, in the same way that structural glasses (e.g., SiO2 glass) can get hung up in geometric configurations that are not the global energetic minimum (crystalline quartz, in the SiO2 case).  

The standard tools of statistical physics are difficult to apply to the glassy situation.  A key assumption of equilibrium thermodynamics is that, for a given total energy, a system is equally likely to be found in any microscopic configuration that has that total energy.  Being able to cycle through all those configurations is called ergodicity.  In a spin glass at low temperatures, the potential landscape means that the system can get easily hung up in a local energy minimum, becoming non-ergodic.  

An approach that Parisi took to this problem involved "replicas", where one considers the whole system as an ensemble of replica systems, and a key measure of what's going on is the similarity of configurations between the replicas.  Parisi himself summarizes this in this pretty readable (for physicists) article.  One of Parisi's big contributions was showing that the Ising spin glass model of Sherrington and Kirkpatrick is exactly solvable.

I learned about spin glasses as a doctoral student, since the interacting two-level systems in structural glasses at milliKelvin temperatures act a lot like a spin glass (TLS coupled to each other via a dipolar elastic interaction, and sometimes an electric dipolar interaction), complete with slow relaxations, differences between field-cooled and zero-field-cooled properties, etc.  

Parisi has made contributions across many diverse areas of physics.  Connecting his work to that of the climate modelers is a bit of a stretch thematically - sure, they all worry about dynamics of complex systems, but that's a really broad umbrella.  Still, it's nice to see recognition for the incredibly challenging problem of strongly disordered systems.

Sunday, October 03, 2021

Annual Nobel speculation thread

Once more, the annual tradition:  Who do people think will win the Nobel this year in physics or chemistry?  I have repeately and incorrectly suggested Aharonov and Berry for geometric phases.  There is a lot of speculation on social media about AspectZeilinger, and Clauser for Bell's inequality tests.  Social media speculation has included quantum cascade lasers as well as photonic bandgap/metamaterials. Other suggestions I've seen online have included superconducting qubits (with various combinations of people) and twisted bilayer graphene, though both of those may be a bit early.  


Tuesday, September 28, 2021

Science/tech consulting in creative arts

I've watched the first two episodes of the new adaptation of Foundation.  It surely looks gorgeous, though there are some script challenges (even apart from the challenge of trying to adapt an enormous book series that was always long on ideas and short on character development).   The issues I've spotted seem mostly to be ones of poor script editing for consistency.  (The emperor of the Galactic Empire says in the first episode that the imperial population is 8 trillion, and then in the second episode a character says that the core worlds alone have a population of 40 trillion.  The latter number is more reasonable, given the size of Asimov's empire.)  

Watching this, I again think it would be great fun to do scientific/technical consulting for TV, movies, and even books. I'm on the list for the Science and Entertainment Exchange, though all I've ever done is give a tiny bit of feedback to a would-be author.  (My expertise probably looks too narrow, and not living in southern California seems to be a major filter.)  

It feels like there are some similarities to the role of science in public policy.  In the creative productions, science can contribute (and these media can be a great way of getting scientific ideas out into the public), but in the end plot and what can practically be implemented will always drive the final product.  In policy, science and technical knowledge should definitely factor in when relevant, but fundamentally there are social and political factors that can overwhelm those influences in decision-making.  Now back to our regularly scheduled psychohistorical crisis....

Wednesday, September 22, 2021

DOE Experimental Condensed Matter PI meeting, Day 3

Here are some tidbits from the last day of this year's meeting.  (I didn't really get to see the other posters in my own poster session, so apologies for missing those.  For the curious:  the meeting attendees alternate between posters and 15 minute talks from year to year.)

  • It's been known for a while that combining magnetism with topological insulator materials can lead to a rich phase diagram.  Tuning composition is a powerful tool.  Likewise, the van der Waals nature of these systems mean that it's possible to look systematically through a family of related materials.
  • Tuning composition in flat-band kagome metals is also of interest.
  • I had not appreciated just how important specific crystal growth approaches (e.g., rapid quenching vs. slow annealed cooling) are to the properties of some magnetic/topological materials, such as Fe5GeTe2.  
  • Strain can be a powerful tool for tuning electronic topology in some materials such as ZrTe5, and driving certain phonon modes via laser offers the potential of controlled switching of topological properties.
  • Quantum oscillations (e.g., magnetization as a function of 1/H) are a conventional way to learn about Fermi surfaces, and it is always bizarre when that kind of response shows up in a correlated material that is nominally an insulator, or in thermal transport but not electrical transport.
  • Speaking of quantum oscillations in insulators, how about thermal transport in the spin liquid phase of \(\alpha\)-RuCl3?  Looks like some kind of bosonic edge mode is responsible.
  • If transition metal dichalcogenides are starting to bore you, perhaps you'd be more interested in trichalcogenides, which can be grown as individual 1D chains within carbon and boron nitride nanotubes.
Thanks to everyone for making the meeting enjoyable and informative, even if we couldn't get together in a random Marriott in Maryland this time.  

Tuesday, September 21, 2021

DOE Experimental Condensed Matter PI meeting, Day 2

 More highlights from the meeting.  Office hours for my class conflicted with a couple of the talks, so these are briefer than I would've liked.

  • It is possible to use coherent x-ray scattering to look at time variations in the domain structure of an antiferromagnet.  In the magnetic diffraction pattern there is speckle near the magnetic Bragg spots that bops around as the domain structure fluctuates.
  • Amorphous magnetic alloys can show some really wild spin textures.  
  • By growing a few nanometers of a paramagnetic metal, Bi2Ir2O7, on top of an insulating spin ice, Dy2Ti2O7, it's possible to get enough coupling that field-driven spin ice transitions can generate magnetoresistance signatures in the metal layer.
  • Square planar nickelates can look a lot like the copper oxide superconductors in terms of band dispersion and possible "strange" metallicity.
  • Some rare-earth intermetallic compounds can have an impressively rich magnetic phase diagram.
  • I learned that some pyrochlore iridates can exhibit a kind of topological metallic state with giant anomalous Hall response.
  • I had not previously appreciated how wild it is that one can engineer ferroelectric response in stacks of 2D materials that are not intrinsically ferroelectric, as in hBN or even WSe2.
  • Ultrasound attenuation can be a heck of a tool for looking at superconductivity and other electronic transport properties.
  • Strontium titanate remains a really interesting test case for understanding exotic superconductivity, with its superconducting dome as a function of doping, very low carrier density, and incipient ferroelectricity.  Phonons + paraelectric fluctuations + spin-orbit coupling appear to be the big players
  • A related system in the sense of near-ferroelectricity and low carrier density is at interfaces of KTaO3.
  • This experiment in graphene/hBN/graphene stacks (with encapsulating hBN and graphite top and bottom gates) is an extremely pretty, tunable demonstration of superfluidity of bilayer excitons, an effect previously seen in one limit in GaAs systems.

Monday, September 20, 2021

DOE Experimental Condensed Matter PI meeting, Day 1

Somehow I found the first day of the virtual meeting more exhausting than when we do these things in person, probably because I had to go teach in the middle of the event.  A sampling of highlights:

  • Rather than relying on relatively crude methods to create defect centers in diamond for quantum sensing (or qubit purposes), one can use chemistry to build transition metal complexes with designer ligand fields (and hence energy level structures), as demonstrated here.
  • I know understand better why it has historically been so difficult to demonstrate, experimentally, that the quasiparticles in the fractional quantum Hall effect obey fractional (anyonic) statistics.  In an interferometer, it's critical to use screening (from top and bottom electron gases that act like capacitor plates) to reduce Coulomb interactions between edge states and the bulk. Once that's done, you can see clear evidence of fractional (anyonic) phase slips.
  • Some truly exceptional people can still do research even while being a university president.  At very low energies in an Ising ferromagnet with an in-plane magnetic field, hyperfine interactions can lead to hybridization of magnetic levels and the formation of "electronuclear" spin excitations.
  • Ultraclean ABC-stacked graphene trilayers can show remarkably rich response, dominated by strong electron-electron interaction effects.
  • High quality crystal growth can drastically lower the defect densities in transition metal dichalcogenides.  That makes it possible to construct bilayers of WSe2, for example, that can host apparent excitonic condensates.  Similar physics can be seen in MoSe2/WSe2 bilayers, where it is clear that exciton-exciton interactions can be very strong.
  • Pulling and pushing on a sample can lead to elastocaloric effects (like when a rubber band cools upon being stretched), and these can reveal otherwise hidden properties and phase transitions.
More tomorrow.  (One fun idea from a colleague:  Perhaps the program officers have hidden a secret easter egg token somewhere in the Gather virtual poster area, and whoever finds it gets a bonus award supplement to support a summer undergrad....)

DOE Experimental Condensed Matter Physics PI meeting, 2021

Every two years, the US Department of Energy Experimental Condensed Matter Physics program has a principal investigator meeting, and I've written up highlights of these for a while (for 2019, see a, b, c;  2017, see abc; for 2015 see abc; for 2013 see ab).

The meetings have always been very educational for me.  They're a chance to get a sense of the breadth of the whole program and the research portfolio.  It is unfortunate that the covid pandemic has forced this year's meeting to be virtual.  I'll do my best to summarize some tidbits in posts over the next three days.

Friday, September 17, 2021

Moiré materials and the Mott transition

There are back-to-back papers in Nature this week, one out of Columbia and one out of Cornell, using bilayers of transition metal dichalcogenides to examine the Mott transition.  (Sorry for the brevity - I'm pressed for time right now, but I wanted to write something....)

As I described ages ago in here, imagine a lattice of sites, each containing one electron.  While quantum statistics would allow each site to be doubly occupied (thanks to spin), if the on-site electron-electron repulsion \(U\) is sufficiently strong (large compared to the kinetic energy scale \(t\) associated with hopping between neighboring sites), then the interacting system will be an insulator even though the non-interacting version would be a metal.  Moving away from this half-filling condition, you can get conduction, just as having an empty site allows those sliding tile puzzles to work.

As discussed here, in bilayers of 2D materials can lead to the formation of a moiré lattice, where the interlayer interactions result in an effective periodic array of potential wells.  The Columbia folks got a moiré pattern by using a 4-5 degree twisted bilayer of WSe2, while the Cornell folks instead used an aligned bilayer of MoTe2 and WSe2 (where the moiré comes from the differing lattice constants).  In both cases, you end up with a triangular moiré lattice (encapsulated in hBN to provide a clean charge environment and protection from the air).  

The investigators are able to tune the systems in multiple ways.  With overall gate voltage, they can capacitively tune the "filling", the ratio of number of "free" charges to number of moiré lattice sites.  By adjusting top gate vs. back gate, they can tune the vertical electric field across the bilayer, and that is a way of tuning interactions by pushing around localized wavefunctions for the lattice sites.  

Both groups find that they can tune in/out of a Mott insulating phase when they're at one carrier per moiré lattice site.  Interestingly, both groups see that the Mott transition is continuous (second-order) - there is no sudden onset of insulating response as a function of tuning either knob.  Instead, there appears to be quantum critical scaling, and regions of linear-in-\(T\) temperature dependence of the resistivity (a possible indicator of a strange metal) on either side of the insulating region.  The Cornell folks are able to do magnetic circular dichroism measurements to confirm that the transition does not involve obvious magnetic ordering. 

This is very pretty work, and it shows the promise of the moiré lattice approach for studying fundamental issues (like whether or not the Mott transition in a triangular lattice is continuous).  I'm sure that there will be much more to come in these and related systems.

Monday, September 06, 2021

What is the spin Hall effect?

The Hall Effect is an old (1879) story, told in first-year undergraduate physics classes for decades. Once students are told about the Lorentz force law, it's easy to make a handwave classical argument that something like the Hall Effect has to exist:  Drive a current in a conductor in the presence of a magnetic induction \(\mathbf{B}\).  Charged particles undergo a \(q \mathbf{v} \times \mathbf{B}\) force that pushes them transverse to their original \(\mathbf{v}\) direction.  In a finite slab of material with current perpendicular to \(\mathbf{B}\), the particles have to pile up at the transverse edge, leading to the development of a (Hall) voltage perpendicular to the direction of current flow and the magnetic induction.  You can measure the Hall voltage readily, and it's used for sensing magnetic fields, as well as figuring out charge carrier densities in materials.

The spin Hall effect, in contrast, is a much newer idea.  It was first proposed by Dyakonov and Perel in 1971 as an extrinsic effect (that is, induced by scattering from impurities in a material), and this was revisited in 1999 by Hirsch and others.  It's also possible to have an intrinsic spin Hall effect (proposed here and here) due just to the electronic structure of a material itself, not involving impurities.

Adapted from here.

So what is the SHE?  In some non-magnetic conductors, in the absence of any external magnetic field, a charge current (say in the \(+x\) direction) results in a build-up of electrons with spin polarized up (down) along the \(z\) direction along the positive (negative) \(y\) edge of the material, as shown in the bottom left drawing of the figure.  Note that there is no net charge imbalance or transverse voltage - just a net spin imbalance. 

The SHE is a result of spin-orbit coupling - it's fundamentally a relativistic effect (!).  While we static observers see only electric fields in the material, the moving charge carriers in their frame of reference see effective magnetic fields, and that affects carrier motion.  In the extrinsic SHE, scattering of carriers from impurities ends up having a systematic spin dependence, so that spin-up carriers are preferentially scattered one way and spin-down carriers are scattered the other.  In the intrinsic SHE, there ends up being a spin-dependent term in the semiclassical velocity that one would get from the band structure, because of spin-orbit effects.  (The anomalous Hall effect, when one observes a Hall voltage correlated with the magnetization of a magnetic conductor, is closely related.  The net charge imbalance shows up because the populations of different spins are not equal in a ferromagnet.)  The result is a spin current density \(\mathbf{J}_{\mathrm{s}}\) that is perpendicular to the charge current density \(\mathbf{J}_{\mathrm{c}}\), and is characterized by a (material-dependent) spin Hall angle, \(\theta_{\mathrm{SH}}\), so that \(J_{\mathrm{s}} = (\hbar/2e)\theta_{\mathrm{SH}}J_{\mathrm{c}}\).

There is also an inverse SHE:  if (appropriately oriented) spin polarized charge carriers are injected into a strong spin-orbit coupled non-magnetic metal (say along \(+x\) as in the bottom right panel of the figure), the result is a transverse (\(y\)-directed) charge current and transverse voltage build-up.  (It's this inverse SHE that is used to detect spin currents in spin Seebeck effect experiments.)

The SHE and ISHE have attracted a lot of interest for technological applications.  Generating a spin current via the SHE and using that to push around the magnetization of some magnetic material is called spin orbit torque, and here is a recent review discussing device ideas.

Wednesday, September 01, 2021

Rice University physics faculty search in experimental quantum science and technology

The Department of Physics and Astronomy at Rice University invites applications for tenure-track faculty positions in the broad area of experimental quantum science and technology. This encompasses quantum information processing, quantum sensing, quantum communication, quantum opto-mechanics, and quantum simulation in photonic, atomic/ionic, quantum-material, and other solid-state platforms. We seek outstanding scientists whose research will complement and extend existing activities in these areas within the Department and across the University. In addition to developing an independent and vigorous research program, the successful applicants will be expected to teach, on average, one undergraduate or graduate course each semester, and contribute to the service missions of the Department and University. The Department anticipates making appointments at the assistant professor level. A Ph.D. in physics or related field is required.

Beginning September 1, 2021, applications for this position must be submitted electronically at .

Applications for this position must be submitted electronically. Applicants will be required to submit the following: (1) cover letter; (2) curriculum vitae; (3) statement of research; (4) statement on teaching; (5) statement on diversity, mentoring, and outreach; (6) PDF copies of up to three publications; and (7) the names, affiliations, and email addresses of three professional references. Rice University, and the Department of Physics and Astronomy, are strongly committed to a culturally diverse intellectual community. In this spirit, we particularly welcome applications from all genders and members of historically underrepresented groups who exemplify diverse cultural experiences and who are especially qualified to mentor and advise all members of our diverse student population.We will begin reviewing applications November 15, 2021. To receive full consideration, all application materials must be received by January 1, 2022. The expected appointment date is July, 2022.  

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

Saturday, August 28, 2021

What is the spin Seebeck effect?

Thermoelectricity is an old story, and I've also discussed it here.  Take a length of some conductor, and hold one end of that conductor at temperature \(T_{\mathrm{hot}}\), and hold the other end of that conductor at temperature \(T_{\mathrm{cold}}\).  The charge carriers in the conductor will tend to diffuse from the hot end toward the cold end.  However, if the conductor is electrically isolated, that can't continue, and a voltage will build up between the ends of the conductor, so that in the steady state there is no net flow of charge.  The ratio of the voltage to the temperature difference is given by \(S\), the Seebeck coefficient.  

It turns out that spin, the angular momentum carried by electrons, can also lead to the generation of voltages in the presence of temperature differences, even when the material is an insulator and the electrons don't move.  

Let me describe an experiment for you.  Two parallel platinum wires are patterned next to each other on the surface of an insulator.  An oscillating current at angular frequency \(\omega\) is run through wire A,  while wire B is attached to a voltage amplifier feeding into a lock-in amplifier.  From everything we teach in first-year undergrad physics, you might expect some signal on the lock-in at frequency \(\omega\) because the two wires are capacitively coupled to each other - the oscillating voltage on wire A leads to the electrons on wire B moving back and forth because they are influenced by the electric field from wire A.  You would not expect any kind of signal on wire B at frequency \(2 \omega\), though, at least not if the insulator is ideal.

However, if that insulator is magnetically interesting (e.g., a ferrimagnet, an antiferromagnet, some kinds of paramagnet), it is possible to see a \(2 \omega\) signal on wire B.  

In the spin Seebeck effect, a temperature gradient leads to a build-up of a net spin density across the magnetic insulator.  This is analogous to the conventional Seebeck effect - in a magnetically ordered system, there is a flow of magnons from the hot side to the cold side, transporting angular momentum along.  This builds up a net spin polarization of the electrons in the magnetic insulator.  Those electrons can undergo exchange processes with the electrons in the platinum wire B, and if the spins are properly oriented, this causes a voltage to build up across wire B due to the inverse spin Hall effect.  

So, in the would-be experiment, the ac current in wire A generates a temperature gradient between wire A and wire B that oscillates at frequency \(2 \omega\).  An external magnetic field is used to orient the spins in the magnetic insulator, and if the transported angular momentum points the right direction, there is a \(2 \omega \) voltage signal on wire B.   

I think this is pretty neat - an effect that is purely due to the quantum properties of electrons and would just not exist in the classical electricity and magnetism that we teach in intro undergrad courses.

(On writing this, I realized that I've never written a post defining the spin Hall and related effects. I'll have to work on that....  Sorry for the long delay between postings.  The beginning of the semester has been unusually demanding of my time.)

Thursday, August 12, 2021

More amazingly good harmonic oscillators

 Harmonic oscillators are key elements of the physicist's toolkit for modeling the world.  Back at the end of March I wrote about some recent results using silicon nitride membranes to make incredibly high quality (which is to say, low damping) harmonic oscillators.  (Remember, the ideal harmonic oscillator that gets introduced in undergrad intro physics is a mass on a spring, with no friction or dissipation at all.  An ideal oscillator would have a \(Q\) factor that is infinite, and it would keep ringing forever once started.) This past week, two papers appeared on the arxiv showing that it's possible to design networks of (again) silicon nitride beams that have resonances at room temperature (in vacuum) with \(Q > 10^{9}\).  

(a) A perimeter mode of oscillation. (b) a false-
color electron micrograph of such a device.
One of these papers takes a specific motif, a suspended polygon made from beams, supported by anchoring beams coming from its vertices, as shown in the figure.  The resonant modes with the really high \(Q\) factors are modes of the perimeter, with nodes at the vertices.  This minimizes "clamping losses", damping that occurs at anchoring points (where the strain tends to be large, and where phonons can leak vibrational energy out of the resonator and into whatever is holding it).  

The other paper gets to a very similar design, through a process that combines biological inspiration (spiderwebs), physics insight, and machine learning/optimization to really maximize \(Q\).  

With tools like this, it's possible to do quantum mechanics experiments  (that is, mechanics experiments where quantum effects are dominant) at or near room temperature with these.  Amazing.

Monday, August 09, 2021

Brief items

 It's been a busy week, so my apologies for the brevity, but here are a couple of interesting papers and sites that I stumbled upon:

  • Back when I first started teaching about nanoscience, I said that you'd really know that semiconductor quantum dots had hit the big time when you occasionally saw tanker trucks full of them going down the highway.  I think we're basically there.  Here is a great review article that summarizes the present state of the art.
  • Reaching back a month, I thought that this is an impressive piece of work.  They combine scanning tunneling microscopy, photoluminescence with a tunable optical source, and having the molecule sitting on a layer of NaCl to isolate it from the electronic continuum of the substrate.  The result is amazingly (to me) sharp spectral features in the emission, spatially resolved to the atomic scale.
  • The emergence of python and the ability to embed it in web pages through notebooks has transformative educational potential, but it definitely requires a serious investment of time and effort.  Here is a fluid dynamics course from eight years ago that I found the other day - hey, it was new to me.
  • For a more up-to-the-minute example, here is a new course about topology and condensed matter.  Now if I only had time to go through this.  The impending start of the new semester. 
  • This preprint is also an important one.  There have been some major reports in the literature about quantum oscillations (e.g., resistivity or magnetization vs. magnetic field ) being observed in insulators.  This paper shows that one must be very careful, since the use of graphite gates can lead to a confounding effect that comes from those gates rather than the material under examination.
  • This PNAS paper is a neat one.  It can be hard to grow epitaxial films of some "stubborn" materials, ones involving refractory metals (high melting points, very low vapor pressures, often vulnerable to oxidation).  This paper shows that instead one can use solid forms of precursor compounds containing those metals.  The compounds sublime with reasonably high vapor pressures, and if one can work out their decomposition properly, it's possible to grow nice films and multilayers of otherwise tough materials.  (I'd need to be convinced that the purity achieved from this comparatively low temperature approach is really good.)

Monday, August 02, 2021

Metallic water!

What does it take to have a material behave as a metal, from the physicist's perspective?  I've written about this before (wow, I've been blogging for a long time).  Fundamentally, there have to be "gapless" charge-carrying excitations, so that the application of even a tiny electric field allows those charge carriers to transition into states with (barely) higher kinetic energies and momenta.  

Top: a droplet of NaK 
alloy.  Bottom: That 
droplet coated with 
adsorbed water that 
has become a metal. 
From here.
In conventional band insulators, the electronic states are filled right up to the brim in an energy band.  Apply an electric field, and an electron has no states available into which it can go without somehow grabbing enough energy to make it all the way to the bottom of the next (conduction) band.  Since that band gap can be large (5.5 eV for diamond, 8.5 eV for NaCl), no current flows, and you have an insulator.

This is, broadly speaking, the situation in liquid water. (Even though it's a liquid, the basic concept of bands of energy levels is still helpful, though of course there are no Bloch waves as in crystalline solids.)  According to calculations and experiments, the band gap in ordinary water is about 7 eV.  You can dissolve ions in water and have those carry a current - that's the whole deal with electrolytes - but ordinarily water is not a conductor based on electrons.  It is possible to inject some electrons into water, and these end up "hydrated" or "solvated" thanks to interactions with the surrounding polar water molecules and the hydronium and hydroxyl ions floating around, but historically this does not result in a metal.  To achieve metallicity, you'd have to inject or borrow so many electrons that they could get up into that next band.

This paper from late last week seems to have done just that.  A few molecular layers of water adsorbed on the outside of a droplet of liquid sodium-potassium metal apparently ends up taking in enough electrons (\( \sim 5 \times 10^{21}\) per cc) to become metallic, as detected through optical measurements of its conductivity (including a plasmon resonance).   It's rather transient, since chemistry continues and the whole thing oxidizes, but the result is quite neat!

Friday, July 30, 2021

Workshop highlights: Spins, 1D topo materials from carbon, and more

 While virtual meetings can be draining (no breaks to go hiking; no grabbing a beer and catching up, especially when attendees are spread out across a 7 timezones), this workshop was a great way for me to catch up on some science that I'd been missing.  I can't write up everything (mea culpa), but here are a few experimental highlights:

  • Richard Berndt's group has again shown that shot noise integrated with STM is powerful, and they have used tunneling noise measurements to probe where and how spin-polarized transport happens through single radical-containing molecules on gold surfaces.
  • Katharina Franke's group has looked at what happens when you have a localized spin on the surface of a superconductor.  Exchange coupling can rip apart Cooper pairs and bind a quasiparticle in what are called Yu-Shiba-Rusinov states.  With STM, it is possible to map these and related phenomena spatially, and the states can also be tuned via tip height, leading to very pretty data.
  • Amazing polymers from here.
    Pavel Jelinek gave a talk with some really eye-popping images as well as cool science.  I had not realized before that in 1D conjugated systems (think polyacetylene) it is possible to see a topological transition as a function of length, between a conjugated state (with valence-band-like orbitals filled, and conduction-band-like orbitals empty) and another conjugated state that has an unpaired electron localized at each end (equivalent to surface states) with effectively band inversion (empty valence-band-like states above filled conduction-band-like states) in the middle.  You can actually make polymers (shown here) that show these properties and image the end states via STM.  
  • Latha Venkataraman spoke about intellectually related work.  Ordinarily, even with a conjugated oligomer, conductance falls exponentially with increasing molecular length.   However, under the right circumstances, you can get the equivalent topological transition, creating resonant states localized at the molecular ends, and over some range of lengths, you can get electronic conduction increasing with increasing molecular length.  As the molecule gets longer the resonances become better defined and stronger, though at even larger lengths the two end states decouple from each other and conductance falls again.
  • Jascha Repp did a really nice job laying out their technique that is AFM with single-charge-tunneling to give STM-like information for molecules on insulating substrates.  Voltage pulses are applied in sync with the oscillating tip moving into close proximity with the molecule, such that single charges can be added or removed each cycle.  This is detected through shifts in the mechanical resonance of the AFM cantilever due to the electrostatic interactions between the tip and the molecule.  This enables time-resolved measurements as well, to look at things like excited state lifetimes in individual molecules.
The meeting is wrapping up today, and the discussions have been a lot of fun.  Hopefully we will get together in person soon!