Saturday, September 07, 2024

Seeing through tissue and Kramers-Kronig

There is a paper in Science this week that is just a great piece of work.  The authors find that by dyeing living tissue with a particular biocompatible dye molecule, they can make that tissue effectively transparent, so you can see through it.  The paper includes images (and videos) that are impressive. 
Seeing into a living mouse, adapted from here.

How does this work?  There are a couple of layers to the answer.  

Light scatters at the interface between materials with dissimilar optical properties (summarized mathematically as the frequency-dependent index of refraction, \(n\), related to the complex dielectric function \(\tilde{\epsilon}\).   Light within a material travels with a phase velocity of \(c/n\).).  Water and fatty molecules have different indices, for example, so little droplets of fat in suspension scatter light strongly, which is why milk is, well, milky.  This kind of scattering is mostly why visible light doesn't make it through your skin very far.  Lower the mismatch between indices, and you turn down scattering at the interfaces.  Here is a cute demo of this that I pointed out about 15 (!) years ago:


Frosted glass scatters visible light well because it has surface bumpiness on the scale of the wavelength of visible light, and the index of refraction of glass is about 1.5 for visible light, while air has an index close to 1.  Fill in those bumps with something closer to the index of glass, like clear plastic packing tape, and suddenly you can see through frosted glass.  

In the dyed tissue, the index of refraction of the water-with-dye becomes closer to that of the fatty molecules that make up cell membranes, making that layer of tissue have much-reduced scattering, and voilĂ , you can see a mouse's internal organs.  Amazingly, this index matching idea is the plot device in HG Wells' The Invisible Man!

The physics question is then, how and why does the dye, which looks yellow and absorbs strongly in the blue/purple, change the index of refraction of the water in the visible?  The answer lies with a concept that very often seems completely abstract to students, the Kramers-Kronig relations.  

We describe how an electric field (from the light) polarizes a material using the frequency-dependent complex permittivity \(\tilde{\epsilon}(\omega) = \epsilon'(\omega) + i \epsilon''(\omega)\), where \(\omega\) is the frequency.  What this means is that there is a polarization that happens in-phase with the driving electric field (proportional to the real part of \(\tilde{\epsilon}(\omega)\)) and a polarization that lags or leads the phase of the driving electric field (the imaginary part, which leads to dissipation and absorption).   

The functions \(\epsilon'(\omega)\) and \(\epsilon''(\omega)\) can't be anything you want, though. Thanks to causality, the response of a material now can only depend on what the electric field has done in the past.  That restriction means that, when we decide to work in the frequency domain by Fourier transforming, there are relationships, the K-K relations, that must be obeyed between integrals of \(\epsilon'(\omega)\) and \(\epsilon''(\omega)\).  The wikipedia page has both a traditional (and to many students, obscure) derivation, as well as a time-domain picture.  

So, the dye molecules, with their very strong absorption in the blue/purple, make \(\epsilon''(\omega)\) really large in that frequency range.  The K-K relations require some compensating changes in \(\epsilon'(\omega)\) at lower frequencies to make up for this, and the result is the index matching described above.  

This work seems like it should have important applications in medical imaging, and it's striking to me that this had not been done before.  The K-K relations have been known in their present form for about 100 years.  It's inspiring that new, creative insights can still come out of basic waves and optics.

Saturday, August 31, 2024

Items of interest

The start of the semester has been very busy, but here are some items that seem interesting:

  • As many know, there has been a lot of controversy in recent years about high pressure measurements of superconductivity.  Here is a first-hand take by one of the people who helped bring the Dias scandal into the light.  It's a fascinating if depressing read.
  • Adapted from [1].
    Related, a major challenge in the whole diamond anvil cell search for superconductivity is trying to perform techniques more robust and determinative than 4-point resistance measurements and optical spectroscopy.  Back in March I had pointed out a Nature paper incorporating nitrogen-vacancy centers into the diamond anvils themselves to try in situ magnetometry of the Meissner effect.  Earlier this month, I saw this Phys Rev Lett paper, in which the authors have incorporated a tunnel junction directly onto the diamond anvil facet.  In addition to the usual Au leads for conduction measurements, they also have Ta leads that are coated with a native Ta2O5 oxide layer that functions as a tunnel barrier.  They've demonstrated clean-looking tunneling spectroscopy on sulphur at 160 GPa, which is pretty impressive.  Hopefully this will eventually be applied to the higher pressures and more dramatic systems of, e.g., H2S, reported to show 203 K superconductivity.  I do wonder if they will have problems applying this to hydrides, as one could imagine that having lots of hydrogen around might not be good for the oxide tunnel barriers. 
  • Saw a talk this week by Dr. Dev Shenoy, head of the US DoD's microelectronics effort.  It was very interesting and led me down the rabbit hole of learning more about the extreme ultraviolet lithography machines that are part of the state of the art.  The most advanced of these are made by ASML, are as big as a freight car, and cost almost $400M a piece.  Intel put up a video about taking delivery of one.  The engineering is pretty ridiculous.  Working with 13.5 nm light, you have to use mirrors rather than lenses, and the flatness/precision requirements on the optics are absurd.  It would really be transformative if someone could pull a SpaceX and come up with an approach that works as well but only costs $50M per machine, say.  (Of course, if it were easy, someone would have done it.  I'm also old enough to remember Bell Labs' effort at a competing approach, projective electron beam lithography.)
  • Lastly, Dan Ralph from Cornell has again performed a real pedagogical service to the community.  A few years ago, he put on the arXiv a set of lecture notes about the modern topics of Berry curvature and electronic topology meant to slot into an Ashcroft and Mermin solid state course.  Now he has uploaded another set of notes, this time on electron-electron interactions, the underpinnings of magnetism, and superconductivity, that again are at the right level to modernize and complement that kind of a course.  Highly recommended.

Saturday, August 17, 2024

Experimental techniques: bridge measurements

When we teach undergraduates about materials and measuring electrical resistance, we tend to gloss over the fact that there are specialized techniques for this - it's more than just hooking up a battery and an ammeter.  If you want to get high precision results, such as measuring the magnetoresistance \(\Delta R(B)\), where \(B\) is a magnetic field, to a part in \(10^{5}\) or better, more sophisticated tools are needed.  Bridge techniques compose a class of these, where instead of, say, measuring the voltage drop across a sample with a known current, instead you measure the difference between that voltage drop and the voltage drop across a known reference resistor.   

Why is this good?  Well, imagine that your sample resistance is something like 1 kOhm, and you want to look for changes in that resistance on the order of 10 milliOhms.  Often we need to use relatively low currents because in condensed matter physics we are doing low temperature measurements and don't want to heat up the sample.  If you used 1 microAmp of current, then the voltage drop across the sample would be about 1 mV and the changes you're looking for would be 10 nV, which is very tough to measure on top of a 1 mV background.  If you had a circuit where you were able to subtract off that 1 mV and only look at the changes, this is much more do-able.
Wheatstone bridge, from wikipedia

Sometimes in undergrad circuits, we teach the Wheatstone bridge, shown at right.  The idea is, you dial around the variable resistor \(R_{2}\) until the voltage \(V_{G} = 0\).  When the bridge is balanced like this, that means that \(R_{2}/R_{1} = R_{x}/R_{3}\), where \(R_{x}\) is the sample you care about and \(R_{1}\) and \(R_{3}\) are reference resistors that you know.  Now you can turn up the sensitivity of your voltage measurement to be very high, since you're looking at deviations away from \(V_{G} = 0\).   

You can do better in sensitivity by using an AC voltage source instead of the battery shown, and then use a lock-in amplifier for the voltage detection across the bridge.  That helps avoid some slow, drift-like confounding effects or thermoelectric voltages. 

Less well-known:  Often in condensed matter and nanoscale physics, the contact resistances where the measurement leads are attached aren't negligible.  If we are fortunate we can set up a four-terminal measurement that mitigates this concern, so that our the voltage measured on the sample is ideally not influenced by the contacts where current is injected or collected.  
A Kelvin bridge, from wikipedia

Is there a way to do a four-terminal bridge measurement?  Yes, it's called a Kelvin bridge, shown at right in its DC version.  When done properly, you can use variable resistors to null out the contact resistances.  This was originally developed back in the late 19th/early 20th century to measure resistances smaller than an Ohm or so (and so even small contact resistances can be relevant).  In many solid state systems, e.g., 2D materials, contact resistances can be considerably larger, so this comes in handy even for larger sample resistances.  

There are also capacitance bridges and inductance bridges - see here for something of an overview.  A big chunk of my PhD involved capacitance bridge measurements to look at changes in the dielectric response with \(10^{-7}\) levels of sensitivity.

One funny story to leave you:  When I was trying to understand all about the Kelvin bridge while I was a postdoc, I grabbed a book out of the Bell Labs library about AC bridge techniques that went back to the 1920s.  The author kept mentioning something cautionary about looking out for "the head effect".  I had no idea what this was; the author was English, and I wondered whether this was some British/American language issue, like how we talk about electrical "ground" in the US, but in the UK they say "earth".  Eventually I realized what this was really about.  Back before lock-ins and other high sensitivity AC voltmeters were readily available, it was common to run an AC bridge at a frequency of something like 1 kHz, and to use a pair of headphones as the detector.  The human ear is very sensitive, so you could listen to the headphones and balance the bridge until you couldn't hear the 1 kHz tone anymore (meaning the AC \(V_{G}\) signal on the bridge was very small).  The "head effect" is when you haven't designed your bridge correctly, so that the impedance of your body screws up the balance of the bridge when you put the headphones on.  The "head effect" = bridge imbalance because of the capacitance or inductance of your head.  See here.

Sunday, August 04, 2024

CHIP and Science, NSF support, and hypocrisy

Note: this post is a semi-rant about US funding for science education; if this isn't your cup of tea, read no further.


Two years ago, the CHIPS and Science Act (link goes to the full text of the bill, via the excellent congress.gov service of the Library of Congress) was signed into law.  This has gotten a lot of activity going in the US related to the semiconductor industry, as briefly reviewed in this recent discussion on Marketplace.  There are enormous investments by industry in semiconductor development and manufacturing in the US (as well as funding through US agencies such as DARPA, e.g.).  It was recognized in the act that the long-term impact of all of this will be contingent in part upon "workforce development" - having ongoing training and education of cohorts of people who can actually support all of this.  The word "workforce" shows up 222 times in the actual bill.   Likewise, there is appreciation that basic research is needed to set up sustained success and competitiveness - that's one reason why the act authorizes $81B over five years for the National Science Foundation, which would have roughly doubled the NSF budget over that period.

The reality has been sharply different.  Authorizations are not the same thing as appropriations, and the actual appropriation last year fell far short of the aspirational target.  NSF's budget for FY24 was $9.085B (see here) compared with $9.899B for FY23; the STEM education piece was $1.172B in FY24 (compared to $1.371B in FY23), a 17% year-over-year reduction.  That's even worse than the House version of the budget, which had proposed to cut the STEM education by 12.8%.  In the current budget negotiations (see here), the House is now proposing an additional 14.7% cut specifically to STEM education.  Just to be clear, that is the part of NSF's budget that is supposed to oversee the workforce development parts of CHIPS and Science.  Specifically, the bill says that the NSF is supposed to support "undergraduate scholarships, including at community colleges, graduate fellowships and traineeships, postdoctoral awards, and, as appropriate, other awards, to address STEM workforce gaps, including for programs that recruit, retain, and advance students to a bachelor's degree in a STEM discipline concurrent with a secondary school diploma, such as through existing and new partnerships with State educational agencies."  This is also the part of NSF that does things like Research Experience for Undergraduates and Research Experience for Teachers programs, and postdoctoral fellowships.  

Congressional budgeting in the US is insanely complicated and fraught for many reasons.  Honest, well-motivated people can have disagreements about priorities and appropriate levels of government spending.  That said, I think it is foolish not to support the educational foundations needed for the large investments in high tech manufacturing and infrastructure.  The people who oppose this kind of STEM education support tend to be the same people who also oppose allowing foreign talent into the country in high tech sectors.  If the US is serious about this kind of investment for future tech competitiveness, half-measures and failing to follow through are decidedly not helpful.

Sunday, July 28, 2024

Items of interest

 A couple of interesting papers that I came across this week:

  • There is long been an interest in purely electronic cooling techniques (no moving parts!) that would work at cryogenic temperatures.  You're familiar with ordinary evaporative cooling - that's what helps cool down your tea or coffee when you blow across the top if your steaming mug, and it's what makes you feel cold when you step out of the shower.  In evaporative cooling, the most energetic molecules can escape from the liquid into the gas phase, and the remaining molecules left behind reestablish thermal equilibrium at a lower temperature.  One can make a tunnel junction between a normal metal and a superconductor, and under the right circumstances, the hottest (thermally excited) electrons in the normal metal can be driven into the superconductor, leading to net cooling of the remaining electrons in the normal metal.  This is pretty neat, but it's had somewhat limited utility due to relatively small cooling power - here is a non-paywalled review that includes discussion of these approaches.  This week, the updated version of this paper went on the arXiv, demonstrating in Al/AlOx/Nb junctions, it is possible to cool from about 2.4 K to about 1.6 K, purely via electronic means.  This seems like a nice advance, especially as the quantum info trends have pushed hard on improving wafer-level Nb electronics.
  • I've written before about chirality-induced spin selectivity (see the first bullet here).  This is a still poorly understood phenomenon in which electrons passing through a chiral material acquire a net spin polarization, depending on the handedness of the chirality and the direction of the current.  This new paper in Nature is a great demonstration.  Add a layer of chiral perovskite to the charge injection path of a typical III-V multiple quantum well semiconductor LED, and the outcoming light acquires a net circular polarization, the sign of which depends on the sign of the chirality.  This works at room temperature, by the way.  

Saturday, July 20, 2024

The physics of squeaky shoes

In these unsettling and trying times, I wanted to write about the physics of a challenge I'm facing in my professional life: super squeaky shoes.  When I wear a particularly comfortable pair of shoes at work, when I walk in some hallways in my building (but not all), my shoes squeak very loudly with every step. How and why does this happen, physically?  

The shoes in question.

To understand this, we need to talk a bit about a friction, the sideways interfacial force between two surfaces when one surface is sheared (or attempted to be sheared) with respect to the other.  (Tribology is the study of friction, btw.)  In introductory physics we teach some (empirical) "laws" of friction, described in detail on the wikipedia page linked above as well as here:

  1.  For static friction (no actual sliding of the surfaces relative to each other), the frictional force \(F_{f} \le \mu_{s}N\), where \(\mu_{s}\) is the "coefficient of static friction" and \(N\) is the normal force (pushing the two surfaces together).  The force is directed in the plane and takes on the magnitude needed so that no sliding happens, up to its maximum value, at which point the surfaces start slipping relative to each other.
  2. For sliding or kinetic friction, \(F_{f} = \mu_{k}N\), where \(\mu_{k}\) is the coefficient of kinetic or sliding friction, and the force is directed in the plane to oppose the relative sliding motion.  The friction coefficients depend on the particular materials and their surface conditions.
  3. The friction forces are independent of the apparent contact area between the surfaces.  
  4. The kinetic friction force is independent of the relative sliding speed between the surfaces.
These "laws", especially (3) and (4), are truly weird once we know a bit more about physics, and I discuss this a little in my textbook.  The macroscopic friction force is emergent, meaning that it is a consequence of the materials being made up of many constituent particles interacting.  It's not a conservative force, in that energy dissipated through the sliding friction force doing work is "lost" from the macroscopic movement of the sliding objects and ends up in the microscopic vibrational motion (and electronic distributions, if the objects are metals).  See here for more discussion of friction laws.

Shoe squeaking happens because of what is called "stick-slip" motion.  When I put my weight on my right shoe, the rubber sole of the shoe deforms and elastic forces (like a compressed spring) push the rubber to spread out, favoring sliding rubber at the rubber-floor interface.  At some point, the local static friction maximum force is exceeded and the rubber begins to slide relative to the floor.  That lets the rubber "uncompress" some, so that the spring-like elastic forces are reduced, and if they fall back below \(\mu_{s}N\), that bit of sole will stick on the surface again.  A similar situation is shown in this model from Wolfram, looking at a mass (attached to an anchored spring) interacting with a conveyer belt.   If this start/stop cyclic motion happens at acoustic sorts of frequencies in the kHz, it sounds like a squeak, because the start-stop motion excites sound waves in the air (and the solid surfaces).  This stick-slip phenomenon is also why brakes on cars and bikes squeal, why hinges on doors in spooky houses creak, and why that one board in your floor makes that weird noise.  It's also used in various piezoelectric actuators

Macroscopic friction emerges from a zillion microscopic interactions and is affected by the chemical makeup of the surfaces, their morphology and roughness, any adsorbed layers of moisture or contaminants (remember: every surface around you right now is coated in a few molecular layers of water and hydrocarbon contamination), and van der Waals forces, among other things.  The reason my shoes squeak in some hallways but not others has to do with how the floors have been cleaned.  I could stop the squeaking by altering the bottom surface of my soles, though I wouldn't want to use a lubricant that is so effective that it seriously lowers \(\mu_{s}N\) and makes me slip.  

Friction is another example of an emergent phenomenon that is everywhere around us, of enormous technological and practical importance, and has some remarkable universality of response.  This kind of emergence is at the heart of the physics of materials, and trying to predict friction and squeaky shoes starting from elementary particle physics is just not do-able. 


Sunday, July 14, 2024

Brief items - light-driven diamagnetism, nuclear recoil, spin transport in VO2

Real life continues to make itself felt in various ways this summer (and that's not even an allusion to political madness), but here are three papers (two from others and a self-indulgent plug for our work) you might find interesting.

  • There has been a lot of work in recent years particularly by the group of Andrea Cavalleri, in which they use infrared light to pump particular vibrational modes in copper oxide superconductors (and other materials) (e.g. here).  There are long-standing correlations between the critical temperature for superconductivity, \(T_{c}\), and certain bond angles in the cuprates.  Broadly speaking, using time-resolved spectroscopy, measurements of the optical conductivity in these pumped systems show superconductor-like forms as a function of energy even well above the equilibrium \(T_{c}\), making it tempting to argue that the driven systems are showing nonequilibrium superconductivity.  At the same time, there has been a lot of interest in looking for other signatures, such as signs of the ways uperconductors expel magnetic flux through the famous Meissner effect.  In this recent result (arXiv here, Nature here), magneto-optic measurements in this same driven regime show signs of field build-up around the perimeter of the driven cuprate material in a magnetic field, as would be expected from Meissner-like flux expulsion.  I haven't had time to read this in detail, but it looks quite exciting.  
  • Optical trapping of nanoparticles is a very useful tool, and with modern techniques it is possible to measure the position and response of individual trapped particles to high precision (see here and here).  In this recent paper, the group of David Moore at Yale has been able to observe the recoil of such a particle due to the decay of a single atomic nucleus (which spits out an energetic alpha particle).  As an experimentalist, I find this extremely impressive, in that they are measuring the kick given to a nanoparticle a trillion times more massive than the ejected helium nucleus.  
  • From our group, we have published a lengthy study (arXiv here, Phys Rev B here) of local/longitudinal spin Seebeck response in VO2, a material with an insulating state that is thought to be magnetically inert.  This corroborates our earlier work, discussed here.  In brief, in ideal low-T VO2, the vanadium atoms are paired up into dimers, and the expectation is that the unpaired 3d electrons on those atoms form singlets with zero net angular momentum.  The resulting material would then not be magnetically interesting (though it could support triplet excitations called triplons).  Surprisingly, at low temperatures we find a robust spin Seebeck response, comparable to what is observed in ordered insulating magnets like yttrium iron garnet.  It seems to have the wrong sign to be from triplons, and it doesn't seem possible to explain the details using a purely interfacial model.  I think this is intriguing, and I hope other people take notice.
Hoping for more time to write as the summer progresses.  Suggestions for topics are always welcome, though I may not be able to get to everything.