Tuesday, October 01, 2024

CHIPS and Science - the reality vs the aspiration

I already wrote about this issue here back in August, but I wanted to highlight a policy statement that I wrote with colleagues as part of Rice's Baker Institute's Election 2024: Policy Playbook, which "delivers nonpartisan, expert insights into key issues at stake on the 2024 campaign trail and beyond. Presented by Rice University and the Baker Institute for Public Policy, the series offers critical context, analysis, and recommendations to inform policymaking in the United States and Texas."

The situation is summarized in this graph.  It will be very difficult to achieve the desired policy goals of the CHIPS and Science Act if Congress doesn't come remotely close to appropriations that match the targets in the Act.  What is not shown in this plot are the cuts to STEM education pieces of NSF and other agencies, despite the fact that a main goal of the Act is supposed to be education and workforce development to support the semiconductor industry.

Anyway, please take a look.  It's a very brief document.

Sunday, September 29, 2024

Annual Nobel speculation thread

Not that prizes are the be-all and end-all, but this has become an annual tradition.  Who are your speculative laureates this year for physics and chemistry?  As I did last year and for several years before, I will put forward my usual thought that the physics prize could be Aharonov and Berry for geometric phases in physics (even though Pancharatnam is intellectually in there and died in 1969).  This is a long shot, as always. Given that attosecond experiments were last year, and AMO/quantum info foundations were in 2022, and climate + spin glasses/complexity were 2021, it seems like astro is "due".   

Sunday, September 22, 2024

Lots to read, including fab for quantum and "Immaterial Science"

Sometimes there are upticks in the rate of fun reading material.  In the last few days:

  • A Nature paper has been published by a group of authors predominantly from IMEC in Belgium, in which they demonstrate CMOS-compatible manufacturing of superconducting qubit hardware (Josephson junctions, transmon qubits, based on aluminum) across 300 mm diameter wafers.  This is a pretty big deal - their method for making the Al/AlOx/Al tunnel junctions is different than the shadow evaporation method routinely used in small-scale fab.  They find quite good performance of the individual qubits with strong uniformity across the whole wafer, testing representative random devices.  They did not actually do multi-qubit operations, but what they have shown is certainly a necessary step if there is ever going to be truly large-scale quantum information processing based on this kind of superconducting approach.
  • Interestingly, Friday on the arXiv, a group led by researchers at Karlsruhe demonstrated spin-based quantum dot qubits in Si/SiGe, made on 300 mm substrates.  This fab process comes complete with an integrated Co micromagnet for help in conducting electric dipole spin resonance.  They demonstrate impressive performance in terms of single-qubit properties and operations, with the promise that the coherence times would be at least an order of magnitude longer if they had used isotopically purified 28Si material.  (The nuclear spins of the stray 29Si atoms in the ordinary Si used here are a source of decoherence.)  
So, while tremendous progress has been made with atomic physics approaches to quantum computing (tweezer systems like thision trapping), it's not wise to count out the solid-state approaches.  The engineering challenges are formidable, but solid-state platforms are based on fab approaches that can make billions of transistors per chip, with complex 3D integration.

  • On the arXiv this evening is also this review about "quantum geometry", which seems like a pretty readable overview of how the underlying structure of the wavefunctions in crystalline solids (the part historically neglected for decades, but now appreciated through its relevance to topology and a variety of measurable consequences) affects electronic and optical response.  I just glanced at it, but I want to make time to look it over in detail.
  • Almost 30 years ago, Igor Dolgachev at Michigan did a great service by writing up a brief book entitled "A Brief Introduction to Physics for Mathematicians".  That link is to the pdf version hosted on his website.  Interesting to see how this is presented, especially since a number of approaches routinely shown to undergrad physics majors (e.g., almost anything we do with Dirac delta functions) generally horrify rigorous mathematics students.
  • Also fun (big pdf link here) is the first fully pretty and typeset issue of the amusing Journal of Immaterial Science, shown at right.  There is a definite chemistry slant to the content, and I encourage you to read their (satirical) papers as they come out on their website


Monday, September 16, 2024

Fiber optics + a different approach to fab

 Two very brief items of interest:

  • This article is a nice popular discussion of the history of fiber optics and the remarkable progress it's made for telecommunications.  If you're interested in a more expansive but very accessible take on this, I highly recommend City of Light by Jeff Hecht (not to be confused with Eugene Hecht, author of the famous optics textbook).
  • I stumbled upon an interesting effort by Yokogawa, the Japanese electronics manufacturer, to provide an alternative path for semiconductor device prototyping that they call minimal fab.  The idea is, instead of prototyping circuits on 200 mm wafers or larger (the industry standard for large scale production is 200 mm or 300 mm.  Efforts to go up to 450 mm wafers have been shelved for now.), there are times when it makes sense to work on 12.5 mm substrates.  Their setup uses maskless photolithography and is intended to be used without needing a cleanroom.  Admittedly, this limits it strongly in terms of device size to 1970s-era micron scales (presumably this could be pushed to 1-2 micron with a fancier litho tool), and it's designed for single-layer processing (not many-layer alignments with vias).  Still, this could be very useful for startup efforts, and apparently it's so simple that a child could use it.

Saturday, September 07, 2024

Seeing through tissue and Kramers-Kronig

There is a paper in Science this week that is just a great piece of work.  The authors find that by dyeing living tissue with a particular biocompatible dye molecule, they can make that tissue effectively transparent, so you can see through it.  The paper includes images (and videos) that are impressive. 
Seeing into a living mouse, adapted from here.

How does this work?  There are a couple of layers to the answer.  

Light scatters at the interface between materials with dissimilar optical properties (summarized mathematically as the frequency-dependent index of refraction, \(n\), related to the complex dielectric function \(\tilde{\epsilon}\).   Light within a material travels with a phase velocity of \(c/n\).).  Water and fatty molecules have different indices, for example, so little droplets of fat in suspension scatter light strongly, which is why milk is, well, milky.  This kind of scattering is mostly why visible light doesn't make it through your skin very far.  Lower the mismatch between indices, and you turn down scattering at the interfaces.  Here is a cute demo of this that I pointed out about 15 (!) years ago:


Frosted glass scatters visible light well because it has surface bumpiness on the scale of the wavelength of visible light, and the index of refraction of glass is about 1.5 for visible light, while air has an index close to 1.  Fill in those bumps with something closer to the index of glass, like clear plastic packing tape, and suddenly you can see through frosted glass.  

In the dyed tissue, the index of refraction of the water-with-dye becomes closer to that of the fatty molecules that make up cell membranes, making that layer of tissue have much-reduced scattering, and voilĂ , you can see a mouse's internal organs.  Amazingly, this index matching idea is the plot device in HG Wells' The Invisible Man!

The physics question is then, how and why does the dye, which looks yellow and absorbs strongly in the blue/purple, change the index of refraction of the water in the visible?  The answer lies with a concept that very often seems completely abstract to students, the Kramers-Kronig relations.  

We describe how an electric field (from the light) polarizes a material using the frequency-dependent complex permittivity \(\tilde{\epsilon}(\omega) = \epsilon'(\omega) + i \epsilon''(\omega)\), where \(\omega\) is the frequency.  What this means is that there is a polarization that happens in-phase with the driving electric field (proportional to the real part of \(\tilde{\epsilon}(\omega)\)) and a polarization that lags or leads the phase of the driving electric field (the imaginary part, which leads to dissipation and absorption).   

The functions \(\epsilon'(\omega)\) and \(\epsilon''(\omega)\) can't be anything you want, though. Thanks to causality, the response of a material now can only depend on what the electric field has done in the past.  That restriction means that, when we decide to work in the frequency domain by Fourier transforming, there are relationships, the K-K relations, that must be obeyed between integrals of \(\epsilon'(\omega)\) and \(\epsilon''(\omega)\).  The wikipedia page has both a traditional (and to many students, obscure) derivation, as well as a time-domain picture.  

So, the dye molecules, with their very strong absorption in the blue/purple, make \(\epsilon''(\omega)\) really large in that frequency range.  The K-K relations require some compensating changes in \(\epsilon'(\omega)\) at lower frequencies to make up for this, and the result is the index matching described above.  

This work seems like it should have important applications in medical imaging, and it's striking to me that this had not been done before.  The K-K relations have been known in their present form for about 100 years.  It's inspiring that new, creative insights can still come out of basic waves and optics.

Saturday, August 31, 2024

Items of interest

The start of the semester has been very busy, but here are some items that seem interesting:

  • As many know, there has been a lot of controversy in recent years about high pressure measurements of superconductivity.  Here is a first-hand take by one of the people who helped bring the Dias scandal into the light.  It's a fascinating if depressing read.
  • Adapted from [1].
    Related, a major challenge in the whole diamond anvil cell search for superconductivity is trying to perform techniques more robust and determinative than 4-point resistance measurements and optical spectroscopy.  Back in March I had pointed out a Nature paper incorporating nitrogen-vacancy centers into the diamond anvils themselves to try in situ magnetometry of the Meissner effect.  Earlier this month, I saw this Phys Rev Lett paper, in which the authors have incorporated a tunnel junction directly onto the diamond anvil facet.  In addition to the usual Au leads for conduction measurements, they also have Ta leads that are coated with a native Ta2O5 oxide layer that functions as a tunnel barrier.  They've demonstrated clean-looking tunneling spectroscopy on sulphur at 160 GPa, which is pretty impressive.  Hopefully this will eventually be applied to the higher pressures and more dramatic systems of, e.g., H2S, reported to show 203 K superconductivity.  I do wonder if they will have problems applying this to hydrides, as one could imagine that having lots of hydrogen around might not be good for the oxide tunnel barriers. 
  • Saw a talk this week by Dr. Dev Shenoy, head of the US DoD's microelectronics effort.  It was very interesting and led me down the rabbit hole of learning more about the extreme ultraviolet lithography machines that are part of the state of the art.  The most advanced of these are made by ASML, are as big as a freight car, and cost almost $400M a piece.  Intel put up a video about taking delivery of one.  The engineering is pretty ridiculous.  Working with 13.5 nm light, you have to use mirrors rather than lenses, and the flatness/precision requirements on the optics are absurd.  It would really be transformative if someone could pull a SpaceX and come up with an approach that works as well but only costs $50M per machine, say.  (Of course, if it were easy, someone would have done it.  I'm also old enough to remember Bell Labs' effort at a competing approach, projective electron beam lithography.)
  • Lastly, Dan Ralph from Cornell has again performed a real pedagogical service to the community.  A few years ago, he put on the arXiv a set of lecture notes about the modern topics of Berry curvature and electronic topology meant to slot into an Ashcroft and Mermin solid state course.  Now he has uploaded another set of notes, this time on electron-electron interactions, the underpinnings of magnetism, and superconductivity, that again are at the right level to modernize and complement that kind of a course.  Highly recommended.

Saturday, August 17, 2024

Experimental techniques: bridge measurements

When we teach undergraduates about materials and measuring electrical resistance, we tend to gloss over the fact that there are specialized techniques for this - it's more than just hooking up a battery and an ammeter.  If you want to get high precision results, such as measuring the magnetoresistance \(\Delta R(B)\), where \(B\) is a magnetic field, to a part in \(10^{5}\) or better, more sophisticated tools are needed.  Bridge techniques compose a class of these, where instead of, say, measuring the voltage drop across a sample with a known current, instead you measure the difference between that voltage drop and the voltage drop across a known reference resistor.   

Why is this good?  Well, imagine that your sample resistance is something like 1 kOhm, and you want to look for changes in that resistance on the order of 10 milliOhms.  Often we need to use relatively low currents because in condensed matter physics we are doing low temperature measurements and don't want to heat up the sample.  If you used 1 microAmp of current, then the voltage drop across the sample would be about 1 mV and the changes you're looking for would be 10 nV, which is very tough to measure on top of a 1 mV background.  If you had a circuit where you were able to subtract off that 1 mV and only look at the changes, this is much more do-able.
Wheatstone bridge, from wikipedia

Sometimes in undergrad circuits, we teach the Wheatstone bridge, shown at right.  The idea is, you dial around the variable resistor \(R_{2}\) until the voltage \(V_{G} = 0\).  When the bridge is balanced like this, that means that \(R_{2}/R_{1} = R_{x}/R_{3}\), where \(R_{x}\) is the sample you care about and \(R_{1}\) and \(R_{3}\) are reference resistors that you know.  Now you can turn up the sensitivity of your voltage measurement to be very high, since you're looking at deviations away from \(V_{G} = 0\).   

You can do better in sensitivity by using an AC voltage source instead of the battery shown, and then use a lock-in amplifier for the voltage detection across the bridge.  That helps avoid some slow, drift-like confounding effects or thermoelectric voltages. 

Less well-known:  Often in condensed matter and nanoscale physics, the contact resistances where the measurement leads are attached aren't negligible.  If we are fortunate we can set up a four-terminal measurement that mitigates this concern, so that our the voltage measured on the sample is ideally not influenced by the contacts where current is injected or collected.  
A Kelvin bridge, from wikipedia

Is there a way to do a four-terminal bridge measurement?  Yes, it's called a Kelvin bridge, shown at right in its DC version.  When done properly, you can use variable resistors to null out the contact resistances.  This was originally developed back in the late 19th/early 20th century to measure resistances smaller than an Ohm or so (and so even small contact resistances can be relevant).  In many solid state systems, e.g., 2D materials, contact resistances can be considerably larger, so this comes in handy even for larger sample resistances.  

There are also capacitance bridges and inductance bridges - see here for something of an overview.  A big chunk of my PhD involved capacitance bridge measurements to look at changes in the dielectric response with \(10^{-7}\) levels of sensitivity.

One funny story to leave you:  When I was trying to understand all about the Kelvin bridge while I was a postdoc, I grabbed a book out of the Bell Labs library about AC bridge techniques that went back to the 1920s.  The author kept mentioning something cautionary about looking out for "the head effect".  I had no idea what this was; the author was English, and I wondered whether this was some British/American language issue, like how we talk about electrical "ground" in the US, but in the UK they say "earth".  Eventually I realized what this was really about.  Back before lock-ins and other high sensitivity AC voltmeters were readily available, it was common to run an AC bridge at a frequency of something like 1 kHz, and to use a pair of headphones as the detector.  The human ear is very sensitive, so you could listen to the headphones and balance the bridge until you couldn't hear the 1 kHz tone anymore (meaning the AC \(V_{G}\) signal on the bridge was very small).  The "head effect" is when you haven't designed your bridge correctly, so that the impedance of your body screws up the balance of the bridge when you put the headphones on.  The "head effect" = bridge imbalance because of the capacitance or inductance of your head.  See here.