Processing math: 100%

Search This Blog

Saturday, April 12, 2025

What is multiferroicity?

(A post summarizing recent US science-related events will be coming later.  For now, here is my promised post about multiferroics, inspired in part by a recent visit to Rice by Yoshi Tokura.)

Electrons carry spins and therefore magnetic moments (that is, they can act in some ways like little bar magnets), and as I was teaching undergrads this past week, under certain conditions some of the electrons in a material can spontaneously develop long-range magnetic order.  That is, rather than being, on average, randomly oriented, instead below some critical temperature the spins take on a pattern that repeats throughout the material.  In the ordered state, if you know the arrangement of spins in one (magnetic) unit cell of the material, that pattern is repeated over many (perhaps all, if the system is a single domain) the unit cells.  In picking out this pattern, the overall symmetry of the material is lowered compared to the non-ordered state.  (There can be local moment magnets, when the electrons with the magnetic moments are localized to particular atoms; there can also be itinerant magnets, when the mobile electrons in a metal take on a net spin polarization.)  The most famous kind of magnetic order is ferromagnetism, when the magnetic moments spontaneously align along a particular direction, often leading to magnetic fields projected out of the material.    Magnetic materials can be metals, semiconductors, or insulators.

In insulators, an additional kind of order is possible, based on electric polarization, P.  There is subtlety about defining polarization, but for the purposes of this discussion, the question is whether the atoms within each unit cell bond appropriately and are displaced below some critical temperature to create a net electric dipole moment, leading to ferroelectricity.  (Antiferroelectricity is also possible.) Again, the ordered state has lower symmetry than the non-ordered state.  Ferroelectric materials have some interesting applications.  

BiFeO3, a multiferroic antiferromagnet,
image from here.

Multiferroics are materials that have simultaneous magnetic order and electric polarization order.  A good recent review is here.  For applications, obviously it would be convenient if both the magnetic and polarization ordering happened well above room temperature.  There can be deep connections between the magnetic order and the electric polarization - see this paper, and this commentary.   Because of these connections, the low energy excitations of multiferroics can be really complicated, like electromagnons.  Similarly, there can be combined "spin textures" and polarization textures in such materials - see here and here.   Multiferroics raise the possibility of using applied voltages (and hence electric fields) to flip P, and thus toggle around M.  This has been proposed as a key enabling capability for information processing devices, as in this approach.  These materials are extremely rich, and it feels like their full potential has not yet been realized.  

Sunday, March 30, 2025

Science updates - brief items

Here are a couple of neat papers that I came across in the last week.  (Planning to write something about multiferroics as well, once I have a bit of time.)

  • The idea of directly extracting useful energy from the rotation of the earth sounds like something out of an H. G. Wells novel.  At a rough estimate (and it's impressive to me that AI tools are now able to provide a convincing step-by-step calculation of this; I tried w/ gemini.google.com) the rotational kinetic energy of the earth is about 2.6×1029 J.  The tricky bit is, how do you get at it?  You might imagine constructing some kind of big space-based pick-up coil and getting some inductive voltage generation as the earth rotates its magnetic field past the coil.  Intuitively, though, it seems like while sitting on the (rotating) earth, you should in some sense be comoving with respect to the local magnetic field, so it shouldn't be possible to do anything clever that way.  It turns out, though, that Lorentz forces still apply when moving a wire through the axially symmetric parts of the earth's field.  This has some conceptual contact with Faraday's dc electric generator.   With the right choice of geometry and materials, it is possible to use such an approach to extract some (tiny at the moment) power.  For the theory proposal, see here.  For an experimental demonstration, using thermoelectric effects as a way to measure this (and confirm that the orientation of the cylindrical shell has the expected effect), see here.  I need to read this more closely to decide if I really understand the nuances of how it works.
  • On a completely different note, this paper came out on Friday.  (Full disclosure:  The PI is my former postdoc and the second author was one of my students.)  It's an impressive technical achievement.  We are used to the fact that usually macroscopic objects don't show signatures of quantum interference.  Inelastic interactions of the object with its environment effectively suppress quantum interference effects on some time scale (and therefore some distance scale).  Small molecules are expected to still show electronic quantum effects at room temperature, since they are tiny and their electronic levels are widely spaced, and here is a review of what this could do in electronic measurements.  Quantum interference effects should also be possible in molecular vibrations at room temperature, and they could manifest themselves through the vibrational thermal conduction through single molecules, as considered theoretically here.  This experimental paper does a bridge measurement to compare the thermal transport between a single-molecule-containing junction between a tip and a surface, and an empty (farther spaced) twin tip-surface geometry.  They argue that they see differences between two kinds of molecules that originate from such quantum interference effects.
As for more global issues about the US research climate, there will be more announcements soon about reductions in force and the forthcoming presidential budget request.  (Here is an online petition regarding the plan to shutter the NIST atomic spectroscopy group.)  Please pay attention to these issues, and if you're a US citizen, I urge you to contact your legislators and make your voice heard.  

Thursday, March 20, 2025

March Meeting 2025, Day 4 and wrap-up

 I saw a couple of interesting talks this morning before heading out:

  • Alessandro Chiesa of Parma spoke about using spin-containing molecules potentially as qubits, and about chiral-induced spin selectivity (CISS) in electron transfer.  Regarding the former, here is a review.  Spin-containing molecules can have interesting properties as single qubits, or, for spins higher than 1/2, qudits, with unpaired electrons often confined to a transition metal or rare earth ion somewhat protected from the rest of the universe by the rest of the molecule.  The result can be very long coherence times for their spins.  Doing multi-qubit operations is very challenging with such building blocks, however.  There are some theory proposals and attempts to couple molecular qubits to superconducting resonators, but it's tough!   Regarding chiral induced spin selectivity, he discused recent work trying to use molecules where a donor region is linked to an acceptor region via a chiral bridge, and trying to manipulate spin centers this way.  A question in all the CISS work is, how can the effects be large when spin-orbit coupling is generally very weak in light, organic molecules?  He has a recent treatment of this, arguing that if one models the bridge as a chain of sites with large U/t, where U is the on-site repulsion energy and t is the hopping contribution, then exchange processes between sites can effectively amplify the otherwise weak spin-orbit effects.  I need to read and think more about this.
  • Richard Schlitz of Konstanz gave a nice talk about some pretty recent research using a scanning tunneling microscope tip (with magnetic iron atoms on the end) to drive electron paramagnetic resonance in a single pentacene molecule (sitting on MgO on Ag, where it tends to grab an electron from the silver and host a spin).  The experimental approach was initially explained here.  The actual polarized tunneling current can drive the resonance, and exactly how depends on the bias conditions.  At high bias, when there is strong resonant tunneling, the current exerts a damping-like torque, while at low bias, when tunneling is far off resonance, the current exerts a field-like torque.  Neat stuff.
  • Leah Weiss from Chicago gave a clear presentation about not-yet-published results (based on earlier work), doing optically detected EPR of Er-containing molecules.  These condense into mm-sized molecular crystals, with the molecular environment being nice and clean, leading to very little inhomogeneous broadening of the lines.  There are spin-selective transitions that can be driven using near telecom-wavelength (1.55 μm) light.  When the (anisotropic) g-factors of the different levels are different, there are some very promising ways to do orientation-selective and spin-selective spectroscopy.  Looking forward to seeing the paper on this.
And that's it for me for the meeting.  A couple of thoughts:
  • I'm not sold on the combined March/April meeting.  Six years ago when I was a DCMP member-at-large, the discussion was all about how the March Meeting was too big, making it hard to find and get good deals on host sites, and maybe the meeting should split.  Now they've made it even bigger.  Doesn't this make planning more difficult and hosting more expensive since there are fewer options?  (I'm not an economist, but....)  A benefit for the April meeting attendees is that grad students and postdocs get access to the career/networking events held at the MM.  If you're going to do the combination, then it seems like you should have the courage of your convictions and really mingle the two, rather than keeping the March talks in the convention center and the April talks in site hotels.
  • I understand that van der Waals/twisted materials are great laboratories for physics, and that topological states in these are exciting.  Still, by my count there were 7 invited sessions broadly about this topic, and 35 invited talks on this over four days seems a bit extreme.  
  • By my count, there were eight dilution refrigerator vendors at the exhibition (Maybell, Bluefors, Ice, Oxford, Danaher/Leiden, Formfactor, Zero-Point Cryo, and Quantum Design if you count their PPMS insert).  Wow.  
I'm sure there will be other cool results presented today and tomorrow that I am missing - feel free to mention them in the comments.

Wednesday, March 19, 2025

March Meeting 2025, Day 3

Another busy day at the APS Global Physics Summit.  Here are a few highlights:

  • Shahal Ilani of the Weizmann gave an absolutely fantastic talk about his group's latest results from their quantum twisting microscope.  In a scanning tunneling microscope, because tunneling happens at an atomic-scale location between the tip and the sample, the momentum in the transverse direction is not conserved - that is, the tunneling averages over a huge range of k vectors for the tunneling electron.  In the quantum twisting microscope, electrons tunnel from a flat (graphite) patch something like d 100 nm across, coherently, through a couple of layers of some insulator (like WSe2) and into a van der Waals sample.  In this case, k in the plane is comparatively conserved, and by rotating the sample relative to the tip, it is possible to build up a picture of the sample's electronic energy vs. k dispersion, rather like in angle-resolved photoemission.  This has allowed, e.g., mapping of phonons via inelastic tunneling.  His group has applied this to magic angle twisted bilayer graphene, a system that has a peculiar combination of properties, where in some ways the electrons act like very local objects, and in other ways they act like delocalized objects.  The answer seems to be that this system at the magic angle is a bit of an analog of a heavy fermion system, where there are sort of local moments (living in very flat bands) interacting and hybridizing with "conduction" electrons (bands crossing the Fermi level at the Brillouin zone center).  The experimental data (movies of the bands as a function of energy and k in the plane as the filling is tuned via gate) are gorgeous and look very much like theoretical models.
  • I saw a talk by Roger Melko about applying large language models to try to get efficient knowledge of many-body quantum states, or at least the possible outputs of evolution of a quantum system like a quantum computer based on Rydberg atoms.  It started fairly pedagogically, but I confess that I got lost in the AI/ML jargon about halfway through.
  • Francis M. Ross, recipient of this year's Keithley Award, gave a great talk about using transmission electron microscopy to watch the growth of materials in real time.  She had some fantastic videos - here is a review article about some of the techniques used.  She also showed some very new work using a focused electron beam to make arrays of point defects in 2D materials that looks very promising.
  • Steve Kivelson, recipient of this year's Buckley Prize, presented a very nice talk about his personal views on the theory of high temperature superconductivity in the cuprates.  One basic point:  these materials are balancing between multiple different kinds of emergent order (spin density waves, charge density waves, electronic nematics, perhaps pair density waves).   This magnifies the effects of quenched disorder, which can locally tip the balance one way or another.  Recent investigations of the famous 2D square lattice Hubbard model show this as well.  He argues that the ground state of the Hubbard model for a broad range 1/2<U/t<8, where U is the on-site repulsion and t is the hopping term, the ground state is in fact a charge density wave, not a superconductor.  However, if there is some amount of disorder in the form of δt/t0.10.2, the result is a robust, unavoidable superconducting state.  He further argues that increasing the superconducting transition temperature requires striking a balance between the underdoped case (strong pairing, weak superfluid phase stiffness) and the overdoped case (weak pairing, strong superfluid stiffness), and that one way to achieve this would be in a bilayer with broken mirror symmetry (say different charge reservoir layers above and below, and/or a big displacement field perpendicular to the plane).  (Apologies for how technical that sounded - hard to reduce that one to something super accessible without writing much more.)
A bit more tomorrow before I depart back to Houston.

March Meeting 2025, Day 2

I spent a portion of today catching up with old friends and colleagues, so fewer highlights, but here are a couple:

  • Like a few hundred other people, I went to the invited talk by Chetan Nayak, leader of Microsoft's quantum computing effort. It was sufficiently crowded that the session chair warned everyone about fire code regulations and that people should not sit on the floor blocking the aisles.  To set the landscape:  Microsoft's approach to quantum computing is to develop topological qubits based on interesting physics that is predicted to happen (see here and here) if one induces superconductivity (via the proximity effect) in a semiconductor nanowire with spin-orbit coupling.  When the right combination of gate voltage and external magnetic field is applied, the nanowire should cross into a topologically nontrivial state with majorana fermions localized to each end of the nanowire, leading to "zero energy states" seen as peaks in the conductance dI/dV centered at zero bias (V=0).  A major challenge is that disorder in these devices can lead to other sources of zero-bias peaks (Andreev bound states).  A 2023 paper outlines a protocol that is supposed to give good statistical feedback on whether a given device is in the topologically interesting or trivial regime.  I don't want to rehash the history of all of this.  In a paper published last month, a single proximitized, gate-defined InAs quantum wire is connected to a long quantum dot to form an interferometer, and the capacitance of that dot is sensed via RF techniques as a function of the magnetic flux threading the interferometer, showing oscillations with period h/2e, interpreted as charge parity oscillations of the proximitized nanowire.  In new data, not yet reported in a paper, Nayak presented measurements on a system comprising two such wires and associated other structures.  The argument is that each wire can be individually tuned simultaneously into the topologically nontrivial regime via the protocol above.  Then interferometer measurements can be performed in one wire (the Z channel) and in a configuration involving two ends of different wires (the X channel), and they interpret their data as early evidence that they have achieved the desired majorana modes and their parity measurements.  I look forward to when a paper is out on this, as it is hard to make informed statements about this based just on what I saw quickly on slides from a distance.  
  • In a completely different session, Garnet Chan gave a very nice talk about applying advanced quantum chemistry and embedding techniques to look at some serious correlated materials physics.  Embedding methods are somewhat reminiscent of mean field theories:  Instead of trying to solve the Schrödinger equation for a whole solid, for example, you can treat the solid as a self-consistent theory of a unit cell or set of unit cells embedded in a more coarse-grained bath (made up of other unit cells appropriately averaged).  See here, for example. He presented recent results on computing the Kondo effect of magnetic impurities in metals, understanding the trends of antiferromagnetic properties of the parent cuprates, and trying to describe superconductivity in the doped cuprates.  Neat stuff.
  • In the same session, my collaborator Silke Buehler-Paschen gave a nice discussion of ways to use heavy fermion materials to examine strange metals, looking beyond just resistivity measurements.  Particularly interesting is the idea of trying to figure out quantum Fisher information, which in principle can tell you how entangled your many-body system is (that is, estimating how many other degrees of freedom are entangled with one particular degree of freedom).  See here for an intro to the idea, and here for an implementation in a strange metal, Ce3Pd20Si6.  
More tomorrow....

(On a separate note, holy cow, the trade show this year is enormous - seems like it's 50% bigger than last year.  I never would have dreamed when I was a grad student that you could go to this and have your pick of maybe 10 different dilution refrigerator vendors.  One minor mystery:  Who did World Scientific tick off?  Their table is located on the completely opposite side of the very large hall from every other publisher.)

Monday, March 17, 2025

March Meeting 2025, Day 1

The APS Global Physics Summit is an intimate affair, with a mere 14,000 attendees, all apparently vying for lunch capacity for about 2,000 people.   The first day of the meeting was the usual controlled chaos of people trying to learn the layout of the convention center while looking for talks and hanging out having conversations.  On the plus side, the APS wifi seems to function well, and the projectors and slide upload system are finally technologically mature (though the pointers/clickers seem to have some issues).  Some brief highlights of sessions I attended:

  • I spent the first block of time at this invited session about progress in understanding quantum spin liquids and quantum spin ice.  Spin ices are generally based on the pyrochlore structure, where atoms hosting local magnetic moments sit at the vertices of corner-sharing tetrahedra, as I had discussed here.  The idea is that the crystal environment and interactions between spins are such that the moments are favored to satisfy the ice rules, where in each tetrahedron two moments point inward toward the center and two point outward.  Classically there are a huge number of spin arrangements that all have about the same ground state energy.  In a quantum spin ice, the idea is that quantum fluctuations are large, so that the true ground state would be some enormous superposition of all possible ice-rule-satistfying configurations.  One consequence of this is that there are low energy excitations that look like an emergent form of electromagnetism, including a gapless phonon-like mode.  Bruce Gaulin spoke about one strong candidate quantum spin ice, Ce2Zr2O7, in a very pedagogical talk that covered all this.  A relevant recent review is this one.   There were two other talks in the session also about pyrochlores, an experimentally focused one by Sylvain Petit discussing Tb2Ti2O7 (see here), and a theory talk by Yong-Baek Kim focused again on the cerium zirconate.    Also in the session was an interesting talk by Jeff Rau about K2IrCl6, a material with a completely different structure that (above its ordering temperature of 3 K) acts like a "nodal line spin liquid".
  • In part because I had students speaking there, I also attended a contributed session about nanomaterials (wires, tubes, dots, particles, liquids).  There were some neat talks.  The one that I found most surprising was from the Cha group at Cornell, where they were using a method developed by the Schroer group at Yale (see here and here) to fabricate nanowires of two difficult to grow, topologically interesting metals, CoIn3 and RhIn3.  The idea is to create a template with an array of tubular holes, and squeeze that template against a bulk crystal of the desired material at around 350C, so that the crystal is extruded into the holes to form wires.  Then the template can be etched away and the wires recovered for study.  I'm amazed that this works.
  • In the afternoon, I went back and forth between the very crowded session on fractional quantum anomalous Hall physics in stacked van der Waals materials, and a contributed session about strange metals.  Interesting stuff for sure.
I'm still trying to figure out what to see tomorrow, but there will be another update in the evening.

Sunday, March 16, 2025

March Meeting 2025, Day 0

Technically, this year the conference is known as the APS Global Physics Summit rather than the March Meeting, but I'm keeping my blog post titles consistent with previous years.   Over 14,000 physicists have descended upon Anaheim, and there are parallel events in more than a dozen countries around the world as well.

Late this afternoon I attended an APS town hall session about "Protecting Science".  There were brief remarks by APS President John Doyle, APS CEO Jonathan Bagger, and APS External Affairs Officer Francis "Slake" Slakey, followed by an audience Q&A.  It was a solid event attended by about 300 people in person and more online, as the society tries to thread its way through some very challenging times for science and scholarship in the US.  Main take-aways from the intro remarks:

  • The mission and values of the APS have not changed. 
  • Paraphrasing:  We must explain to the public and officials the wonder of science and the economic impact of what we do.  Discovery and application reinforce each other, and this dynamic is what drives progress.  We need the public to hear this.  We need Congress to hear this.  We need the executive branch and its advisors to hear this.   APS needs to promote physics, and physicists need to tell the truth, even when uncomfortable.  The truth is our currency with the public.  It is our superpower.  APS is not a blue or red state organization; it's an organization that champions physics.
  • Slake thanked and asked the audience to stand and thank the many federal science agency employees who are feeling dispirited and unsupported.  "You are part of this community and no federal disruption is going to change that."
  • Slake also mentioned that the critical short-term issue is the upcoming budget.  The White House will announce its version in April, and the APS is pursuing a 50-state coordinated approach to have people speak to their congressional delegations in their states and districts, to explain what the harm and true costs are if the science agency budgets are slashed.  They are targeting certain key states in particular (Alaska, Kansas, Indiana, Pennsylvania, Maine, South Dakota were mentioned).
  • APS is continuing its support for bridge and mentorship programs, as well as the STEP-UP program; see here.  These programs are open to all.  
Tomorrow, some highlights of the scientific program.  Apologies for unavoidably missing a lot of cool stuff - I go to my students' sessions and try to see other topics that interest me, but because the meeting is so large, with so many parallel talks, I know that I inevitably can't see all the exciting science.

Tuesday, March 11, 2025

The 2025 Wolf Prize in Physics

One nice bit of condensed matter/nanoscale physics news:  This year's Wolf Prize in Physics has gone to three outstanding scientists, Jim Eisenstein, Moty Heiblum, and Jainendra Jain, each of whom have done very impactful work involving 2D electron gases - systems of electrons confined to move only in two dimensions by the electronic structure and alignment of energy bands at interfaces between semiconductors.  Of particular relevance to these folks are the particularly clean 2D electron gases at the interfaces between GaAs and AlGaAs, or in GaAs quantum wells embedded in AlGaAs.

A thread that connects all three of these scientists is the fractional quantum Hall effect in these 2D systems.  Electrons confined to move in 2D, in the presence of a magnetic field perpendicular to the plane of motion, form a remarkable system.  The quantum wavefunction of an electron in this situation changes as the magnetic induction B is increased.  The energy levels of such an electron are given by (n+1/2)ωc, where ωceB/m is the cyclotron frequency.  These energy levels are called Landau Levels.  The ratio between the 2D density of electrons and the density of magnetic flux in fundamental units (B/(h/e)) is called the "filling factor", ν, and when this is an integer, the Hall conductance is quantized in fundamental units - see here.  
Figure 4 from this article by Jain, with Rxx(B) data from here.  Notice how the data around B=0 looks a lot like the data around ν=1/2, which looks a lot like the data around ν=1/4

A remarkable thing happens when ν=1/2 - see the figure above.  There is no quantum Hall effect there; in fact, if you look at the longitudinal resistance Rxx as a function of B near ν=1/2, it looks remarkably like Rxx(B) near B=0.  At this half-integer filling factor, the 2D electrons plus the magnetic flux "bundle together", leading to a state with new low-energy excitations called composite fermions that act like they are in zero magnetic field.  In many ways the FQHE looks like the integer quantum Hall effect for these composite fermions, though the situation is more complicated than that.  Jainendra Jain did foundational work on the theory of composite fermions, among many other things.

Jim Eisenstein has done a lot of great experimental work involving composite fermions and even-denominator FQH states.  My postdoctoral mentor, Bob Willett, and he are first two authors on the paper where an unusual quantum Hall state was discovered at ν=5/2, a state still under active investigation for potential topological quantum computing applications.   One particularly surprising result from Eisenstein's group was the discovery that some "high" Landau level even-denominator fillings (ν=9/2,11/2) showed enormously anisotropic resistances, with big differences between Rxx and Ryy, an example of the onset of a "stripe" phase of alternating fillings.  

Another very exciting result from Eisenstein's group used 2D electron gases in close proximity parallel layers and in high magnetic fields, as well as 2D electron gases near 2D hole gases.  Both can allow the formation of excitons, bound states of electrons and holes, but with the electrons and holes in neighboring layers so that they could not annihilate each other.  Moreover, a Bose-Einstein condensation of those excitons is possible leading to remarkable superflow of excitons and resonant tunneling between the layers.  This review article is a great discussion of all of this.

Moty Heiblum's group at the Weizmann Institute has been one of the world-leading groups investigating "mesoscopic" physics of confined electrons in the past 30+ years.  They have performed some truly elegant experiments using 2D electron gases as their platform.  A favorite of mine (mentioned in my textbook) is this one, in which they make a loop-shaped interferometer for electrons which shows oscillations in the conductance as they thread magnetic flux through the loop; they then use a nearby quantum point contact as a charge sensor near one arm of the interferometer, a which-path detector that tunably suppresses the quantum interference. 

His group also did foundational work on the use of shot noise as a tool to examine the nature and transport of charge carriers in condensed matter systems (an idea that I found inspiring).  Their results showing that the quasiparticles in the fractional quantum Hall regime can have fractional charges are remarkable.  More recently, they have shown how subtle these measurements really can be, in 2D electron systems that can support neutral excitations as well as charged ones.

All in all, this is a great recognition of outstanding scientists for a large volume of important, influential work.

(On a separate note:  I will be attending 3+ days of the APS meeting next week.  I'll try to do my usual brief highlight posts, time permitting.  If people have suggestions of cool content, please let me know.)

Thursday, March 06, 2025

Some updates on the NSF and related issues

Non-blog life has been very busy, and events have been changing rapidly, but I thought it would be a good idea to give a brief bulleted list of updates regarding the NSF and associated issues:
  • A court decision regarding who has the authority to fire probationary federal workers has led to the NSF hiring back 84 of the employees that it had previously dismissed, at least for now.  The Office of Personnel Management is still altering their wording on this.
  • There is likely some kind of continuing resolution in the offing in Congress, as the current funding stopgap expires on March 14.  If a CR passes that extends to the rest of the fiscal year (Sept 30), that would stave off any big cuts until next FY's budget.
  • At the same time, a number of NSF-funded research experience for undergraduate programs are being cancelled for this year.  This is very unfortunate, as REU programs are many undergrads' first exposure to real research, while also being a critical mechanism for students at non-research-heavy institutions to get research experience.
  • The concerns about next year's funding are real.  As I've written before, cuts and programmatic changes have been proposed by past presidents (including this one in his first term), but historically Congressional appropriators have tended not to follow those.  It seems very likely that the White House's budget proposal will be very bleak for science.  The big question is the degree to which Congress will ignore that.  
  • In addition to the budget, agencies (including NSF) have been ordered to prepare plans for reductions in force - staffing cuts - with deadlines to prepare those plans by 13 March and another set of plans by 14 April. 
  • Because of all this, a number of universities are cutting back on doctoral program admissions (either in specific departments or more broadly).  My sense is that universities with very large components of NIH funding thanks to medical schools are being particularly cautious.  Schools are being careful because many places guarantee some amount of support for at least several years, and it's difficult for them to be full-speed-ahead given uncertainties in federal sponsor budgets, possible endowment taxes, possible revisions to indirect cost policies, etc.
Enormous uncertainty remains in the wake of all of this activity, and this period of comparative quiet before the staffing plans and CR are due is an eerie calm.  (Reminds me of the line from here, about how it can be unsettling when a day goes by and you don't hear anything about the horse loose in the hospital.)

In other news, there is a national Stand Up for Science set of rallies tomorrow.  Hopefully the net impact of this will be positive.  The public and our legislators need to understand that support for basic science is not a partisan issue and has been the underpinning of enormous economic and technological progress.

Update:  My very brief remarks at the Stand Up for Science event at Rice today:

Hello everyone –

Thanks for turning out for this important event.  

Science research has shaped the world we know.  Our understanding of the universe (physics, chemistry, biology, mathematics, and all the engineering disciplines that have come from those foundations) is one of humanity’s great intellectual achievements.  We know a lot, and we know enough to know that we still have much more to learn.

One of the great things about basic science is that you never know where it can lead.  Basic research into heat-tolerant bacteria gave us the polymerase chain reaction technique, which led to the mapping of genomes, enormous advances in biology and medicine, and a lot of unrealistic scenes in TV police procedurals.  Basic research into semiconductors gave us the light emitting diode, which has transformed lighting around the world and given us the laser pointer, the blue ray, and those annoying programmable holiday lights you see all the time.

Particularly since WWII, science research has been supported by the US government, with the idea that while industry is good at many things, there is a need for public support of research that does not have a short-term profit motive.  

Thanks to several agencies (the National Science Foundation, the National Institutes of Health, the Department of Energy, the Department of Defense, and others), the result has led to enormous progress, and great economic and societal benefit to the country and the world.  

We need to remind everyone – the general person on the street and the politicians in Austin and Washington – that science research and education is vital for our future.  Science is not partisan, and good science can and should inform policy making.  We face many challenges, and continued support for science and engineering research is essential to our future success.

Thanks again for turning out, and let’s keep reminding everyone that supporting science is incredibly important for all of us.

Sunday, February 23, 2025

What is "static electricity"/"contact electrification"/triboelectricity?

An early physics demonstration that many of us see in elementary school is that of static electricity:  an electrical insulator like a wool cloth or animal fur is rubbed on a glass or plastic rod, and suddenly the rod can pick up pieces of styrofoam or little bits of paper.  Alternately, a rubber balloon is rubbed against a kid's hair, and afterward the balloon is able to stick to a wall with sufficient force that static friction keeps the balloon from sliding down the surface.  The physics here is that when materials are rubbed together, there can be a net transfer of electrical charge from one to the other, a phenomenon called triboelectricity.  The electrostatic attraction between net charge on the balloon and the polarizable surface of the wall is enough to hold up the balloon.  
Balloons electrostatically clinging to a wall, from here.


The big mysteries are, how and why do charges transfer between materials when they are rubbed together?  As I wrote about once before, this is still not understood, despite more than 2500 years of observations.  The electrostatic potentials that can be built up through triboelectricity are not small.  They can be tens of kV, enough to cause electrons accelerating across those potentials to emit x-rays when they smack into the positively charged surface.  Whatever is going on, it's a way to effectively concentrate the energy from mechanical work into displacing charges.  This is how Wimshurst machines and Van de Graaff generators work, even though we don't understand the microscopic physics of the charge generation and separation.

There are disagreements to this day about the mechanisms at work in triboelectricity, including the role of adsorbates, surface chemistry, whether the charges transferred are electrons or ions, etc.  From how electronic charge transfer works between metals, or between metals and semiconductors, it's not crazy to imagine that somehow this should all come down to work functions or the equivalent.  Depending on the composition and structure of materials, the electrons in there can be bound more tightly (energetically deeper compared to the energy of an electron far away, also called "the vacuum" level) or more loosely (energetically shallower, closer to the energy of a free electron).  It's credible that bringing two such materials in contact could lead to electrons "falling down hill" from the more loosely-binding material into the more tightly binding one.   That clearly is not the whole story, though, or this would've been figured out long ago.

This week, a new paper revealed an interesting wrinkle.  The net preference for picking up or losing charge seems to depend very clearly on the history of repeated contacts.  The authors used PDMS silicone rubber, and they find that repeated contacting can deterministically bake in a tendency for charge to flow one direction.  Using various surface spectroscopy methods, they find no obvious differences at the PDMS surface before/after the contacting procedures, but charge transfer is affected.  

My sneaking suspicion is that adsorbates will turn out to play a huge role in all of this.  This may be one of those issues like friction (see here too), where there is a general emergent phenomenon (net charge transfer) that can take place via multiple different underlying pathways.  Experiments in ultrahigh vacuum with ultraclean surfaces will undoubtedly show quantitatively different results than experiments in ambient conditions, but they may both show triboelectricity.






Wednesday, February 19, 2025

The National Science Foundation - this is not business as usual

The National Science Foundation was created 75 years ago, at the behest of Vannevar Bush, who put together the famed study, Science, The Endless Frontier, in 1945.  The NSF has played a critical role in a huge amount of science and engineering research since its inception, including advanced development of the laser, the page rank algorithm that ended up in google, and too many other contributions to list.  

The NSF funds university research as well as some national facilities.  Organizationally, the NSF is an independent agency, meaning that it doesn’t reside under a particular cabinet secretary, though its Director is a presidential appointee who is confirmed by the US Senate.  The NSF comprises a number of directorates (most relevant for readers of this blog are probably Mathematical and Physical Sciences; Engineering;  and STEM Education, though there are several others).  Within the directorates are divisions (for example, MPS → Division of Materials Research; Division of Chemistry; Division of Physics; Division of Mathematics etc.).   Within each division are a variety of programs, spanning from individual investigator grants to medium to large center proposals, to group training grants, to individual graduate and postdoctoral fellowships.  Each program is administered by one or more program officers who are either scientists who have become civil servants, or "rotators", academics who take a leave of absence from their university positions to serve at the NSF for some number of years.  The NSF is the only agency whose mission historically has explicitly included science education.  The NSF's budget has been about $9B/yr (though until very recently there was supposedly bipartisan support for large increases), and 94% of its funds are spent on research, education, and related activities.  NSF funds more than 1/4 of all basic research done at universities in the US, and it also funds tech development, like small business innovation grants.

The NSF, more than any other agency that funds physical science and engineering research, relies on peer review.  Grants are reviewed by individual reviewers and/or panels.  Compared to other agencies, the influence of program officers in the review process is minimal.  If a grant doesn't excite the reviewers, it won't get funded.  This has its pluses and minuses, but it's less of a personal networking process than other agencies.  The success rate for many NSF programs is low, averaging around 25% in DMR, and 15% or so for graduate fellowships.  Every NSF program officer with whom I've ever interacted has been dedicated and professional.  

Well, yesterday the NSF laid off 11% of its workforce.  I had an exchange last night with a long-time NSF program director, who gave permission for me to share the gist, suitably anonymized.  (I also corrected typos.)  This person says that they want people to be aware of what's going on.  They say that NSF leadership is apparently helping with layoffs, and that "permanent Program Directors (feds such as myself) will be undergoing RIF or Reduction In Force process within the next month or so. So far, through buyout and firing today we lost about 16% of the workforce, and RIF is expected to bring it up to 50%."  When I asked further, this person said this was "fairly certain".   They went on:  "Another danger is budget.  We do no know what happens after the current CR [continuing resolution] ends March 14.  A long shutdown or another CR are possible.  For FY26 we are told about plans to reduce the NSF budget by 50%-75% - such reduction will mean no new awards for at least a year, elimination of divisions, merging of programs.  Individual researchers and professional societies can help by raising the voice of objection.  But realistically, we need to win the midterms to start real change.  For now we are losing this battle.  I can only promise you that NSF PDs are united as never before in our dedication to serve our communities of reesarchers and educators.  We will continue to do so as long as we are here."  On a related note, here is a thread by a just laid off NSF program officer.  Note that congress has historically ignored presidential budget requests to cut NSF, but it's not at all clear that this can be relied upon now.  

Voluntarily hobbling the NSF is, in my view, a terrible mistake that will take decades to fix.  The argument that this is a fiscally responsible thing to do is weak.  The total federal budget expenditures in FY24 was $6.75T.  The NSF budget was $9B, or 0.13% of the total.  The secretary of defense today said that their plan is to cut 8% of the DOD budget every year for the next several years.  That's a reduction of 9 NSF budgets per year.  

I fully recognize that many other things are going on in the world right now, and many agencies are under similar pressures, but I wanted to highlight the NSF in particular.  Acting like this is business as usual, the kind of thing that happens whenever there is a change of administration, is disingenuous.  

Sunday, February 16, 2025

What are parastatistics?

While I could certainly write more about what is going on in the US these days (ahh, trying to dismantle organizations you don't understand), instead I want to briefly highlight a very exciting result from my colleagues, published in Nature last month.  (I almost titled this post "Lies, Damn Lies, and (para)Statistics", but that sounds like I don't like the paper.)

When we teach students about the properties of quantum objects (and about thermodynamics), we often talk about the "statistics" obeyed by indistinguishable particles.  I've written about aspects of this before.  "Statistics" in this sense means, what happens mathematically to the multiparticle quantum state |Ψ when two particles are swapped.  If we use the label 1 to mean the set of quantum numbers associated with particle 1, etc., then the question is, how are |Ψ(1,2) and |Ψ(2,1) related to each other.  We know that probabilities have to be conserved, so Ψ(1,2)|Ψ(1,2)=Ψ(2,1)|Ψ(2,1).   

The usual situation is to assume |Ψ(2,1) =c|Ψ(1,2), where c is a complex number of magnitude 1.  If c=1, which is sort of the "common sense" expectation from classical physics, the particles are bosons, obeying Bose-Einstein stastistics.   If c=1, the particles are fermions and obey Fermi-Dirac statistics.  In principle, one could have c=exp(iα), where α is some phase angle.  Particles in that general case are called anyons, and I wrote about them here.  Low energy excitations of electrons (fermions) confined in 2D in the presence of a magnetic field can act like anyons, but it seems there can't be anyons in higher dimensions.

 Being imprecise, when particles are "dilute" -- "far" from each other in terms of position and momentum -- we typically don't really need to worry much about what kind of quantum statistics govern the particles.  The distribution function - the average occupancy of a typical single-particle quantum state (labeled by a coordinate r, a wavevector k, and a spin σ as one possibility) - is much less than 1.  When particles are much more dense, though, the quantum statistics matter enormously.  At low temperatures, bosons can all pile into the (single-particle, in the absence of interactions) ground state - that's Bose-Einstein condensation.   In contrast, fermions have to stack up into higher energy states, since FD statistics imply that no two indistinguishable fermions can be in the same state - this is the Pauli Exclusion Principle, and it's basically why solids are solid.  If a gas of particles is at a temperature T and a chemical potential μ, then the distribution function and a function of energy ϵ for bosons or fermions is given by f(ϵ,μ,T)=1/(exp((ϵμ)/kBT)±1), where the + sign is the fermion case and the sign is the boson case.  

In the paper at hand, the authors take on parastatistics, the question of what happens if, besides spin, there are other "internal degrees of freedom" that are attached to particles described by additional indices that obey different algebras.  As they point out, this is not a new idea, but what they have done here is show that it is possible to have mathematically consistent versions of this that do not trivially reduce to fermions and bosons and can survive in, say, 3 spatial dimensions.  They argue that low energy excitations (quasiparticles) of some quantum spin systems can have these properties.  That's cool but not necessarily surprising - there are quasiparticles in condensed matter systems that are argued to obey a variety of exotic relations originally proposed in the world of high energy theory (Weyl fermions, Majorana fermions, massless Dirac fermions).  They also put forward the possibility that elementary particles could obey these statistics as well.  (Ideas transferring over from condensed matter or AMO physics to high energy is also not a new thing; see the Anderson-Higgs mechanism, and the concept of unparticles, which has connections to condensed matter systems where electronic quasiparticles may not be well defined.)
Fig. 1 from this paper, showing distribution functions for fermions, bosons, 
and more exotic systems studied in the paper.


Interestingly, the authors work out what the distribution function can look like for these exotic particles, as shown here (fig 1 from the paper).  The left panel shows how many particles can be in a single-particle spatial state for fermions (zero or one), bosons (up to ), and funky parastatistics-obeying particles of different types.  The right panel shows the distribution functions for these cases.  I think this is very cool.  When I've taught statistical physics to undergrads, I've told the students that no one has written down a general distribution function for systems like this.  Guess I'll have to revise my statements on this!




Saturday, February 08, 2025

Indirect costs + potential unintended consequences

It's been another exciting week where I feel compelled to write about the practice of university-based research in the US.  I've written about "indirect costs" before, but it's been a while.  I will try to get readers caught up on the basics of the university research ecosystem in the US, what indirect costs are, the latest (ahh, the classic Friday evening news dump) from NIH, and what might happen.  (A note up front:  there are federal laws regulating indirect costs, so the move by NIH will very likely face immediate legal challenges.  Update:  And here come the lawsuitsUpdate 2Here is a useful explanatory video.)  Update 3:  This post is now closed (7:53pm CST 13 Feb).  When we get to the "bulldozer going to pile you lot onto the trash" level of discourse, there is no more useful discussion happening.

How does university-based sponsored technical research work in the US?  Since WWII, but particularly since the 1960s, many US universities conduct a lot of science and engineering research sponsored by US government agencies, foundations, and industry.  By "sponsored", I mean there is a grant or contract between a sponsor and the university that sends funds to the university in exchange for research to be conducted by one or more faculty principal investigators, doctoral students, postdocs, undergrads, staff scientists, etc.  When a PI writes a proposal to a sponsor, a budget is almost always required that spells out how much funding is being requested and how it will be spent.  For example, a proposal could say, we are going to study superconductivity in 2D materials, and the budget (which comes with a budget justification) says, to do this, I need $37000 per year to pay a graduate research assistant for 12 months, plus $12000 per year for graduate student tuition, plus $8000 in for the first year for a special amplifier, plus $10000 to cover materials, supplies, and equipment usage fees.  Those are called direct costs.  

In addition, the budget asks for funds to cover indirect costs.  Indirect costs are meant to cover the facilities and administrative costs that the university will incur doing the research - that includes things like, maintaining the lab building, electricity, air conditioning, IT infrastructure, research accountants to keep track of the expenses and generate financial reports, etc.  Indirect costs are computed as some percentage of some subset of the direct costs (e.g.,  there are no indirect costs charged on grad tuition or pieces of equipment more expensive than $5K).  Indirect cost rates have varied over the years but historically have been negotiated between universities and the federal government.  As I wrote eight years ago, "the magic (ahem) is all hidden away in OMB Circular A21 (wiki about it, pdf of the actual doc).  Universities periodically go through an elaborate negotiation process with the federal government (see here for a description of this regarding MIT), and determine an indirect cost rate for that university."  Rice's indirect cost rate is 56.5% for on-campus fed or industrial sponsored projects.  Off-campus rates are lower (if you're really doing the research at CERN, then logically your university doesn't need as much indirect).  Foundations historically try to negotiate lower indirect cost rates, often arguing that their resources are limited and paying for administration is not what their charters endorse.  The true effective indirect rate for universities is always lower than the stated number because of such negotiations.

PIs are required to submit technical progress reports, and universities are required to submit detailed financial reports, to track these grants.  

This basic framework has been in place for decades, and it has resulted in the growth of research universities, with enormous economic and societal benefit.  Especially as industrial long term research has waned in the US (another screed I have written before), the university research ecosystem has been hugely important in contributing to modern technological society.  We would not have the internet now, for example, if not for federally sponsored research.

Is it ideal?  No.  Are there inefficiencies?  Sure.  Should the whole thing be burned down?  Not in my opinion, no.

"All universities lose money doing research."  This is a quote from my colleague who was provost when I arrived at Rice, and was said to me tongue-in-cheek, but also with more than a grain of truth.  If you look at how much it really takes to run the research apparatus, the funds brought in via indirect costs do not cover those costs.  I have always said that this is a bit like Hollywood accounting - if research was a true financial disaster, universities wouldn't do it.  The fact is that research universities have been willing to subsidize the additional real indirect costs because having thriving research programs brings benefits that are not simple to quantify financially - reputation, star faculty, opportunities for their undergrads that would not exist in the absence of research, potential patent income and startup companies, etc.

Reasonable people can disagree on what is the optimal percentage number for indirect costs.   It's worth noting that the indirect cost rate at Bell Labs back when I was there was something close to 100%.  Think about that.  In a globally elite industrial research environment, with business-level financial pressure to be frugal, the indirect rate was 100%.  

The fact is, if indirect cost rates are set too low, universities really will be faced with existential choices about whether to continue to support sponsored research.  The overall benefits of having research programs will not outweigh the large financial costs of supporting this business.

Congress has made these true indirect costs steadily higher.  Over the last decades, both because it is responsible stewardship and because it's good politics, Congress has passed laws requiring more and more oversight of research expenditures and security.  Compliance with these rules has meant that universities have had to hire more administrators - on financial accounting and reporting, research security, tech transfer and intellectual property, supervisory folks for animal- and human-based research, etc.  Agencies can impose their own requirements as well.  Some large center-type grants from NIH/HHS and DOD require preparation and submission of monthly financial reports. 

What did NIH do yesterday?  NIH put out new guidance (linked above) setting their indirect cost rate to 15% effective this coming Monday.  This applies not just to new grants, but also to awards already underway.  There is also a not very veiled threat in there that says, we have chosen for now not to retroactively go back to the start of current awards and ask for funds (already spent) to be returned to us, but we think we would be justified in doing so.   The NIH twitter feed proudly says that this change will produce an immediate savings to US taxpayers of $4B. 

What does this mean?  What are the intended and possible unintended consequences?  It seems very likely that other agencies will come under immense pressure to make similar changes.  If all agencies do so, and nothing else changes, this will mean tens of millions fewer dollars flowing to typical research universities every year.   If a university has $300M annually in federally sponsored research, then that would be generating under the old rules (assume 55% indirect rate) $194M of direct and $106M of indirect costs.  If the rate is dropped to 15% and the direct costs stay the same at $194M, then that would generate $29M of indirect costs, a net cut to the university of $77M per year.

There will be legal challenges to all of this, I suspect. 

The intended consequences are supposedly to save taxpayer dollars and force universities to streamline their administrative processes.  However, given that Congress and the agencies are unlikely to lessen their reporting and oversight requirements, it's very hard for me to see how there can be some radical reduction in accounting and compliance staffs.  There seems to be a sentiment that this will really teach those wealthy elite universities a lesson, that with their big endowments they should pick up more of the costs.

One unintended consequence:  If this broadly goes through and sticks, universities will want to start making new direct costs.  For a grant like the one I described above, you could imagine asking for $1200 per year for electricity, $1000/yr for IT support, $3000/yr for lab space maintenance, etc.  This will create a ton of work for lawyers, as there will be a fight over what is or is not an allowable direct cost.  This will also create the need for even more accounting types to track all of this.  This is the exact opposite of "streamlined" administrative processes.

A second unintended consequence:  Universities for whom doing research is financially a lot more of a marginal proposition would likely get out of those activities,  if they truly can't recover the costs of operating their offices of research.  This is the opposite of improving the situation and student opportunities at the less elite universities.

From a purely real politik perspective that often appeals to legislators:  Everything that harms the US research enterprise effectively helps adversaries.  The US benefitted enormously after WWII by building a global premier research environment.  Risking that should not be done lightly.

Don't panic.  There is nothing gained by freaking out.  Whatever happens, it will likely be a drawn out process.  It's best to be aware of what's happening, educated about what it means, and deliberate in formulating strategies that will preserve research excellence and capabilities.

(So help me, I really want my next post to be about condensed matter physics or nanoscale science!)

Tuesday, February 04, 2025

NSF targeted with mass layoffs, acc to Politico; huge cuts in president’s budget request

According to this article at politico, there was an all-hands meeting at NSF today (at least for the engineering directorate) where they were told that there will be staff layoffs of 25-50% over the next two months.

This is an absolute catastrophe if it is accurately reported and comes to pass.  NSF is already understaffed.  This goes far beyond anything involving DEI, and is essentially a declaration that the US is planning to abrogate the federal role in supporting science and engineering research.  

Moreover, I strongly suspect that if this conversation is being had at NSF, it is likely being had at DOE and NIH.

I don't even know how to react to this, beyond encouraging my fellow US citizens to call their representatives and senators and make it clear that this would be an unmitigated disaster.

Update: looks like the presidential budget request will be for a 2/3 cut to the NSF.  Congress often goes against such recommendations, but this is certainly an indicator of what the executive branch seems to want.  


Saturday, February 01, 2025

An update, + a paper as a fun distraction

My post last week clearly stimulated some discussion.  I know people don't come here for political news, but as a professional scientist it's hard to ignore the chaotic present situation, so here are some things to read, before I talk about a fun paper:

  • Science reports on what is happening with NSF.  The short version: As of Friday afternoon, panels are delayed and funds (salary) are still not accessible for NSF postdoctoral fellows.  Here is NPR's take.
  • As of Friday afternoon, there is a new court order that specifically names the agency heads (including the NSF director), saying to disburse already approved funds according to statute.  
  • Update: The NSF is now allowing postdocs and GRF recipients to get paid; they are obeying the new court order.  See here and the FAQ specifically.
Looks like on this and a variety of other issues, we will see whether court orders actually compel actions anymore.

Now to distract ourselves with dreams of the future, this paper was published in Nature Photonics, measuring radiation pressure exerted by a laser on a 50 nm thick silicon nitride membrane.  The motivation is a grand one:  using laser-powered light sails to propel interstellar probes up to a decent fraction (say 10% or more) of the velocity of light.  It's easy to sketch out the basic idea on a napkin, and it has been considered seriously for decades (see this 1984 paper).  Imagine a reflective sail say 10 m2 and 100 nm thick.  When photons at normal incidence bounce from a reflective surface, they transfer momentum \(2\hbar \omega/c) normal to the surface.  If the reflective surface is very thin and low mass, and you can bounce enough photons off it, you can get decent accelerations.  Part of the appeal is, this is a spacecraft where you effectively keep the engine (the whopping laser) here at home and don't have to carry it with you.  There are braking schemes so that you could try to slow the craft down when it reaches your favorite target system.

A laser-powered lightsail (image from CalTech)

Of course, actually doing this on a scale where it would be useful faces enormous engineering challenges (beyond building whopping lasers and operating them for years at a time with outstanding collimation and positioning).  Reflection won't be perfect, so there will be heating.  Ideally, you'd want a light sail that passively stabilizes itself in the center of the beam.  In this paper, the investigators implement a clever scheme to measure radiation forces, and they test ideas involving dielectric gratings etched into the sail to generate self-stabilization.   Definitely more fun to think about such futuristic ideas than to read the news.

(An old favorite science fiction story of mine is "The Fourth Profession", by Larry Niven.  The imminent arrival of an alien ship at earth is heralded by the appearance of a bright point in the sky, whose emission turns out to be the highly blue-shifted, reflected spectrum of the sun, bouncing off an incoming alien light sail.  The aliens really need humanity to build them a launching laser to get to their next destination.)

Friday, January 24, 2025

Turbulent times

While I've been absolutely buried under deadlines, it's been a crazy week for US science, and things are unlikely to calm down anytime soon.  As I've written before, I largely try to keep my political views off here, since that's not what people want to read from me, and I want to keep the focus on the amazing physics of materials and nanoscale systems.  (Come on, this is just cool - using light to dynamically change the chirality of crystals?  That's really nifty.)   

Still, it's hard to be silent, even just limiting the discussion to science-related issues.  Changes of presidential administrations always carry a certain amount of perturbation, as the heads of many federal agencies are executive branch appointees who serve at the pleasure of the president.  That said, the past week was exceptional for multiple reasons, including pulling the US out of the WHO as everyone frets about H5N1 bird flu; a highly disruptive freeze of activity within HHS (lots of negative consequences even if it wraps up quickly); and immediate purging of various agency websites of any programs or language related to DEI, with threatened punishment for employees who don't report their colleagues for insufficient reporting of any continued DEI-related activities.

Treating other people with respect, trying to make science (and engineering) welcoming to all, and trying to engage and educate the widest possible population in expanding human knowledge should not be controversial positions.  Saying that we should try to broaden the technical workforce, or that medical trials should involve women and multiple races should not be controversial positions.

What I wrote eight years ago is still true.  It is easier to break things than to build things.  Rash steps very often have lingering unintended consequences.  

Panic is not helpful.  Doomscrolling is not helpful.  Getting through challenging times requires determination, focus, and commitment to not losing one's principles.  

Ok, enough out of me.  Next week (deadlines permitting) I'll be back with some science, because that's what I do.


Saturday, January 04, 2025

This week in the arXiv: quantum geometry, fluid momentum "tunneling", and pasta sauce

Three papers caught my eye the other day on the arXiv at the start of the new year:

arXiv:2501.00098 - J. Yu et al., "Quantum geometry in quantum materials" - I hope to write up something about quantum geometry soon, but I wanted to point out this nice review even if I haven't done my legwork yet.  The ultrabrief point:  The single-particle electronic states in crystalline solids may be written as Bloch waves, of the form unk(r)exp(ikr), where the (crystal) momentum is given by k and unk is a function with the real-space periodicity of the crystal lattice and contains an implicit k dependence.  You can get very far in understanding solid-state physics without worrying about this, but it turns out that there are a number of very important phenomena that originate from the oft-neglected k dependence of unk.  These include the anomalous Hall effect, the (intrinsic) spin Hall effect, the orbital Hall effect, etc.  Basically the k dependence of unk in the form of derivatives defines an internal "quantum" geometry of the electronic structure.  This review is a look at the consequences of quantum geometry on things like superconductivity, magnetic excitations, excitons, Chern insulators, etc. in quantum materials.

Fig. 1 from arXiv:2501.01253
arXiv:2501.01253 - B. Coquinot et al., "Momentum tunnelling between nanoscale liquid flows" - In electronic materials there is a phenomenon known as Coulomb drag, in which a current driven through one electronic system (often a 2D electron gas) leads, through Coulomb interactions, to a current in adjacent but otherwise electrically isolated electronic system (say another 2D electron gas separated from the first by a few-nm insulating layer).  This paper argues that there should be a similar-in-spirit phenomenon when a polar liquid (like water) flows on one side of a thin membrane (like one or few-layer graphene, which can support electronic excitations like plasmons) - that this could drive flow of a polar fluid on the other side of the membrane (see figure).  They cast this in the language of momentum tunneling across the membrane, but the point is that it's some inelastic scattering process mediated by excitations in the membrane.  Neat idea.

arXiv:2501.00536 - G. Bartolucci et al., "Phase behavior of Cacio and Pepe sauce" - Cacio e pepe is a wonderful Italian pasta dish with a sauce made from pecorino cheese, pepper, and hot pasta cooking water that contains dissolved starch.  When prepared well, it's incredibly creamy, smooth, and satisfying.  The authors here perform a systematic study of the sauce properties as a function of temperature and starch concentration relative to cheese content, finding the part of parameter space to avoid if you don't want the sauce to "break" (condensing out clumps of cheese-rich material and ruining the sauce texture).  That's cool, but what is impressive is that they are actually able to model the phase stability mathematically and come up with a scientifically justified version of the recipe.  Very fun.