Wednesday, November 18, 2020

Hard condensed matter can be soft, too.

In the previous post, I mentioned that one categorization of "soft" condensed matter is for systems where quantum mechanics is (beyond holding atoms together, etc.) unimportant.  In that framing, "hard" condensed matter looks at systems where \(\hbar\) figures prominently, in the form of quantum many-body physics.  By that labeling, strongly interacting quantum materials are the "hardest" systems out there, with entanglement, tunneling, and quantum fluctuations leading to rich phenomena. 

Orientation textures in a liquid crystal, from wikipedia
Interestingly, in recent years it has become clear that these hard CM systems can end up having properties that are associated with some soft condensed matter systems.  For instance, liquid crystals are canonical soft matter systems.  As I'd explained long ago here, liquid crystals are fluids made up of objects with some internal directionality (e.g., a collection of rod-like molecules, where one can worry about how the rods are oriented in addition to their positions).  Liquid crystals can have a variety of phases, including ones where the system spontaneously picks out a direction and becomes anisotropic.  It turns out that sometimes the electronic fluid in certain conductors can spontaneously do this as well, acting in some ways like a nematic liquid crystal.  A big review of this is here.  One example of this occurs in 2D electronic systems in high magnetic fields in the quantum Hall regime; see here for theory and here for a representative experiment.  Alternately, see here for an example in a correlated oxide at the cusp of a quantum phase transition.

Another example:  hydrodynamics is definitely part of the conventional purview of soft condensed matter.   In recent years, however, it has become clear that there are times when the electronic fluid can also be very well-described by math that is historically the realm of classical fluids.   This can happen in graphene, or in more exotic Weyl semimetals, or perhaps in the exotic "strange metal" phase.  In the last of those, this is supposed to happen when the electrons are in such a giant, entangled, many-body situation that the quasiparticle picture doesn't work anymore.  

Interesting that the hardest of hard condensed matter systems can end up having emergent properties that look like those of soft matter.

Saturday, November 14, 2020

Soft matter is hard!

This great article by Randall Munroe from the NY Times this week brings up, in its first illustration (reproduced here), a fact that surprises me on some level every time I really stop to think about it:  The physics of "soft matter", in this case the static and dynamic properties of sand, is actually very difficult, and much remains poorly understood.  


"Soft" condensed matter typically refers to problems involving solid, liquids, or mixed phases in which quantum mechanics is comparatively unimportant - if you were to try to write down equations modeling these systems, those equations would basically be some flavor of classical mechanics ("h-bar = 0", as some would say).  (If you want to see a couple of nice talks about this field, check out this series and this KITP talk.)  This encompasses the physics of classical fluids, polymers, and mixed-phase systems like ensembles of hard particles plus gas (sand!), suspensions of colloidal particles (milk, cornstarch in water), other emergent situations like the mechanical properties of crumping paper.  (Soft matter also is sometimes said to encompass "active matter", as in living systems, but it's difficult even without that category.)

Often, soft matter problems sound simple.  Take a broomhandle, stick it a few inches into dry sand, and try to drag the handle sideways.  How much force does it take to move the handle at a certain speed?  This problem only involves classical mechanics.  Clearly the dominant forces that are relevant are gravity acting on the sand grains, friction between the grains, and the "normal force" that is the hard-core repulsion preventing sand grains from passing through each other or through the broom handle.  Maybe we need to worry about the interactions between the sand grains and the air in the spaces between grains.  Still, all of this sounds like something that should have been solved by a French mathematician in the 18th or 19th centuries - one of those people with a special function or polynomial named after them.  And yet, these problems are simultaneously extremely important for industrial purposes, and very difficult.

A key issue is that many soft matter systems are hindered - energy scales required to reshuffle their constitutents (e.g., move grains of sand around and past each other) can be larger than what's available from thermal fluctuations.  So, configurations get locked in, kinetically hung up or stuck.  This can mean that the past history of the system can be very important, in the sense that the system can get funneled into some particular configuration and then be unable to escape, even if that configuration isn't something "nice", like one that globally minimizes energy.  

A message that I think is underappreciated:  Emergent dynamic properties, not obvious at all from the building blocks and their simple rules, can happen in such soft matter systems (e.g., oscillons and creepy non-Newtonian fluids), and are not just the provenance of exotic quantum materials.  Collective responses from many interacting degrees of freedom - this is what condensed matter physics is all about.

Sunday, November 08, 2020

Recently on the arxiv

A couple of papers caught my eye recently on the arxiv, when I wasn't preoccupied with the presidential election, the pandemic, or grant writing:

arxiv:2010.09986 - Zhao et al., Determination of the helical edge and bulk spin axis in quantum spin Hall insulator WTe2
Monolayer tungsten ditelluride is a quantum spin Hall insulator, meaning that the 2D "bulk" of a flake of  the material is an insulator at low temperatures, while there are supposed to be helical edge states that run around the perimeter of the flake.  Because of spin-momentum locking, preferred spin orientation of the electrons in those edges should be fixed, but the spin doesn't have to be pointing perpendicular to the plane of the flake.  In this work, highly detailed transport measurements determine experimentally the orientation of that preferred direction.

arxiv:2011.01335 - Hossain et al., Observation of Spontaneous Ferromagnetism in a Two-Dimensional Electron System
For many years, people have been discussing the ground state of a dilute 2D layer of electrons in the limit of low density and a very clean system.  This system is ideal for demonstrating one of the most unintuitive consequences of the Pauli Principle:  As the electron density is lowered, and thus the average spacing between electrons increases, electron-electron interactions actually become increasingly dominant.  These investigators, working with electrons in an essentially 2D AlAs layer, show (though hysteresis in the electronic resistance as a function of applied magnetic field) the onset of ferromagnetism at sufficiently low electron densities.  

arxiv:2011.02500 - Rodan-Legrain et al., Highly Tunable Junctions and Nonlocal Josephson Effect in Magic Angle Graphene Tunneling Devices
Over the last couple of years, it's become clear that "magic angle" twisted bilayer graphene is pretty remarkable.  It's a superconductor.  It's an orbital magnet.  It's a correlated insulator.  It's a floor wax and a dessert topping.  Here, the authors demonstrate that it is possible to make devices with this material that are sufficiently free of disorder that they can be tuned into a wide variety of structures - Josephson junctions, single-electron transistors, etc.  Pretty remarkable.


Sunday, November 01, 2020

Science, policy-making, and the right thing to do

I know people don't read this blog for politics, but the past week has seen a couple of very unusual situations, and I think it's worth having a discussion of science, its role in policy-making, and the people who work on these issues at the highest levels.   (If you want, view this as writing-therapy for my general anxiety and move on.)

As a political reality, it's important to understand that science does not, itself, make policy.  Public policy is complicated and messy because it involves people, who as a rule are also complicated and messy. Deciding to set fuel standards for non-electric cars to 200 miles per gallon beginning next year and requiring that the fuel all be made from biological sources would be extremely bold, but it would also be completely unworkable and enormously disruptive.  That said, when policy must be made that has a science and technology aspect, it's critical that actual scientific and engineering knowledge be presented at the table.  If science isn't in the room where it happens, then we can make bad situations worse.  (It's been one of the great privileges of my career to have had the chance to meet and interact with some of the people who have worked on policy.  One of the most depressing aspects of the past four years has been the denigration of expertise, the suggestion that no one with detailed technical knowledge can be trusted because they're assumed to be on the make.)  The pandemic has shined a spotlight on this, as well as showing the (also complicated and messy) scientific process of figuring out how the disease works.

A million years ago at the beginning of this week, the White House Office of Science and Technology Policy put out a press release, linking to a detailed report (pdf), about their science and technology accomplishments over the last four years.  The top highlight in the press release was "Ending the pandemic".  That language doesn't appear anywhere in the actual report, but it sure shows up front and center in the press release.  After this was met with, shall we say, great skepticism (almost 100,000 cases per day, about 1000 deaths per day doesn't sound like an ending to this), the administration walked it back, saying the release was "poorly worded".  The question that comes to mind:  How can Kelvin Droegemeier, the presidential science advisor and head of OSTP, continue in that position?  There is essentially zero chance that he approved that press release language.  It must have been added after he and OSTP staff produced and signed off on the report, and therefore it was either over his objections or without his knowledge.  Either way, under ordinary circumstances that would be the kind of situation that leads to an offer of resignation.  

In a weird complement of this, yesterday evening, Dr. Anthony Fauci gave an interview to the Washington Post, where he stated a number of points with great frankness, including his opinion that the pandemic was in a very dangerous phase and that he disagreed in the strongest terms with Dr. Scott Atlas.  Dr. Atlas has seemingly become the chief advisor to the administration on the pandemic, despite having views that disagree with a large number of public health experts.  The White House in the same Post article takes Dr. Fauci to task for airing his grievances publicly.  Again, the question comes to mind:  How can Dr. Fauci continue to serve on the coronavirus policy task force, when he clearly disagrees with how this is being handled?

As I alluded back in late 2016, these situations remind me of this book, The Dilemmas of an Upright Man, about Max Planck and his difficult decision to remain in Germany and helping to influence German science during WWII.  His rationale was that it was much better for German science if he stayed there, where he thought he could at least be a bit of a moderating influence, than for him to be completely outside the system.  

There are no easy answers here about the right course of action - to quit on principle when that might lead to more chaos, or to try to exert influence from within even in the face of clear evidence that such influence is minimal at best.  What I do know is that we face a complicated world filled with myriad challenges, and that science and engineering know-how is going to be needed in any credible effort to surmount those problems.  The cost of ignoring, or worse, actively attacking technical expertise is just too high.

Saturday, October 24, 2020

Silicon nanoelectronics is a truly extraordinary achievement.

Arguably the greatest technical and manufacturing achievement in all of history is around us all the time, supporting directly or indirectly a huge fraction of modern life, and the overwhelming majority people don't give it a second's thought.  

I'm talking about silicon nanoelectronics (since about 2003, "microelectronics" is no longer an accurate description).  As I was updating notes for a class I'm teaching, the numbers really hit me.  A high end microprocessor these days (say the AMD "Epyc" Rome) contains 40 billion transistors in a chip about 3 cm on a side.  These essentially all work properly, for many years at a time.  (Chips rarely die - power supplies and bad electrolytic capacitors are much more common causes of failure of motherboards.)  No other manufacturing process of components for any product comes close to the throughput and reliability of transistors.  

The transistors on those chips are the culmination of many person-years of research.  They're FinFETs, made using what is labeled the 7 nm process.  Remember, transistors are switches, with the current flow between the source and drain electrodes passing through a channel the conductance of which is modulated by the voltage applied to a gate electrode.  The active channel length of those transistors, the distance between the source and drain, is around 16 nm, or about 50 atoms (!).  The positioning accuracy required for the lithography steps (when ultraviolet light and photochemistry are used to pattern the features) is down to about 3 nm.  These distances are controlled accurately across a single-crystal piece of silicon the size of a dinner plate.  That silicon is pure at the level of one atom out of every 10 trillion (!!).  

This is not an accident.  It's not good fortune.  Science (figuring out the rules of the universe) and engineering (applying those rules to accomplish a task or address a need) have given us this (see here and here).  It's the result of an incredible combination of hard-earned scientific understanding, materials and chemistry acumen, engineering optimization, and the boot-strapping nature of modern technology (that is, we can do this kind of manufacturing because we have advanced computational tools for design, control, and analysis, and we have those tools because of our ability to do this kind of manufacturing.)   

This technology would look like literal magic to someone from any other era of history - that's something worth appreciating.

Thursday, October 15, 2020

Emergent monopoles

One of the truly remarkable things about condensed matter physics is the idea that, from a large number of interacting particles that obey comparatively simple rules, there can emerge new objects  (in the sense of having well-defined sets of parameters like mass, charge, etc.) with properties that are not at all obviously related to those of the original constituents.   (One analogy I like:  Think about fans in a sports stadium doing The Wave.  The propagating wave only exists because of the cooperative response of thousands of people, and its spatial extent and propagation speed are not obviously related to the size of individual fans.)

A fairly spectacular example of this occurs in materials called spin ices, insulating materials that have unusual magnetic properties. A prime example is Dy2Ti2O7.  The figure here shows a little snipped of the structure.  The dysprosium atoms (which end up having angular momentum \(J = 15/2\), very large as compared to a spin-1/2 electron) sit at the corners of corner-sharing tetrahedra.  It's a bit hard to visualize, but the centers of those tetrahedra form the same spatial pattern as the locations of carbon atoms in a diamond crystal.  Anyway, because of some rather deep physics ("crystal field effects"), the magnetic moments of each Dy are biased to point either radially inward toward or radially outward from the center of the tetrahedron.  Moreover, because of interactions between the magnetic moments, it is energetically favored so that for each tetrahedron, two moments (shown as a little arrows) point inward and two moments point outward.  This is the origin of the "ice" part of the name, since this two-in/two-out rule is the same thing seen in ordinary water ice, where each oxygen atom is coordinated by four hydrogen atoms, two strongly (closer, covalently bound) and two more weakly (farther away, hydrogen bonding).  The spin ice ordering in this material really kicks in at low temperatures, below 1 K.  

So, what happens at rather warmer temperatures, say between 2 K and 15 K?  The lowest energy excitations of this system act like magnetic monopoles (!).  Now, except for the fact that electrical charge is quantized, there is no direct evidence for magnetic monopoles (isolated north and south poles that would interact with a Coulomb-like force law) in free space.  In spin ice, though, you can create an effective monopole/antimonopole pair by flipping some moments so that one tetrahedron is 3-out/1-in, and another is 1-out/3-in, as shown at right.  You can "connect" the monopole to the antimonopole by following a line of directed magnetic moments - this is a topological constraint, in the sense that you can see how having multiple m/anti-m pairs could interfere with each other.  This connection is the analog of a Dirac string (where you can think of the m/anti-m pair as opposite ends of an infinitesimally skinny solenoid).  

This is all fun to talk about, but is there really evidence for these emergent monopoles?  Yes.  A nice very recent review of the subject is here.  There are a variety of experiments (starting with magnetization and neutron scattering and ending up with more sophisticated measurements like THz optical properties and magnetic flux noise experiments looking at m/anti-m generation and recombination) that show evidence for monopoles and their interactions.  (full disclosure:  I have some thoughts on fun experiments to do in these and related systems.)  It's also possible to make two-dimensional arrays of nanoscale ferromagnets that can mimic these kinds of properties, so-called artificial spin ice.  This kind of emergence, when you can end up with excitations that act like exotic, interacting, topologically constrained (quasi)particles that seemingly don't exist elsewhere, is something that gets missed if one takes a reductionist view of physics.

Wednesday, October 14, 2020

Room temperature superconductivity!

As many readers know, the quest for a practical room temperature superconductor has been going on basically ever since Kamerlingh Onnes discovered superconductivity over 100 years ago.  If one could have superconductivity with high critical currents and high critical fields in a material that could readily be made into wires, for example, it would be absolutely transformative to the world.  (Just one example:  we lose 5-10% of generated electricity just in transmission lines due to resistive heating.)  

One exotic possibility suggested over 50 years ago by Neil Ashcroft (of textbook fame in addition to his scientific prestige) was that highly compressed metallic hydrogen could be a room temperature superconductor.  The basic ingredients for traditional superconductivity would be a high electronic density of states, light atoms (and hence a high soundspeed for phonon-based pairing), and a strong electron-phonon coupling.  

In recent years, there have been striking advances in hydrogen-rich compounds with steadily increasing superconducting transition temperatures, including H2S (here and here) and LaH10 (here and here), all requiring very high (200+ GPa) pressures obtained in diamond anvil cells.  In those cool gadgets, tiny sample volumes are squeezed between the facets of cut gemstone-quality diamonds, and there is a great art in making electronic, optical, and magnetic measurements of samples under extreme pressures. 

Today, a new milestone has been reached and published.  Using these tools, the investigators (largely at Rochester) put some carbon, sulphur, and hydrogen containing compounds in the cell, zapped them with a laser to do some in situ chemistry, and measured superconductivity with a transition temperature up to 287.7 K (!) at a pressure of 267 GPa (!!).  The evidence for superconductivity is both a resistive transition to (as near as can be seen) zero resistance, and an onset of diamagnetism (as seen through ac susceptibility).  

This is exciting, and a milestone, though of course there are many questions:  What is the actual chemical compound at work here?  How does superconductivity work - is it conventional or more exotic? Is there any pathway to keeping these properties without enormous externally applied pressure?  At the very least, this shows experimentally what people have been saying for a long time, that there is no reason in principle why there couldn't be room temperature (or above) superconductivity.