Search This Blog

Saturday, November 28, 2020

Brief items

 Several items of note:

  • Quanta Magazine remains a generally outstanding source of science articles and opinion pieces.  In this opinion column,  high energy theorist Robert Dijkgraaf gives his views on whether we are reaching "the end of physics".  Spoilers:  he thinks not, and condensed matter physics, with its emergence of remarkable phenomena from underlying building blocks, is one reason.
  • Similarly, I should have pointed earlier to this interesting article by Natalie Wolchover, who asked a number of physicists to define what they mean by "a particle".  I understand the mathematical answer ("a particle is an irreproducible representation of the Poincare group", meaning that it's an object defined by having particular numbers describing how it changes or doesn't under translations in time, space, and rotation).  That also lends itself to a nice definition of a quasiparticle (such an object, but one that results from the collective action of underlying degrees of freedom, rather than existing in the universe's vacuum).  As an experimentalist, though, I confess a fondness for other perspectives.
  • Springer Nature has released its approach for handling open access publication.  I don't think I'm alone in thinking that its fee structure is somewhere between absurd and obscene.  It's simply absurd to think that the funding agencies (at least in the US) are going to allow people to budget €9,500 for a publication charge.  That's equivalent to four months of support for a graduate student in my discipline.  Moreover, the publisher is proposing to charge a non-refundable €2,190 fee just to have a manuscript evaluated for "guided open access" at Nature Physics.  Not that I lack confidence in the quality of my group's work, but how could I possibly justify spending that much for a 75% probability of a rejection letter?  Given that they do not pay referees, am I really supposed to believe that finding referees, soliciting reviews, and tracking manuscript progress costs the publisher €2,190 per submission? 
  • It's older news, but this approach to computation is an interesting one.  Cerebras is implementing neural networks in hardware, and they are doing this through wafer-scale processors (!) with trillions (!) of transistors and hundreds of thousands of cores.  There must be some impressive faulty tolerance built into their network training approach, because otherwise I'd be surprised if even the amazing manufacturing reliability of the semiconductor industry would produce a decent yield of these processors.
  • Older still, one of my colleagues brought this article to my attention, about someone trying to come up with a way to play grandmaster-level chess in a short period of time.  I don't buy into the hype, but it was an interesting look at how easy it seems to be now to pick up machine learning coding skills.  (Instead of deeply studying chess, the idea was to find a compact, memorizable/mentally evaluatable decision algorithm for chess based on training a machine learning system against a dedicated chess engine.) 

Wednesday, November 25, 2020

Statistical mechanics and Thanksgiving

Many books and articles have been written about the science of cooking, and why different cooking methods work the way that they do.  (An absolute favorite: J. Kenji López-Alt's work.  Make sure to watch his youtube videos.)  Often the answers involve chemistry, as many reactions take place during cooking, including the Maillard Reaction (the browning and caramelization of sugars and reactions with amino acids that gives enormous flavor) and denaturing of proteins (the reason that eggs hard-boil and firm up when scrambled over heat).  Sometimes the answers involve biology, as in fermentation.  

Occasionally, though, the real hero of the story is physics, in particular statistical mechanics.  Tomorrow is the Thanksgiving holiday in the US, and this traditionally involves cooking a turkey.  A technique gaining popularity is dry brining.  This oxymoronic name really means applying salt (often mixed with sugar, pepper, or other spices) to the surfaces of a piece of meat (say a turkey) and letting the salted meat sit in a refrigerated environment for a day or two prior to cooking.  What does this do?  

In statistical mechanics, we learn (roughly speaking) that systems approach equilibrium macroscopic states that correspond to the largest number of microscopic arrangements of the constituents.  Water is able to diffuse in and out of cells at some rate, as are solvated ions like Na+ and Cl-.  Once salt is on the turkey's surface, we have a non-equilibrium situation (well, at least a more severe on than before):  there are many more (by many orders of magnitude) ways to arrange the water molecules and ions now, such that some of the ions are inside the cells, and some of the water is outside, solvating the salt.  The result is osmosis, and over the timescale of the dry brining, the moisture and salt ions redistribute themselves.  (The salt also triggers reactions in the cells to break down some proteins, but that's chemistry not physics.)  After cooking, the result is supposed to be a more flavorful, tender meal.

So among the things for which to be thankful, consider the unlikely case of statistical mechanics.

(For a fun look at osmosis (!), try this short story if you can find it.)

Wednesday, November 18, 2020

Hard condensed matter can be soft, too.

In the previous post, I mentioned that one categorization of "soft" condensed matter is for systems where quantum mechanics is (beyond holding atoms together, etc.) unimportant.  In that framing, "hard" condensed matter looks at systems where \(\hbar\) figures prominently, in the form of quantum many-body physics.  By that labeling, strongly interacting quantum materials are the "hardest" systems out there, with entanglement, tunneling, and quantum fluctuations leading to rich phenomena. 

Orientation textures in a liquid crystal, from wikipedia
Interestingly, in recent years it has become clear that these hard CM systems can end up having properties that are associated with some soft condensed matter systems.  For instance, liquid crystals are canonical soft matter systems.  As I'd explained long ago here, liquid crystals are fluids made up of objects with some internal directionality (e.g., a collection of rod-like molecules, where one can worry about how the rods are oriented in addition to their positions).  Liquid crystals can have a variety of phases, including ones where the system spontaneously picks out a direction and becomes anisotropic.  It turns out that sometimes the electronic fluid in certain conductors can spontaneously do this as well, acting in some ways like a nematic liquid crystal.  A big review of this is here.  One example of this occurs in 2D electronic systems in high magnetic fields in the quantum Hall regime; see here for theory and here for a representative experiment.  Alternately, see here for an example in a correlated oxide at the cusp of a quantum phase transition.

Another example:  hydrodynamics is definitely part of the conventional purview of soft condensed matter.   In recent years, however, it has become clear that there are times when the electronic fluid can also be very well-described by math that is historically the realm of classical fluids.   This can happen in graphene, or in more exotic Weyl semimetals, or perhaps in the exotic "strange metal" phase.  In the last of those, this is supposed to happen when the electrons are in such a giant, entangled, many-body situation that the quasiparticle picture doesn't work anymore.  

Interesting that the hardest of hard condensed matter systems can end up having emergent properties that look like those of soft matter.

Saturday, November 14, 2020

Soft matter is hard!

This great article by Randall Munroe from the NY Times this week brings up, in its first illustration (reproduced here), a fact that surprises me on some level every time I really stop to think about it:  The physics of "soft matter", in this case the static and dynamic properties of sand, is actually very difficult, and much remains poorly understood.  


"Soft" condensed matter typically refers to problems involving solid, liquids, or mixed phases in which quantum mechanics is comparatively unimportant - if you were to try to write down equations modeling these systems, those equations would basically be some flavor of classical mechanics ("h-bar = 0", as some would say).  (If you want to see a couple of nice talks about this field, check out this series and this KITP talk.)  This encompasses the physics of classical fluids, polymers, and mixed-phase systems like ensembles of hard particles plus gas (sand!), suspensions of colloidal particles (milk, cornstarch in water), other emergent situations like the mechanical properties of crumping paper.  (Soft matter also is sometimes said to encompass "active matter", as in living systems, but it's difficult even without that category.)

Often, soft matter problems sound simple.  Take a broomhandle, stick it a few inches into dry sand, and try to drag the handle sideways.  How much force does it take to move the handle at a certain speed?  This problem only involves classical mechanics.  Clearly the dominant forces that are relevant are gravity acting on the sand grains, friction between the grains, and the "normal force" that is the hard-core repulsion preventing sand grains from passing through each other or through the broom handle.  Maybe we need to worry about the interactions between the sand grains and the air in the spaces between grains.  Still, all of this sounds like something that should have been solved by a French mathematician in the 18th or 19th centuries - one of those people with a special function or polynomial named after them.  And yet, these problems are simultaneously extremely important for industrial purposes, and very difficult.

A key issue is that many soft matter systems are hindered - energy scales required to reshuffle their constitutents (e.g., move grains of sand around and past each other) can be larger than what's available from thermal fluctuations.  So, configurations get locked in, kinetically hung up or stuck.  This can mean that the past history of the system can be very important, in the sense that the system can get funneled into some particular configuration and then be unable to escape, even if that configuration isn't something "nice", like one that globally minimizes energy.  

A message that I think is underappreciated:  Emergent dynamic properties, not obvious at all from the building blocks and their simple rules, can happen in such soft matter systems (e.g., oscillons and creepy non-Newtonian fluids), and are not just the provenance of exotic quantum materials.  Collective responses from many interacting degrees of freedom - this is what condensed matter physics is all about.

Sunday, November 08, 2020

Recently on the arxiv

A couple of papers caught my eye recently on the arxiv, when I wasn't preoccupied with the presidential election, the pandemic, or grant writing:

arxiv:2010.09986 - Zhao et al., Determination of the helical edge and bulk spin axis in quantum spin Hall insulator WTe2
Monolayer tungsten ditelluride is a quantum spin Hall insulator, meaning that the 2D "bulk" of a flake of  the material is an insulator at low temperatures, while there are supposed to be helical edge states that run around the perimeter of the flake.  Because of spin-momentum locking, preferred spin orientation of the electrons in those edges should be fixed, but the spin doesn't have to be pointing perpendicular to the plane of the flake.  In this work, highly detailed transport measurements determine experimentally the orientation of that preferred direction.

arxiv:2011.01335 - Hossain et al., Observation of Spontaneous Ferromagnetism in a Two-Dimensional Electron System
For many years, people have been discussing the ground state of a dilute 2D layer of electrons in the limit of low density and a very clean system.  This system is ideal for demonstrating one of the most unintuitive consequences of the Pauli Principle:  As the electron density is lowered, and thus the average spacing between electrons increases, electron-electron interactions actually become increasingly dominant.  These investigators, working with electrons in an essentially 2D AlAs layer, show (though hysteresis in the electronic resistance as a function of applied magnetic field) the onset of ferromagnetism at sufficiently low electron densities.  

arxiv:2011.02500 - Rodan-Legrain et al., Highly Tunable Junctions and Nonlocal Josephson Effect in Magic Angle Graphene Tunneling Devices
Over the last couple of years, it's become clear that "magic angle" twisted bilayer graphene is pretty remarkable.  It's a superconductor.  It's an orbital magnet.  It's a correlated insulator.  It's a floor wax and a dessert topping.  Here, the authors demonstrate that it is possible to make devices with this material that are sufficiently free of disorder that they can be tuned into a wide variety of structures - Josephson junctions, single-electron transistors, etc.  Pretty remarkable.


Sunday, November 01, 2020

Science, policy-making, and the right thing to do

I know people don't read this blog for politics, but the past week has seen a couple of very unusual situations, and I think it's worth having a discussion of science, its role in policy-making, and the people who work on these issues at the highest levels.   (If you want, view this as writing-therapy for my general anxiety and move on.)

As a political reality, it's important to understand that science does not, itself, make policy.  Public policy is complicated and messy because it involves people, who as a rule are also complicated and messy. Deciding to set fuel standards for non-electric cars to 200 miles per gallon beginning next year and requiring that the fuel all be made from biological sources would be extremely bold, but it would also be completely unworkable and enormously disruptive.  That said, when policy must be made that has a science and technology aspect, it's critical that actual scientific and engineering knowledge be presented at the table.  If science isn't in the room where it happens, then we can make bad situations worse.  (It's been one of the great privileges of my career to have had the chance to meet and interact with some of the people who have worked on policy.  One of the most depressing aspects of the past four years has been the denigration of expertise, the suggestion that no one with detailed technical knowledge can be trusted because they're assumed to be on the make.)  The pandemic has shined a spotlight on this, as well as showing the (also complicated and messy) scientific process of figuring out how the disease works.

A million years ago at the beginning of this week, the White House Office of Science and Technology Policy put out a press release, linking to a detailed report (pdf), about their science and technology accomplishments over the last four years.  The top highlight in the press release was "Ending the pandemic".  That language doesn't appear anywhere in the actual report, but it sure shows up front and center in the press release.  After this was met with, shall we say, great skepticism (almost 100,000 cases per day, about 1000 deaths per day doesn't sound like an ending to this), the administration walked it back, saying the release was "poorly worded".  The question that comes to mind:  How can Kelvin Droegemeier, the presidential science advisor and head of OSTP, continue in that position?  There is essentially zero chance that he approved that press release language.  It must have been added after he and OSTP staff produced and signed off on the report, and therefore it was either over his objections or without his knowledge.  Either way, under ordinary circumstances that would be the kind of situation that leads to an offer of resignation.  

In a weird complement of this, yesterday evening, Dr. Anthony Fauci gave an interview to the Washington Post, where he stated a number of points with great frankness, including his opinion that the pandemic was in a very dangerous phase and that he disagreed in the strongest terms with Dr. Scott Atlas.  Dr. Atlas has seemingly become the chief advisor to the administration on the pandemic, despite having views that disagree with a large number of public health experts.  The White House in the same Post article takes Dr. Fauci to task for airing his grievances publicly.  Again, the question comes to mind:  How can Dr. Fauci continue to serve on the coronavirus policy task force, when he clearly disagrees with how this is being handled?

As I alluded back in late 2016, these situations remind me of this book, The Dilemmas of an Upright Man, about Max Planck and his difficult decision to remain in Germany and helping to influence German science during WWII.  His rationale was that it was much better for German science if he stayed there, where he thought he could at least be a bit of a moderating influence, than for him to be completely outside the system.  

There are no easy answers here about the right course of action - to quit on principle when that might lead to more chaos, or to try to exert influence from within even in the face of clear evidence that such influence is minimal at best.  What I do know is that we face a complicated world filled with myriad challenges, and that science and engineering know-how is going to be needed in any credible effort to surmount those problems.  The cost of ignoring, or worse, actively attacking technical expertise is just too high.