Wednesday, December 30, 2020

End of the year, looking back and looking forward

 A few odds and ends at the close of 2020:

  • This was not a good year, for just about anyone.  Please, let's take better care of each other (e.g.) and ourselves!  
  • The decision to cancel the in-person 2020 APS March Meeting looks pretty darn smart in hindsight.
  • Please take a moment and consider how amazing it is that in less than a year, there are now multiple efficacious vaccines for SARS-Cov-2, using different strategies, when no one had ever produced a successful vaccine for any coronavirus in the past.  Logistical problems of distribution aside, this is a towering scientific achievement.  People who don't "believe" in vaccines, yet are willing to use (without thinking) all sorts of other scientific and engineering marvels, are amazing to me, and not in a good way.  For a compelling book about this kind of science, I again recommend this one, as I had done ten years ago.
  • I also recommend this book about the history of money.  Fascinating and extremely readable.  It's remarkable how we ended up where we are in terms of fiat currencies, and the fact that there are still fundamental disagreements about economics is both interesting and sobering.
  • As is my habit, I've been thinking again about the amazing yet almost completely unsung intellectual achievement that is condensed matter physics.  The history of this is filled with leaps that are incredible in hindsight - for example, the Pauli principle in 1925, the formulation of the Schroedinger equation in 1926, and Bloch's theorem for electrons in crystals in 1928 (!!).  I've also found that there is seemingly only one biography of Sommerfeld (just started it) and no book-length biography of Felix Bloch (though there are this and this).  
  • Four years ago I posted about some reasons for optimism at the end of 2016.  Globally speaking, these are still basically valid, even if it doesn't feel like it many days.  Progress is not inevitable, but there is reason for hope.
Thanks for reading, and good luck in the coming year.  

Saturday, December 19, 2020

The physics of beskar

 In keeping with my previous posts about favorite science fiction condensed matter systems and the properties of vibranium, I think we are overdue for an observational analysis of the physics of beskar.  Beskar is the material of choice of the Mandalorians in the Star Wars universe.  It is apparently an alloy (according to wookiepedia), and it is most notable for being the only material that can resist direct attack by lightsaber, as well as deflecting blaster shots.   

Like many fictional materials, beskar has whatever properties are needed to drive the plot and look cool doing so, but it's still fun to think about what would have to be going on in the material for it to behave the way it appears on screen.  

In ingot form, beskar looks rather like Damascus steel (or perhaps Valyrian steel, though without the whole dragonfire aspect).  That's a bit surprising, since the texturing in damascene steel involves phase separation upon solidification from the melt, while the appearance of beskar is homogeneous when it's in the form of armor plating or a spear.  From the way people handle it, beskar seems to have a density similar to steel, though perhaps a bit lower.

Beskar's shiny appearance says that at least at optical frequencies the material is a metal, meaning it has highly mobile charge carriers.  Certainly everyone calls it a metal.  That is interesting in light of two of its other obvious properties:  An extremely high melting point (we know that lightsabers can melt through extremely tough metal plating as in blast doors); and extremely poor thermal conductivity.  (Possible spoilers for The Mandalorian S2E8 - it is possible to hold a beskar spear with gloved hands mere inches from where the spear is visibly glowing orange.)  Because mobile charge carriers tend to conduct heat very well (see the Wiedemann Franz relation), it's tricky to have metals that are really bad thermal conductors.  This is actually a point consistent with beskar being an alloy, though.  Alloys tend to have higher electrical resistivity and poorer thermal conduction than pure substances.  

The high melting temperature is consistent with the nice acoustic properties of beskar (as seen here, in S2E7), and its extreme mechanical toughness.  The high melting temperature is tricky, though, because there is on-screen evidence that beskar may be melted (for forging into armor) without being heated to glowing.  Indeed, at about 1:02 in this video, the Armorer is able to melt a beskar ingot at the touch of a button on a console.  This raises a very interesting possibility, that beskar is close to a solid-liquid phase transition that may be tuned to room temperature via a simple external parameter (some externally applied field?).  This must be something subtle, because otherwise you could imagine anti-beskar weapons that would turn Mandalorian armor into a puddle on the floor.  

Regardless of the inconsistencies in its on-screen portrayal (which are all minor compared to the way dilithium has been shown), beskar is surely a worthy addition to fictional materials science.  This is The Way.

 

Thursday, December 17, 2020

Brief items

Here are a few interesting links as we look toward the end of a long year:

  • Brian Skinner of Gravity and Levity has a long and excellent thread on twitter about cool materials.
  • Subir Sachdev at Harvard has put his entire semester's worth of lectures on youtube for his course on Quantum Phases of Matter
  • New data on stellar distances makes the Hubble constant problem even worse, as explained in this nice article by the reliably excellent Natalie Wolchover.
  • In case you were wondering, we are nowhere near done with magnetic tape as a storage medium, especially since it's comparatively cheap and can now hold 317 Gb/in2.
  • If you aren't impressed by SpaceX's initial flight test of their latest rocket, I don't know what to say to you.  They were trying several brand new things at the same time, and almost got it all to work on the first try.  FYI, the green exhaust at the end is from the engine running hot and fuel-deprived, so the oxygen is burning the copper alloy engine lining.
  • This paper uses nanomechanical resonators immersed in fluid to study Brownian motion.  The resonator is getting kicked randomly by collisions with the fluid molecules, and looking at the noise in the displacement is a neat probe of the fluid's dynamics.  
  • In this paper, the authors are able to resolve inelastic electron tunneling spectra even above room temperature.  That's actually very surprising!
  • Here is a perspective article about plasmonic catalysis, trying to drive chemical reactions by optical excitation of collective electronic modes in conductive nanostructures.  

Thursday, December 10, 2020

Photonic quantum supremacy, protein folding, and "computing"

In the last week or two, there have been a couple of big science stories that I think raise some interesting issues about what we consider to be scientific computing.

In one example, Alphafold, a machine learning/AI approach to predicting protein structure, has demonstrated that it is really good at predicting protein structure.  Proteins are polymers made up of sequences of many amino acids, and in biological environments they fold up into complex shapes (with structural motifs like alpha helices and beta sheets) held together by hydrogen bonds. Proteins do an amazing amount of critical things in organisms (like act as enzymes to promote highly specific chemical reactions, or as motor units to move things around, or to pump specific ions and molecules in and out of cells and organelles).  Their ability to function in the wet, complex, constantly fluctuating biological environment is often dependent on minute details in their folded structure.  We only know snapshots of the structures of only some proteins because actually getting the structure requires crystallizing the protein molecules and performing high precision x-ray diffraction measurements on those crystals.  The challenge of understanding how proteins end up in particular functional structures based on their amino acid sequence is called the protein folding problem.  The statistical physics of folding is complex but usefully considered in terms of free energy landscapes.  It is possible to study large numbers of known protein structures and look for covariances (see here), correlations in sequences that show up commonly across many organisms.  Alphafold was trained on something like 100,000 structures and associated data, and is now good enough at predicting structures that it can actually allow people to solve complex x-ray diffraction data that was previously not analyzable, leading to new solved structures.  

This is very nice and will be a powerful tool, though like all such news stories one should be wary of the hype.  It does raise questions, and I would like to hear from experts:  Do we actually have greater foundational understanding of protein structure now?  Or have we created an extraordinarily effective interpolational look-up table?  It's useful either way, but the former might have more of an impact on our ability to understand the dynamics of proteins.  

That's a lot of optical components!
The second big story of the week is the photonic quantum supremacy achievement by a large group from USTC in China.  Through a very complex arrangement of optical components (see image), they report to have used boson sampling to determine statistical information about the properties of matrices at a level that would take an insanely long time with a classical computer.  Here, as with google's quantum supremacy claim (mentioned here), I again have to ask:  This is an amazing technical achievement, but is it really a computation, as opposed to an analog emulation or simulation?  If I filmed cream being stirred into coffee, and I analyzed the images to infer the flow of energy down to smaller and smaller length scales, I would describe that as an experiment, not as me doing a computation to solve the Navier-Stokes equations (which would also be very challenging to do with high precision on a classical computer).  Perhaps its splitting hairs, and quantum simulation is very interesting, but it does seem distinct to me from what most people would call computing.

Anyway, between AI/ML and quantum information sciences, it is surely an exciting time in the world of computing, broadly construed. 

(Sorry for the slow posting - end of semester grading + proposal writing have taken a lot of time.)

Saturday, November 28, 2020

Brief items

 Several items of note:

  • Quanta Magazine remains a generally outstanding source of science articles and opinion pieces.  In this opinion column,  high energy theorist Robert Dijkgraaf gives his views on whether we are reaching "the end of physics".  Spoilers:  he thinks not, and condensed matter physics, with its emergence of remarkable phenomena from underlying building blocks, is one reason.
  • Similarly, I should have pointed earlier to this interesting article by Natalie Wolchover, who asked a number of physicists to define what they mean by "a particle".  I understand the mathematical answer ("a particle is an irreproducible representation of the Poincare group", meaning that it's an object defined by having particular numbers describing how it changes or doesn't under translations in time, space, and rotation).  That also lends itself to a nice definition of a quasiparticle (such an object, but one that results from the collective action of underlying degrees of freedom, rather than existing in the universe's vacuum).  As an experimentalist, though, I confess a fondness for other perspectives.
  • Springer Nature has released its approach for handling open access publication.  I don't think I'm alone in thinking that its fee structure is somewhere between absurd and obscene.  It's simply absurd to think that the funding agencies (at least in the US) are going to allow people to budget €9,500 for a publication charge.  That's equivalent to four months of support for a graduate student in my discipline.  Moreover, the publisher is proposing to charge a non-refundable €2,190 fee just to have a manuscript evaluated for "guided open access" at Nature Physics.  Not that I lack confidence in the quality of my group's work, but how could I possibly justify spending that much for a 75% probability of a rejection letter?  Given that they do not pay referees, am I really supposed to believe that finding referees, soliciting reviews, and tracking manuscript progress costs the publisher €2,190 per submission? 
  • It's older news, but this approach to computation is an interesting one.  Cerebras is implementing neural networks in hardware, and they are doing this through wafer-scale processors (!) with trillions (!) of transistors and hundreds of thousands of cores.  There must be some impressive faulty tolerance built into their network training approach, because otherwise I'd be surprised if even the amazing manufacturing reliability of the semiconductor industry would produce a decent yield of these processors.
  • Older still, one of my colleagues brought this article to my attention, about someone trying to come up with a way to play grandmaster-level chess in a short period of time.  I don't buy into the hype, but it was an interesting look at how easy it seems to be now to pick up machine learning coding skills.  (Instead of deeply studying chess, the idea was to find a compact, memorizable/mentally evaluatable decision algorithm for chess based on training a machine learning system against a dedicated chess engine.) 

Wednesday, November 25, 2020

Statistical mechanics and Thanksgiving

Many books and articles have been written about the science of cooking, and why different cooking methods work the way that they do.  (An absolute favorite: J. Kenji López-Alt's work.  Make sure to watch his youtube videos.)  Often the answers involve chemistry, as many reactions take place during cooking, including the Maillard Reaction (the browning and caramelization of sugars and reactions with amino acids that gives enormous flavor) and denaturing of proteins (the reason that eggs hard-boil and firm up when scrambled over heat).  Sometimes the answers involve biology, as in fermentation.  

Occasionally, though, the real hero of the story is physics, in particular statistical mechanics.  Tomorrow is the Thanksgiving holiday in the US, and this traditionally involves cooking a turkey.  A technique gaining popularity is dry brining.  This oxymoronic name really means applying salt (often mixed with sugar, pepper, or other spices) to the surfaces of a piece of meat (say a turkey) and letting the salted meat sit in a refrigerated environment for a day or two prior to cooking.  What does this do?  

In statistical mechanics, we learn (roughly speaking) that systems approach equilibrium macroscopic states that correspond to the largest number of microscopic arrangements of the constituents.  Water is able to diffuse in and out of cells at some rate, as are solvated ions like Na+ and Cl-.  Once salt is on the turkey's surface, we have a non-equilibrium situation (well, at least a more severe on than before):  there are many more (by many orders of magnitude) ways to arrange the water molecules and ions now, such that some of the ions are inside the cells, and some of the water is outside, solvating the salt.  The result is osmosis, and over the timescale of the dry brining, the moisture and salt ions redistribute themselves.  (The salt also triggers reactions in the cells to break down some proteins, but that's chemistry not physics.)  After cooking, the result is supposed to be a more flavorful, tender meal.

So among the things for which to be thankful, consider the unlikely case of statistical mechanics.

(For a fun look at osmosis (!), try this short story if you can find it.)

Wednesday, November 18, 2020

Hard condensed matter can be soft, too.

In the previous post, I mentioned that one categorization of "soft" condensed matter is for systems where quantum mechanics is (beyond holding atoms together, etc.) unimportant.  In that framing, "hard" condensed matter looks at systems where \(\hbar\) figures prominently, in the form of quantum many-body physics.  By that labeling, strongly interacting quantum materials are the "hardest" systems out there, with entanglement, tunneling, and quantum fluctuations leading to rich phenomena. 

Orientation textures in a liquid crystal, from wikipedia
Interestingly, in recent years it has become clear that these hard CM systems can end up having properties that are associated with some soft condensed matter systems.  For instance, liquid crystals are canonical soft matter systems.  As I'd explained long ago here, liquid crystals are fluids made up of objects with some internal directionality (e.g., a collection of rod-like molecules, where one can worry about how the rods are oriented in addition to their positions).  Liquid crystals can have a variety of phases, including ones where the system spontaneously picks out a direction and becomes anisotropic.  It turns out that sometimes the electronic fluid in certain conductors can spontaneously do this as well, acting in some ways like a nematic liquid crystal.  A big review of this is here.  One example of this occurs in 2D electronic systems in high magnetic fields in the quantum Hall regime; see here for theory and here for a representative experiment.  Alternately, see here for an example in a correlated oxide at the cusp of a quantum phase transition.

Another example:  hydrodynamics is definitely part of the conventional purview of soft condensed matter.   In recent years, however, it has become clear that there are times when the electronic fluid can also be very well-described by math that is historically the realm of classical fluids.   This can happen in graphene, or in more exotic Weyl semimetals, or perhaps in the exotic "strange metal" phase.  In the last of those, this is supposed to happen when the electrons are in such a giant, entangled, many-body situation that the quasiparticle picture doesn't work anymore.  

Interesting that the hardest of hard condensed matter systems can end up having emergent properties that look like those of soft matter.

Saturday, November 14, 2020

Soft matter is hard!

This great article by Randall Munroe from the NY Times this week brings up, in its first illustration (reproduced here), a fact that surprises me on some level every time I really stop to think about it:  The physics of "soft matter", in this case the static and dynamic properties of sand, is actually very difficult, and much remains poorly understood.  


"Soft" condensed matter typically refers to problems involving solid, liquids, or mixed phases in which quantum mechanics is comparatively unimportant - if you were to try to write down equations modeling these systems, those equations would basically be some flavor of classical mechanics ("h-bar = 0", as some would say).  (If you want to see a couple of nice talks about this field, check out this series and this KITP talk.)  This encompasses the physics of classical fluids, polymers, and mixed-phase systems like ensembles of hard particles plus gas (sand!), suspensions of colloidal particles (milk, cornstarch in water), other emergent situations like the mechanical properties of crumping paper.  (Soft matter also is sometimes said to encompass "active matter", as in living systems, but it's difficult even without that category.)

Often, soft matter problems sound simple.  Take a broomhandle, stick it a few inches into dry sand, and try to drag the handle sideways.  How much force does it take to move the handle at a certain speed?  This problem only involves classical mechanics.  Clearly the dominant forces that are relevant are gravity acting on the sand grains, friction between the grains, and the "normal force" that is the hard-core repulsion preventing sand grains from passing through each other or through the broom handle.  Maybe we need to worry about the interactions between the sand grains and the air in the spaces between grains.  Still, all of this sounds like something that should have been solved by a French mathematician in the 18th or 19th centuries - one of those people with a special function or polynomial named after them.  And yet, these problems are simultaneously extremely important for industrial purposes, and very difficult.

A key issue is that many soft matter systems are hindered - energy scales required to reshuffle their constitutents (e.g., move grains of sand around and past each other) can be larger than what's available from thermal fluctuations.  So, configurations get locked in, kinetically hung up or stuck.  This can mean that the past history of the system can be very important, in the sense that the system can get funneled into some particular configuration and then be unable to escape, even if that configuration isn't something "nice", like one that globally minimizes energy.  

A message that I think is underappreciated:  Emergent dynamic properties, not obvious at all from the building blocks and their simple rules, can happen in such soft matter systems (e.g., oscillons and creepy non-Newtonian fluids), and are not just the provenance of exotic quantum materials.  Collective responses from many interacting degrees of freedom - this is what condensed matter physics is all about.

Sunday, November 08, 2020

Recently on the arxiv

A couple of papers caught my eye recently on the arxiv, when I wasn't preoccupied with the presidential election, the pandemic, or grant writing:

arxiv:2010.09986 - Zhao et al., Determination of the helical edge and bulk spin axis in quantum spin Hall insulator WTe2
Monolayer tungsten ditelluride is a quantum spin Hall insulator, meaning that the 2D "bulk" of a flake of  the material is an insulator at low temperatures, while there are supposed to be helical edge states that run around the perimeter of the flake.  Because of spin-momentum locking, preferred spin orientation of the electrons in those edges should be fixed, but the spin doesn't have to be pointing perpendicular to the plane of the flake.  In this work, highly detailed transport measurements determine experimentally the orientation of that preferred direction.

arxiv:2011.01335 - Hossain et al., Observation of Spontaneous Ferromagnetism in a Two-Dimensional Electron System
For many years, people have been discussing the ground state of a dilute 2D layer of electrons in the limit of low density and a very clean system.  This system is ideal for demonstrating one of the most unintuitive consequences of the Pauli Principle:  As the electron density is lowered, and thus the average spacing between electrons increases, electron-electron interactions actually become increasingly dominant.  These investigators, working with electrons in an essentially 2D AlAs layer, show (though hysteresis in the electronic resistance as a function of applied magnetic field) the onset of ferromagnetism at sufficiently low electron densities.  

arxiv:2011.02500 - Rodan-Legrain et al., Highly Tunable Junctions and Nonlocal Josephson Effect in Magic Angle Graphene Tunneling Devices
Over the last couple of years, it's become clear that "magic angle" twisted bilayer graphene is pretty remarkable.  It's a superconductor.  It's an orbital magnet.  It's a correlated insulator.  It's a floor wax and a dessert topping.  Here, the authors demonstrate that it is possible to make devices with this material that are sufficiently free of disorder that they can be tuned into a wide variety of structures - Josephson junctions, single-electron transistors, etc.  Pretty remarkable.


Sunday, November 01, 2020

Science, policy-making, and the right thing to do

I know people don't read this blog for politics, but the past week has seen a couple of very unusual situations, and I think it's worth having a discussion of science, its role in policy-making, and the people who work on these issues at the highest levels.   (If you want, view this as writing-therapy for my general anxiety and move on.)

As a political reality, it's important to understand that science does not, itself, make policy.  Public policy is complicated and messy because it involves people, who as a rule are also complicated and messy. Deciding to set fuel standards for non-electric cars to 200 miles per gallon beginning next year and requiring that the fuel all be made from biological sources would be extremely bold, but it would also be completely unworkable and enormously disruptive.  That said, when policy must be made that has a science and technology aspect, it's critical that actual scientific and engineering knowledge be presented at the table.  If science isn't in the room where it happens, then we can make bad situations worse.  (It's been one of the great privileges of my career to have had the chance to meet and interact with some of the people who have worked on policy.  One of the most depressing aspects of the past four years has been the denigration of expertise, the suggestion that no one with detailed technical knowledge can be trusted because they're assumed to be on the make.)  The pandemic has shined a spotlight on this, as well as showing the (also complicated and messy) scientific process of figuring out how the disease works.

A million years ago at the beginning of this week, the White House Office of Science and Technology Policy put out a press release, linking to a detailed report (pdf), about their science and technology accomplishments over the last four years.  The top highlight in the press release was "Ending the pandemic".  That language doesn't appear anywhere in the actual report, but it sure shows up front and center in the press release.  After this was met with, shall we say, great skepticism (almost 100,000 cases per day, about 1000 deaths per day doesn't sound like an ending to this), the administration walked it back, saying the release was "poorly worded".  The question that comes to mind:  How can Kelvin Droegemeier, the presidential science advisor and head of OSTP, continue in that position?  There is essentially zero chance that he approved that press release language.  It must have been added after he and OSTP staff produced and signed off on the report, and therefore it was either over his objections or without his knowledge.  Either way, under ordinary circumstances that would be the kind of situation that leads to an offer of resignation.  

In a weird complement of this, yesterday evening, Dr. Anthony Fauci gave an interview to the Washington Post, where he stated a number of points with great frankness, including his opinion that the pandemic was in a very dangerous phase and that he disagreed in the strongest terms with Dr. Scott Atlas.  Dr. Atlas has seemingly become the chief advisor to the administration on the pandemic, despite having views that disagree with a large number of public health experts.  The White House in the same Post article takes Dr. Fauci to task for airing his grievances publicly.  Again, the question comes to mind:  How can Dr. Fauci continue to serve on the coronavirus policy task force, when he clearly disagrees with how this is being handled?

As I alluded back in late 2016, these situations remind me of this book, The Dilemmas of an Upright Man, about Max Planck and his difficult decision to remain in Germany and helping to influence German science during WWII.  His rationale was that it was much better for German science if he stayed there, where he thought he could at least be a bit of a moderating influence, than for him to be completely outside the system.  

There are no easy answers here about the right course of action - to quit on principle when that might lead to more chaos, or to try to exert influence from within even in the face of clear evidence that such influence is minimal at best.  What I do know is that we face a complicated world filled with myriad challenges, and that science and engineering know-how is going to be needed in any credible effort to surmount those problems.  The cost of ignoring, or worse, actively attacking technical expertise is just too high.

Saturday, October 24, 2020

Silicon nanoelectronics is a truly extraordinary achievement.

Arguably the greatest technical and manufacturing achievement in all of history is around us all the time, supporting directly or indirectly a huge fraction of modern life, and the overwhelming majority people don't give it a second's thought.  

I'm talking about silicon nanoelectronics (since about 2003, "microelectronics" is no longer an accurate description).  As I was updating notes for a class I'm teaching, the numbers really hit me.  A high end microprocessor these days (say the AMD "Epyc" Rome) contains 40 billion transistors in a chip about 3 cm on a side.  These essentially all work properly, for many years at a time.  (Chips rarely die - power supplies and bad electrolytic capacitors are much more common causes of failure of motherboards.)  No other manufacturing process of components for any product comes close to the throughput and reliability of transistors.  

The transistors on those chips are the culmination of many person-years of research.  They're FinFETs, made using what is labeled the 7 nm process.  Remember, transistors are switches, with the current flow between the source and drain electrodes passing through a channel the conductance of which is modulated by the voltage applied to a gate electrode.  The active channel length of those transistors, the distance between the source and drain, is around 16 nm, or about 50 atoms (!).  The positioning accuracy required for the lithography steps (when ultraviolet light and photochemistry are used to pattern the features) is down to about 3 nm.  These distances are controlled accurately across a single-crystal piece of silicon the size of a dinner plate.  That silicon is pure at the level of one atom out of every 10 trillion (!!).  

This is not an accident.  It's not good fortune.  Science (figuring out the rules of the universe) and engineering (applying those rules to accomplish a task or address a need) have given us this (see here and here).  It's the result of an incredible combination of hard-earned scientific understanding, materials and chemistry acumen, engineering optimization, and the boot-strapping nature of modern technology (that is, we can do this kind of manufacturing because we have advanced computational tools for design, control, and analysis, and we have those tools because of our ability to do this kind of manufacturing.)   

This technology would look like literal magic to someone from any other era of history - that's something worth appreciating.

Thursday, October 15, 2020

Emergent monopoles

One of the truly remarkable things about condensed matter physics is the idea that, from a large number of interacting particles that obey comparatively simple rules, there can emerge new objects  (in the sense of having well-defined sets of parameters like mass, charge, etc.) with properties that are not at all obviously related to those of the original constituents.   (One analogy I like:  Think about fans in a sports stadium doing The Wave.  The propagating wave only exists because of the cooperative response of thousands of people, and its spatial extent and propagation speed are not obviously related to the size of individual fans.)

A fairly spectacular example of this occurs in materials called spin ices, insulating materials that have unusual magnetic properties. A prime example is Dy2Ti2O7.  The figure here shows a little snipped of the structure.  The dysprosium atoms (which end up having angular momentum \(J = 15/2\), very large as compared to a spin-1/2 electron) sit at the corners of corner-sharing tetrahedra.  It's a bit hard to visualize, but the centers of those tetrahedra form the same spatial pattern as the locations of carbon atoms in a diamond crystal.  Anyway, because of some rather deep physics ("crystal field effects"), the magnetic moments of each Dy are biased to point either radially inward toward or radially outward from the center of the tetrahedron.  Moreover, because of interactions between the magnetic moments, it is energetically favored so that for each tetrahedron, two moments (shown as a little arrows) point inward and two moments point outward.  This is the origin of the "ice" part of the name, since this two-in/two-out rule is the same thing seen in ordinary water ice, where each oxygen atom is coordinated by four hydrogen atoms, two strongly (closer, covalently bound) and two more weakly (farther away, hydrogen bonding).  The spin ice ordering in this material really kicks in at low temperatures, below 1 K.  

So, what happens at rather warmer temperatures, say between 2 K and 15 K?  The lowest energy excitations of this system act like magnetic monopoles (!).  Now, except for the fact that electrical charge is quantized, there is no direct evidence for magnetic monopoles (isolated north and south poles that would interact with a Coulomb-like force law) in free space.  In spin ice, though, you can create an effective monopole/antimonopole pair by flipping some moments so that one tetrahedron is 3-out/1-in, and another is 1-out/3-in, as shown at right.  You can "connect" the monopole to the antimonopole by following a line of directed magnetic moments - this is a topological constraint, in the sense that you can see how having multiple m/anti-m pairs could interfere with each other.  This connection is the analog of a Dirac string (where you can think of the m/anti-m pair as opposite ends of an infinitesimally skinny solenoid).  

This is all fun to talk about, but is there really evidence for these emergent monopoles?  Yes.  A nice very recent review of the subject is here.  There are a variety of experiments (starting with magnetization and neutron scattering and ending up with more sophisticated measurements like THz optical properties and magnetic flux noise experiments looking at m/anti-m generation and recombination) that show evidence for monopoles and their interactions.  (full disclosure:  I have some thoughts on fun experiments to do in these and related systems.)  It's also possible to make two-dimensional arrays of nanoscale ferromagnets that can mimic these kinds of properties, so-called artificial spin ice.  This kind of emergence, when you can end up with excitations that act like exotic, interacting, topologically constrained (quasi)particles that seemingly don't exist elsewhere, is something that gets missed if one takes a reductionist view of physics.

Wednesday, October 14, 2020

Room temperature superconductivity!

As many readers know, the quest for a practical room temperature superconductor has been going on basically ever since Kamerlingh Onnes discovered superconductivity over 100 years ago.  If one could have superconductivity with high critical currents and high critical fields in a material that could readily be made into wires, for example, it would be absolutely transformative to the world.  (Just one example:  we lose 5-10% of generated electricity just in transmission lines due to resistive heating.)  

One exotic possibility suggested over 50 years ago by Neil Ashcroft (of textbook fame in addition to his scientific prestige) was that highly compressed metallic hydrogen could be a room temperature superconductor.  The basic ingredients for traditional superconductivity would be a high electronic density of states, light atoms (and hence a high soundspeed for phonon-based pairing), and a strong electron-phonon coupling.  

In recent years, there have been striking advances in hydrogen-rich compounds with steadily increasing superconducting transition temperatures, including H2S (here and here) and LaH10 (here and here), all requiring very high (200+ GPa) pressures obtained in diamond anvil cells.  In those cool gadgets, tiny sample volumes are squeezed between the facets of cut gemstone-quality diamonds, and there is a great art in making electronic, optical, and magnetic measurements of samples under extreme pressures. 

Today, a new milestone has been reached and published.  Using these tools, the investigators (largely at Rochester) put some carbon, sulphur, and hydrogen containing compounds in the cell, zapped them with a laser to do some in situ chemistry, and measured superconductivity with a transition temperature up to 287.7 K (!) at a pressure of 267 GPa (!!).  The evidence for superconductivity is both a resistive transition to (as near as can be seen) zero resistance, and an onset of diamagnetism (as seen through ac susceptibility).  

This is exciting, and a milestone, though of course there are many questions:  What is the actual chemical compound at work here?  How does superconductivity work - is it conventional or more exotic? Is there any pathway to keeping these properties without enormous externally applied pressure?  At the very least, this shows experimentally what people have been saying for a long time, that there is no reason in principle why there couldn't be room temperature (or above) superconductivity.



Saturday, October 10, 2020

How fast can sound be in a solid or liquid?

There is a new paper here that argues through dimensional analysis for an upper limit to the speed of sound in solids and liquids (when the atoms bump up against each other).  The authors derive that the maximum speed of sound is, to within numerical factors of order 1, given by \(v_{\mathrm{max}}/c = \alpha \sqrt{m_{e}/(2m_{p})} \), where \(\alpha\) is the fine structure constant, and \(m_{e}\) and \(m_{p}\) are the masses of the electron and proton, respectively.  Numerically, that ends up being about 36 km/s.  

It's a neat argument, and I agree with the final result, but I actually think there's a more nuanced way to think about this than the approach of the authors.  Sound speed can be derived from some assumptions about continuum elasticity, and is given by \(v_{s} = \sqrt{K/\rho}\), where \(K\) is the bulk modulus and \(\rho\) is the mass density.  Bulk modulus is given by (negative) the inverse fractional change in volume of a substance when the pressure on the substance is increased.  So, a squishy soft substance has a low bulk modulus, because when the pressure goes up, its volume goes down comparatively a lot.

The authors make the statement "It has been ascertained that elastic constants are governed by the density of electromagnetic energy in condensed matter phases."  This is true, but for the bulk modulus I would argue that this is true indirectly, as a consequence of the Pauli principle.  I wrote about something similar previously, explaining why you can't push solids through each other even though the atoms are mostly empty space.  If you try to stuff two atoms into the volume of one atom, it's not the Coulomb repulsion of the electrons that directly stops this from happening.  Rather, the Pauli principle says that cramming those additional electrons into that tiny volume would require the electrons to occupy higher atomic energy levels.  They typical scale of those atomic energy levels is something like a Rydberg, so that establishes one cost of trying to compress solids or liquids; that Rydberg scale of energy is how the authors get to the fine structure constant and the masses of the electron and proton in their result.  

I would go further and say that this is really the ultimate limiting factor on sound speed in dense material.  Yes, interatomic chemical bonds are important - as I'd written, they establish why solids deform instead of actually merging when squeezed.  It's energetically cheaper to break or rearrange chemical bonds (on the order of a couple of eV in energy) than to push electrons into higher energy states (several eV or more - real Rydberg scales).  

Still, it's a cool idea - that one can do intelligently motivated dimensional analysis and come up with an insight into the maximum possible value of some emergent quantity like sound speed.  (Reminds me of the idea of a conjectured universal bound on diffusion constants for electrons in metals.)



Thursday, October 08, 2020

Postdoc opportunities

There is a postdoc opportunity coming up in my lab to look at light emission from molecular-scale plasmonic nanostructures.  It's going to be very cool, looking at (among other things) photon counting statistics (this kind of thing), coupling plasmon-based emission to single fluorophores, all kinds of fun.  Please check it out and share with those who might be interested:  https://jobs.rice.edu/postings/24792 

In addition:  The Smalley-Curl Institute is happy to announce that they are accepting applications to 2 (two) J Evans Attwell-Welch Postdoctoral Research Associate positions.   Highly competitive, the Attwell-Welch fellowship was established in 1998 to provide Ph.D. recipients in nanosciences and nanotechnology-related fields, an opportunity to further their basic scientific research experience.

The deadline for the Evans Attwell-Welch submissions is Monday Dec 7th, 2020.  Applications containing the candidate’s resume, a two-page research project, and a letter of support from an SCI member must be emailed to sci@rice.edu before the deadline.  Only applicants sponsored by an SCI Rice faculty member will be considered.   

I would be happy to work with a potential applicant, particularly one interested in strongly correlated nanostructures and spin transport in magnetic insulators.  If you're a student finishing up and are interested, please contact me, and if you're a faculty member working with possible candidates, please feel free to point out this opportunity.   Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

Saturday, October 03, 2020

Annual Nobel speculation, + nanoscale views on twitter

It's that annual tradition:  Who do people think will win the Nobel this year in physics?  Or chemistry?  On the physics side, I've repeatedly predicted (incorrectly) Aharonov and Berry for geometric phases.  Another popular suggestion from years past is Aspect, Zeilinger, and Clauser for Bell's inequality tests.   Speculate away in the comments.

I've also finally taken the plunge and created @NanoscaleViews on twitter.  Hopefully this will help reach a broader audience, even if I don't have the time to fall down the twitter rabbit hole constantly.

Thursday, September 24, 2020

The Barnett Effect and cool measurement technique

 I've written before about the Einstein-deHaas effect - supposedly Einstein's only experimental result (see here, too) - a fantastic proof that spin really is angular momentum.  In that experiment, a magnetic field is flipped, causing the magnetization of a ferromagnet to reorient itself to align with the new field direction.  While Einstein and deHaas thought about amperean current loops (the idea that magnetization came from microscopic circulating currents that we would now call orbital magnetism), we now know that magnetization in many materials comes from the spin of the electrons.  When those spins reorient, angular momentum has to be conserved somehow, so it is transferred to/from the lattice, resulting in a mechanical torque that can be measured.

Less well-known is the complement, the Barnett effect.  Take a ferromagnetic material and rotate it. The mechanical rotational angular momentum gets transferred (via rather complicated physics, it turns out) at some rate to the spins of the electrons, causing the material to develop a magnetization along the rotational axis.  This seems amazing to me now, knowing about spin.  It must've really seemed nearly miraculous back in 1915 when it was measured by Barnett.

So, how did Barnett actually measure this, with the technology available in 1915?  Here's the basic diagram of the scheme from the original paper:


There are two rods that can each be rotated about its long axis.  The rods pass through counterwound coils, so that if there is a differential change in magnetic flux through the two coils, that generates a current that flows through the fluxmeter.  The Grassot fluxmeter is a fancy galvanometer - basically a coil suspended on a torsion fiber between poles of a magnet.  Current through that coil leads to a torque on the fiber, which is detected in this case by deflection of a beam of light bounced off a mirror mounted on the fiber.  The paper describes the setup in great detail, and getting this to work clearly involved meticulous experimental technique and care.  It's impressive how people were able to do this kind of work without all the modern electronics that we take for granted.  Respect.

Monday, September 21, 2020

Rice ECE assistant professor position in Quantum Engineering

The Department of Electrical and Computer Engineering at Rice University invites applications for a tenure track Assistant Professor Position in the area of experimental quantum engineering, broadly defined. Under exceptional circumstances, more experienced senior candidates may be considered. Specific areas of interest include, but are not limited to: quantum computation, quantum sensing, quantum simulation, and quantum networks.

The department has a vibrant research program in novel, leading-edge research areas, has a strong culture of interdisciplinary and multidisciplinary research with great national and international visibility, and is ranked #1 nationally in faculty productivity.* With multiple faculty involved in quantum materials, quantum devices, optics and photonics, and condensed matter physics, Rice ECE considers these areas as focal points of quantum engineering research in the coming decade. The successful applicant will be required to teach undergraduate courses and build a successful research program.

The successful candidate will have a strong commitment to teaching, advising, and mentoring undergraduate and graduate students from diverse backgrounds. Consistent with the National Research Council’s report, Convergence: Facilitating Transdisciplinary Integration of Life Sciences, Physical Sciences, Engineering, and Beyond, we are seeking candidates who have demonstrated ability to lead and work in research groups that “… [integrate] the knowledge, tools, and ways of thinking…” from engineering, mathematics, and computational, natural, social and behavioral sciences to solve societal problems using a convergent approach.

Applicants should submit a cover letter, curriculum vitae, statements of research and teaching interests, and at least three references through the Rice faculty application website: http://jobs.rice.edu/postings/24582. The deadline for applications is January 15, 2021; review of applications will commence November 15, 2020. The position is expected to be available July 1, 2021. Additional information can be found on our website: http://www.ece.rice.edu.

Rice University is a private university with a strong reputation for academic excellence in both undergraduate and graduate education and research. Located in the economically dynamic, internationally diverse city of Houston, Texas, 4th largest city in the U.S., Rice attracts outstanding undergraduate and graduate students from across the nation and around the world. Rice provides a stimulating environment for research, teaching, and joint projects with industry.

The George R. Brown School of Engineering ranks among the top 20 of undergraduate engineering programs (US News & World Report) and is strongly committed to nurturing the aspirations of faculty, staff, and students in an inclusive environment. Rice University is an Equal Opportunity Employer with commitment to diversity at all levels and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability, or protected veteran status. We seek greater representation of women, minorities, people with disabilities, and veterans in disciplines in which they have historically been underrepresented; to attract international students from a wider range of countries and backgrounds; to accelerate progress in building a faculty and staff who are diverse in background and thought; and we support an inclusive environment that fosters interaction and understanding within our diverse community.

*http://news.rice.edu/2007/11/30/rices-electrical-engineering-and-computer-science-programs-rank-no-1/ Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

Tenure-track faculty position in Astronomy at Rice University

The Department of Physics and Astronomy at Rice University invites applications for a tenure-track faculty position in astronomy in the general field of galactic star formation, including the formation and evolution of planetary systems. We seek an outstanding theoretical, observational, or computational astronomer whose research will complement and extend existing activities in these areas within the Department. In addition to developing an independent and vigorous research program, the successful applicant will be expected to teach, on average, one undergraduate or graduate course each semester, and contribute to the service missions of the Department and University. The Department anticipates making the appointment at the assistant professor level. A Ph.D. in astronomy/astrophysics or related field is required. 

Applications for this position must be submitted electronically at http://jobs.rice.edu/postings/24588. Applicants will be required to submit the following: (1) cover letter; (2) curriculum vitae; (3) statement of research; (4) statement on teaching, mentoring, and outreach; (5) PDF copies of up to three publications; and (6) the names, affiliations, and email addresses of three professional references. Rice University is committed to a culturally diverse intellectual community. In this spirit, we particularly welcome applications from all genders and members of historically underrepresented groups who exemplify diverse cultural experiences and who are especially qualified to mentor and advise all members of our diverse student population. We will begin reviewing applications December 1, 2020. To receive full consideration, all application materials must be received by January 1, 2021. The appointment is expected to begin in July, 2021. 

Rice University is an Equal Opportunity Employer with a commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability, or protected veteran status. We encourage applicants from diverse backgrounds to apply.


Thursday, September 10, 2020

The power of a timely collaboration

Sometimes it takes a while to answer a scientific question, and sometimes that answer ends up being a bit unexpected.  Three years ago, I wrote about a paper from our group, where we had found, much to our surprise, that the thermoelectric response of polycrystalline gold wires varied a lot as a function of position within the wire, even though the metal was, by every reasonable definition, a good, electrically homogeneous material.  (We were able to observe this by using a focused laser as a scannable heat source, and measuring the open-circuit photovoltage of the device as a function of the laser position.)  At the time, I wrote "Annealing the wires does change the voltage pattern as well as smoothing it out.  This is a pretty good indicator that the grain boundaries really are important here."

What would be the best way to test the idea that somehow the grain boundaries within the wire were responsible for this effect?  Well, the natural thought experiment would be to do the same measurement in a single crystal gold wire, and then ideally do a measurement in a wire with, say, a single grain boundary in a known location.  

Fig. 4 from this paper
Shortly thereafter, I had the good fortune to be talking with Prof. Jonathan Fan at Stanford.  His group had, in fact, come up with a clever way to create single-crystal gold wires, as shown at right.  Basically they create a wire via lithography, encapsulate it in silicon oxide so that the wire is sitting in its own personal crucible, and then melt/recrystallize the wire.  Moreover, they could build upon that technique as in this paper, and create bicrystals with a single grain boundary.  Focused ion beam could then be used to trim these to the desired width (though in principle that can disturb the surface).

We embarked on a rewarding collaboration that turned out to be a long, complicated path of measuring many many device structures of various shapes, sizes, and dimensions.  My student Charlotte Evans, measuring the photothermoelectric (PTE) response of these, worked closely with members of Prof. Fan's group - Rui Yang grew and prepared devices, and Lucia Gan did many hours of back-scatter electron diffraction measurements and analysis, for comparison with the photovoltage maps.  My student Mahdiyeh Abbasi learned the intricacies of finite element modeling to see what kind of spatial variation of Seebeck coefficient \(S\) would be needed to reproduce the photovoltage maps.  

From Fig. 1 of our new paper.  Panel g upper shows the local crystal
misorientation as found from electron back-scatter diffraction, while 
panel g lower shows a spatial map of the PTE response.  The two 
patterns definitely resemble each other (panel h), and this is seen
consistently across many devices.

A big result of this was published this week in PNAS.  The surprising result:  Individual high-angle grain boundaries produce a PTE signal so small as to be unresolvable in our measurement system.  In contrast, though, the PTE measurement could readily detect tiny changes in Seebeck response that correlate with small local misorientations of the local single crystal structure.  The wire is still a single crystal, but it contains dislocations and disclinations and stacking faults and good old-fashioned strain due to interactions with the surroundings when it crystallized.  Some of these seem to produce detectable changes in thermoelectric response.  When annealed, the PTE features smooth out and reduce in magnitude, as some (but not all) of the structural defects and strain can anneal away.  

So, it turns out it's likely not the grain boundaries that cause Seebeck variations in these nanostructures - instead it's likely residual strain and structural defects from the thin film deposition process, something to watch out for in general for devices made by lithography and thin film processing.  Also, opto-electronic measurements of thermoelectric response are sensitive enough to detect very subtle structural inhomogeneities, an effect that can in principle be leveraged for things like defect detection in manufactured structures.  It took a while to unravel, but it is satisfying to get answers and see the power of the measurement technique.

Tuesday, September 08, 2020

Materials and popular material

This past week was a great one for my institution, as the Robert A. Welch Foundation and Rice University announced the creation of the Welch Institute for Advanced Materials.  Exactly how this is going to take shape and grow is still in the works, but the stated goals of materials-by-design and making Rice and Houston a global destination for advanced materials research are very exciting.  

Long-time readers of this blog know my view that the amazing physics of materials is routinely overlooked in part because materials are ubiquitous - for example, the fact that the Pauli principle in some real sense is what is keeping you from falling through the floor right now.  I'm working on refining a few key concepts/topics that I think are translatable to the general reading public.  Emergence, symmetry, phases of matter, the most important physical law most people have never heard about (the Pauli principle), quasiparticles, the quantum world (going full circle from the apparent onset of the classical to using collective systems to return to quantum degrees of freedom in qubits).   Any big topics I'm leaving out?

Saturday, August 29, 2020

Diamond batteries? Unlikely.

The start of the academic year at Rice has been very time-intensive, leading to the low blogging frequency.  I will be trying to remedy that, and once some of the dust settles I may well create a twitter account to point out as-they-happen results and drive traffic this way.  

In the meantime, there has been quite a bit of media attention this week paid to the claim by NDB that they can make nanodiamond-based batteries with some remarkable properties.  This idea was first put forward in this video.  The eye-popping part of the news release is this:  "And it can scale up to electric vehicle sizes and beyond, offering superb power density in a battery pack that is projected to last as long as 90 years in that application – something that could be pulled out of your old car and put into a new one."

The idea is not a new one.  The NDB gadget is a take on a betavoltaic device.  Take a radioactive source that is a beta emitter - in this case, 14C which decays into 14N plus an antineutrino plus an electron with an average energy of 49 keV - and capture the electrons and ideally the energy from the decay.  Betavoltaic devices produce power for a long time, depending on the half-life of the radioactive species (here, 5700 years).  The problem is, the power of these systems is very low, which greatly limits their utility.  For use in applications when you need higher instantaneous power, the NDB approach appears to be to use the betavoltaic gizmo to trickle-charge an integrated supercapacitor that can support high output powers.

To get a sense of the numbers:  If you had perfectly efficient capture of the decay energy, if you had 14 grams of 14C (a mole), my estimate of the total power available is 13 mW. (((6.02e23 *49000 eV *1.602e-19 J/eV)/2)/(5700 yrs*365.25 days/yr*86400)). If you wanted to charge the equivalent of a full Tesla battery (80 kW-h), it would take (80000 W-hr*3600 s/hr)/(0.013 W) = 2.2e10 seconds. Even if you had 10 kg of pure 14C, that would take you 180 days.

Now, the actual image in the press release-based articles shows a chip-based battery labeled "100 nW", which is very reasonable.  This technology is definitely clever, but it just does not have the average power densities needed for an awful lot of applications.


Tuesday, August 18, 2020

Black Si, protected qubits, razor blades, and a question

The run up to the new academic year has been very time-intense, so unfortunately blogging has correspondingly been slow.  Here are three interesting papers I came across recently:

  • In this paper (just accepted at Phys Rev Lett), the investigators have used micro/nanostructured silicon to make an ultraviolet photodetector with an external quantum efficiency (ratio of number of charges generated to number of incoming photons) greater than 100%.  The trick is carrier multiplication - a sufficiently energetic electron or hole can in principle excite additional carriers through "impact ionization".  In the nano community, it has been argued that nanostructuring can help this, because nm-scale structural features can help fudge (crystal) momentum conservation restrictions in the impact ionization process. Here, however, the investigators show that nanostructuring is irrelevant for the process, and it has more to do with the Si band structure and how it couples to the incident UV radiation.  
  • In this paper (just published in Science), the authors have been able to implement something quite clever that's been talked about for a while.  It's been known since the early days of discussing quantum computing that one can try to engineer a quantum bit that lives in a "decoherence-free subspace" - basically try to set up a situation where your effective two-level quantum system (made from some building blocks coupled together) is much more isolated from the environment than the building blocks themselves individually.  Here they have done this using a particular kind of defect in silicon carbide "dressed" with applied microwave EM fields.  They can increase the coherence time of the composite system by 10000x compared with the bare defect.
  • This paper in Science uses very cool in situ electron microscopy to show how even comparatively soft hairs can dull the sharp edge of steel razor blades.  See this cool video that does a good job explaining this.  Basically, with the proper angle of attack, the hair can torque the heck out of the metal at the very end of the blade, leading to microfracturing and chipping.
And here is my question:  would it be worth joining twitter and tweeting about papers?  I've held off for a long time, for multiple reasons.  With the enormous thinning of science blogs, I do wonder, though, whether I'd reach more people.

Wednesday, August 05, 2020

The energy of the Beirut explosion

The shocking explosion in Beirut yesterday was truly awful and shocking, and my heart goes out to the residents.  It will be quite some time before a full explanation is forthcoming, but it sure sounds like the source was a shipment of explosives-grade ammonium nitrate that had been impounded from a cargo ship and (improperly?) stored for several years.

Interestingly, it is possible in principle to get a good estimate of the total energy yield of the explosion from cell phone video of the event.  The key is a fantastic example of dimensional analysis, a technique somehow more common in an engineering education than in a physics one.  The fact that all of our physical quantities have to be defined by an internally consistent system of units is actually a powerful constraint that we can use in solving problems.  For those interested in the details of this approach, you should start by reading about the Buckingham Pi Theorem.  It seems abstract and its applications seem a bit like art, but it is enormously powerful.  

The case at hand was analyzed by the British physicist G. I. Taylor, who was able to take still photographs in a magazine of the Trinity atomic bomb test and estimate the yield of the bomb.  Assume that a large amount of energy \(E\) is deposited instantly in a tiny volume at time \(t=0\), and this produces a shock wave that expands spherically with some radius \(R(t)\) into the surrounding air of mass density \(\rho\).  If you assume that this contains all the essential physics in the problem, then you can realize that the \(R\) must in general depend on \(t\), \(\rho\), and \(E\).  Now, \(R\) has units of length (meters).  The only way to combine \(t\), \(\rho\), and \(E\) into something with the units of length is \( (E t^2/\rho)^{1/5}\).  That implies that \( R = k (E t^2/\rho)^{1/5} \), where \(k\) is some dimensionless number, probably on the order of 1.  If you cared about precision, you could go and do an experiment:  detonate a known amount of dynamite on a tower and film the whole thing with a high speed camera, and you can experimentally determine \(k\).  I believe that the constant is found to be close to 1.  

Flipping things around and solving, we fine \(E = R^5 \rho/t^2\).  (A more detailed version of this derivation is here.)  

This youtube video is the best one I could find in terms of showing a long-distance view of the explosion with some kind of background scenery for estimating the scale.  Based on the "before" view and the skyline in the background, and a google maps satellite image of the area, I very crudely estimated the radius of the shockwave at about 300 m at \(t = 1\) second.  Using 1.2 kg/m3 for the density of air, that gives an estimated yield of about 3 trillion Joules, or the equivalent of around 0.72 kT of TNT.   That's actually pretty consistent with the idea that there were 2750 tons of ammonium nitrate to start with, though it's probably fortuitous agreement - that radius to the fifth really can push the numbers around.

Dimensional analysis and scaling are very powerful - it's why people are able to do studies in wind tunnels or flow tanks and properly predict what will happen to full-sized aircraft or ships, even without fully understanding the details of all sorts of turbulent fluid flow.  Physicists should learn this stuff (and that's why I stuck it in my textbook.)

Saturday, August 01, 2020

How long does quantum tunneling take?

The "tunneling time" problem has a long, fun history.  Here is a post that I wrote about this issue 13 years ago (!!).  In brief, in quantum mechanics a particle can "tunnel" through a "classically forbidden" region (a region where by simple classical mechanics arguments, the particle does not have sufficient kinetic energy to be there).  I've written about that more recently here, and the wikipedia page is pretty well done.  The question is, how long does a tunneling particle spend in the classically forbidden barrier?  

It turns out that this is not a trivial issue at all.  While that's a perfectly sensible question to ask from the point of view of classical physics, it's not easy to translate that question into the language of quantum mechanics.  In lay terms, a spatial measurement tells you where a particle is, but doesn't say anything about where it was, and without such a measurement there is uncertainty in the initial position and momentum of the particle.  

Some very clever people have thought about how to get at this issue.  This review article by Landauer and Martin caught my attention when I was in grad school, and it explains the issues very clearly.  One idea people had (Baz' and Rybochenko) is to use the particle itself as a clock.  If the tunneling particle has spin, you can prepare the incident particles to have that spin oriented in a particular direction.  Then have a magnetic field confined to the tunneling barrier.  Look at the particles that did tunnel through and see how far the spins have precessed.  This idea is shown below.
"Larmor clock", from this paper

This is a cute idea in theory, but extremely challenging to implement in an experiment.  However, this has now been done by Ramos et al. from the Steinberg group at the University of Toronto, as explained in this very nice Nature paper.  They are able to do this and actually see an effect that Landauer and others had discussed:  there is "back-action", where the presence of the magnetic field itself (essential for the clock) has an effect on the tunneling time.  Tunneling is not instantaneous, though it is faster than the simple "semiclassical" estimate (that one would get by taking the magnitude of the imaginary momentum in the barrier and using that to get an effective velocity).  Very cool.

Saturday, July 25, 2020

Kitchen science: insulated cups

An impromptu science experiment this morning.  A few months ago we acquired some very nice insulated tumblers (initially from causebox and then more from here).  Like all such insulated items, the inner and outer walls are made from a comparatively lousy thermal conductor, in this case stainless steel.  (Steel is an alloy, and the disorder in its micro and nanoscale structure scatters electrons, making it have a lower electrical (and hence thermal) conductivity than pure metals.)  Ideally the walls only touch at the very top lip of the cup where they are joined, and the space between the walls has been evacuated to minimize heat conduction by any trapped gas in there.  When working well, so that heat transfer has to take place along the thin metal wall, the interior wall of the cup tends to sit very close to the temperature of whatever liquid is in there, and the exterior wall tends to sit at room temperature.

We accidentally dropped one of the cups this morning, making a dent near the base.  The question was, did this affect the thermal insulation of that cup?  To test this, we put four ice cubes and four ounces of water from our refrigerator into each cup and let them sit on the counter for 15 minutes.  Then we used an optical kitchen thermometer (with handy diode laser for pointing accuracy) to look at the exterior and interior wall temperatures.  (Apologies for the use of Fahrenheit units.)  Check this out.


The tumbler on the left is clearly doing a better job of keeping the outside warm and the inside cold.  If we then scrutinize the tumbler on the right we find the dent, which must be deep enough to bring the inner and outer walls barely into contact.


The bottom line:  Behold, science works.  Good insulated cups are pretty impressive engineering, but you really should be careful with them, because the layers really are close together and can be damaged.