Wednesday, July 18, 2018

Items of interest

While trying to write a few things (some for the blog, some not), I wanted to pass along some links of interest:

  • APS March Meeting interested parties:  The time to submit nominations for invited sessions for the Division of Condensed Matter Physics is now (deadline of August 24).  See here.  As a member-at-large for DCMP, I've been involved in the process now for a couple of years, and lots of high quality nominations are the best way to get a really good meeting.  Please take the time to nominate!
  • Similarly, now is the time to nominate people for DCMP offices (deadline of Sept. 1).
  • There is a new tool available called Scimeter that is a rather interesting add-on to the arxiv.  It has done some textual analysis of all the preprints on the arxiv, so you can construct a word cloud for an author (see at right for mine, which is surprisingly dominated by "field effect transistor" - I guess I use that phrase too often) or group of authors; or you can search for similar authors based on that same word cloud analysis.  Additionally, the tool uses that analysis to compare breadth of research topics spanned by an author's papers.  Apparently I am 0.3 standard deviations more broad than the mean broadness, whatever that means.  
  • Thanks to a colleague, I stumbled on Fermat's Library, a great site that stockpiles some truly interesting and foundational papers across many disciplines and allows shared commenting in the margins (hence the Fermat reference).  

Sunday, July 08, 2018

Physics in the kitchen: Frying tofu

I was going to title this post "On the emergence of spatial and temporal coherence in frying tofu", or "Frying tofu:  Time crystal?", but decided that simplicity has virtues.

I was doing some cooking yesterday, and I was frying some firm tofu in a large, deep skillet in my kitchen.  I'd cut the stuff into roughly 2cm by 2cm by 1 cm blocks, separated by a few mm from each other but mostly covering the whole cooking surface, and was frying them in a little oil (enough to coat the bottom of the skillet) when I noticed something striking, thanks to the oil reflecting the overhead light.  The bubbles forming in the oil under/around the tofu were appearing and popping in what looked to my eye like very regular intervals, at around 5 Hz.  Moreover (and this was the striking bit), the bubbles across a large part of the whole skillet seemed to be reasonably well synchronized.  This went on long enough (a couple of minutes, until I needed to flip the food) that I really should have gone to grab my camera, but I missed my chance to immortalize this on youtube because (a) I was cooking, and (b) I was trying to figure out if this was some optical illusion.

From the physics perspective, here was a driven nonequilibrium system (heated from below by a gas flame and conduction through the pan) that spontaneously picked out a frequency for temporal oscillations, and apparently synchronized the phase across the pan well.  Clearly I should have filmed this and called it a classical time crystal.   Would've been a cheap and tasty paper.  (I kid, I kid.)

What I think happened is this.  The bubbles in this case were produced by the moisture inside the tofu boiling into steam (due to the local temperature and heat flux) and escaping from the bottom (hottest) surface of the tofu into the oil to make bubbles.  There has to be some rate of steam formation set by the latent heat of vaporization for water, the heat flux (and thus thermal conductivity of the pan, oil, and tofu), and the local temperature (again involving the thermal conductivity and specific heat of the tofu).  The surface tension of the oil, its density, and the steam pressure figure into the bubble growth and how big the bubbles get before they pop.  I'm sure someone far more obsessive than I am could do serious dimensional analysis about this.  The bubbles then couple to each other via the surrounding fluid, and synched up because of that coupling (maybe like this example with flames).   This kind of self-organization happens all the time - here is a nice talk about this stuff.  This kind of synchronization is an example of universal, emergent physics.

Tuesday, July 03, 2018

A metal superconducting transistor (?!)

A paper was published yesterday in Nature Nanotechnology that is quite surprising, at least to me, and I thought I should point it out.

The authors make superconducting wires (e-beam evaporated Ti in the main text, Al in the supporting information) that appear to be reasonably "good metals" in the normal state.  [For the Ti case, for example, their electrical resistance is about 10 Ohms per square, very far from the "quantum of resistance" \(h/2e^{2}\approx 12.9~\mathrm{k}\Omega\).  This suggests that the metal is electrically pretty homogeneous (as opposed to being a bunch of loosely connected grains).  Similarly, the inferred resistivity of around 30 \(\mu\Omega\)-cm) is comparable to expectations for bulk Ti (which is actually a bit surprising to me).]

The really surprising thing is that the application of a large voltage between a back-gate (the underlying Si wafer, separated from the wire by 300 nm of SiO2) and the wire can suppress the superconductivity, dialing the critical current all the way down to zero.  This effect happens symmetrically with either polarity of bias voltage. 

This is potentially exciting because having some field-effect way to manipulate superconductivity could let you do very neat things with superconducting circuitry. 

The reason this is startling is that ordinarily field-effect modulation of metals has almost no effect.  In a typical metal, a dc electric field only penetrates a fraction of an atomic diameter into the material - the gas of mobile electrons in the metal has such a high density that it can shift itself by a fraction of a nanometer and self-consistently screen out that electric field. 

Here, the authors argue (in a model in the supplemental information that I need to read carefully) that the relevant physical scale for the gating of the superconductivity is, empirically, the London penetration depth, a much longer spatial scale (hundreds of nm in typical low temperature superconductors).    I need to think about whether this makes sense to me physically.

Sunday, July 01, 2018

Book review: The Secret Life of Science

I recently received a copy of The Secret Life of Science:  How It Really Works and Why It Matters, by Jeremy Baumberg of Cambridge University.  The book is meant to provide a look at the "science ecosystem", and it seems to be unique, at least in my experience.  From the perspective of a practitioner but with a wider eye, Prof. Baumberg tries to explain much of the modern scientific enterprise - what is modern science (with an emphasis on "simplifiers" [often reductionists] vs. "constructors" [closer to engineers, building new syntheses] - this is rather similar to Narayanamurti's take described here), who are the different stakeholders, publication as currency, scientific conferences, science publicizing and reporting, how funding decisions happen, career paths and competition, etc. 

I haven't seen anyone else try to spell out, for a non-scientist audience, how the scientific enterprise fits together from its many parts, and that alone makes this book important - it would be great if someone could get some policy-makers to read it.  I agree with many of the book's main observations:

  • The actual scientific enterprise is complicated (as pointed out repeatedly with one particular busy figure that recurs throughout the text), with a bunch of stakeholders, some cooperating, some competing, and we've arrived at the present situation through a complex, emergent history of market forces, not some global optimization of how best to allocate resources or how to choose topics. 
  • Scientific publishing is pretty bizarre, functioning to disseminate knowledge as well as a way of keeping score; peer review is annoying in many ways but serves a valuable purpose; for-profit publications can distort people's behaviors because of the prestige associated with some.
  • Conferences are also pretty weird, serving purposes (networking, researcher presentation training) that are not really what used to be the point (putting out and debating new results).
  • Science journalism is difficult, with far more science than can be covered, squeezed resources for real journalism, incentives for PR that can oversimplify or amp up claims and controversy, etc.
The book ends with some observations and suggestions from the author's perspective on changes that might improve the system, with a realist recognition that big changes will be hard.   

It would be very interesting to get the perspective of someone in a very different scientific field (e.g., biochemistry) for their take on Prof. Baumberg's presentation.  My own research interests align much w/ his, so it's hard for me to judge whether his point of view on some matters matches up well with other fields.  (I do wonder about some of the numbers that appear.  Has the number of scientists in France really grown by a factor of three since 1980?  And by a factor of five in Spain over that time?)

If you know someone who is interested in a solid take on the state of play in (largely academic) science in the West today, this is a very good place to start. 

Monday, June 25, 2018

Don't mince words, John Horgan. What do you really think?

In his review of Sabine Hossenfelder's new book for Scientific American, John Horgan begins by saying:
Does anyone who follows physics doubt it is in trouble? When I say physics, I don’t mean applied physics, material science or what Murray-Gell-Mann called “squalid-state physics.” I mean physics at its grandest, the effort to figure out reality. Where did the universe come from? What is it made of? What laws govern its behavior? And how probable is the universe? Are we here through sheer luck, or was our existence somehow inevitable?
Wow.  Way to back-handedly imply that condensed matter physics is not grand or truly important.  The frustrating thing is that Horgan knows perfectly well that condensed matter physics has been the root of multiple of profound ideas (Higgs mechanism, anyone?), as well as shaping basically all of the technology he used to write that review.   He goes out of his way here to make clear that he doesn't think any of that is really interesting.  Why do that as a rhetorical device?  

Sunday, June 24, 2018

There is no such thing as a rigid solid.

How's that for a provocative, click-bait headline?

More than any other branch of physics, condensed matter physics highlights universality, the idea that some properties crop up repeatedly, in many physical systems, independent of and despite the differences in the microscopic building blocks of the system.  One example that affects you pretty much all the time is emergence of rigid solids from the microscopic building blocks that are atoms and molecules.  You may never think about it consciously, but mechanically rigid solids make up much of our environment - our buildings, our furniture, our roads, even ourselves.

A quartz crystal is an example of a rigid solid. By solid, I mean that the material maintains its own shape without confining walls, and by rigid, I mean that it “resists deformation”. Deforming the crystal – stretching it, squeezing it, bending it – involves trying to move some piece of the crystal relative to some other piece of the crystal. If you try to do this, it might flex a little bit, but the crystal pushes back on you. The ratio between the pressure (say) that you apply and the percentage change in the crystal’s size is called an elastic modulus, and it’s a measure of rigidity. Diamond has a big elastic modulus, as does steel. Rubber has a comparatively small elastic modulus – it’s squishier. Rigidity implies solidity. If a hunk of material has rigidity, it can withstand forces acting on it, like gravity.  (Note that I'm already assuming that atoms can't pass through each other, which turns out to be a macroscopic manifestation of quantum mechanics, even though people rarely think of it that way.  I've discussed this recently here.)

Take away the walls of an aquarium, and the rectangular “block” of water in there can’t resist gravity and splooshes all over the table. In free fall as in the International Space Station, a blob of water will pull itself into a sphere, as it doesn’t have the rigidity to resist surface tension, the tendency of a material to minimize its surface area.

Rigidity is an emergent property. One silicon or oxygen atom isn’t rigid, but somehow, when you put enough of them together under the right conditions, you get a mechanically solid object. A glass, in contrast to a crystal, looks very different if you zoom in to the atomic scale. In the case of silicon dioxide, while the detailed bonding of each silicon to two oxygens looks similar to the case of quartz, there is no long-range pattern to how the atoms are arranged. Indeed, while it would be incredibly difficult to do experimentally, if you could take a snapshot of molten silica glass at the atomic scale, from the positions of the atoms alone, you wouldn’t be able to tell whether it was molten or solidified.   However, despite the structural similarities to a liquid, solid glass is mechanically rigid. In fact, some glasses are actually far more stiff than crystalline solids – metallic glasses are highly prized for exactly this property – despite having a microscopic structure that looks like a liquid. 

Somehow, these two systems (quartz and silica glass), with very different detailed structures, have very similar mechanical properties on large scales. Maybe this example isn't too convincing. After all, the basic building blocks in both of those materials are really the same. However, mechanical rigidity shows up all the time in materials with comparatively high densities. Water ice is rigid. The bumper on your car is rigid. The interior of a hard-boiled egg is rigid. Concrete is rigid. A block of wood is rigid. A vacuum-packed bag of ground espresso-roasted coffee is rigid. Somehow, mechanical rigidity is a common collective fate of many-particle systems. So where does it originate? What conditions are necessary to have rigidity?

Interestingly, this question remains one that is a subject of research.  Despite my click-bait headline, it sure looks like there are materials that are mechanically rigid.  However, it can be shown mathematically (!) that "equilibrium states of matter that break spontaneously translational invariance...flow if even an infinitesimal stress is applied".   That is, take some crystal or glass, where the constituent particles are sitting in well-defined locations (thus "breaking translational invariance"), and apply even a tiny bit of shear, and the material will flow.  It can be shown mathematically that the particles in the bulk of such a material can always rearrange a tiny amount that should end up propagating out to displace the surface of the material, which really is what we mean by "flow".   How do we reconcile this statement with what we see every day, for example that you touching your kitchen table really does not cause its surface to flow like a liquid?

Some of this is the kind of hair-splitting/no-true-Scotsman definitional stuff that shows up sometimes in theoretical physics.  A true equilibrium state would last forever.   To say that "equilibrium states of matter that break spontaneously translational invariance" are unstable under stress just means that the final, flowed rearrangement of atoms is energetically favored once stress is applied, but doesn't say anything on how long it takes the system to get there.

We see other examples of this kind of thing in condensed matter and statistical physics.  It is possible to superheat liquid water above its boiling point.  Under those conditions, the gas phase is thermodynamically favored, but to get from the homogeneous liquid to the gas requires creating a blob of gas, with an accompanying liquid/gas interface that is energetically expensive.  The result is an "activation barrier".

Turns out, that appears to be the right way to think about solids.  Solids only appear rigid on any useful timescale because the timescale to create defects and reach the flowed state is very very long.  A recent discussion of this is here, with some really good references, in a paper that only appeared this spring in the Proceedings of the National Academy of Sciences of the US.  An earlier work (a PRL) trying to quantify how this all works is here, if you're interested.

One could say that this is a bit silly - obviously we know empirically that there are rigid materials, and any analysis saying they don't exist has to be off the mark somehow.  However, in science, particularly physics, this kind of study, where observation and some fairly well-defined model seem to contradict each other, is precisely where we tend to gain a lot of insight.  (This is something we have to be better at explaining to non-scientists....)





Monday, June 18, 2018

Scientific American - what the heck is this?

Today, Scientific American ran this on their blogs page.  This article calls to mind weird mysticism stuff like crystal energy, homeopathy, and tree waves (a reference that attendees of mid-1990s APS meetings might get), and would not be out of place in Omni Magazine in about 1979.

I’ve written before about SciAm and their blogs.  My offer still stands, if they ever want a condensed matter/nano blog that I promise won’t verge into hype or pseudoscience.