- I just ran into this article from early in 2019. It touches on my discussion about liquids, and is a great example of a recurring theme in condensed matter physics. The authors look at the vibrational excitations of liquid droplets on surfaces. As happens over and over in physics, the imposition of boundary conditions on the liquid motion (e.g., wetting conditions on the surface and approximately incompressible liquid with a certain surface tension) leads to quantization of the allowed vibrations. Discrete frequencies/mode shapes/energies are picked out due to those constraints, leading to a "periodic table" of droplet vibrations. (This one looks moderately like atomic states, because spherical harmonics show up in the mode description, as they do when looking at atomic orbitals.)
- Another article from the past, this one from 2014 in the IEEE Spectrum. It talks about how we arrived at the modern form for Maxwell's equations. Definitely a good read for those interested in the history of physics. Maxwell's theory was developing in parallel with what became vector calculus, and Maxwell's original description (like Faraday's intuition) was very mechanistic rather than abstract.
- Along those lines, this preprint came out recently promoting a graphical pedagogical approach to vector calculus. The spirit at work here is that Feynman's graphical diagrammatic methods were a great way to teach people perturbative quantum field theory, and do perhaps a diagrammatic scheme for vector calc could be good. I'm a bit of a skeptic - I found the approach by Purcell to be very physical and intuitive, and this doesn't look simpler to me.
- This preprint about twisted bilayer graphene and the relationship between superconductivity and strongly insulating states caught my eye, and I need to read it carefully. The short version: While phase diagrams showing superconductivity and insulating states as a function of carrier density make it tempting to think that SC evolves out of the insulating states via doping (as likely in the cuprates), the situation may be more complicated.

# nanoscale views

A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?

## Sunday, December 08, 2019

### Brief items

Here are some tidbits that came across my eyeballs this past week:

## Saturday, November 30, 2019

### What is a liquid?

I wrote recently about phases of matter (and longer ago here). The phase that tends to get short shrift in the physics curriculum is the liquid, and this is actually a symptom indicating that liquids are not simple things.

We talk a lot about gases, and they tend to be simple in large part because they are low density systems - the constituents spend the overwhelming majority of their time far apart (compared to the size of the constituents), and therefore tend to interact with each other only very weakly. We can even look in the ideal limit of infinitesimal particle size and zero interactions, so that the only energy in the problem is the kinetic energy of the particles, and derive the Ideal Gas Law.

There is no such thing as an Ideal Liquid Law. That tells you something about the complexity of these systems right there.

A classical liquid is a phase of matter in which the constituent particles have a typical interparticle distance comparable to the particle size, and therefore interact strongly, with both a "hard core repulsion" so that the particles are basically impenetrable, and usually through some kind of short-ranged attraction, either from van der Waals forces and/or longer-ranged/stronger interactions. The kinetic energy of the particles is sufficiently large that they don't bond rigidly to each other and therefore move past and around each other continuously. However, the density is so high that you can't even do very well by only worrying about pairs of interacting particles - you have to keep track of three-body, four-body, etc. interactions somehow.

The very complexity of these strongly interacting collections of particles leads to the emergence of some simplicity at larger scales. Because the particles are cheek-by-jowl and impenetrable, liquids are about as incompressible as solids. The lack of tight bonding and enough kinetic energy to keep everyone moving means that, on average and on scales large compared to the particle size, liquids are homogeneous (uniform properties in space) and isotropic (uniform properties in all directions). When pushed up against solid walls by gravity or other forces, liquids take on the shapes of their containers. (If the typical kinetic energy per particle can't overcome the steric interactions with the local environment, then particles can get jammed. Jammed systems act like "rigid" solids.)

Because of the constant interparticle collisions, energy and momentum get passed along readily within liquids, leading to good thermal conduction (the transport of kinetic energy of the particles via microscopic, untraceable amounts we call heat) and viscosity (the transfer of transverse momentum between adjacent rough layers of particles just due to collisions - the fluid analog of friction). The lack of rigid bonding interactions means that liquids can't resist shear; layers of particles slide past each other. This means that liquids, like gases, don't have transverse sound waves. The flow of particles is best described by hydrodynamics, a continuum approach that makes sense on scales much bigger than the particles.

Quantum liquids are those for which the quantum statistics of the constituents are important to the macroscopic properties. Liquid helium is one such example. Physicists have also adopted the term "liquid" to mean any strongly interacting, comparatively incompressible, flow-able system, such as the electrons in a metal ("Fermi liquid").

Liquids are another example emergence that is deep, profound, and so ubiquitous that people tend to look right past it. "Liquidity" is a set of properties so well-defined that a small child can tell you whether something is a liquid by looking at a video of it; those properties emerge largely independent of the microscopic details of the constituents and their interactions (water molecules with hydrogen bonds; octane molecules with van der Waals attraction; very hot silica molecules in flowing lava); and none of those properties are obvious if one starts with, say, the Standard Model of particle physics.

## Monday, November 25, 2019

### General relativity (!) and band structure

Today we had a seminar at Rice by Qian Niu of the University of Texas, and it was a really nice, pedagogical look at this paper (arxiv version here). Here's the basic idea.

As I wrote about here, in a crystalline solid the periodic lattice means that single-particle electronic states look like Bloch waves, labeled by some wavevector \(\mathbf{k}\), of the form \(u_{\mathbf{k}}(\mathbf{r}) \exp(i \mathbf{k}\cdot \mathbf{r})\) where \(u_{\mathbf{k}}\) is periodic in space like the lattice. It is possible to write down semiclassical equations of motion of some wavepacket that starts centered around some spatial position \(\mathbf{r}\) and some (crystal) momentum \(\hbar \mathbf{k}\). These equations tell you that the momentum of the wavepacket changes with time as due to the external forces (looking a lot like the Lorentz force law), and the position of the wavepacket has a group velocity, plus an additional "anomalous" velocity related to the Berry phase (which has to do with the variation of \(u_{\mathbf{k}}\) over the allowed values of \(\mathbf{k}\)).

The paper asks the question, what are the semiclassical equations of motion for a wavepacket if the lattice is actually distorted a bit as a function of position in real space. That is, imagine a strain gradient, or some lattice deformation. In that case, the wavepacket can propagate through regions where the lattice is varying spatially on very long scales while still being basically periodic on shorter scales still long compared to the Fermi wavelength.

It turns out that the right way to tackle this is with the tools of differential geometry, the same tools used in general relativity. In GR, when worrying how the coordinates of a particle change as it moves along, there is the ordinary velocity, and then there are other changes in the components of the velocity vector because the actual geometry of spacetime (the coordinate system) is varying with position. You need to describe this with a "covariant derivative", and that involves Christoffel symbols. In this way, gravity isn't a force - it's freely falling particles propagating as "straight" as they can, but the actual geometry of spacetime makes their trajectory look curved based on our choice of coordinates.

For the semiclassical motion problem in a distorted lattice, something similar happens. You have to worry about how the wavepacket evolves both because of the local equations of motion, and because the wavepacket is propagating into a new region of the lattice where the \(u_{\mathbf{k}}\) functions are different because the actual lattice is different (and that also affects the Berry phase anomalous velocity piece). Local rotations of the lattice can lead to an affective Coriolis force on the wavepacket; local strain gradients can lead to effective accelerations of the wavepacket.

(For more fun, you can have temporal periodicity as well. That means you don't just have Bloch functions in 3d, you have Bloch-Floquet functions in 3+1d, and that's where I fell behind.)

Bottom line: The math of general relativity is an elegant way to look at semiclassical carrier dynamics in real materials. I knew that undergrad GR course would come in handy....

As I wrote about here, in a crystalline solid the periodic lattice means that single-particle electronic states look like Bloch waves, labeled by some wavevector \(\mathbf{k}\), of the form \(u_{\mathbf{k}}(\mathbf{r}) \exp(i \mathbf{k}\cdot \mathbf{r})\) where \(u_{\mathbf{k}}\) is periodic in space like the lattice. It is possible to write down semiclassical equations of motion of some wavepacket that starts centered around some spatial position \(\mathbf{r}\) and some (crystal) momentum \(\hbar \mathbf{k}\). These equations tell you that the momentum of the wavepacket changes with time as due to the external forces (looking a lot like the Lorentz force law), and the position of the wavepacket has a group velocity, plus an additional "anomalous" velocity related to the Berry phase (which has to do with the variation of \(u_{\mathbf{k}}\) over the allowed values of \(\mathbf{k}\)).

The paper asks the question, what are the semiclassical equations of motion for a wavepacket if the lattice is actually distorted a bit as a function of position in real space. That is, imagine a strain gradient, or some lattice deformation. In that case, the wavepacket can propagate through regions where the lattice is varying spatially on very long scales while still being basically periodic on shorter scales still long compared to the Fermi wavelength.

It turns out that the right way to tackle this is with the tools of differential geometry, the same tools used in general relativity. In GR, when worrying how the coordinates of a particle change as it moves along, there is the ordinary velocity, and then there are other changes in the components of the velocity vector because the actual geometry of spacetime (the coordinate system) is varying with position. You need to describe this with a "covariant derivative", and that involves Christoffel symbols. In this way, gravity isn't a force - it's freely falling particles propagating as "straight" as they can, but the actual geometry of spacetime makes their trajectory look curved based on our choice of coordinates.

For the semiclassical motion problem in a distorted lattice, something similar happens. You have to worry about how the wavepacket evolves both because of the local equations of motion, and because the wavepacket is propagating into a new region of the lattice where the \(u_{\mathbf{k}}\) functions are different because the actual lattice is different (and that also affects the Berry phase anomalous velocity piece). Local rotations of the lattice can lead to an affective Coriolis force on the wavepacket; local strain gradients can lead to effective accelerations of the wavepacket.

(For more fun, you can have temporal periodicity as well. That means you don't just have Bloch functions in 3d, you have Bloch-Floquet functions in 3+1d, and that's where I fell behind.)

Bottom line: The math of general relativity is an elegant way to look at semiclassical carrier dynamics in real materials. I knew that undergrad GR course would come in handy....

## Friday, November 22, 2019

### Recent results on the arxiv

Here are a few interesting results I stumbled upon recently:

- This preprint has become this Science paper that was published this week. The authors take a cuprate superconductor (YBCO) and use reactive ion etching to pattern an array of holes in the film. Depending on how long they etch, they can kill global superconductivity but leave the system such that it still behaves as a funny kind of metal (resistance decreasing with decreasing temperature), with some residual resistance at low temperatures. The Hall effect in this metallic state produces no signal - a sign that the is balance between particle-like and hole-like carriers (particle-hole symmetry). For magnetic field perpendicular to the film, they also see magnetoresistance with features periodic in flux through one cell of the pattern, with a periodicity that indicates the charge carriers have charge 2
*e*. This is an example of a "Bose metal". Neat! (The question about whether there are pairs without superconductivity touches on our own recent work.) - This preprint was recently revised (and thus caught my eye in the arxiv updates). In it, the authors are using machine learning to try to find new superconductors. The results seem encouraging. I do wonder if one could do a more physics-motivated machine learning approach (that is, something with an internal structure to the classification scheme and the actual weighting procedure) to look at this and other related problems (like identifying which compounds might be growable via which growth techniques).
- This preprint is not a condensed matter topic, but has gotten a lot of attention. The authors look at a particular nuclear transition in 4He, and find a peculiar angular distribution for the electron-positron pairs that come out. The reason this is of particular interest is that this paper by the same investigators looking at a nuclear transition in 8Be three years ago found something very similar. If one assumes that there is a previously unobserved boson (a dark matter candidate perhaps) of some sort with a mass of around 17 MeV that couples in there, that could explain both results. Intriguing, but it would be great if these observations were confirmed independently by a different group.

## Tuesday, November 12, 2019

### Advice on proposal writing

Many many people have written about how to write scientific grant proposals, and much of that advice is already online. Rather than duplicate that work, and recognizing that sometimes different people need to hear advice in particular language, I want to link to some examples.

This last one has taken a pre-eminent position of importance because it's something that can be readily counted and measured. There is a rough rule that many program officers in NSF and DOE will tell you; averaging over their programs, they get roughly one high impact paper per $100K total cost. They would like more, of course.

I'll confess: I got this organizational approach by emulating the structure of an excellent proposal that I reviewed a number of years ago. It was really terrific - clear; pedagogical, so that a non-expert in that precise area could understand the issues and ideas; very cleanly written; easy-to-read figures, including diagrams that really showed how the ideas would work.

I could write more, and probably will amend this down the line, but work calls and this is at least a start.

- Here (pdf) is some advice straight from the National Science Foundation about how to write a compelling proposal. It's older (2004) and a bit out of date, but the main points are foundational.
- This is a very good collection of advice that has been updated (2015) to reflect current practice about NSF.
- Here are lecture notes from a course at Illinois that touched on this as well, generalizing beyond the NSF.

This last one has taken a pre-eminent position of importance because it's something that can be readily counted and measured. There is a rough rule that many program officers in NSF and DOE will tell you; averaging over their programs, they get roughly one high impact paper per $100K total cost. They would like more, of course.

**. Program officers (including foundational ones) tend to take real pride in their portfolios. Everyone likes funding successful, high-impact, exciting, trend-setting work. Still, particular program officers have areas of emphasis, in part so that there is not duplication of effort or support within an agency or across agencies. (This is especially true in areas like high energy theory, where if you've got DOE funding, you essentially can't get NSF support, and vice versa.) You will be wasting your time if you submit to the wrong program or pitch your idea to the wrong reviewing audience. NSF takes a strong line that their research directions are broadly set by the researchers themselves, via their deep peer review process (mail-in reviews, in-person or virtual panel discussions) and workshops that define programmatic goals. DOE likewise has workshops to help define major challenges and open questions, though my sense is that the department takes a more active role in delineating priorities. The DOD is more goal-directed, with program officers having a great deal of sway on topics of interest, and the prospect that such research may transition closer to technology-readiness. Foundations are idiosyncratic, but a common refrain is that they prefer to fund topics that are not already supported by federal agencies.***Talk with program officers before writing and submitting - know the audience**Think it through, and think like a referee*. When coming up with an idea, do your best to consider in some detail how you would actually pull this off. How could you tell if it works? What would the implications be of success? What are the likely challenges and barriers? If some step doesn't go as planned, is it a show-stopper, or are their other ways to go? As an experimentalist: Do you have the tools you need to do this? How big a signal are you trying to detect? Remember, referees are frequently asked to evaluate strengths and weaknesses of technical approach. Better to have this in mind while at an early stage of the process.*Clearly state the problem, and explain the proposal's organization*. Reviewers might be asked to read several proposals in a short timeframe. It seems like a good idea to say up front, in brief (like in a page or so): What is the problem? What are the open scientific/engineering questions you are specifically addressing? What is your technical approach? What will the results mean? Then, explain the organization of the proposal (e.g., section 2 gives a more detailed introduction to the problem and open questions; section 3 explains the technical approach, including a timeline of proposed work; etc.). This lets readers know where to find things.I'll confess: I got this organizational approach by emulating the structure of an excellent proposal that I reviewed a number of years ago. It was really terrific - clear; pedagogical, so that a non-expert in that precise area could understand the issues and ideas; very cleanly written; easy-to-read figures, including diagrams that really showed how the ideas would work.

*Reviewing proposals*is very helpful in improving your own. Very quickly you will get a sense of what you think makes a good or bad proposal. NSF is probably the most open to getting new investigators involved in the reviewing process.*Don't wait until the last minute*. You know that classmate of yours from undergrad days, the one who used to brag about how they waited until the night before to blitz through a 20 page writing assignment? Amazingly, some of these people end up as successful academics. I genuinely don't know how they do it, because these days research funding is so competitive and proposals are detailed and complicated. There are many little formatting details that agencies enforce now. You don't want to get to an hour before the deadline and realize that all of your bibliographic references are missing a URL field. People really do read sections like data management plans and postdoctoral mentoring plans - you can't half-ass them. Also, while it is unlikely to sink a really good proposal, it definitely comes across badly to referees if there are missing or mislabeled references, figures, etc.I could write more, and probably will amend this down the line, but work calls and this is at least a start.

## Thursday, November 07, 2019

### Rice Academy of Fellows 2020

As I had posted a year ago: Rice has a university-wide competitive postdoctoral fellow program known as the Rice Academy of Fellows. Like all such things, it's very competitive. The new application listing has gone live here with a deadline of January 3, 2020. Applicants have to have a faculty mentor, so in case someone is interested in working with me on this, please contact me via email. We've got some fun, exciting stuff going on!

## Friday, November 01, 2019

### Sorry for the hiatus

My apologies for the unusually long hiatus in posts. Proposal deadlines + department chair obligations + multiple papers in process made the end of October very challenging. Later next week I expect to pick up again. Suggested topics (in the comments?) are always appreciated. I realize I've never written an advice-on-grant-proposal-writing post. On the science side, I'm still mulling over the most accessible way to describe quantum Hall physics, and there are plenty of other "primer" topics that I should really write at some point.

If I hadn't been so busy, I would've written a post during the baseball World Series about how the hair of Fox Sports broadcaster Joe Buck is a study in anisotropic light scattering. Viewed straight on, it's a perfectly normal color, but when lit and viewed from an angle, it's a weirdly iridescent yellow - I'm thinking that this really might have interesting physics behind it, in the form of some accidental structural color.

If I hadn't been so busy, I would've written a post during the baseball World Series about how the hair of Fox Sports broadcaster Joe Buck is a study in anisotropic light scattering. Viewed straight on, it's a perfectly normal color, but when lit and viewed from an angle, it's a weirdly iridescent yellow - I'm thinking that this really might have interesting physics behind it, in the form of some accidental structural color.

Subscribe to:
Posts (Atom)