Search This Blog

Monday, March 15, 2021

APS March Meeting, Day 1

As in past years, I'm going to try to give a few highlights of talks that I saw "at" the APS March Meeting.  Historically these are a blend of talks that usually have some connection to research topics that interest me, and subjects that I think are likely to be important or presented by particularly good speakers.  The meeting being virtual this year presents challenges.  On the one hand, because a very large fraction of the talks are being recorded, in principle I should be able to go back and watch anything that I otherwise would miss due to scheduling collisions or other commitments.  On the other hand, not traveling means that it's very hard to truly concentrate on the meeting without local work demanding some attention.  

(To simulate the true March Meeting experience, I was tempted to spend $4.50 on some terrible coffee this morning, and $11 on a slice of turkey, a slice of cheese, a sad slice of tomato, and a wilted lettuce leaf on white bread for lunch.)

  • Tim Hugo Taminiau from Delft presented a neat talk about using (the electron spins of) NV centers in diamond to examine and control 13C nuclear spins.  Through very impressive pulse sequences based on NMR techniques plus machine learning, his group has been able to determine the locations and couplings of tens of nuclear spins, and controllably create and manipulate entanglement among them.
  • Markus Raschke from Colorado gave a very nice presentation showcasing the impressive work that his group has done using the plasmonic resonance of a gold tip to do cavity quantum electrodynamics with individual emitters.   Even though the plasmonic cavity is leaky (low \(Q\)), the mode volume is tiny compared with the wavelength (\(V_{m} \sim 10^{-6} \lambda^{3}\)).  This lets them get into the strong coupling regime, with big splittings of the excitonic emission peaks in quantum dots and clear detection of the plexitonic (or polaritonic, depending on your terminology) states.
  • There was a nice session about strange metals, but I had to pop in and out of it.  One particularly interesting talk was given by Philip Phillips, who spoke about Noether's theorem(s) and the demise of charge quantization in the strange metal - see here.  (This relates to an experiment I'm very interested in trying.)  This talk also featured an unscheduled interruption for the first APS/Marvel's WandaVision crossover (see image).
  • Late in the day I was able to catch most of Bart van Wees's talk about spin transport in magnetic insulators, including the spin Seebeck effect.  The basic measurement approach is this one, using the inverse spin Hall effect to detect an incoming current of magnons driven either by spin injection or by a temperature gradient.  They have applied this approach to examine a number of material systems, including van der Waals antiferromagnets and the van der Waals Ising magnet CrBr3.  In the latter case, because the material is so chemically reactive, they had to do some clever sample fabrication to encapsulate it in hBN while countersinking their Pt spin Hall electrodes.
  • I also managed to see Bob Willett's talk about showing actual interferometric demonstration of non-Abelian statistics at the \(\nu = 5/2\) and \(7/2\) fractional quantum Hall states.  These devices are amazing in that they preserve the material quality despite challenging fabrication, and the experiments are about the clearest evidence you can have for exotic fractional charge and statistics in these systems.
There are some other talks from today that I want to see, but they will have to wait.  The virtual meeting format is ok, but there really is no substitute for talking to people face to face.  

Wednesday, March 10, 2021

Items leading into the APS March Meeting

This will be the first virtual APS March Meeting.  It's also taking place at a time when many universities in the US have eliminated spring recess, and no one is getting out of town for the conference.  This means that faculty and students are going to try to balance attending virtual talks and some level of networking/social interaction along with the usual business of the university.  Between that and the reluctance to sit immobile in front of a screen for many hours at a time, it will be interesting to see how this goes.  Here is the information available so far on how the meeting is actually going to work in terms of zoom/web access/discussions.  More information is reportedly on the way.

In the meantime, the biggest condensed matter news item of the week is the retraction of the Majorana fermion paper discussed here.  

  • The official investigative panel report on this matter is available here.  The panelists detail multiple issues with the paper, and conclude that "the most plausible explanation [is] that the authors were caught up in the excitement of the moment, and were themselves blind to the data that did not fit the goal they were striving for. They have "fooled themselves" in the way forewarned by Feynman in the speech we quoted at the beginning of section 3."
  • Another analysis is here.  An inescapable conclusion is that making the data sets available greatly helped in figuring out what went on here.  
  • Here is a youtube video that goes over this from the technical perspective.
In other news:
  • Here is an updated version of a paper by my postdoctoral mentor showing interferometric evidence for the braiding of other exotic quasiparticles.   This is an implementation of ideas related to these proposals (1, 2).  One point of commonality with the Majorana ideas:  exquisitely clean material is needed to see the interesting physics, and preserving that lack of disorder when fabricating devices is really hard.



Wednesday, March 03, 2021

Undergraduate labs - quick survey

 I've already posed this survey on a mailing list of US physics + P&A department chairs, but more information would certainly be helpful.  At major US universities,  I'm trying to do a bit of a survey about how departments staff their undergraduate introductory labs (both the physics-for-engineers/majors sequence and the physics-for-biosciences/premeds sequence).  If you have this information and can provide it and identify the university, I would be appreciative.

1) Do you have traditional-style intro labs, or a more active learning/discovery-based/modern pedagogy approach?

2) How many undergrads per lab section, how do they work (e.g. groups of 2) and how many lab TAs (or equivalent) per lab section?

3) Who is doing the supervision - what combo of graduate lab TAs, undergrad lab TAs, NTT instructors?

I've heard back from about 8 programs so far, but more would be helpful.  If you would prefer emailing me rather than using the comments, that's fine as well.

Friday, February 26, 2021

And more items of interest

 Here are some outreach/popularization tidbits:

Wednesday, February 24, 2021

Brief items

Here are a few items I came across in the last few days that may be of broader interest:

Sunday, February 21, 2021

Grad school admissions this year

Based on conversations with my colleagues at my institution and across the US, graduate program application rates in the US seem to be up quite a bit this year, including in physics and astronomy.  This is happening at the same time that many graduate programs are still working to handle the exceptional circumstances that arose due to the pandemic.  These include: 

  • lower graduation rates (as students are slower to graduate when there is increased uncertainty in the post-degree employment market, academic or otherwise); 
  • continued visa challenges with international students (e.g., students who have enrolled remotely from outside the US in fall '20 but have not yet been able to get here, and therefore may well need extra time to affiliate with a research advisor once they get to the US, presumably in the late spring or summer); 
  • restricted budgets to support existing and incoming students (especially at some public universities whose finances have been hardest hit by the pandemic-related economic fallout)
This whole mess increases the stress on graduate applicants by making an already fraught process even more competitive, in the sense of more people vying for fewer openings.  Graduate admissions is a complicated process driven very strongly by detailed needs that are often not visible to the applicant (e.g., if researchers in an area don't have a need for more students in a given year, something that may not be clear until January, admissions offers in that area are going to be limited).  I hope people know this, but it's worth stating explicitly:  Not getting admitted to a program is about the fit at the time between the needs of the program and the particular profile of the applicant, not a vote on anyone's worth as a scientist or person.  

For additional reference, here is the post I made last year about choosing a graduate program.

Sunday, February 14, 2021

Majoranas - a brief follow-up

As you can always tell by the frequency of my posting, work-related activities have been dominating my schedule of late.  In addition to the usual stuff (papers, proposals, the normal academic activities), this is the time of year when as department chair there are deadlines and activities associated with faculty and staff evaluations, departmental budgets, graduate admissions, teaching assignments for next year, etc.  Still, in the wake of this article from Wired and some breathless reactions in the news and social media, it's worth following up my prior post on the topic of solid state implementations of Majorana fermions and what the pending retraction of this paper means. 

There are two main issues.  First, it has become clear that it can be very challenging to achieve the experimental conditions needed to have clear, unambiguous evidence of Majorana quasiparticles in the superconductor/semiconductor nanowire architecture.  This is explained in detail here, for example.   The interface quality of the semiconductor and of the semiconductor/superconductor boundary is extremely important, as disorder can lead to various confounding effects.  Interfaces are notoriously challenging.  ("God created the bulk; surfaces were invented by the devil." - Pauli)  There is no reason to think that it is impossible to reach the cleanliness level needed to see Majoranas in this type of structure, but like many material-related problems, this seems like it will require even more effort. 

Second is the particular issue of data presentation in this paper and whether it was misleading.  I have not personally looked at this in depth, but others have (twitter thread).  Snipping out segments of gate voltage without making that clear, and only plotting a limited range of gate voltage (leaving out where the conductance exceeds what is supposed to be the limiting value), is problematic.  

It's important to separate these two issues.  The issues with this particular paper are not a reason to stop this experimental approach or give up trying to confirm Majoranas this way.  It's just hard, the community isn't there just yet, and this is a cautionary tale about triumphal press releases.

Tuesday, February 02, 2021

Bringing modern industrial nanofab to quantum computing

One big selling point of solid-state quantum computing platforms is the notion of scalability. The semiconductor industry has spent billions of dollars and millions of person-hours developing the capability of fabricating tens of billions of nanoscale electronic components in parallel with remarkable reliability.   Surely it's not crazy to think that this will be the key to creating large numbers of functioning qubits as well.

Like many ideas that look plausible at first glance, this becomes very complicated under greater scrutiny.  Many of the approaches that people have in mind for solid-state quantum computing are not necessarily compatible with the CMOS manufacturing processes that produced the chips powering the computer you're currently using.  Virtually all of the university groups working on these systems use university-type fabrication methods - electron beam lithography for patterning, lift-off processing, etc.  In contrast, industrial chip makers use very different processes: elaborate immersion photolithography, subtractive patterning, and a whole host of clever tricks that have driven forward the mass production of silicon nanoelectronics.  The situation gets even worse in terms of materials development if one considers attempts to use more exotic systems.  The most reasonable quantum computing platform to approach first, if one is worried about industrial compatibility is probably using spins in gate-defined quantum dots in silicon.  

A team from Delft and Intel has done just that, as shown in this preprint.  They successfully demonstrate basic single-qubit effects like Rabi oscillations in single spins in quantum dots (single-electron transistors) defined in FinFETs, which they have patterned across a full 300 mm wafer (!) of isotopically pure 28Si (to avoid decoherence issues associated with nuclear spin).  They present data (which I have not read carefully) about how reproducible the properties of the single-electron transistors are across the wafer.   
The contrast between Si quantum devices
produced through university fab(top) and 
elite industrial fab (bottom).

I think the figure here from their paper's supplementary material really shows the point in terms of fabrication methods.  At the top is a cross-sectional TEM image of a chain of quantum dot devices, where the bright lumpy features are the defining metal gates that were patterned by e-beam lithography and deposited by lift-off processing.  In contrast, at the bottom is a cross-sectional TEM of the nominally equivalent industrially made device.  Behold the result of the accumulation of decades of technique and experience.

Of course, they were able to do this because Intel decided that it was worth it to invest in developing the special purpose masks and the process flow necessary.   Universities ordinarily don't have access to the equipment or the specialists able to do this work.  This makes me wonder again, as I have several times over the years, whether it would have been worthwhile for DOE or NSF to have set up (perhaps with Intel or IBM as a public-private partnership) some fabrication hub that would actually give the broader university research community access to these capabilities and this expertise.   It would be very expensive, but it might have pushed technology farther ahead than having several "nanocenters" that don't necessarily have technology much different than what is available at the top two dozen university cleanrooms.  

 


Wednesday, January 27, 2021

Zero bias peaks - an example of the challenge of experimental (condensed matter) physics

The puzzle-solving aspect of experimental physics is one reason why it can fun, but also why it can be very challenging.  In condensed matter, for example, we have limited experimental tools and can only measure certain quantities (e.g., voltages, currents, frequencies)  in the lab, and we can only tune certain experimental conditions (e.g., temperature, applied magnetic field, voltages on electrodes).  Getting from there to an unambiguous determination of underlying physics can be very difficult.

For example, when measuring electronic conduction in nanostructures, often we care about the differential conductance, \(dI/dV\), as a function of the bias voltage \(V\) applied across the system between a source and a drain electrode.  In an ideal resistor, \(dI/dV\) is just a constant as a function of the bias.  "Zero bias" \( (V = 0) \) is a special situation, when the electronic chemical potential (the Fermi level, at \(T = 0\)) of the source and drain electrodes are the aligned.  In a surprisingly large number of systems, there is some feature in \(dI/dV\) that occurs at \(V= 0\).  The zero-bias conductance \( (dI/dV)(V=0)\) can be suppressed, or it can be enhanced, relative to the high bias limit.  These features are often called "zero bias anomalies", and there are many physical mechanisms that can produce them.  

For example, In conduction through a quantum dot containing an odd number of electrons, at sufficiently low temperatures there can be a zero-bias peak in the conductance due to the Kondo Effect, where magnetic processes lead to forward-scattering of electrons through the dot when the Fermi levels are aligned.  This Kondo resonance peak in \(dI/dV\) has a maximum possible height of \(2e^2/h\), and it splits into two peaks in a particular way as a magnetic field is applied.  In superconducting systems, Andreev processes can lead to zero bias peaks that have very different underlying physics, and different systematic dependences on magnetic field and voltage.

Zero bias anomalies have taken on a new significance in recent years because they are one signature that is predicted for solid-state implementations of Majorana fermions involving superconductors connected to semiconductor nanowires.   These exotic quasiparticles have topological properties that make them appealing as a possible platform for quantum computingObservations of zero bias anomalies in these structures have attracted enormous attention for this reason.  

The tricky bit is, it has become increasingly clear that it is extremely difficult to distinguish conclusively between "Majorana zero modes" and cousins of the Andreev features that I mentioned above.  As I mentioned in my last post, there is a whole session at the upcoming APS meeting about this, recent papers, and now a retraction of a major claim in light of new interpretation.  It's a fascinating challenge that shows just how tricky these experiments and their analysis can be!  This stuff is just hard.

(Posting will likely continue to be slow - this is the maximally busy time of the year as department chair....)

Monday, January 18, 2021

Brief items, new year edition

 It's been a busy time, but here are a few items for news and discussion:

  • President-Elect Biden named key members of his science team, and for the first time ever has elevated the role of Presidential Science Advisor (and head of the White House Office of Science and Technology Policy) to a cabinet-level position.  
  • The President-Elect has also written a letter to the science advisor, outlining key questions that he wants to be considered.  
  • There is talk of a "Science New Deal", unsurprisingly directed a lot toward the pandemic, climate change, and American technological competitiveness.
  • The webcomic SMBC has decided to address controversy head on, reporting "Congressman Johnson comes out against Pauli Exclusion."  This would have rather negative unintended consequences, like destabilizing all matter more complex than elementary particles....
  • This session promises to be an interesting one at the March APS meeting, as it goes right to the heart of how difficult it is to distinguish Majorana fermion signatures in superconductor/semiconductor hybrid structures from spurious electrical features.  I may try to write more about this soon.
  • This paper (arxiv version) is very striking.  Looking in the middle of a sheet of WTe2 (that is, away from where the topological edge states live), the authors see quantum oscillations of the resistance as a function of magnetic field that look a lot like Landau quantization, even though the bulk of the material is (at zero field) quite insulating.  I need to think more carefully about the claim that this argues in favor of some kind of emergent neutral fermions.
  • Being on twitter for four months has made me realize how reality-warping that medium is.  Reading about science on twitter can be incredibly wearing - it feels like seemingly everyone else out there is publishing in glossy journals, winning major prizes, and landing huge grants.  This is, of course, a selection effect, but I don't think it's healthy.
  • I do think twitter has driven blog traffic up a bit, but I actually wonder if occasionally posting blog links to /r/physics on reddit would be far more effective in terms of outreach.  When one of my posts ends up there, it gets literally 50x the page views than normal.  Still, I have an old-internet-user aversion to astroturfing.  

Saturday, January 09, 2021

Questions that show who you are as a physicist

There are some cool new physics and nanoscience results out there, but after a frankly absurd week (in which lunatics stormed the US Capitol, the US reached 4000 covid deaths per day, and everything else), we need something light.  Stephen Colbert has started a new segment on his show called "The Colbert Questionert" (see an example here with Tom Hanks - sorry if that's region-coded to the US), in which he asks a list of fifteen questions that (jokingly) reveal the answerer's core as a human being.   These range from "What is your favorite sandwich?" to "What do you think happens when you die?".  Listening to this, I think we need some discipline-specific questions for physicists.  Here are some initial thoughts, and I'd be happy to get more suggestions in the comments.  

  • Food that you eat when traveling to a conference or talk but not at home?
    • Science fiction - yes or no?
    • What is your second-favorite subdiscipline of physics/astronomy/science?
    • Favorite graph:  linear-linear? Log-log?  Log-linear?  Double-log?  Polar?  Weird uninterpretable 3D representation that would make Edward Tufte's head explode?
    • Lagrangian or Hamiltonian?
    • Bayesian or frequentist?
    • Preferred interpretation of quantum mechanics/solution to the measurement problem?

    Friday, January 01, 2021

    Idle speculation can teach physics, vacuum energy edition

    To start the new year, a silly anecdote ending in real science.

    Back when I was in grad school, around 25 years ago, I was goofing around chatting with one of my fellow group members, explaining about my brilliant (ahem) vacuum energy extraction machine.  See, I had read this paper by Robert L. Forward, which proposed an interesting idea, that one could use the Casimir effect to somehow "extract energy from the vacuum" - see here (pdf).  
    Fig from here.


    (For those not in the know: the Casimir effect is an attractive (usually) interaction between conductors that grows rapidly at very small separations.  The non-exotic explanation for the force is that it is a relativistic generalization of the van der Waals force.  The exotic explanation for the force is that conducting boundaries interact with zero-point fluctuations of the electromagnetic field, so that "empty" space outside the region of the conductors has higher energy density.   As explained in the wiki link and my previous post on the topic, the non-exotic explanation seemingly covers everything without needing exotic physics.)

    Anyway, my (not serious) idea was, conceptually, to make a parallel plate structure where one plate is gold (e.g.) and the other is one of the high temperature superconductors.  Those systems are rather lousy conductors in the normal state.  So, the idea was, cool the system just below the superconducting transition.  The one plate becomes superconducting, leading ideally to dramatically increased Casimir attraction between the plates.  Let the plates get closer, doing work on some external load.  Then warm the plates just slightly, so that the superconductivity goes away.  The attraction should lessen, and the plate would spring back, doing less work of the opposite sign.  It's not obvious that the energy required to switch the superconductivity is larger than the energy one could extract from running such a cycle.   Of course, there has to be a catch (as Forward himself points out in the linked pdf above).  In our conversation, I realized that the interactions between the plates would very likely modify the superconducting transition, probably in just the way needed to avoid extracting net energy through this process.  

    Fast forward to last week, when I randomly came upon this article.  Researchers actually did an experiment using nanomechanical resonators to try to measure the impact of the Casimir interactions on the superconducting transition in (ultrasmooth, quench-condensed) lead films.  They were not able to resolve anything (like a change in the transition temperature) in this first attempt, but it shows that techniques now exist to probe such tiny effects, and that idly throwing ideas around can sometimes stumble upon real physics.


    Wednesday, December 30, 2020

    End of the year, looking back and looking forward

     A few odds and ends at the close of 2020:

    • This was not a good year, for just about anyone.  Please, let's take better care of each other (e.g.) and ourselves!  
    • The decision to cancel the in-person 2020 APS March Meeting looks pretty darn smart in hindsight.
    • Please take a moment and consider how amazing it is that in less than a year, there are now multiple efficacious vaccines for SARS-Cov-2, using different strategies, when no one had ever produced a successful vaccine for any coronavirus in the past.  Logistical problems of distribution aside, this is a towering scientific achievement.  People who don't "believe" in vaccines, yet are willing to use (without thinking) all sorts of other scientific and engineering marvels, are amazing to me, and not in a good way.  For a compelling book about this kind of science, I again recommend this one, as I had done ten years ago.
    • I also recommend this book about the history of money.  Fascinating and extremely readable.  It's remarkable how we ended up where we are in terms of fiat currencies, and the fact that there are still fundamental disagreements about economics is both interesting and sobering.
    • As is my habit, I've been thinking again about the amazing yet almost completely unsung intellectual achievement that is condensed matter physics.  The history of this is filled with leaps that are incredible in hindsight - for example, the Pauli principle in 1925, the formulation of the Schroedinger equation in 1926, and Bloch's theorem for electrons in crystals in 1928 (!!).  I've also found that there is seemingly only one biography of Sommerfeld (just started it) and no book-length biography of Felix Bloch (though there are this and this).  
    • Four years ago I posted about some reasons for optimism at the end of 2016.  Globally speaking, these are still basically valid, even if it doesn't feel like it many days.  Progress is not inevitable, but there is reason for hope.
    Thanks for reading, and good luck in the coming year.  

    Saturday, December 19, 2020

    The physics of beskar

     In keeping with my previous posts about favorite science fiction condensed matter systems and the properties of vibranium, I think we are overdue for an observational analysis of the physics of beskar.  Beskar is the material of choice of the Mandalorians in the Star Wars universe.  It is apparently an alloy (according to wookiepedia), and it is most notable for being the only material that can resist direct attack by lightsaber, as well as deflecting blaster shots.   

    Like many fictional materials, beskar has whatever properties are needed to drive the plot and look cool doing so, but it's still fun to think about what would have to be going on in the material for it to behave the way it appears on screen.  

    In ingot form, beskar looks rather like Damascus steel (or perhaps Valyrian steel, though without the whole dragonfire aspect).  That's a bit surprising, since the texturing in damascene steel involves phase separation upon solidification from the melt, while the appearance of beskar is homogeneous when it's in the form of armor plating or a spear.  From the way people handle it, beskar seems to have a density similar to steel, though perhaps a bit lower.

    Beskar's shiny appearance says that at least at optical frequencies the material is a metal, meaning it has highly mobile charge carriers.  Certainly everyone calls it a metal.  That is interesting in light of two of its other obvious properties:  An extremely high melting point (we know that lightsabers can melt through extremely tough metal plating as in blast doors); and extremely poor thermal conductivity.  (Possible spoilers for The Mandalorian S2E8 - it is possible to hold a beskar spear with gloved hands mere inches from where the spear is visibly glowing orange.)  Because mobile charge carriers tend to conduct heat very well (see the Wiedemann Franz relation), it's tricky to have metals that are really bad thermal conductors.  This is actually a point consistent with beskar being an alloy, though.  Alloys tend to have higher electrical resistivity and poorer thermal conduction than pure substances.  

    The high melting temperature is consistent with the nice acoustic properties of beskar (as seen here, in S2E7), and its extreme mechanical toughness.  The high melting temperature is tricky, though, because there is on-screen evidence that beskar may be melted (for forging into armor) without being heated to glowing.  Indeed, at about 1:02 in this video, the Armorer is able to melt a beskar ingot at the touch of a button on a console.  This raises a very interesting possibility, that beskar is close to a solid-liquid phase transition that may be tuned to room temperature via a simple external parameter (some externally applied field?).  This must be something subtle, because otherwise you could imagine anti-beskar weapons that would turn Mandalorian armor into a puddle on the floor.  

    Regardless of the inconsistencies in its on-screen portrayal (which are all minor compared to the way dilithium has been shown), beskar is surely a worthy addition to fictional materials science.  This is The Way.

     

    Thursday, December 17, 2020

    Brief items

    Here are a few interesting links as we look toward the end of a long year:

    • Brian Skinner of Gravity and Levity has a long and excellent thread on twitter about cool materials.
    • Subir Sachdev at Harvard has put his entire semester's worth of lectures on youtube for his course on Quantum Phases of Matter
    • New data on stellar distances makes the Hubble constant problem even worse, as explained in this nice article by the reliably excellent Natalie Wolchover.
    • In case you were wondering, we are nowhere near done with magnetic tape as a storage medium, especially since it's comparatively cheap and can now hold 317 Gb/in2.
    • If you aren't impressed by SpaceX's initial flight test of their latest rocket, I don't know what to say to you.  They were trying several brand new things at the same time, and almost got it all to work on the first try.  FYI, the green exhaust at the end is from the engine running hot and fuel-deprived, so the oxygen is burning the copper alloy engine lining.
    • This paper uses nanomechanical resonators immersed in fluid to study Brownian motion.  The resonator is getting kicked randomly by collisions with the fluid molecules, and looking at the noise in the displacement is a neat probe of the fluid's dynamics.  
    • In this paper, the authors are able to resolve inelastic electron tunneling spectra even above room temperature.  That's actually very surprising!
    • Here is a perspective article about plasmonic catalysis, trying to drive chemical reactions by optical excitation of collective electronic modes in conductive nanostructures.  

    Thursday, December 10, 2020

    Photonic quantum supremacy, protein folding, and "computing"

    In the last week or two, there have been a couple of big science stories that I think raise some interesting issues about what we consider to be scientific computing.

    In one example, Alphafold, a machine learning/AI approach to predicting protein structure, has demonstrated that it is really good at predicting protein structure.  Proteins are polymers made up of sequences of many amino acids, and in biological environments they fold up into complex shapes (with structural motifs like alpha helices and beta sheets) held together by hydrogen bonds. Proteins do an amazing amount of critical things in organisms (like act as enzymes to promote highly specific chemical reactions, or as motor units to move things around, or to pump specific ions and molecules in and out of cells and organelles).  Their ability to function in the wet, complex, constantly fluctuating biological environment is often dependent on minute details in their folded structure.  We only know snapshots of the structures of only some proteins because actually getting the structure requires crystallizing the protein molecules and performing high precision x-ray diffraction measurements on those crystals.  The challenge of understanding how proteins end up in particular functional structures based on their amino acid sequence is called the protein folding problem.  The statistical physics of folding is complex but usefully considered in terms of free energy landscapes.  It is possible to study large numbers of known protein structures and look for covariances (see here), correlations in sequences that show up commonly across many organisms.  Alphafold was trained on something like 100,000 structures and associated data, and is now good enough at predicting structures that it can actually allow people to solve complex x-ray diffraction data that was previously not analyzable, leading to new solved structures.  

    This is very nice and will be a powerful tool, though like all such news stories one should be wary of the hype.  It does raise questions, and I would like to hear from experts:  Do we actually have greater foundational understanding of protein structure now?  Or have we created an extraordinarily effective interpolational look-up table?  It's useful either way, but the former might have more of an impact on our ability to understand the dynamics of proteins.  

    That's a lot of optical components!
    The second big story of the week is the photonic quantum supremacy achievement by a large group from USTC in China.  Through a very complex arrangement of optical components (see image), they report to have used boson sampling to determine statistical information about the properties of matrices at a level that would take an insanely long time with a classical computer.  Here, as with google's quantum supremacy claim (mentioned here), I again have to ask:  This is an amazing technical achievement, but is it really a computation, as opposed to an analog emulation or simulation?  If I filmed cream being stirred into coffee, and I analyzed the images to infer the flow of energy down to smaller and smaller length scales, I would describe that as an experiment, not as me doing a computation to solve the Navier-Stokes equations (which would also be very challenging to do with high precision on a classical computer).  Perhaps its splitting hairs, and quantum simulation is very interesting, but it does seem distinct to me from what most people would call computing.

    Anyway, between AI/ML and quantum information sciences, it is surely an exciting time in the world of computing, broadly construed. 

    (Sorry for the slow posting - end of semester grading + proposal writing have taken a lot of time.)

    Saturday, November 28, 2020

    Brief items

     Several items of note:

    • Quanta Magazine remains a generally outstanding source of science articles and opinion pieces.  In this opinion column,  high energy theorist Robert Dijkgraaf gives his views on whether we are reaching "the end of physics".  Spoilers:  he thinks not, and condensed matter physics, with its emergence of remarkable phenomena from underlying building blocks, is one reason.
    • Similarly, I should have pointed earlier to this interesting article by Natalie Wolchover, who asked a number of physicists to define what they mean by "a particle".  I understand the mathematical answer ("a particle is an irreproducible representation of the Poincare group", meaning that it's an object defined by having particular numbers describing how it changes or doesn't under translations in time, space, and rotation).  That also lends itself to a nice definition of a quasiparticle (such an object, but one that results from the collective action of underlying degrees of freedom, rather than existing in the universe's vacuum).  As an experimentalist, though, I confess a fondness for other perspectives.
    • Springer Nature has released its approach for handling open access publication.  I don't think I'm alone in thinking that its fee structure is somewhere between absurd and obscene.  It's simply absurd to think that the funding agencies (at least in the US) are going to allow people to budget €9,500 for a publication charge.  That's equivalent to four months of support for a graduate student in my discipline.  Moreover, the publisher is proposing to charge a non-refundable €2,190 fee just to have a manuscript evaluated for "guided open access" at Nature Physics.  Not that I lack confidence in the quality of my group's work, but how could I possibly justify spending that much for a 75% probability of a rejection letter?  Given that they do not pay referees, am I really supposed to believe that finding referees, soliciting reviews, and tracking manuscript progress costs the publisher €2,190 per submission? 
    • It's older news, but this approach to computation is an interesting one.  Cerebras is implementing neural networks in hardware, and they are doing this through wafer-scale processors (!) with trillions (!) of transistors and hundreds of thousands of cores.  There must be some impressive faulty tolerance built into their network training approach, because otherwise I'd be surprised if even the amazing manufacturing reliability of the semiconductor industry would produce a decent yield of these processors.
    • Older still, one of my colleagues brought this article to my attention, about someone trying to come up with a way to play grandmaster-level chess in a short period of time.  I don't buy into the hype, but it was an interesting look at how easy it seems to be now to pick up machine learning coding skills.  (Instead of deeply studying chess, the idea was to find a compact, memorizable/mentally evaluatable decision algorithm for chess based on training a machine learning system against a dedicated chess engine.) 

    Wednesday, November 25, 2020

    Statistical mechanics and Thanksgiving

    Many books and articles have been written about the science of cooking, and why different cooking methods work the way that they do.  (An absolute favorite: J. Kenji López-Alt's work.  Make sure to watch his youtube videos.)  Often the answers involve chemistry, as many reactions take place during cooking, including the Maillard Reaction (the browning and caramelization of sugars and reactions with amino acids that gives enormous flavor) and denaturing of proteins (the reason that eggs hard-boil and firm up when scrambled over heat).  Sometimes the answers involve biology, as in fermentation.  

    Occasionally, though, the real hero of the story is physics, in particular statistical mechanics.  Tomorrow is the Thanksgiving holiday in the US, and this traditionally involves cooking a turkey.  A technique gaining popularity is dry brining.  This oxymoronic name really means applying salt (often mixed with sugar, pepper, or other spices) to the surfaces of a piece of meat (say a turkey) and letting the salted meat sit in a refrigerated environment for a day or two prior to cooking.  What does this do?  

    In statistical mechanics, we learn (roughly speaking) that systems approach equilibrium macroscopic states that correspond to the largest number of microscopic arrangements of the constituents.  Water is able to diffuse in and out of cells at some rate, as are solvated ions like Na+ and Cl-.  Once salt is on the turkey's surface, we have a non-equilibrium situation (well, at least a more severe on than before):  there are many more (by many orders of magnitude) ways to arrange the water molecules and ions now, such that some of the ions are inside the cells, and some of the water is outside, solvating the salt.  The result is osmosis, and over the timescale of the dry brining, the moisture and salt ions redistribute themselves.  (The salt also triggers reactions in the cells to break down some proteins, but that's chemistry not physics.)  After cooking, the result is supposed to be a more flavorful, tender meal.

    So among the things for which to be thankful, consider the unlikely case of statistical mechanics.

    (For a fun look at osmosis (!), try this short story if you can find it.)

    Wednesday, November 18, 2020

    Hard condensed matter can be soft, too.

    In the previous post, I mentioned that one categorization of "soft" condensed matter is for systems where quantum mechanics is (beyond holding atoms together, etc.) unimportant.  In that framing, "hard" condensed matter looks at systems where \(\hbar\) figures prominently, in the form of quantum many-body physics.  By that labeling, strongly interacting quantum materials are the "hardest" systems out there, with entanglement, tunneling, and quantum fluctuations leading to rich phenomena. 

    Orientation textures in a liquid crystal, from wikipedia
    Interestingly, in recent years it has become clear that these hard CM systems can end up having properties that are associated with some soft condensed matter systems.  For instance, liquid crystals are canonical soft matter systems.  As I'd explained long ago here, liquid crystals are fluids made up of objects with some internal directionality (e.g., a collection of rod-like molecules, where one can worry about how the rods are oriented in addition to their positions).  Liquid crystals can have a variety of phases, including ones where the system spontaneously picks out a direction and becomes anisotropic.  It turns out that sometimes the electronic fluid in certain conductors can spontaneously do this as well, acting in some ways like a nematic liquid crystal.  A big review of this is here.  One example of this occurs in 2D electronic systems in high magnetic fields in the quantum Hall regime; see here for theory and here for a representative experiment.  Alternately, see here for an example in a correlated oxide at the cusp of a quantum phase transition.

    Another example:  hydrodynamics is definitely part of the conventional purview of soft condensed matter.   In recent years, however, it has become clear that there are times when the electronic fluid can also be very well-described by math that is historically the realm of classical fluids.   This can happen in graphene, or in more exotic Weyl semimetals, or perhaps in the exotic "strange metal" phase.  In the last of those, this is supposed to happen when the electrons are in such a giant, entangled, many-body situation that the quasiparticle picture doesn't work anymore.  

    Interesting that the hardest of hard condensed matter systems can end up having emergent properties that look like those of soft matter.

    Saturday, November 14, 2020

    Soft matter is hard!

    This great article by Randall Munroe from the NY Times this week brings up, in its first illustration (reproduced here), a fact that surprises me on some level every time I really stop to think about it:  The physics of "soft matter", in this case the static and dynamic properties of sand, is actually very difficult, and much remains poorly understood.  


    "Soft" condensed matter typically refers to problems involving solid, liquids, or mixed phases in which quantum mechanics is comparatively unimportant - if you were to try to write down equations modeling these systems, those equations would basically be some flavor of classical mechanics ("h-bar = 0", as some would say).  (If you want to see a couple of nice talks about this field, check out this series and this KITP talk.)  This encompasses the physics of classical fluids, polymers, and mixed-phase systems like ensembles of hard particles plus gas (sand!), suspensions of colloidal particles (milk, cornstarch in water), other emergent situations like the mechanical properties of crumping paper.  (Soft matter also is sometimes said to encompass "active matter", as in living systems, but it's difficult even without that category.)

    Often, soft matter problems sound simple.  Take a broomhandle, stick it a few inches into dry sand, and try to drag the handle sideways.  How much force does it take to move the handle at a certain speed?  This problem only involves classical mechanics.  Clearly the dominant forces that are relevant are gravity acting on the sand grains, friction between the grains, and the "normal force" that is the hard-core repulsion preventing sand grains from passing through each other or through the broom handle.  Maybe we need to worry about the interactions between the sand grains and the air in the spaces between grains.  Still, all of this sounds like something that should have been solved by a French mathematician in the 18th or 19th centuries - one of those people with a special function or polynomial named after them.  And yet, these problems are simultaneously extremely important for industrial purposes, and very difficult.

    A key issue is that many soft matter systems are hindered - energy scales required to reshuffle their constitutents (e.g., move grains of sand around and past each other) can be larger than what's available from thermal fluctuations.  So, configurations get locked in, kinetically hung up or stuck.  This can mean that the past history of the system can be very important, in the sense that the system can get funneled into some particular configuration and then be unable to escape, even if that configuration isn't something "nice", like one that globally minimizes energy.  

    A message that I think is underappreciated:  Emergent dynamic properties, not obvious at all from the building blocks and their simple rules, can happen in such soft matter systems (e.g., oscillons and creepy non-Newtonian fluids), and are not just the provenance of exotic quantum materials.  Collective responses from many interacting degrees of freedom - this is what condensed matter physics is all about.

    Sunday, November 08, 2020

    Recently on the arxiv

    A couple of papers caught my eye recently on the arxiv, when I wasn't preoccupied with the presidential election, the pandemic, or grant writing:

    arxiv:2010.09986 - Zhao et al., Determination of the helical edge and bulk spin axis in quantum spin Hall insulator WTe2
    Monolayer tungsten ditelluride is a quantum spin Hall insulator, meaning that the 2D "bulk" of a flake of  the material is an insulator at low temperatures, while there are supposed to be helical edge states that run around the perimeter of the flake.  Because of spin-momentum locking, preferred spin orientation of the electrons in those edges should be fixed, but the spin doesn't have to be pointing perpendicular to the plane of the flake.  In this work, highly detailed transport measurements determine experimentally the orientation of that preferred direction.

    arxiv:2011.01335 - Hossain et al., Observation of Spontaneous Ferromagnetism in a Two-Dimensional Electron System
    For many years, people have been discussing the ground state of a dilute 2D layer of electrons in the limit of low density and a very clean system.  This system is ideal for demonstrating one of the most unintuitive consequences of the Pauli Principle:  As the electron density is lowered, and thus the average spacing between electrons increases, electron-electron interactions actually become increasingly dominant.  These investigators, working with electrons in an essentially 2D AlAs layer, show (though hysteresis in the electronic resistance as a function of applied magnetic field) the onset of ferromagnetism at sufficiently low electron densities.  

    arxiv:2011.02500 - Rodan-Legrain et al., Highly Tunable Junctions and Nonlocal Josephson Effect in Magic Angle Graphene Tunneling Devices
    Over the last couple of years, it's become clear that "magic angle" twisted bilayer graphene is pretty remarkable.  It's a superconductor.  It's an orbital magnet.  It's a correlated insulator.  It's a floor wax and a dessert topping.  Here, the authors demonstrate that it is possible to make devices with this material that are sufficiently free of disorder that they can be tuned into a wide variety of structures - Josephson junctions, single-electron transistors, etc.  Pretty remarkable.


    Sunday, November 01, 2020

    Science, policy-making, and the right thing to do

    I know people don't read this blog for politics, but the past week has seen a couple of very unusual situations, and I think it's worth having a discussion of science, its role in policy-making, and the people who work on these issues at the highest levels.   (If you want, view this as writing-therapy for my general anxiety and move on.)

    As a political reality, it's important to understand that science does not, itself, make policy.  Public policy is complicated and messy because it involves people, who as a rule are also complicated and messy. Deciding to set fuel standards for non-electric cars to 200 miles per gallon beginning next year and requiring that the fuel all be made from biological sources would be extremely bold, but it would also be completely unworkable and enormously disruptive.  That said, when policy must be made that has a science and technology aspect, it's critical that actual scientific and engineering knowledge be presented at the table.  If science isn't in the room where it happens, then we can make bad situations worse.  (It's been one of the great privileges of my career to have had the chance to meet and interact with some of the people who have worked on policy.  One of the most depressing aspects of the past four years has been the denigration of expertise, the suggestion that no one with detailed technical knowledge can be trusted because they're assumed to be on the make.)  The pandemic has shined a spotlight on this, as well as showing the (also complicated and messy) scientific process of figuring out how the disease works.

    A million years ago at the beginning of this week, the White House Office of Science and Technology Policy put out a press release, linking to a detailed report (pdf), about their science and technology accomplishments over the last four years.  The top highlight in the press release was "Ending the pandemic".  That language doesn't appear anywhere in the actual report, but it sure shows up front and center in the press release.  After this was met with, shall we say, great skepticism (almost 100,000 cases per day, about 1000 deaths per day doesn't sound like an ending to this), the administration walked it back, saying the release was "poorly worded".  The question that comes to mind:  How can Kelvin Droegemeier, the presidential science advisor and head of OSTP, continue in that position?  There is essentially zero chance that he approved that press release language.  It must have been added after he and OSTP staff produced and signed off on the report, and therefore it was either over his objections or without his knowledge.  Either way, under ordinary circumstances that would be the kind of situation that leads to an offer of resignation.  

    In a weird complement of this, yesterday evening, Dr. Anthony Fauci gave an interview to the Washington Post, where he stated a number of points with great frankness, including his opinion that the pandemic was in a very dangerous phase and that he disagreed in the strongest terms with Dr. Scott Atlas.  Dr. Atlas has seemingly become the chief advisor to the administration on the pandemic, despite having views that disagree with a large number of public health experts.  The White House in the same Post article takes Dr. Fauci to task for airing his grievances publicly.  Again, the question comes to mind:  How can Dr. Fauci continue to serve on the coronavirus policy task force, when he clearly disagrees with how this is being handled?

    As I alluded back in late 2016, these situations remind me of this book, The Dilemmas of an Upright Man, about Max Planck and his difficult decision to remain in Germany and helping to influence German science during WWII.  His rationale was that it was much better for German science if he stayed there, where he thought he could at least be a bit of a moderating influence, than for him to be completely outside the system.  

    There are no easy answers here about the right course of action - to quit on principle when that might lead to more chaos, or to try to exert influence from within even in the face of clear evidence that such influence is minimal at best.  What I do know is that we face a complicated world filled with myriad challenges, and that science and engineering know-how is going to be needed in any credible effort to surmount those problems.  The cost of ignoring, or worse, actively attacking technical expertise is just too high.

    Saturday, October 24, 2020

    Silicon nanoelectronics is a truly extraordinary achievement.

    Arguably the greatest technical and manufacturing achievement in all of history is around us all the time, supporting directly or indirectly a huge fraction of modern life, and the overwhelming majority people don't give it a second's thought.  

    I'm talking about silicon nanoelectronics (since about 2003, "microelectronics" is no longer an accurate description).  As I was updating notes for a class I'm teaching, the numbers really hit me.  A high end microprocessor these days (say the AMD "Epyc" Rome) contains 40 billion transistors in a chip about 3 cm on a side.  These essentially all work properly, for many years at a time.  (Chips rarely die - power supplies and bad electrolytic capacitors are much more common causes of failure of motherboards.)  No other manufacturing process of components for any product comes close to the throughput and reliability of transistors.  

    The transistors on those chips are the culmination of many person-years of research.  They're FinFETs, made using what is labeled the 7 nm process.  Remember, transistors are switches, with the current flow between the source and drain electrodes passing through a channel the conductance of which is modulated by the voltage applied to a gate electrode.  The active channel length of those transistors, the distance between the source and drain, is around 16 nm, or about 50 atoms (!).  The positioning accuracy required for the lithography steps (when ultraviolet light and photochemistry are used to pattern the features) is down to about 3 nm.  These distances are controlled accurately across a single-crystal piece of silicon the size of a dinner plate.  That silicon is pure at the level of one atom out of every 10 trillion (!!).  

    This is not an accident.  It's not good fortune.  Science (figuring out the rules of the universe) and engineering (applying those rules to accomplish a task or address a need) have given us this (see here and here).  It's the result of an incredible combination of hard-earned scientific understanding, materials and chemistry acumen, engineering optimization, and the boot-strapping nature of modern technology (that is, we can do this kind of manufacturing because we have advanced computational tools for design, control, and analysis, and we have those tools because of our ability to do this kind of manufacturing.)   

    This technology would look like literal magic to someone from any other era of history - that's something worth appreciating.

    Thursday, October 15, 2020

    Emergent monopoles

    One of the truly remarkable things about condensed matter physics is the idea that, from a large number of interacting particles that obey comparatively simple rules, there can emerge new objects  (in the sense of having well-defined sets of parameters like mass, charge, etc.) with properties that are not at all obviously related to those of the original constituents.   (One analogy I like:  Think about fans in a sports stadium doing The Wave.  The propagating wave only exists because of the cooperative response of thousands of people, and its spatial extent and propagation speed are not obviously related to the size of individual fans.)

    A fairly spectacular example of this occurs in materials called spin ices, insulating materials that have unusual magnetic properties. A prime example is Dy2Ti2O7.  The figure here shows a little snipped of the structure.  The dysprosium atoms (which end up having angular momentum \(J = 15/2\), very large as compared to a spin-1/2 electron) sit at the corners of corner-sharing tetrahedra.  It's a bit hard to visualize, but the centers of those tetrahedra form the same spatial pattern as the locations of carbon atoms in a diamond crystal.  Anyway, because of some rather deep physics ("crystal field effects"), the magnetic moments of each Dy are biased to point either radially inward toward or radially outward from the center of the tetrahedron.  Moreover, because of interactions between the magnetic moments, it is energetically favored so that for each tetrahedron, two moments (shown as a little arrows) point inward and two moments point outward.  This is the origin of the "ice" part of the name, since this two-in/two-out rule is the same thing seen in ordinary water ice, where each oxygen atom is coordinated by four hydrogen atoms, two strongly (closer, covalently bound) and two more weakly (farther away, hydrogen bonding).  The spin ice ordering in this material really kicks in at low temperatures, below 1 K.  

    So, what happens at rather warmer temperatures, say between 2 K and 15 K?  The lowest energy excitations of this system act like magnetic monopoles (!).  Now, except for the fact that electrical charge is quantized, there is no direct evidence for magnetic monopoles (isolated north and south poles that would interact with a Coulomb-like force law) in free space.  In spin ice, though, you can create an effective monopole/antimonopole pair by flipping some moments so that one tetrahedron is 3-out/1-in, and another is 1-out/3-in, as shown at right.  You can "connect" the monopole to the antimonopole by following a line of directed magnetic moments - this is a topological constraint, in the sense that you can see how having multiple m/anti-m pairs could interfere with each other.  This connection is the analog of a Dirac string (where you can think of the m/anti-m pair as opposite ends of an infinitesimally skinny solenoid).  

    This is all fun to talk about, but is there really evidence for these emergent monopoles?  Yes.  A nice very recent review of the subject is here.  There are a variety of experiments (starting with magnetization and neutron scattering and ending up with more sophisticated measurements like THz optical properties and magnetic flux noise experiments looking at m/anti-m generation and recombination) that show evidence for monopoles and their interactions.  (full disclosure:  I have some thoughts on fun experiments to do in these and related systems.)  It's also possible to make two-dimensional arrays of nanoscale ferromagnets that can mimic these kinds of properties, so-called artificial spin ice.  This kind of emergence, when you can end up with excitations that act like exotic, interacting, topologically constrained (quasi)particles that seemingly don't exist elsewhere, is something that gets missed if one takes a reductionist view of physics.

    Wednesday, October 14, 2020

    Room temperature superconductivity!

    As many readers know, the quest for a practical room temperature superconductor has been going on basically ever since Kamerlingh Onnes discovered superconductivity over 100 years ago.  If one could have superconductivity with high critical currents and high critical fields in a material that could readily be made into wires, for example, it would be absolutely transformative to the world.  (Just one example:  we lose 5-10% of generated electricity just in transmission lines due to resistive heating.)  

    One exotic possibility suggested over 50 years ago by Neil Ashcroft (of textbook fame in addition to his scientific prestige) was that highly compressed metallic hydrogen could be a room temperature superconductor.  The basic ingredients for traditional superconductivity would be a high electronic density of states, light atoms (and hence a high soundspeed for phonon-based pairing), and a strong electron-phonon coupling.  

    In recent years, there have been striking advances in hydrogen-rich compounds with steadily increasing superconducting transition temperatures, including H2S (here and here) and LaH10 (here and here), all requiring very high (200+ GPa) pressures obtained in diamond anvil cells.  In those cool gadgets, tiny sample volumes are squeezed between the facets of cut gemstone-quality diamonds, and there is a great art in making electronic, optical, and magnetic measurements of samples under extreme pressures. 

    Today, a new milestone has been reached and published.  Using these tools, the investigators (largely at Rochester) put some carbon, sulphur, and hydrogen containing compounds in the cell, zapped them with a laser to do some in situ chemistry, and measured superconductivity with a transition temperature up to 287.7 K (!) at a pressure of 267 GPa (!!).  The evidence for superconductivity is both a resistive transition to (as near as can be seen) zero resistance, and an onset of diamagnetism (as seen through ac susceptibility).  

    This is exciting, and a milestone, though of course there are many questions:  What is the actual chemical compound at work here?  How does superconductivity work - is it conventional or more exotic? Is there any pathway to keeping these properties without enormous externally applied pressure?  At the very least, this shows experimentally what people have been saying for a long time, that there is no reason in principle why there couldn't be room temperature (or above) superconductivity.



    Saturday, October 10, 2020

    How fast can sound be in a solid or liquid?

    There is a new paper here that argues through dimensional analysis for an upper limit to the speed of sound in solids and liquids (when the atoms bump up against each other).  The authors derive that the maximum speed of sound is, to within numerical factors of order 1, given by \(v_{\mathrm{max}}/c = \alpha \sqrt{m_{e}/(2m_{p})} \), where \(\alpha\) is the fine structure constant, and \(m_{e}\) and \(m_{p}\) are the masses of the electron and proton, respectively.  Numerically, that ends up being about 36 km/s.  

    It's a neat argument, and I agree with the final result, but I actually think there's a more nuanced way to think about this than the approach of the authors.  Sound speed can be derived from some assumptions about continuum elasticity, and is given by \(v_{s} = \sqrt{K/\rho}\), where \(K\) is the bulk modulus and \(\rho\) is the mass density.  Bulk modulus is given by (negative) the inverse fractional change in volume of a substance when the pressure on the substance is increased.  So, a squishy soft substance has a low bulk modulus, because when the pressure goes up, its volume goes down comparatively a lot.

    The authors make the statement "It has been ascertained that elastic constants are governed by the density of electromagnetic energy in condensed matter phases."  This is true, but for the bulk modulus I would argue that this is true indirectly, as a consequence of the Pauli principle.  I wrote about something similar previously, explaining why you can't push solids through each other even though the atoms are mostly empty space.  If you try to stuff two atoms into the volume of one atom, it's not the Coulomb repulsion of the electrons that directly stops this from happening.  Rather, the Pauli principle says that cramming those additional electrons into that tiny volume would require the electrons to occupy higher atomic energy levels.  They typical scale of those atomic energy levels is something like a Rydberg, so that establishes one cost of trying to compress solids or liquids; that Rydberg scale of energy is how the authors get to the fine structure constant and the masses of the electron and proton in their result.  

    I would go further and say that this is really the ultimate limiting factor on sound speed in dense material.  Yes, interatomic chemical bonds are important - as I'd written, they establish why solids deform instead of actually merging when squeezed.  It's energetically cheaper to break or rearrange chemical bonds (on the order of a couple of eV in energy) than to push electrons into higher energy states (several eV or more - real Rydberg scales).  

    Still, it's a cool idea - that one can do intelligently motivated dimensional analysis and come up with an insight into the maximum possible value of some emergent quantity like sound speed.  (Reminds me of the idea of a conjectured universal bound on diffusion constants for electrons in metals.)



    Thursday, October 08, 2020

    Postdoc opportunities

    There is a postdoc opportunity coming up in my lab to look at light emission from molecular-scale plasmonic nanostructures.  It's going to be very cool, looking at (among other things) photon counting statistics (this kind of thing), coupling plasmon-based emission to single fluorophores, all kinds of fun.  Please check it out and share with those who might be interested:  https://jobs.rice.edu/postings/24792 

    In addition:  The Smalley-Curl Institute is happy to announce that they are accepting applications to 2 (two) J Evans Attwell-Welch Postdoctoral Research Associate positions.   Highly competitive, the Attwell-Welch fellowship was established in 1998 to provide Ph.D. recipients in nanosciences and nanotechnology-related fields, an opportunity to further their basic scientific research experience.

    The deadline for the Evans Attwell-Welch submissions is Monday Dec 7th, 2020.  Applications containing the candidate’s resume, a two-page research project, and a letter of support from an SCI member must be emailed to sci@rice.edu before the deadline.  Only applicants sponsored by an SCI Rice faculty member will be considered.   

    I would be happy to work with a potential applicant, particularly one interested in strongly correlated nanostructures and spin transport in magnetic insulators.  If you're a student finishing up and are interested, please contact me, and if you're a faculty member working with possible candidates, please feel free to point out this opportunity.   Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

    Saturday, October 03, 2020

    Annual Nobel speculation, + nanoscale views on twitter

    It's that annual tradition:  Who do people think will win the Nobel this year in physics?  Or chemistry?  On the physics side, I've repeatedly predicted (incorrectly) Aharonov and Berry for geometric phases.  Another popular suggestion from years past is Aspect, Zeilinger, and Clauser for Bell's inequality tests.   Speculate away in the comments.

    I've also finally taken the plunge and created @NanoscaleViews on twitter.  Hopefully this will help reach a broader audience, even if I don't have the time to fall down the twitter rabbit hole constantly.