Monday, January 20, 2020

Brief items

Here are some items of interest:

  • An attempt to lay out a vision for research in the US beyond Science: The Endless Frontier.  The evolving roles of the national academies are interesting, though I found the description of the future of research universities to be rather vague - I'm not sure growing universities to the size of Arizona State is the best way to provide high quality access to knowledge for a large population.  It still feels to me like an eventual successful endpoint for online education could be natural language individualized tutoring ("Alexa, teach me multivariable calculus."), but we are still a long way from there.
  • Atomic-resolution movies of chemistry are still cool.
  • Dan Ralph at Cornell has done a nice service to the community by making his lecture notes available on the arxiv.  The intent is for these to serve as a supplement to a solid state course such as one out of Ashcroft and Mermin, bringing students up to date about Berry curvature and topology at a similar level to that famous text.
  • This preprint tries to understand an extremely early color photography process developed by Becquerel (the photovoltaic one, who was the father of the radioactivity Becquerel).  It turns out that there are systematic changes in reflectivity spectra of the exposed Ag/AgCl films depending on the incident wavelength.  Why the reflectivity changes that way remains a mystery to me after reading this.
  • On a related note, this led me to this PNAS paper about the role of plasmons in the daguerreotype process.  Voila, nanophotonics in the 19th century.
  • This preprint (now out in Nature Nano) demonstrates incredibly sensitive measurements of torques on very rapidly rotating dielectric nanoparticles.  This could be used to see vacuum rotational friction.
  • The inventors of chemically amplified photoresists have been awarded the Charles Stark Draper prize.  Without that research, you probably would not have the computing device sitting in front of you....

Tuesday, January 14, 2020

The Wolf Prize and how condensed matter physics works

The Wolf Prize in Physics for 2020 was announced yesterday, and it's going to Pablo Jarillo-Herrero, Allan MacDonald, and Rafi Bistritzer, for twisted bilayer graphene.  This prize is both well-deserved and a great example of how condensed matter physics works.  

MacDonald and Bistritzer did key theory work (for example) highlighting how the band structure of twisted bilayer graphene would become very interesting for certain twist angles - how the moire pattern from the two layers would produce a lateral periodicity, and that interactions between the layers would lead to very flat bands.  Did they predict every exotic thing that has been seen in this system?  No, but they had the insight to get key elements, and the knowledge that flat bands would likely lead to many competing energy scales, including electron-electron interactions, the weak kinetic energy of the flat bands, the interlayer coupling, effective magnetic interactions, etc.  Jarillo-Herrero was the first to implement this with sufficient control and sample quality to uncover a remarkable phase diagram involving superconductivity and correlated insulating states.  Figuring out what is really going on here and looking at all the possibilities in related layered materials will keep people busy for years.   (As an added example of how condensed matter works as a field, Bistritzer is in industry working for Applied Materials.)

All of this activity and excitement, thanks to feedback between well-motivated theory and experiment, is how the bulk of physics that isn't "high energy theory" actually works.  

Monday, January 13, 2020

Popular treatment of condensed matter - topics

I'm looking more seriously at trying to do some popularly accessible writing about condensed matter.  I have a number of ideas about what should be included in such a work, but I'm always interested in other peoples' thoughts on this.   Suggestions? 

Sunday, January 05, 2020

Brief items

Happy new year.  As we head into 2020, here are a few links I've been meaning to point out:

  • This paper is a topical review of high-throughput (sometimes called combinatorial) approaches to searching for new superconductors.   The basic concept is simple enough:  co-deposit multiple different elements in a way that deliberately produces compositional gradients across the target substrate.  This can be done via geometry of deposition, or with stencils that move during the deposition process.  Then characterize the local properties in an efficient way across the various compositional gradients, looking for the target properties you want (e.g., maximum superconducting transition temperature).  Ideally, you combine this with high-throughput structural characterization and even annealing or other post-deposition treatment.  Doing all of this well in practice is a craft.  
  • Calling back to my post on this topic, Scientific American has an article about wealth distribution based on statistical mechanics-like models of economies.   It's hard for me to believe that some of these insights are really "new" - seems like many of these models could have been examined decades ago....
  • This is impressive.  Jason Petta's group at Princeton has demonstrated controlled entanglement between single-electron spins in Si/SiGe gate-defined quantum dots separated by 4 mm.  That may not sound all that exciting; one could use photons to entangle atoms separated by km, as has been done with optical fiber.  However, doing this on-chip using engineered quantum dots (with gates for tunable control) in an arrangement that is in principle scalable via microfabrication techniques is a major achievement.
  • Just in case you needed another demonstration that correlated materials like the copper oxide superconductors are complicated, here you go.  These investigators use an approach based on density functional theory (see here, here, and here), and end up worrying about energetic competition between 26 different electronic/magnetic phases.  Regardless of the robustness of their specific conclusions, just that tells you the inherent challenge of those systems:  Many possible ordered states all with very similar energy scales.

Monday, December 30, 2019

Energy scales and crystals in science fiction

Crystals are fascinating.  Somehow, for reasons that don't seem at all obvious at first glance, some materials grow in cool shapes as solids, with facets and obvious geometric symmetries.  This was early support for the idea of atoms, and it's no wonder at all that people throughout history have looked upon obviously crystalline materials as amazing, possibly connected with magical powers.

In science fiction (or maybe more properly science fantasy), crystals show up repeatedly as having special properties, often able to control or direct energies that seem more appropriate for particle physics.  In Star Trek, dilithium crystals are able to channel and control the flow of matter-antimatter reactions needed for warp drive, the superluminal propulsion system favored by the Federation and the Klingon Empire.  In Star Wars, kyber crystals are at the heart of lightsabers, and were also heavily mined by the Empire for use in the planet-killing main weapon of the Death Star.

In real life, though, crystals don't do so well in interacting with very high energy electromagnetic or particle radiation.  Yes, it is possible for crystals to scatter x-rays and high energy electrons - that's the way x-ray diffraction and electron diffraction work.  On very rare occasions, crystals can lead to surprising nuclear processes, such as all the iron atoms in a crystal sharing the recoil when an excited iron nucleus spits out a gamma ray, as in the Mossbauer Effect.   Much more typically, though, crystals are damaged by high energy radiation - if the energy scale of the photon or other particle is much larger than the few eV chemical energy scales that hold atoms in place or bind core electrons (say a few tens of eV), then the cool look and spatial arrangement of the atoms really doesn't matter, and atoms get kicked around.  The result is the creation of vacancies or interstitial defects, some of which can even act as "color centers", so that otherwise colorless Al2O3, for example, can take on color after being exposed to ionizing radiation in a reactor.

Ahh well.  Crystals are still amazing even if they can't propel starships faster than light.

(Happy new year to my readers!  I'm still trying to be optimistic, even if it's not always easy.)


Sunday, December 22, 2019

Condensed matter and Christmas decorations - 'tis the season

Modern outdoor decorations owe quite a bit to modern science - polymers; metallurgy; electric power for the lighting, fans, sensors, and motors which make possible the motion-actuated inflatable Halloween decorations that scare my dog....  Condensed matter physics has, as in many areas of society, had a big impact on Christmas decorations that is so ubiquitous and pervasive that no one even thinks about it.  In particular, I'm thinking about the light emitting diode and its relative, the diode laser.  I'm pretty sure that Nick Holonyak and Shuji Nakamura never imagined that LEDs would pave the way for animated multicolor icicle decorations.  Likewise, I suspect that the inventors discussed here (including Holonyak) never envisioned laser projected holiday lighting.  So, the next time someone asks if any of this quantum stuff or basic research is useful, remember that these inherently quantum devices have changed the world in all kinds of ways that everyone sees but few observe. 

Wednesday, December 18, 2019

Materials and neuromorphic computing

(In response to a topic suggestion from the Pizza Perusing Physicist....)

Neuromorphic computing is a trendy concept aimed at producing computing devices that are structured and operate like biological neural networks.  

In standard digital computers, memory and logic are physically separated and handled by distinct devices, and both are (nearly always) based on binary states and highly regular connectivity.  That is, logic gates take inputs that are two-valued (1 or 0), and produce outputs that are similarly two-valued; logic gates have no intrinsic memory of past operations that they've conducted; memory elements are also binary, with data stored as a 1 or 0; and everything is organized in a regular, immutable pattern - memory registers populated and read by clocked, sequential logic gates via a bus.

Natural neural networks, on the other hand, are very different.  Each neuron can be connected to many others via synapses.  Somehow memory and logic are performed by the same neuronal components.   The topology of the connections varies with time - some connections are reinforced by repeated use, while others are demoted, in a continuous rather than binary way.  Information traffic involves temporal trains of pulses called spikes.  

All of these things can be emulated with standard digital computers.  Deep learning methods do this, with multiple layers playing the roles of neurons, and weighted links between nodes modeling the connections and strengths.  This is all a bit opaque and doesn't necessarily involve simulating the spiking dynamics at all.  Implementing neural networks via standard hardware loses some of the perceived benefits of biological neural nets, like very good power efficiency.

In the last few years, as machine learning and big data have become increasingly important, there has been a push to try to implement in device hardware architectures that look a lot more like the biological analogs.  To do this, you might want nonvolatile memory elements that can also be used for logic, and can have continuously graded values of "on"-ness determined by their history.  Resistive switching memory elements, sometimes called memristors (though that is a loaded term - see here and here), can fit the bill, as in this example.  Many systems can act as resistive switches, with conduction changes often set by voltage-driven migration of ions or vacancies in the material.

On top of this, there has been a lot of interest in using strongly correlated materials in such applications.  There are multiple examples of correlated materials (typically transition metal oxides) that undergo dramatic metal-insulator transitions as a function of temperature.  These materials then offer a chance to emulate spiking - driving a current can switch such a material from the insulating to the metallic state via local Joule heating or more nontrivial mechanisms, and then revert to the insulating state.  See the extensive discussion here.  

Really implementing all of this at scale is not simple.  The human brain involves something like 100,000,000,000 neurons, and connections run in three dimensions.  Getting large numbers of effective solid-state neurons with high connectivity via traditional 2D planar semiconductor-style fab (basically necessary if one wants to have many millions of neurons) is not easy, particularly if it requires adapting processing techniques to accommodate new classes of materials.

If you're interested in this and how materials physics can play a role, check out this DOE report and this recent review article.