Monday, December 30, 2019

Energy scales and crystals in science fiction

Crystals are fascinating.  Somehow, for reasons that don't seem at all obvious at first glance, some materials grow in cool shapes as solids, with facets and obvious geometric symmetries.  This was early support for the idea of atoms, and it's no wonder at all that people throughout history have looked upon obviously crystalline materials as amazing, possibly connected with magical powers.

In science fiction (or maybe more properly science fantasy), crystals show up repeatedly as having special properties, often able to control or direct energies that seem more appropriate for particle physics.  In Star Trek, dilithium crystals are able to channel and control the flow of matter-antimatter reactions needed for warp drive, the superluminal propulsion system favored by the Federation and the Klingon Empire.  In Star Wars, kyber crystals are at the heart of lightsabers, and were also heavily mined by the Empire for use in the planet-killing main weapon of the Death Star.

In real life, though, crystals don't do so well in interacting with very high energy electromagnetic or particle radiation.  Yes, it is possible for crystals to scatter x-rays and high energy electrons - that's the way x-ray diffraction and electron diffraction work.  On very rare occasions, crystals can lead to surprising nuclear processes, such as all the iron atoms in a crystal sharing the recoil when an excited iron nucleus spits out a gamma ray, as in the Mossbauer Effect.   Much more typically, though, crystals are damaged by high energy radiation - if the energy scale of the photon or other particle is much larger than the few eV chemical energy scales that hold atoms in place or bind core electrons (say a few tens of eV), then the cool look and spatial arrangement of the atoms really doesn't matter, and atoms get kicked around.  The result is the creation of vacancies or interstitial defects, some of which can even act as "color centers", so that otherwise colorless Al2O3, for example, can take on color after being exposed to ionizing radiation in a reactor.

Ahh well.  Crystals are still amazing even if they can't propel starships faster than light.

(Happy new year to my readers!  I'm still trying to be optimistic, even if it's not always easy.)


Sunday, December 22, 2019

Condensed matter and Christmas decorations - 'tis the season

Modern outdoor decorations owe quite a bit to modern science - polymers; metallurgy; electric power for the lighting, fans, sensors, and motors which make possible the motion-actuated inflatable Halloween decorations that scare my dog....  Condensed matter physics has, as in many areas of society, had a big impact on Christmas decorations that is so ubiquitous and pervasive that no one even thinks about it.  In particular, I'm thinking about the light emitting diode and its relative, the diode laser.  I'm pretty sure that Nick Holonyak and Shuji Nakamura never imagined that LEDs would pave the way for animated multicolor icicle decorations.  Likewise, I suspect that the inventors discussed here (including Holonyak) never envisioned laser projected holiday lighting.  So, the next time someone asks if any of this quantum stuff or basic research is useful, remember that these inherently quantum devices have changed the world in all kinds of ways that everyone sees but few observe. 

Wednesday, December 18, 2019

Materials and neuromorphic computing

(In response to a topic suggestion from the Pizza Perusing Physicist....)

Neuromorphic computing is a trendy concept aimed at producing computing devices that are structured and operate like biological neural networks.  

In standard digital computers, memory and logic are physically separated and handled by distinct devices, and both are (nearly always) based on binary states and highly regular connectivity.  That is, logic gates take inputs that are two-valued (1 or 0), and produce outputs that are similarly two-valued; logic gates have no intrinsic memory of past operations that they've conducted; memory elements are also binary, with data stored as a 1 or 0; and everything is organized in a regular, immutable pattern - memory registers populated and read by clocked, sequential logic gates via a bus.

Natural neural networks, on the other hand, are very different.  Each neuron can be connected to many others via synapses.  Somehow memory and logic are performed by the same neuronal components.   The topology of the connections varies with time - some connections are reinforced by repeated use, while others are demoted, in a continuous rather than binary way.  Information traffic involves temporal trains of pulses called spikes.  

All of these things can be emulated with standard digital computers.  Deep learning methods do this, with multiple layers playing the roles of neurons, and weighted links between nodes modeling the connections and strengths.  This is all a bit opaque and doesn't necessarily involve simulating the spiking dynamics at all.  Implementing neural networks via standard hardware loses some of the perceived benefits of biological neural nets, like very good power efficiency.

In the last few years, as machine learning and big data have become increasingly important, there has been a push to try to implement in device hardware architectures that look a lot more like the biological analogs.  To do this, you might want nonvolatile memory elements that can also be used for logic, and can have continuously graded values of "on"-ness determined by their history.  Resistive switching memory elements, sometimes called memristors (though that is a loaded term - see here and here), can fit the bill, as in this example.  Many systems can act as resistive switches, with conduction changes often set by voltage-driven migration of ions or vacancies in the material.

On top of this, there has been a lot of interest in using strongly correlated materials in such applications.  There are multiple examples of correlated materials (typically transition metal oxides) that undergo dramatic metal-insulator transitions as a function of temperature.  These materials then offer a chance to emulate spiking - driving a current can switch such a material from the insulating to the metallic state via local Joule heating or more nontrivial mechanisms, and then revert to the insulating state.  See the extensive discussion here.  

Really implementing all of this at scale is not simple.  The human brain involves something like 100,000,000,000 neurons, and connections run in three dimensions.  Getting large numbers of effective solid-state neurons with high connectivity via traditional 2D planar semiconductor-style fab (basically necessary if one wants to have many millions of neurons) is not easy, particularly if it requires adapting processing techniques to accommodate new classes of materials.

If you're interested in this and how materials physics can play a role, check out this DOE report and this recent review article.

Sunday, December 08, 2019

Brief items

Here are some tidbits that came across my eyeballs this past week:

  • I just ran into this article from early in 2019.  It touches on my discussion about liquids, and is a great example of a recurring theme in condensed matter physics.  The authors look at the vibrational excitations of liquid droplets on surfaces.  As happens over and over in physics, the imposition of boundary conditions on the liquid motion (e.g., wetting conditions on the surface and approximately incompressible liquid with a certain surface tension) leads to quantization of the allowed vibrations.  Discrete frequencies/mode shapes/energies are picked out due to those constraints, leading to a "periodic table" of droplet vibrations.  (This one looks moderately like atomic states, because spherical harmonics show up in the mode description, as they do when looking at atomic orbitals.)
  • Another article from the past, this one from 2014 in the IEEE Spectrum.  It talks about how we arrived at the modern form for Maxwell's equations.  Definitely a good read for those interested in the history of physics.  Maxwell's theory was developing in parallel with what became vector calculus, and Maxwell's original description (like Faraday's intuition) was very mechanistic rather than abstract.
  • Along those lines, this preprint came out recently promoting a graphical pedagogical approach to vector calculus.  The spirit at work here is that Feynman's graphical diagrammatic methods were a great way to teach people perturbative quantum field theory, and do perhaps a diagrammatic scheme for vector calc could be good.  I'm a bit of a skeptic - I found the approach by Purcell to be very physical and intuitive, and this doesn't look simpler to me.
  • This preprint about twisted bilayer graphene and the relationship between superconductivity and strongly insulating states caught my eye, and I need to read it carefully.  The short version:  While phase diagrams showing superconductivity and insulating states as a function of carrier density make it tempting to think that SC evolves out of the insulating states via doping (as likely in the cuprates), the situation may be more complicated.