Sunday, July 24, 2016

Dark matter, one more time.

There is strong circumstantial evidence that there is some kind of matter in the universe that interacts with ordinary matter via gravity, but is otherwise not readily detected - it is very hard to explain things like the rotation rates of galaxies, the motion of star clusters, and features of the large scale structure of the universe without dark matter.   (The most discussed alternative would be some modification to gravity, but given the success of general relativity at explaining many things including gravitational radiation, this seems less and less likely.)  A favorite candidate for dark matter would be some as-yet undiscovered particle or class of particles that would have to be electrically neutral (dark!) and would only interact very weakly if at all beyond the gravitational attraction.

There have been many experiments trying to detect these particles directly.  The usual assumption is that these particles are all around us, and very occasionally they will interact with the nuclei of ordinary matter via some residual, weak mechanism (say higher order corrections to ordinary standard model physics).  The signature would be energy getting dumped into a nucleus without necessarily producing a bunch of charged particles.   So, you need a detector that can discriminate between nuclear recoils and charged particles.  You want a lot of material, to up the rate of any interactions, and yet the detector has to be sensitive enough to see a single event, and you need pure enough material and surroundings that a real signal wouldn't get swamped by background radiation, including that from impurities.  The leading detection approaches these days use sodium iodide scintillators (DAMA), solid blocks of germanium or silicon (CDMS), and liquid xenon (XENON, LUX, PandaX - see here for some useful discussion and links).

I've been blogging long enough now to have seen rumors about dark matter detection come and go.  See here and here.  Now in the last week both LUX and PandaX have reported their latest results, and they have found nothing - no candidate events at all - after their recent experimental runs.  This is in contrast to DAMA, who have been seeing some sort of signal for years that seems to vary with the seasons.  See here for some discussion.  The lack of any detection at all is interesting.  There's always the possibility that whatever dark matter exists really does only interact with ordinary matter via gravity - perhaps all other interactions are somehow suppressed by some symmetry.  Between the lack of dark matter particle detection and the apparent lack of exotica at the LHC so far, there is a lot of head scratching going on....

Saturday, July 16, 2016

Impact factors and academic "moneyball"

For those who don't know the term:  Moneyball is the title of a book and a movie about the 2002 Oakland Athletics baseball team, a team with a payroll in the bottom 10% of major league baseball at the time.   They used a data-intensive, analytics-based strategy called sabermetrics to find "hidden value" and "market inefficiencies", to put together a very competitive team despite their very limited financial resources.   A recent (very fun if you're a baseball fan) book along the same lines is this one.  (It also has a wonderful discussion of confirmation bias!)

A couple of years ago there was a flurry of articles (like this one and the academic paper on which it was based) about whether a similar data-driven approach could be used in scientific academia - to predict success of individuals in research careers, perhaps to put together a better department or institute (a "roster") by getting a competitive edge at identifying likely successful researchers.

The central problems in trying to apply this philosophy to academia are the lack of really good metrics and the timescales involved in research careers.  Baseball is a paradise for people who love statistics.  The rules have been (largely) unchanged for over a hundred years; the seasons are very long (formerly 154 games, now 162), and in any game an everyday player can get multiple opportunities to show their offensive or defensive skills.   With modern tools it is possible to get quantitative information about every single pitched ball and batted ball.  As a result, the baseball stats community has come up with a huge number of quantitative metrics for evaluating performance in different aspects of the game, and they have a gigantic database against which to test their models.  They even have devised metrics to try and normalize out the effects of local environment (baseball park-neutral or adjusted stats).

Fig. 1, top panel, from this article.  x-axis = # of citations.
The mean of the distribution is strongly affected by the outliers.
In scientific research, there are very few metrics (publications; citation count; impact factor of the journals in which articles are published), and the total historical record available on which to base some evaluation of an early career researcher is practically the definition of what a baseball stats person would call "small sample size".   An article in Nature this week highlights the flaws with impact factor as a metric.  I've written before about this (here and here), pointing out that impact factor is a lousy statistic because it's dominated by outliers, and now I finally have a nice graph (fig. 1 in the article; top panel shown here) to illustrate this.  

So, in academia, the tantalizing fact is that there is almost certainly a lot of "hidden value" out there missed by traditional evaluation approaches.  Just relying on pedigree (where did so-and-so get their doctorate?) and high impact publications (person A must be better than person B because person A published a paper as a postdoc in a high impact glossy journal) almost certainly misses some people who could be outstanding researchers.  However, the lack of good metrics, the small sample sizes, the long timescales associated with research, and enormous local environmental influence (it's just easier to do cutting-edge work at Harvard than at Northern Michigan), all mean that it's incredibly hard to come up with a way to find these people via some analytic approach.  

Wednesday, July 06, 2016

Keeping your (samples) cool is not always easy.

Very often in condensed matter physics we like to do experiments on materials or devices in a cold environment.  As has been appreciated for more than a century, cooling materials down often makes them easier to understand, because at low temperatures there is not enough thermal energy bopping around to drive complicated processes.  There are fewer lattice vibrations.  Electrons settle down more into their lowest available states.  The spread in available electron energies is proportional to \(k_{\mathrm{B}}T\), so any electronic measurement as a function of energy gets sharper-looking at low temperatures.

Sometimes, though, you have to dump energy into the system to do the study you care about.  If you want to measure electronic conduction, you have to apply some voltage \(V\) across your sample to drive a current \(I\), and that \(I \times V\) power shows up as heat.  In our case, we have done work over the last few years trying to do simultaneous electronic measurements and optical spectroscopy on metal junctions containing one or a few molecules (see here).   What we are striving toward is doing inelastic electron tunneling spectroscopy (IETS - see here) at the same time as molecular-scale Raman spectroscopy (see here for example).   The tricky bit is that IETS works best at really low temperatures (say 4.2 K), where the electronic energy spread is small (hundreds of microvolts), but the optical spectroscopy works best when the structure is illuminated by a couple of mW of laser power focused into a ~ 1.5 micron diameter spot.

It turns out that the amount of heating you get when you illuminate a thin metal wire (which can be detected in various ways; for example, we can use the temperature-dependent electrical resistance of the wire itself as a thermometer) isn't too bad when the sample starts out at, say, 100 K.  If the sample/substrate starts out at about 5 K, however, even modest incident laser power directly on the sample can heat the metal wire by tens of Kelvin, as we show in a new paper.  How the local temperature changes with incident laser intensity is rather complicated, and we find that we can model this well if the main roadblock at low temperatures is the acoustic mismatch thermal boundary resistance.  This is a neat effect discussed in detail here.  Vibrational heat transfer between the metal and the underlying insulating substrate is hampered (like \(1/T^3\) at low temperatures) by the fact that the speed of sound is very different between the metal and the insulator.   There are a bunch of other complicated issues (this and this, for example) that can also hinder heat flow in nanostructures, but the acoustic mismatch appears to be the dominant one in our case.   The bottom line:  staying cool in the spotlight is hard.  We are working away on some ideas on mitigating this issue.  Fun stuff.

(Note:  I'm doing some travel, so posting will slow down for a bit.)

Thursday, June 30, 2016

The critical material nearly everyone overlooks

Condensed matter physics is tough to popularize, and yet aspects of it are absolutely ubiquitous in modern technologies.  For example:  Nearly every flat panel display, from the one on your phone to your computer monitor to your large television, takes advantage of an underappreciated triumph of materials development, a transparent conducting layer.  Usually, when a material is a good conductor of electricity, it tends to be (when more than tens of nm thick) reflective and opaque.   Remember, light is an electromagnetic wave.  If the electric field from the light can make the mobile charge in the material move, and if that charge can keep up with the rapid oscillations (1014 Hz and faster!) of the electric field, then the light tends to be reflected rather than transmitted.  This is why polished aluminum or silver can be used as a mirror.

The dominant technology for transparent conductors is indium tin oxide (ITO), which manages to thread between two constraints.  It's a highly doped semiconductor.  The undoped indium oxide material has a band gap of 3 eV, meaning that violet light with a shorter wavelength than about 350 nm will have enough energy to be absorbed, by kicking electrons out of the filled valence band and into the conduction band.  Longer wavelength light (most of the visible spectrum) doesn't have enough energy to make those transitions, and thus the material is transparent for those colors.   ITO has had enough tin added to make the resulting material fairly conducting at low frequencies (say those relevant for electronics, but much lower than the frequency of visible light).  However, because of the way charge moves in ITO (see here or here for a nice article), it does not act reflective at visible frequencies.   This material is one huge enabling technology for displays!  I remember being told that the upper limit on LCD display size was, at one point, limited by the electrical conductivity of the ITO, and that we'd never have flat screens bigger than about a meter diagonal.  Clearly that problem was resolved.

Indium isn't cheap.  There are many people interested in making cheaper (yet still reasonably transparent) conducting layers.  Possibilities include graphene (though even at monolayer thickness it does absorb about 2% in the visible) and percolative networks of metal nanowires (or nanotubes).    Unfortunately, because of the physics described above, it would appear that transparent aluminum  (in the sense of having true bulk metal-like propeerties but optical transparency in the visible) must remain in the realm of science fiction.

Tuesday, June 21, 2016

Short items

Here are a few items:

  • This is fantastic.  Eric Schlaepfer, a hardware engineer at Google, has built a "disintegrated circuit", making a 6502 processor (the CPU from the Apple II and also used in one of my favorite undergrad courses back when I took it) out of surface-mount transistors.  It can't run at MHz clock speeds because of the stray capacitance of the traces on the circuit board, but it's still amazing.  If you want a metric for modern processors, if you made a version of the processor for the iPad Air 2, it would cover 82000 m2.
  • This is a bit "meta", but here is Peter Woit's recent Quick Items link.  I've steered clear from the whole multiverse discussion, but wow, I find it very disturbing how much recent mass publicity has been given to an idea that is described, at best, as an extremely speculative notion.  It's like having Bayesian arguments about how many angels can dance on the head of a pin.
  • Speaking of absurdist speculative garbage, Michio Kaku in recent days has claimed that we will shortly be able to create avatars that will live after us based on uploaded memories, and that we are living in The Matrix, which proves the existence of God.   How has this person become one of the well-known faces of science popularization?
  • American Ninja Warrior really is a good way to illustrate some fun physics.
  • Geekwrapped has highlighted this blog as one of the 20 best science blogs out there.  Thanks!

Thursday, June 16, 2016

Frontiers in Quantum Materials and Devices 2016 - day 2

Continuing with my very brief (and necessarily incomplete) summary of the FQMD 2016 meeting at RIKEN at the beginning of this week:

  • Eric Heller of Harvard gave a very interesting and provocative talk about two topics, Raman scattering in graphene and then the onset of optical absorption in semiconductors.  Regarding the former (see here), he makes a strong case that the "double resonance" theoretical treatment of Raman scattering in graphene that has been highly cited since 2000 is not the right way to think about the problem.  Rather, one should use the Kramers-Heisenberg-Dirac theory of Raman scattering c. 1925-27, and keep in mind the important role played by (crystal) momentum conservation, as explained in the paper linked above.   Regarding the latter topic, he went on to argue (persuasively, in my view) that the textbook approach (literally - I described it in my own book) to the onset of optical absorption in direct-gap semiconductors as the photon energy exceeds the band gap is incomplete and gets the functional dependence on frequency wrong.  This work isn't published yet, and it wouldn't be appropriate for me to present his argument before he does, but I will definitely be keeping an eye out for this.
  • Denis Maryenko of RIKEN spoke about measurements of the anomalous Hall effect in the 2d electron gas that is present at the interface between ZnO and MnZnO.  This system is pretty impressive, with disorder so small that it supports very clean fractional quantum Hall effect, but with larger Coulomb and Zeeman energies than the more traditional GaAs/AlGaAs interface because of the different dielectric functions and g factors, respectively of the ZnO system.   Interesting (not yet published) magnetic physics appears to be taking place at the interface due apparently to point defects that support unpaired spins.
  • Pertti Hakonen from Aalto presented a nice talk about the quantum Hall effect in suspended graphene.  They have (not yet published) measurements in suspended structures made in the Corbino geometry, where there is an electrode in the center of a disk, and a second contact around the disk's perimeter.  As you might imagine, making a structure like that where the graphene disk is suspended in space, yet there is a nice contact to the central electrode without disrupting the disk, is quite a fabrication tour de force, based on an approach from here.
  • Vincent Bouchiat from CNRS, Grenoble talked about using tin-decorated graphene as a system to explore the nature of the superconductor-insulator transition.  It's a flexible material system, in that you can control the coverage of the tin (the size and distribution of tin islands), the disorder in the graphene via damage, and the carrier density in the graphene via electrostatic gating.   An earlier paper is here, and a more recent one is here.
  • Steven Richardson of Howard University spoke about the challenges of trying to make germanene, the germanium analog to graphene.  One approach that has been used in graphene growth has been to start with small, polycyclic carbon ring molecules as seeds.  Doing this in germanium has proven difficult, and Prof. Richardson's group does quantum chemistry calculations with DFT to establish the relative energetic stability and properties of candidate molecules.  From his talk I learned something I had not appreciated, that treating dispersion forces (van der Waals interactions) in DFT is really nontrivial.  
  • James Analytis of Berkeley gave a very nice talk about Weyl fermions, where I actually felt like I had a grasp of this for a few minutes.  Up to now, most of the experiments on materials that are supposed to support Weyl-like band structure have been based on photoemission, rather than actual transport.  Prof. Analytis showed particular transport signatures (quantum oscillations of resistance as a function of magnetic field) that are consistent with what one would expect from electrons actually tracing out Weyl-expected trajectories (in both real space and reciprocal space).  This work relies on impressive nanofabrication, where a focused ion beam is used to carve Cd3As2 into nanostructures + leads without killing the material quality.
  • Yoshinori Tokura from RIKEN surveyed his group's results looking at the interplay of magnetism, the quantum Hall effect, and the quantum anomalous Hall effect, built on high quality epitaxial structures based on a topological insulator (Bi1-xSbx)2Te3 and its Cr-doped relative.  Relevant papers are here, here, and here.   This is a great example of how much scientific activity can spring forth when it becomes possible to grow a new material system with very high quality.
  • Jagadeesh Moodera from MIT presented work that is similar in spirit, involving Cr doping of Bi2Se3, and then V doping of Sb2Te3.  In systems like this it is possible to see robust, ballistic transport via chiral edge states over millimeters.  Again, excellent material quality + interesting choices of materials = impressive science.
  • Joe Checkelsky of MIT spoke about exploring electronic materials with magnetically frustrated lattices.  Many systems with magnetic frustration (where magnetic moments at different lattice sites have competing interactions so that it's not possible to satisfy all of them) are insulators.  In conducting versions of these systems, there can be really funky effects where the magnetic states interact with the electrons through mechanisms like Berry curvature.  This work is in press right now and I will come back and update this once it's available online.
  • Hajime Okamoto from NTT gave a neat talk about optomechanical effects (see here for a review) - where photogenerated carriers in an AlGaAs/GaAs cantilever can couple (via the piezoelectric properties of the material) to the mechanical oscillations of the cantilever.  This makes it possible to do an interesting kind of optical driving and optical cooling of such structures.   See here and here, for example.
Whew.  Overall, a fun, interesting, and dense two days!

Tuesday, June 14, 2016

Frontiers in Quantum Materials and Devices 2016 - day 1

There were a number of really interesting talks at the Harvard/MIT sponsored, RIKEN-co-sponsored FQMD workshop this week.   I'm very grateful for the invitation to come and present.  It was a very dense two days!  I have to be a bit careful in what I write, given that some of the work is not yet published.  Here are some highlights.  I'll try to use links to the arxiv versions of the papers so that people without paid access can see them.

  • Ania Bleszynski-Jayich of UCSB spoke about her group's impressive nanoscale magnetic imaging using single nitrogen-vacancy centers in diamond AFM tips.   The N-V centers are defects in the diamond lattice, where a N atom is substituted for a C atom, directly adjacent to a C-atom vacancy.  These defects play host to a single unpaired electronic spin and can be probed through optically detected magnetic resonance.  Brendan Shields at Basel gave a talk later in the day on this technique as well - impressive imaging of domains in antiferromagnetic (!) structures.
  • Naoto Nagaosa of RIKEN gave an overview of his group's work on nonlinear and nonreciprocal electronic and optical responses in special (topological) materials - see here, here, and here for examples.  The last of these is an example where because of funky topological band structure, you can have a material that is rectifying (resistance \( R(I) \neq R(-I)\) ) where the rectification is controlled by a magnetic field.
  • Dylan Maher of Bristol, most recently in the spotlight for cool quantum optics work with Aephraim Steinberg, gave a great overview of the impressive integrated photonics capabilities at Bristol - see herehere, and here
  • Satoshi Iwamoto of Tokyo showed some neat results involving 3d chiral photonic materials (that is, materials with optical helicity built into their structure).  The wild thing here is that these materials in particular are constructed by manually stacking (!) individual nanoscale-thickness layers, using manipulation within an electron microscope - see here for an example.
  • Jason Petta from Princeton presented some really technically beautiful work involving SiGe quantum dots coupled to (and via) superconducting resonators.  These are gate-defined dots, where metal electrodes are used as capacitor electrodes to "suck in" and confine electrons.  It's hard to explain to a non-expert just how technically impressive the multiple gate structures are that they've developed.  See here.   Figure 1 just doesn't do it justice.
  • Makoto Kohda of Tohoku spoke very clearly about spin-orbit effects in GaAs 2d electron gas and in the layered semiconductor GaSe.  He showed very cool stuff - this paper showing coherent motion and precession of spin over long distances, and gate-controlled switching between weak localization and weak antilocalization in tape-exfoliated GaSe.
  • Bill Wilson, executive director of Harvard's CNS, gave an overview of their nanofab facility.  Truly, it is amazing how much internal investment Harvard has made in that facility, and I'm not even talking about the construction of the building itself.  It's very hard not to be jealous.  As often comes up when talking about Harvard, we again see that having a $40B endowment simply makes many problems faced by mere mortals simply evaporate.