I was just able to help out my postdoc by pulling an old Bell Labs notebook from 11.5 years ago off my bookshelf and showing him a schematic of an electrical measurement technique. This is an object lesson in why it is a good idea to keep a clear, complete lab notebook! I try very hard to impress upon undergrad and graduate students alike that it's critically important to keep good notes, even (perhaps especially) in these days of electronic data acquisition and analysis. I've never once looked back and regretted how much time I spent writing things down, or how much paper I used - good record keeping has saved my bacon (and lots of time) on multiple occasions. Unfortunately, with rare exceptions, students come in to the university (at the undergrad or grad levels) and seem determined to write as little as possible down using as few sheets of paper as they can manage. Somewhere along the way (before grad school, though my thesis advisor was outstanding about this), it got pounded into my brain: if you didn't document it, you didn't do it. Perhaps we should make a facebook-like or twitter-like application that would sucker student researchers into obsessively updating their work status....
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Wednesday, December 29, 2010
Tuesday, December 28, 2010
Statistical mechanics: still work to be done!
Statistical mechanics, the physics of many-particle systems, is a profound intellectual achievement. A statistical approach to systems with many degrees of freedom makes perfect sense. It's ridiculous to think about solving Newton's laws (or the Schroedinger equation, for that matter) for all the gas molecules in this room. Apart from being computationally intractable, it would be silly for the vast majority of issues we care about, since the macroscopic properties of the air in the room are approximately the same now as they were when you began reading this sentence. Instead of worrying about every molecule and their interactions, we characterize the macroscopic properties of the air by a small number of parameters (the pressure, temperature, and density). The remarkable achievement of statistical physics is that it places this on a firm footing, showing how one can go from the microscopic degrees of freedom, through a statistical analysis, and out the other side with the macroscopic parameters.
Monday, December 20, 2010
Science's Breakthrough of the Year for 2010
Science Magazine has named the work of a team at UCSB directed by Andrew Cleland and John Martinis as their scientific breakthrough of the year for 2010. Their achievement: the demonstration of a "quantum machine". I'm writing about this for two reasons. First, it is extremely cool stuff that has a nano+condensed matter focus. Second, this article and this one in the media have so many things wrong with them that I don't even know where to begin, and upon reading them I felt compelled to try to give a better explanation of this impressive work.
One of the main points of quantum mechanics is that systems tend to take in or emit energy in "quanta" (chunks of a certain size) rather than in any old amount. This quantization is the reason for the observation of spectral lines, and mathematically is rather analogous to the fact that a guitar string can ring at a discrete set of harmonics and not any arbitrary frequency. The idea that a quantum system at low energies can have a very small number of states each corresponding to a certain specific energy is familiar (in slightly different language) to every high school chemistry student who has seen s, p, and d orbitals and talked about the Bohr model of the atom. The quantization of energy shows up not just in the case of electronic transitions (that we've discussed so far), but also in mechanical motion. Vibrations in quantum mechanics are quantized - in quantum mechanics, a perfect ball-on-a-spring mechanical oscillator with some mechanical frequency can only emit or absorb energy in amounts of size hf, where h is Planck's constant. Furthermore, there is some lowest energy allowed state of the oscillator called the "ground state". Again, this is all old news, and such vibrational quantization is clear as a bell in many spectroscopy techniques (infrared absorption; Raman spectroscopy).
The first remarkable thing done by the UCSB team is to manufacture a mechanical resonator containing millions of atoms, and to put that whole object into its quantum ground state (by cooling it so that the thermal energy scale is much smaller than hf for that resonator). In fact, that's the comparatively easy part. The second (and really) remarkable thing that the UCSB team did was to confirm experimentally that the resonator really was in its ground state, and to deliberately add and take away single quanta of energy from the resonator. This is very challenging to do, because quantum states can be quite delicate - it's very easy to have your measurement setup mess with the quantum system you're trying to study!
What is the point? Well, on the basic science side, it's of fundamental interest to understand just how complicated many particle systems behave when they are placed in highly quantum situations. That's where much of the "spookiness" of quantum physics lurks. On the practical side, the tools developed to do these kinds of experiments are one way that people like Martinis hope to build quantum computers. I strongly encourage you to watch the video on the Science webpage (should be free access w/ registration); it's a thorough discussion of this impressive achievement.
Tuesday, December 14, 2010
Taking temperatures at the molecular scale
As discussed in my previous post, temperature may be associated with how energy is distributed among microscopic degrees of freedom (like the vibrational motion of atoms in a solid, or how electrons in a metal are placed into the allowed electronic energy levels). Moreover, it takes time for energy to be transferred (via "inelastic" processes) among and between the microscopic degrees of freedom, and during that time electrons can actually move pretty far, on the nano scale of things. This means that if energy is pumped into the microscopic degrees of freedom somehow, it is possible to drive those vibrations and electronic distributions way out of their thermal equilibrium configurations.
So, how can you tell if you've done that? With macroscopic objects, you can think about still describing the nonequilibrium situation with an effective temperature, and measuring that temperature with a thermometer. For example, when cooking a pot roast in the oven (this example has a special place in the hearts of many Stanford graduate physics alumni), the roast is out of thermal equilibrium but in an approximate steady state. The outside of the roast may be brown, crisp, and at 350 F, while the inside of the pot roast may be pink, rare, and 135 F. You could find these effective temperatures (effective because strictly speaking temperature is an equilibrium parameter) by sticking a probe thermometer at different points on the roast, and as long as the thermometer is small (little heat capacity compared to the roast), you can measure the temperature distribution.
What about nanoscale systems? How can you look at the effective temperature or how the energy is distributed in microscopic degrees of freedom, since you can't stick in a thermometer? For electrons, one approach is to use tunneling (see here and here), which is a topic for another time. In our newest paper, we use a different technique, Raman spectroscopy.
Monday, December 13, 2010
Temperature, thermal equilibrium, and nanoscale systems
In preparation for a post about a new paper from my group, I realized that it will be easier to explain why the result is cool if I first write a bit about temperature and thermal equilibrium in nanoscale systems. I've tried to write about temperature before, and in hindsight I think I could have done better. We all have a reasonably good intuition for what temperature means on the macroscopic scale: temperature tells us which way heat flows when two systems are brought into "thermal contact". A cool coin brought into contact with my warm hand will get warmer (its temperature will increase) as my hand cools down (its temperature will locally decrease). Thermal contact here means that the two objects can exchange energy with each other via microscopic degrees of freedom, such as the vibrational jiggling of the atoms in a solid, or the particular energy levels occupied by the electrons in a metal. (This is in contrast to energy in macroscopic degrees of freedom, such as the kinetic energy of the overall motion of the coin, or the potential energy of the coin in the gravitational field of the earth.)
We can turn that around, and try to use temperature as a single number to describe how much energy is distributed in the (microscopic) degrees of freedom. This is not always a good strategy. In the coin I was using as an example, you can conceive of many ways to distribute vibrational energy. Number all the atoms in the coin, and have the even numbered atoms moving to the right and the odd numbered atoms moving to the left at some speed at a given instant. That certainly would have a bunch of energy tied up in vibrational motion. However, that weird and highly artificial arrangement of atomic motion is not what one would expect in thermal equilibrium. Likewise, you could imagine looking at all the electronic energy levels possible for the electrons in the coin, and popping every third electron each up to some high unoccupied energy level. That distribution of energy in the electrons is allowed, but not the sort of thing that would be common in thermal equilibrium. There are certain vibrational and electronic distributions of energy that are expected in thermal equilibrium (when the system has sat long enough that it has reached steady-state as far as its statistical properties are concerned).
How long does it take a system to reach thermal equilibrium? That depends on the system, and this is where nanoscale systems can be particularly interesting. For example, there is some characteristic timescale for electrons to scatter off each other and redistribute energy. If you could directly dump in electrons with an energy 1 eV (one electron volt) above the highest occupied electronic level of a piece of metal, it would take time, probably tens of femtoseconds, before those electrons redistributed their energy by sharing it with the other electrons. During that time period, those energetic electrons can actually travel rather far. A typical (classical) electron velocity in a metal is around 106 m/s, meaning that the electrons could travel tens of nanometers before losing their energy to their surroundings. The scattering processes that transfer energy from electrons into the vibrations of the atoms can be considerably slower than that!
The take-home messages:
1) It takes time for electrons and vibrations arrive at a thermal distribution of energy described by a single temperature number.
2) During that time, electrons and vibrations can have energy distributed in a way that can be complicated and very different from thermal distributions.
3) Electrons can travel quite far during that time, meaning that it's comparatively easy for nanoscale systems to have very non-thermal energy distributions, if driven somehow out of thermal equilibrium.
More tomorrow.
Saturday, December 11, 2010
NSF grants and "wasteful spending"
Hat tip to David Bacon for highlighting this. Republican whip Eric Cantor has apparently decided that the best way to start cutting government spending is to have the general public search through NSF awards and highlight "wasteful" grants that are a poor use of taxpayer dollars.
Look, I like the idea of cutting government spending, but I just spent two days in Washington DC sitting around a table with a dozen other PhD scientists and engineers arguing about which 12% of a large group of NSF proposals were worth trying to fund. I'm sure Cantor would brand me as an elitist for what I'm about to write, but there is NO WAY that the lay public is capable of making a reasoned critical judgment about the relative merits of 98% of NSF grants - they simply don't have the needed contextual information. Bear in mind, too, that the DOD budget is ONE HUNDRED TIMES larger than the NSF budget. Is NSF really the poster child of government waste? Seriously?
Tuesday, December 07, 2010
The tyranny of reciprocal space
I was again thinking about why it can be difficult to explain some solid-state physics ideas to the lay public, and I think part of the problem is what I call the tyranny of reciprocal space. Here's an attempt to explain the issue in accessible language. If you want to describe where the atoms are in a crystalline solid and you're not a condensed matter physicist, you'd either draw a picture, or say in words that the atoms are, for example, arranged in a periodic way in space (e.g., "stacked like cannonballs", "arranged on a square grid", etc.). Basically, you'd describe their layout in what a condensed matter physicist would call real space. However, physicists look at this and realize that you could be much more compact in your description. For example, for a 1d chain of atoms a distance a apart from each other, a condensed matter physicist might describe the chain by a "wavevector" k = 2 \pi/a instead. This k describes a spatial frequency; a wave (quantum matter has wavelike properties) described by cos kr would go through a complete period (peak of wave to peak of wave, say) and start repeating itself over a distance a. Because k has units of 1/length, this wavevector way of describing spatially periodic things is often called reciprocal space. A given point in reciprocal space (kx, ky, kz) implies particular spatial periodicities in the x, y, and z directions.
Why would condensed matter physicists do this - purely to be cryptic? No, not just that. It turns out that a particle's momentum (classically, the product of mass and velocity) in quantum mechanics is proportional to k for the wavelike description of the particle. Larger k (shorter spatial periodicity), higher momentum. Moreover, trying to describe the interaction of, e.g., a wave-like electron with the atoms in a periodic lattice is done very neatly by worrying about the wavevector of the electron and the wavevectors describing the lattice's periodicity. The math is very nice and elegant. I'm always blown away when scattering experts (those who use x-rays or neutrons as probes of material structure) can glance at some insanely complex diffraction pattern, and immediately identify particular peaks with obscure (to me) points in reciprocal space, thus establishing the symmetry of some underlying lattice.
The problem is, from the point of view of the lay public (and even most other branches of physics), essentially no one thinks in reciprocal space. One of the hardest things you (as a condensed matter physicist) can do to an audience in a general (public or colloquium) talk is to start throwing around reciprocal space without some preamble or roadmap. It just shuts down many nonexperts' ability to follow the talk, no matter how pretty the viewgraphs are. Extreme caution should be used in talking about reciprocal space to a general audience! Far better to have some real-space description for people to hang onto.
Friday, December 03, 2010
A seasonal abstract
On the anomalous combustion of oleic and linoleic acid mixtures
J. Maccabeus et al., Hebrew University, Jerusalem, Judea
Olive-derived oils, composed primarily of oleic and linoleic fatty acids, have long been used as fuels, with well characterized combustion rates. We report an observation of anomalously slow combustion of such a mixture, with a burn rate suppressed relative to the standard expectations by more than a factor of eight. Candidate explanations for these unexpectedly slow exothermic reaction kinetics are considered, including the possibility of supernatural agencies intervening to alter the local passage of time in the vicinity of the combustion vessel.
(Come on, admit it, this is at least as credible as either this or this.)
Subscribe to:
Posts (Atom)