Sunday, December 28, 2008

More about insulators

I've been thinking more about explaining what we mean by "insulators", in light of some of the insightful comments. As I'd said, we can think about three major classes of insulators: band insulators (a large gap due to single-particle effects (more below) exists in the ladder of electronic states above the highest occupied state); Anderson insulators (the highest occupied electronic states are localized in space, rather than extending over large distances; localization happens because of disorder and quantum interference); and Mott insulators (hitherto neglected electron-electron interactions make the energetic cost of moving electrons prohibitively high).

The idea of an energy gap (a big interval in the ladder of states, with the states below the gap filled and the states above the gap empty) turns out to be a unifying concept that can tie all three of these categories together. In the band insulator case, the states are pretty much single-particle states (that is, the energy of each state is dominated by the kinetic energies of single electrons and their interactions with the ions that supply the electrons). In the Anderson insulator case, the gap is really the difference in energy between the highest occupied state and the nearest extended state (called the mobility edge). In the Mott case, the states in question are many-body states that have a major contribution due to electron-electron interactions. The electron-electron interaction cost associated with moving electrons around is again an energy gap (a Mott gap), in the ladder of many-body (rather than single-particle) states.

I could also turn this around and talk in terms of the local vs. extended character of the highest occupied states (as Peter points out). In the ideal (infinite periodic solid) band insulator case, all (single-particle) electronic states are extended, and it's the particular lattice arrangement and electronic population that determines whether the highest occupied state is far from the nearest unoccupied state. In the Anderson case, quantum interference + disorder leads to the highest occupied states looking like standing waves - localized in space. In the Mott case, it's tricky to try to think about many-body states in terms of projections onto single-particle states, but you can do so, and you again find that the highest relevant states are localized (due, it turns out, to interactions). Like Peter, I also have been meaning to spend more time thinking hard about insulators.

Coming soon: a discussion of "metals".


It's a small world. Just last week I finished reading this book, a very nice biography of Michael Faraday, possibly the greatest experimental physicist ever. Lo and behold, this week there are two long blog postings (here and here) also talking about Faraday. What an impressive scientist. Now I need to find a good bio of Maxwell....

Friday, December 26, 2008

What does it mean for a material to be an "insulator"?

I've been thinking for a while about trying to explain some physics concepts on, well, a slightly more popular level. This is a first pass at this, focusing on electrical insulators. Feedback is invited. I know that this won't be perfect for nonscientists at a first cut.

Very often we care about the electrical properties of materials. Conceptually, we imagine hooking the positive terminal of a battery up to one end of a material, hooking the negative terminal up to the other end, and checking to see if any current is flowing. We broadly lump solids into two groups, those that conduct electricity and those that don't. Materials in the latter category are known as insulators, and it turns out that there are at least three different kinds.
  • Band insulators. One useful way of thinking about electrons in solids is to think about the electrons as filling up single-particle states (typically with two electrons per state). This is like what you learn in high school chemistry, where you're taught that there are certain orbitals within atoms that get filled up, two electrons per orbital. Helium has two electrons in the 1s orbital, for example. In solids, there are many, many states, each one with an associated energy cost for being occupied by an electron, and the states are grouped into bands separated by intervals of energy (band gaps) with no states. (Picture a ladder with groups of closely spaced rungs, and each rung has two little divots where marbles (the electrons) can sit.) Now, in clean materials, you can think of some states as corresponding to electrons moving to the left. Some states correspond to electrons moving to the right. In order to get a net flow of electrons when a battery is used to apply a voltage difference across a slab of material, there have to be transitions that, for example, take electrons out of left-moving states and put them into right-moving states, so that more electrons are going one way than the other. For this to happen, there have to be empty states available for the electrons to occupy, and the net energy cost of shifting the electrons around has to be low enough that it's supplied by the battery or by thermal energy. In a band insulator, all of the states in a particular band (usually called the valence band) are filled, and the energetically closest empty states are too far away energetically to be reached. (In the ladder analogy, the next empty rung is waaay far up the ladder.) This is the situation in materials like diamond, quartz, and sapphire.
  • Anderson insulators. These are materials where disorder is responsible for insulating behavior. In the ladder analogy above, each rung of the ladder corresponded to what we would call an "extended" state. To get a picture of what this means, consider looking at a smooth, grooved surface, like a freshly plowed field, and filling it partially with water. Each furrow would be an extended state, since on a level field water would extend along the furrow from one end of the field to the other. Now, a disordered system in this analogy would look more like a field pockmarked with hills and holes. Water (representing the electrons) would pool in the low spots rather than forming a continuous line from one end of the field to the other. These local low spots are defects, and the puddles of water correspond to localized states. In the real quantum situation things are a bit more complicated. Because of the wavelike nature of electrons, even weak disorder (shallow dips rather than deep holes in the field) can lead to reflections and interference effects that can cause states to be localized on a big enough "field". Systems like this are insulating (at least at low temperatures) because it takes energy to hop electrons from one puddle to another puddle. For small applied voltages, nothing happens (though clearly if one imagines tilting the whole field enough, all the water will run down hill - this would correspond to applying a large electric field.). Examples of this kind of insulating behavior include doped polymer semiconductors.
  • Mott insulators. Notice that nowhere in the discussion of band or Anderson insulators did I say anything at all about the fact that electrons repel each other. Electron-electron interactions were essentially irrelevant to those two ways of having an insulator. To understand Mott insulators, think about trying to pack ping-pong balls closely in a 2d array. The balls form a triangular lattice. Now the repulsion of the electrons is represented by the fact that you can't force two ping-pong balls to occupy the same site in the 2d lattice. Even though you "should" be able to put two balls (electrons) per site, the repulsion of the electrons prevents you from doing so without comparatively great energetic cost (associated with smashing a ping-pong ball). The result is, for exactly 1 ball (electron) per site ("half-filled band") in this situation dominated by ball-ball interactions ("on-site repulsion"), no balls are able to move in response to an applied push (electric field). To get motion (conduction) in this case, one approach is to remove some of the balls (electrons) to create vacancies in the lattice. This can be done via chemical doping. Examples of Mott insulators are some transition metal oxides like V2O3 and the parent compounds of the high temperature superconductors.
Often when speaking of "metals" vs. "insulators", we are interested in the ground state of the material, the state that would describe the material in equilibrium as T approaches 0. Materials with an electrical resistivity that tends toward infinity as T approaches 0 are insulators in this sense.

Tuesday, December 23, 2008

Quantum dots in graphene

The progress in graphene experiments continues. Unsurprisingly, many people are interested in using graphene, a natural 2d electronic system, to make what some would call quantum dots: localized puddles of electrons separated from "bulk" leads by tunnel barriers. Step one in making graphene quantum dots is to etch graphene into a narrow constriction. You can end up with localized electronic states due to disorder (from the edges and the underlying substrate). This Nano Letter shows examples of this situation. Alternately, you can make structures using gates that can define local regions of p-type (local chemical potential biased into the valence band) and n-type (conduction band) conduction, separated by tunnel junctions formed when p and n regions run into each other. That situation is described here. Neat stuff. I would imagine that low temperature transport measurements through such structures in the presence of applied magnetic fields should be very revealing, given the many predictions about exotic magnetic properties in edge states of graphene ribbons.

Saturday, December 20, 2008

The new science and technology team

The President-elect has named his science team. Apart from the fact that these folks are all highly qualified (the new administration will have two Nobel laureates advising it directly), I'm told by a senior colleague well-versed in policy that the real good news is the re-promotion of the science advisor position back to the level of authority that it had prior to 2001, and the reinvigoration of PCAST.

Friday, December 19, 2008

At the risk of giving offense....

I see that New Scientist has an article effectively making fun of the Department of Defense for asking their major scientific advisory panel, JASON, to look into a company's claim that it could use gravity waves as an imaging tool. JASON rightly determined that this was not something to worry about. Seems like a non-story to me. Thank goodness New Scientist has never actively promoted something manifestly scientifically wacky on their front cover, like a microwave cavity that violates conservation of momentum. Oh wait.

Wednesday, December 17, 2008

Outside shot

Since the President-Elect has not yet named his science advisor (though his transition team has named point people, and the nominee for Secretary of Energy has impeccable credentials), I thought I'd point out another crucial way that I would fit in. Sure, I'm under 5' 8" tall, but I can (sometimes) shoot; as some of my college friends can attest, I won a gift certificate in undergrad days by sinking a shot from the top of the key at a women's basketball game halftime promo. (For the humor-impaired: I'm not really in the running to be part of the Obama administration.)

Let them fail.

Please explain to me why we should give AIG another penny.

A couple of ACS papers

Two recent papers in the ASAP section of Nano Letters caught my eye.

The first is van der Molen et al., "Light-controlled conductance switching of ordered metal−molecule−metal devices". I've written a blurb about this for the ACS that will eventually show up here. The Sch√∂nenberger group has been working for a while on an approach for measuring molecular conductances that is based on networks of metal nanoparticles linked by molecules of interest. The idea is to take metal nanoparticles and form an ordered array of them with neighbors linked by molecules of interest covalently bound to the particle surfaces. The conductance of the array tells you something about the conductance of the particle-molecule-particle junctions. This is simple in concept and extremely challenging in execution, in part because when the metal nanoparticles are made by chemical means they are already coated with some kind of surfactant molecules to keep them suspended in solution. Performing the linking chemistry in a nice way and ending up with an ordered array of particles rather than a blob of goo requires skill and expertise. These folks have now made arrays incorporating molecules that can change reversibly change their structure upon exposure to light of the appropriate wavelength. The structural changes show up in photo-driven changes in the array conductance.

The second is Ryu et al., "CMOS-Analogous Wafer-Scale Nanotube-on-Insulator Approach for Submicrometer Devices and Integrated Circuits Using Aligned Nanotubes". Lots of people talk a good game about trying to make large-scale integrated circuits using nanotubes, but only a couple of groups have made serious progress. This paper by Chongwu Zhou's group shows that they can take arrays of tubes (grown by chemical vapor deposition on quartz or sapphire substrates), transfer them to Si wafers via a clever method involving gold, pattern the tubes, put down electrodes for devices, burn out the metallic tubes, and dope the semiconductor tubes chemically to do either p or n-type conduction. They are also working on fault-tolerant architectures to deal with the fact that each transistor (which in this case incorporates an ensemble of tubes) has slightly different characteristics.

Saturday, December 13, 2008

Manhattan and Apollo project metaphors

Relatively regularly these days there are calls for a Manhattan or Apollo style project to address our energy challenges. While this may sound good, it's worth considering what such a project would actually mean. I found this article (pdf) to be helpful. For the Manhattan project, peak annual funding reached about 1% of total federal outlays and 0.4% of GDP. For Apollo, the peak annual numbers were 2.2% of the federal budget and also 0.4% of GDP. What would that mean in today's numbers? Well, the federal budget is on the order of $3T, and the GDP is around $14T. If we use the GDP numbers, such a commitment of resources would be $56B. That's approximately the combined budgets of DOE, NIH, and NSF. Bear in mind that when we did Apollo, it's not like all other efforts stopped, so really pulling something like this off would require a significant allotment of money.

It's worth pointing out that the government has given three times this amount to AIG alone. It's also worth mentioning that communications technologies are vastly superior to those of the past. Presumably large collaborations can be managed more easily and would eliminate the need to uproot the top researchers in the world from their homes and relocate them in a single central location.

Anyway, those are at least some real numbers, and they're not crazy or unattainable given a strong lead in national priorities from the top. The real challenge is figuring out what the true goal is. It's fine to say "energy independence" or some target number for renewables, but the global energy challenge is a lot more diffuse and multidimensional than either putting people on the moon or developing the atomic bomb.

Thursday, December 11, 2008


I know that the journal is called Nano Letters, but using the "nano" prefix three times in the title of a paper is a little extreme.

Wednesday, December 10, 2008


As US readers have probably heard by now, Steve Chu has been selected by President-Elect Obama to be the new Secretary of Energy. I think that this is very good news. He's extremely smart, and he's been involved in pretty much the entire research enterprise, from his days at Bell Labs to running a research group at Stanford to serving as department chair to acting as director at LBL. He's been actively worrying about both basic and applied research and understands the actual energy demands of the country and the world. As the finance folks say, past performance is not necessarily indicative of future results, but this appointment gives me real cause for optimism. Steve Chu is extremely qualified for this.

What to do about public perception of science

From the comments on my last post, it's clear that there are some number of science types out there who view the situation as hopeless: the public is poorly informed and his bigger things to worry about; despite having direct evidence every day of the importance of science (ubiquitous computers, lasers, GPS, MRI, DNA testing) the public feels that science is somehow not relevant to their lives and finds even the basic concepts inaccessible; because there is no financial incentive for people to learn about science they won't; etc. While there is a grain of truth to these comments, there is plenty of evidence that is more hopeful. There is no question that certain science topics capture the public imagination: Mars rovers, using genetic technology to cure diseases or identify relationships between individuals or species, the LHC (talk about an effective marketing job, at least to some segment of the population).

Chad Orzel has many good things to say about what scientists can do to help improve the situation, and I won't repeat them here. If you are personally trying to do outreach, I do have one suggestion. Remember that people like a compelling story and interesting characters. The story can be a scientific one (Longitude), a personal one (Genius), or a large-scale drama (The Making of the Atomic Bomb), but it is possible to capture and hold people's attention on scientific subjects. I'm not suggesting that everyone should go out and try to write a popular book on science, but try to remember what makes the best of those books successful.

Friday, December 05, 2008

Do people just not care about science and technology?

CNN seems to think that's the case. As others have pointed out (1, 2), they've decided to close down their science and technology division and fold that reporting back into their general news category. So, what is the message here? That the ad revenue CNN can generate from having good science journalism doesn't justify the expense? (I'm sure that they'll claim quality of reporting won't change, but realistically you're not going to get good science journalism if it's a part-time gig for people who are spending more of their time reporting on Jennifer Aniston's latest romantic trevails.) What does it say about our society that we're completely dependent on technology (and therefore science), and yet pursuing science or engineering is viewed as "nerdy" and even accurately reporting on it is judged not worth the expense? Sorry for the buzz-kill of a Friday post, but this really depresses me.

Tuesday, December 02, 2008

This week in cond-mat

Two brief mentions this week.

arxiv:0811.4491 - M. Häfner et al., Anisotropic magnetoresistance in ferromagnetic atomic-sized metal contacts
Over the last few years there's been quite an interest in the effect of magnetic fields on the electrical resistance of atomic-scale contacts between ferromagnetic metals (e.g., Ni). As pointed out by last year's Nobel in physics, if one can create nanostructures with very large magnetoresistive effects, there can be immediate applications in magnetic data storage. Recent investigations (for example, here and here) have shown dramatic variations in the magnetoresistance of such contacts from device to device. It's pretty clear that the atomic-scale details of the structures end up mattering (which is pretty cool, but rather discouraging from the technological side). This paper is a theoretical look at this issue, emphasizing that this sensitivity to details likely results from the fact that those last few atoms at the contact are undercoordinated - that is, they have fewer neighbors than atoms in the bulk of the magnetic metal.

- Masubuchi et al., Fabrication of graphene nanoribbon by local anodic oxidation lithography using atomic force microscope
I've been waiting for a paper like this to show up. It's been known for about a decade now that one can use the tip of an atomic force microscope to do very local electrochemistry. This has been exploited to make interesting metal/metal oxide structures, designer
surface details on Si, and impressive quantum dot systems in GaAs. These folks have done the same on graphene. Nice.
UPDATE: As was pointed out in the comments, I had overlooked two earlier reports of AFM oxidation of graphene, here and here.

You've got to be kidding.

Don't take this as a comment one way or the other on D-wave's actual product claims about building adiabatic quantum computers, but check out the image at right from their presentation at a conference on supercomputing about "disruptive technologies". This may be great for raising [edited in response to comment] enthusiasm, but it's almost self-parody. Remember, true disruptive technologies reshape the world. Examples include fire, agriculture, the wheel, the transistor, the digital computer, the laser, and arguably the internet (if you consider it separately from the computer). Must be nice to claim, on the basis of one real data point and one projected data point, that a product is on a disruptive trajectory (in productivity (arbitrary units) vs. time (arbitrary units)). Wow.