Search This Blog

Thursday, July 30, 2009

More musing about phase transitions

Everyone has seen phase transitions - water freezing and water boiling, for example. These are both examples of "first-order" phase transitions, meaning that there is some kind of "latent heat" associated with the transition. That is, it takes a certain amount of energy to convert 1 g of solid ice into 1 g of liquid water while the temperature remains constant. The heat energy is "latent" because as it goes into the material, it's not raising the temperature - instead it's changing the entropy, by making many more microscopic states available to the atoms than were available before. In our ice-water example, at 0 C there are a certain number of microscopic states available to the water molecules in solid ice, including states where the molecules are slightly displaced from their equilibrium positions in the ice crystal and rattling around. In liquid water at the same temperature, there are many more possible microscopic states available, since the water molecules can, e.g., rotate all over the place, which they could not do in the solid state. (This kind of transition is "first order" because the entropy, which can be thought of as the first derivative of some thermodynamic potential, is discontinuous at the transition.) Because this kind of phase transition requires an input or output of energy to convert material between phases, there really aren't big fluctuations near the transition - you don't see pieces of ice bopping in and out of existence spontaneously inside a glass of icewater.

There are other kinds of phase transitions. A major class of much interest to physicists is that of "second-order" transitions. If one goes to high enough pressure and temperature, the liquid-gas transition becomes second order, right at the critical point where the distinction between liquid and gas vanishes. A second order transition is continuous - that is, while there is a change in the collective properties of the system (e.g., in the ferro- to paramagnetic transition, you can think of the electron spins as many little compass needles; in the ferromagnetic phase the needles all point the same direction, while in the paramagnetic phase they don't), the number of microscopic states available doesn't change across the transition. However, the rate at which microstates become available with changes in energy is different on the two sides of the transition. In second order transitions, you can get big fluctuations in the order of the system near the transition. Understanding these fluctuations ("critical phenomena") was a major achievement of late 20th century theoretical physics.

Here's an analogy to help with the distinction: as you ride a bicycle along a road, the horizontal distance you travel is analogous to increasing the energy available to one of our systems, and the height of the road corresponds to the number of microscopic states available to the system. If you pedal along and come to a vertical cliff, and the road continues on above your head somewhere, that's a bit like the 1st order transition. With a little bit of energy available, you can't easily go back and forth up and down the cliff face. On the other hand, if you are pedaling along and come to a change in the slope of the road, that's a bit like the 2nd order case. Now with a little bit of energy available, you can imagine rolling back and forth over that kink in the road. This analogy is far from perfect, but maybe it'll provide a little help in thinking about these distinctions. One challenge in trying to discuss this stuff with the lay public is that most people only have everyday experience with first-order transitions, and it's hard to explain the subtle distinction between 1st and 2nd order.

Wednesday, July 22, 2009

The Anacapa Society

Hat tip to Arjendu for pointing this out. The Anacapa Society is a national society that promotes and encourages research in computational and theoretical physics at primarily undergrad institutions. They've had a good relationship with the KITP at UCSB, and have just signed an agreement that gives them a real home at Amherst College. (I've had a soft spot for Amherst since back in the day when I was struggling to decide whether to do the tier-1 research route vs. the undergrad education trajectory.) The nice thing about promoting this kind of research is that, particularly on the computational side of things, well-prepared undergrads at smaller institutions can make real contributions to science without necessarily the expensive infrastructure required for some hardcore experimental areas.

Cute optics demo

This youtube video is something that I'll have to remember for a future demo. It shows that cellophane tape makes (1-side) frosted glass appear to be transparent. Quite striking! The reason this works is pretty straightforward from the physics perspective. Frosted glass looks whitish because its surface has been covered (by sandblasting or something analogous) with little irregularities that have a typical size scale comparable to the wavelengths of visible light. Because of the different in index of refraction between glass and air, these little irregularities diffusely scatter light, and they do a pretty equitable job across the visible spectrum. (This is why clouds are white, too, by the way.) By coating the glass intimately with a polymer layer (with an index of refraction closer to the glass than that of the air), one is effectively smoothing out the irregularities to a large degree. As far as I know, this is essentially the same physics behind why wet fabrics often appear darker than dry fabrics. Some of the apparent lightness of the dry material is due to diffuse scattering by ~ wavelength-sized stray threads and fibers. A wetting liquid acts as an index-matching medium, effectively smoothing out those inhomogeneities and reducing that diffuse scattering.

Tuesday, July 21, 2009

Phase transitions and "mean field theory"

One truly remarkable feature of statistical physics (and condensed matter physics in particular) is the emergence of phase transitions. When dealing with large numbers of particles one often finds that, as a function of some parameter like temperature or pressure, the whole collection of particles can undergo a change of state. For example, as liquid water is warmed through 100 C at atmospheric pressure, it boils into a vapor phase of much lower density, even though it is still made up of the same water molecules as before. Understanding how and why phase transitions take place has kept many physicists occupied for a long time.

Of particular interest is understanding how microscopic interactions (e.g., polar attraction between individual water molecules) connect to the phase behavior. A classic toy model of this is used to examine magnetism. It's a comparatively simple statistical physics problem to understand how a single magnetic spin (in real life, something like one of the d electrons in iron) interacts with an external magnetic field. The energy of a magnetic moment is lowered if the magnetic moment aligns with a magnetic field - this is why it's energetically favorable for a compass needle to point north. So, one does the statistical physics problem of a single spin in a magnetic field, and there's a competition between this alignment energy on the one hand, and thermal fluctuations on the other. At large enough fields and low enough temperatures, the spin is highly likely to align with the field. Now, in a ferromagnet (think for now about a magnetic insulator, where the electrons aren't free to move around), there is some temperature, the Curie temperature, below which the spins spontaneously decide to align with each other, even without an external field. Going from the nonmagnetic to the aligned (ferromagnetic) state is a phase transition. A toy model for this is to go back to the single spin treatment, and instead of thinking about the spin interacting with an externally applied magnetic field, say that the spin is interacting with an average (or "mean") magnetic field that is generated by its neighbors. This is an example of a "mean field theory", and may be solved self-consistently to find out, in this model, the Curie temperature and how the magnetization behaves near there.

Mean field theories are nice, but it is comparatively rare that real systems are well described in detail by mean field treatments. For example, in the magnetism example the magnetization (spontaneous alignment of the spins, in appropriate units) goes like (1-T/TC)1/2 at temperatures just below TC. This is not the case for real ferromagnets - the exponent is different. Because of the nature of the approximations made in mean field theory, it is expected to be best in higher dimensionality (that is, when there are lots of neighbors!). Here's a question for experts: what real phase transitions are well described by mean field theory? I can only think of two examples: superconductivity (where the superconducting gap scales like
(1-T/TC)1/2 near the transition, just as mean field theory predicts) and a transition between liquid crystal phases. Any others?


Wednesday, July 15, 2009

The elevator message

I had a conversation today that made me think about the following. These days we're told countless times that it's essential for a scientist to have an "elevator message". That is, we need to be able to describe what we're doing in a pitch accessible to a lay person ideally in something like a single sentence. Some people have a comparatively easy time of this. They can say "I'm trying to cure cancer", or "I'm trying to solve the energy crisis", and have that be a reasonable description of their overarching research goals. Condensed matter physicists in general often have trouble with this, and tend to fall back on things like "My work will eventually enable faster computers" or "...better sensors". I'm all in favor of brief, accessible descriptions of what scientists do, but there are times when I think the elevator message idea is misguided. Not every good research program can be summed up in one sentence.

In the case of my group, we are trying to understand the (electronic, magnetic, and optical) properties of matter on the smallest scales, with an eye toward eventually engineering these properties to do useful things. It's basic research. Sometimes we can test existing theoretical ideas or address long-standing questions; sometimes, because we're working in previously unexplored regimes, we find surprises, and that can be really fun. I know that this italicized section is more sophisticated and therefore less pithy than "it'll give us faster computers". Still, I feel like this longer description does a much better job of capturing what we're actually doing. Our work is much more like puzzle-solving and exploring than it is a focused one-goal pursuit. I don't think that this means I lack vision, but I'm sure others would disagree.

On a separate note: Thanks, Arjendu, for pointing me to this, Microsoft Research's hosting of a series of Feynman lectures at Cornell in 1964. Very cool, even if I had to install MS's plug-in for the video.

Thursday, July 09, 2009

We need more papers like this.

Somehow I had missed this paper when it came out on the arxiv last November, but I came across it the other day while looking for something else in the literature. It's all about the challenges and hazards of trying to measure magnetization of either tiny samples or those with extremely small magnetic responses. Some of the cautions are rather obvious (e.g., don't handle samples with steel tools, since even tiny amounts of steel contamination will give detectable magnetic signals), and others are much more subtle (e.g., magnetic signatures from Kapton tape (due to dust! I learned about this one first hand a few years ago.) and deformed plastic straws (commonly used as sample holders in a popular brand of magnetometer)). Papers like this are incredibly valuable, and usually hard to publish. Still, I much prefer this style, writing a substantive, cautionary paper that is informative and helpful, to the obvious alternative of writing aggressive comments in response to papers that look suspect to you. The paper is so good that I'm even willing to forgive them their choice of font.

Wednesday, July 08, 2009

Figures and permissions - Why, AAAS?

Perhaps someone out there can enlighten me. For review articles, if you want to reproduce a figure from someone's published work, you are required to get permission from the copyright holder (e.g., APS for Physical Review, ACS for Nano Letters, etc.). As far as I can tell, the professional societies (APS, ACS) are cool about this, and won't charge you for permission. Even Nature, a for-profit magazine, does not charge for this if all you're doing is using a figure here and there. However, Science, run by the non-profit AAAS, wants to charge $31.25 per figure for permission to reproduce that figure in a review article. Why is Science doing this? Is this some attempt to recoup publication costs? Anyone got an explanation?

arxiv failure

It would appear that the arxiv is having some issues. Bizarrely, this seems to affect cond-mat, but not (for example) astr-ph. In cond-mat, asking for "recent" papers points you to October, 2008. Asking for "new" papers gets you things like:

New submissions for Wed, 8 Jul 09

Error with 0907.1092
Error with 0907.1096
Error with 0907.1111
Very odd. Hopefully this will be fixed soon. Come to think of it, this is the first problem I've seen like this in a decade of reading cond-mat.

Wednesday, July 01, 2009

This week in cond-mat

There have been a number of exciting (to me, anyway) papers on the arxiv this past week. One in particular, though, seems like a neat illustration of a physical principal that crops up a lot in condensed matter physics.

arxiv:0906.5206 - Tanda et al., Aharonov-Bohm Effect at liquid-nitrogen temperature: Frohlich superconducting quantum device

There are several examples in condensed matter physics of "special" (I'll explain what I mean in a second) electronic ground states that are "gapped", meaning that the lowest energy excited states for the many-electron system are separated from the ground state by an energy range where there are no allowed states. When I say that a ground state is special, I mean that it has some particular order parameter (or broken symmetry) that is distinct from that of the excited states. In this sense, a band insulator or semiconductor is not special - the many-body filled valence band states really don't have any different symmetries than the empty conduction band states. However, the superconducting ground state is special, with broken gauge symmetry (when compared to the normal metallic state) and a minimum energy (the gap energy) required to make any excitations (in this case, by breaking apart a Cooper pair). Fractional quantum Hall states are similarly gapped. The consequence of that energy gap is that the ground state can be very robust. In particular, the gap means that low energy (compared to the gap) inelastic processes cannot perturb the system, since there are no allowed final states around. This is one reason why it is possible to see macroscopic quantum effects in superconductors, as long as T is small compared to the gap.

The authors of this paper have decided to see whether such macroscopic quantum effects (detectable via quantum interference measurements analogous to the two-slit experiment) can survive in another gapped system. The distinction here is that the special state is something called a charge density wave (CDW), where the electronic density in a material (in this case tantalum trisulfide) spontaneously takes on a spatially periodic modulation. This gapped state kicks in at much higher temperatures than typical superconducting transitions. The authors have been able to measure quantum interference robustly in their device at liquid nitrogen temperatures, which is pretty impressive, and there is reason to believe that this could be extended to room temperature. The sample fabrication is very impressive, by the way. You can't just take a sheet of this stuff and punch a hole in it to make your ring-shaped interferometer. Instead, you have to actually curl a sheet up into a tube. Neat stuff, and quite surprising to me. I need to read up more about CDWs....

Saturday, June 27, 2009

A cool result

There's a new asap paper in Nano Letters that is very slick. There has been a lot of interest in the last few years in plasmonics - the controlled manipulation of plasmons, collective oscillations of the electronic fluid in metals. Plasmons are pretty remarkable excitations. Because they involve displacement of the electron density, they necessarily result in local electric fields near metal surfaces (useful for optical antenna sorts of effects), and they can (under the right circumstances) couple efficiently to electromagnetic radiation. Plasmon response to light can be very pronounced, ranging from resonant scattering or absorption (for example, why certain types of glass are colored) to more complex dispersive effects, including negative (effective) indices of refraction. Plasmons are also responsible for helping light to transmit through sub-wavelength apertures. However, as far as I know, until now none of these effects have depended in any significant way on the angular momentum of light. In this new result, researchers from the Technion in Israel have designed aperture structures that can couple selectively to left- or right-circularly polarized light. The trick is in finding a situation such that the angular momentum of the light (essentially the spin of the photons) couples selectively to plasmon modes in the apertures that have matching orbital angular momentum. I don't fully understand how the two experiments described in the paper work, but it's a neat, clever result.

Monday, June 22, 2009

Four items

Four items, and a physics post later in the week.
  • Is "just-in-time" supply chain management truly the work of the devil, or merely incredibly annoying? We've had a problem with a gate valve on a piece of cleanroom equipment at my institution, and the vendor (a) has no spare valves; (b) has no spare parts for the valves; and (c) says it'll take around 4 weeks to fab a replacement valve. Now, I understand why a business wouldn't want a huge inventory sitting on shelves, and that there are real fixed costs associated with inventories. Still, how hard would it be to have some spare parts, particularly when these things don't go bad when stored? I can tell you that it doesn't make me predisposed to ever buy anything from this supplier again. So, while it may be penny-wise, it sure feels pound-foolish for companies to alienate customers by having no backup supplies at all.
  • Ahh, scientific publishing. Two folks from Cornell used an amusing computer program to generate a grammatically correct but completely nonsensical fake paper (pdf). They then got that paper accepted to an open-access journal, without the knowledge of the editor (!), with the strong implication being that this publisher was willing to publish literally anything as long as the authors are willing to pay the fees. Wonderful. I've suspected for a while (basically when a couple of publishers spammed me about being a contributing editor on journals I'd never heard of, back when I was a brand new assistant prof) that there are some shady practices out there.
  • Also regarding scientific publishing, I was shocked and appalled (ok, not really, but certainly surprised) when I got the proofs of an article that we have coming out in Phys Rev B. Why? Because it was clear from the marked-up "author query" version of the manuscript that the AIP production office had converted our beautiful LaTeX manuscript into Microsoft Word format for editing. What is the world coming to?!
  • Lastly, I was fortunate enough to receive a new iPod Touch as a gift. Anyone out there have suggestions for must-have apps?

Monday, June 15, 2009

The revolution will be twittered.

Not a physics post, but an observation. There is a major event going on in Iran right now - protests involving many thousands of people; rioting; the most political upheaval since the 1979 revolution. I hope that everything works out for the best - any country with a Supreme Leader needs a new governance structure, IMO. Anyway, twitter is being used as a major tool by the Iranian protesters. So much for my general perception that twitter was only for people more self-indulgent than bloggers (ahem.). It's fascinating and alarming to watch events unfold from halfway around the world, while CNN reports on things like Sarah Palin/David Letterman feuds. It's as though the "news" network has forgotten what real news is....

Thursday, June 11, 2009

Nanoscale, the book

No, I have not compiled my blog postings into dead-tree format. Nor have I finished my textbook based on my graduate nanoscale physics course sequence. Instead, I wanted to point out this book, which is a cute volume with lots of computer-rendered pictures of crystal structures and the like. It's an admirable attempt to give the reader a sense of the atomic-scale composition of materials, along with brief, informative, often fun descriptions. While there are a few minor typos that seem to be caused by autocorrection run amok, the book remains entertaining and educational, with very well crafted illustrations. The book has its own website, too.

Tuesday, June 09, 2009

This week in cond-mat

Two papers appeared on the arxiv in the last couple of days concerning the very hot topic of quantum-limited measurement. I'm no expert in the area, but here's a quick summary of the idea.... Anyone who's read anything about quantum mechanics is familiar with the popular "gamma-ray microscope" thought experiment meant to highlight the Heisenberg uncertainty relation. In lay terms, trying to use light to determine the location of a particle with arbitrarily high precision requires, in a simple thought experiment, light of a correspondingly short wavelength. Shorter wavelength = higher energy photons = higher momentum photons = big momentum transfer to the particle. Thus, the more precisely you localize the particle, the less you know about its momentum. This is an adequate handwave for the popular press, but the real situation can be more subtle. Still, in the general problem of quantum measurement, one is often concerned about "back action" - the fact that coupling your system to a detector (thus enabling you to make some kind of measurement of an observable) generally perturbs the equations of motion of the system itself. It turns out, under certain very special circumstances, it is possible to design a measurement and pick observables such that the effect of back action is essentially confined to some variable that you don't care about. The net result in that case is that you can measure your particular observable to higher precision than a simplified uncertainty argument would suggest is possible.

Two groups, those of Keith Schwab at Cal Tech (paper here) and Konrad Lehnert at Boulder/JILA (paper here), have managed to do this type of measurement, looking at the position of a nanoscale mechanical resonator. In both cases, they are able to couple the resonator to a microwave LC resonator in such a way that they can measure the mechanical displacement better than the standard quantum limit. These measurements are very technologically impressive, and they open up the path toward really exciting possibilities, including entanglement of different nanomechanical systems, clever cooling schemes, and true quantum mechanics measurements.

Thursday, May 28, 2009

Random tidbits

Several minor things....
  • I've got an article on single-molecule electronics coming out in the June issue of Physics World. It's reasonably accessible, and I'm pretty happy with how it turned out, though I wish there had been more space to discuss the theoretical challenges.
  • This is damned cool. I had an undergrad course that was like the baby version of this - building up transistors into logic gates; then using logic gates to build a shift register; then building and programming a little 6502-based computer to run a model train network. This guy's work puts all that to shame by comparison.
  • The pseudonymous Kyle Finchsigmate, always entertaining and clever (often profane), has started a wiki site devoted to chemistry experimental techniques. In comments about that I came across this site from Rochester. I think it would be great to have a site like this about experimental physics, though clearly it would take a lot of work from many people to have it be any good....
  • I've been asked by a reader to solicit discussion and opinions about the various journal online manuscript submission/review systems. Which ones are good, and which ones are lousy? From what I can tell, the APS system is decent (though it always seems to complain erroneously about mistakes in my references and article lengths), and the Paragon system from ACS is quite good. The Nature publishing group one also seems to be put together well. I'm not a fan of "Manuscript Central" or whatever it is that Elsevier and IEEE use. What do you all think?
  • Thank goodness McLeroy was not confirmed as head of the TX board of education.
  • This'll be the last update for about the next 9 days or so, since I'll be traveling with very limited 'net access.

Tuesday, May 26, 2009

Plastic Fantastic thoughts

Reading Eugenie Reich's Plastic Fantastic brought me right back to the heady days of my postdoc, job search, and nearly a year spent with a student chasing what turned out to be fabricated results. In hindsight I learned an awful lot about human nature and the sociology of science, and some of that is conveyed to readers of this book, though not all.

First, the book review. I think Reich writes well, and I think she did a good job simplifying the science where appropriate for a more general audience. Criticizing the details (e.g., I wasn't a big fan of her definition of "polaron") misses the larger point (you don't need to know what a polaron is to appreciate the fact that Schon didn't fully get what polarons are either). Personally I think it would have been useful to spend more time on standard scientific practice at Bell Labs - Schon's claims (going back to his doctoral work in Germany) that he didn't keep notebooks or save primary data aren't just damning - they're completely outside what I saw essentially everyone else do, both at Bell and in grad school. How on earth did this happen? How did no one immediately supervising Schon never notice that he had no notebooks?! The idea that researchers at Bell were so independent that no one would ever notice this is crazy. I also think it would have been good to spend a bit more time on the denoument, at least discussing further the major issues raised by this whole affair: what are the responsibilities of co-authors? What are the responsibilities of managers? There were also some nuances of what happened as the scandal broke that I didn't see (though I could've missed them on a quick read), including some choice remarks by Batlogg that were rather remarkable at the time. [One other point: Reich points out that the Departments of Defense and Energy don't have central offices of research integrity. Strictly speaking, that's right, but the way it's written makes it sound like DOD and DOE never even consider the matter, which is not true. Since 2000, anyway, DOE has used the following (pdf) policy regarding research misconduct, which is basically the blanket federal policy applied at DOD as well.] In the end, the book is very effective at what it does, though it raises many more questions than it answers.

Regarding specific comments of others.... I don't think management was dealt with unfairly here in general. I didn't feel like Cherry was particularly singled out. Also, the book doesn't convey well one factor that I think is important to remember: most of the immediate managers (e.g., Rogers, Capasso) were running large, active research programs of their own. There's no question that between that and the corporate turmoil from the collapse of telecom, these people had other things on their minds than trying to manage Schon. Now, that being said, how in the name of all that is holy did these people not realize that Schon's publication rate was simply unphysical? NO ONE can write a paper every two weeks for two years. Didn't this raise questions at the journals, too? One other comment about management that was raised only indirectly.... There were a number of people who were thrilled to claim (effectively) some share of the credit for this stuff when things looked good, but were quick to disavow all responsibility when things went bad. You can't have it both ways.

(One final point that has nothing to do with the author: the choice to put a silhouette of Icarus on the cover is deeply flawed. Icarus actually flew.)

Saturday, May 23, 2009

Anyone read this yet?

I was in Barnes & Noble yesterday evening and saw a copy of Plastic Fantastic in their science section. This is Eugenie Reich's telling of the Schön saga. Anyone out there had a chance to read this yet? Steve? Don? I'll have to pick up a copy at some point.

Tuesday, May 19, 2009

Wolfram|Alpha: not too impressive.

By now many of you have run across Wolfram|Alpha, billed by its creator as a "computational knowledge engine". I've been goofing around with it a little over the past two days, and I'm not too impressed, though there are some cute things in there. The demonstration video, narrated by Wolfram himself, is very slick, and gives you the impression that Wolfram|Alpha can take even minimalistic requests (e.g., "Germany US GDP") and provide lots of computed output (US and German GDPs side by side as a function of time, in various different currency units and normalizations, for example). That is sort of true, for a very limited subset of queries. As one might expect from the people who developed Mathematica, Wolfram|Alpha can also do some symbolic math, including graphing of functions.

Unfortunately, it would appear that their model is to have these kinds of limited queries templated by hand on their side. Trying to ask well-defined questions about comparatively simple things ("What is the resistance of a wire?"), which you might expect from the demo to call up a pretty set of dialog boxes, etc., instead gives you "Wolfram|Alpha isn't sure what to do with your input." In this particular example, just "resistance of a wire" calls up dialog boxes about US and UK wire gauges and is at least somewhat useful. For a parser to do fine with "resistance of a wire" and gag on "what is the resistance of a wire" is pretty sad these days.

Bottom line: the idea of Wolfram|Alpha is cute, but right now it's entirely too much like playing an old text adventure game:
----
You are facing a brown, wooden door set in a dark green frame. There is a doorbell button here.

>Ring the doorbell.

I do not know how to do that.

>ring doorbell

I do not know how to do that.

>push button

You push the button, and from within the house you hear a distant chime.


Wednesday, May 13, 2009

Faking APS email not a good way to be taken seriously

Many of us know the joy of getting email from, err, enthusiastic amateurs claiming to have solved all of the great problems of modern physics (often involving the invalidation of quantum mechanics, relativity, or both). This morning's allotment was particularly amusing, though. Subject line: Giant Revolution in the Physics Science. From: [allegedly] aps@aps.org. (Really from someone in Hungary.) It explicitly claims to be a message on behalf of about a dozen physicists (presumably not with their actual permission), including last year's Nobel Laureates. Even better, it asks us all to contact the Royal Swedish Academy (complete with contact information) and pressure them to award the Nobel in physics to a Hungarian physicist who "reinterprets the total known experimental results and uses solely the mathematical apparatus of dynamics and electrodynamics". Amateurishly spoofing email from people is no way to promote yourself....

Tuesday, May 12, 2009

This week in cond-mat

Two recent arxiv papers caught my eye. I'm not working on graphene, but these are both pretty interesting results.

arxiv:0905.0923 - Mak et al., Observation of an Electric-Field Induced Band Gap in Bilayer Graphene by Infrared Spectroscopy
The authors, from Tony Heinz's group at Columbia, make a field-effect device out of bilayer graphene (identified optically thanks to its particular Raman spectrum) and an electrolyte. As I'd mentioned once before, by using electrolytes it is possible to achieve very large gated charge densities in transistor-style devices. In this case, the authors find that they can turn bilayer graphene from a semimetal-like system (with touching valence and conduction bands at the charge neutrality point) to a semiconductor (as determined via optical measurements), with a band gap induced and controlled by the gate. I need to read more carefully just how this works, but it shows how these kinds of experiments (moving a good fraction of a charge per unit cell around) can alter band structure profoundly.

arxiv:0905.1712
- Li et al., Large-Area Synthesis of High-Quality and Uniform Graphene Films on Copper Foils
This paper, published this week online in Science, may end up being quite important. The authors show that they can grow mostly single-layer graphene on copper supports. Copper can be annealed to produce large (several mm) crystallites, so significant areas of graphene can be made this way, templated with comparatively few defects. The big step here compared to earlier work on growing graphene using Ru or Ni substrates is that the resulting material seems to be self-limiting in thickness because of the mutual solubility limits of C in Cu and Cu in C. The authors can also transfer the graphene to other substrates, including Si chips, a necessary step for any would-be electronics applications.