First, following up on my earlier post about the field effect.... I'd seen some of this before but had some really good conversations at the workshop in Japan last week about electrochemical gating. The field effect, as I'd said, is a great way of tuning the density of charge carriers at a surface without the disorder associated with chemical doping. By cranking up some gate voltage you can in some sense just rely on the attraction of opposite charges to accumulate carriers, for example. One major limitation to this technique, though, is the amount of charge that you can really get in there using reasonable dielectrics between the gate and the surface of interest. A given insulator can only take a certain amount of electric field across it before leakage current (and eventual breakdown due to damage from "hot" electrons) starts. Calling that limiting field Emax, you can find the maximum gated charge density to be \epsilon_0 \kappa Emax, where \epsilon_0 is the permittivity of free space in SI units (8.85 x 10-12 Farad/m) and \kappa is the (unitless) relative dielectric constant. If you've got a really good oxide you can get this product up near 1013 carriers per square cm. Usually there is a tradeoff - materials with a big \kappa have a smaller breakdown field. One way around this is to use electrochemical gating. Instead of a dielectric, use either a polymer electrolyte or an ionic liquid. If you don't care about speed of response, this is a great idea because you can get a layer of counterions right next to the surface of interest. As a result, you can accumulate carrier densities exceeding 1014/cm2. That's a huge density that can let you do some fun things even in strongly correlated materials, where you're now talking about adding or removing more than one carrier per unit cell.
A very brief follow-up to my post about the weather: Aww, not this crap again. Hurricane Gustav is really starting to look like a potential annoyance. Here's hoping that it hits neither New Orleans nor Houston.
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Friday, August 29, 2008
Wednesday, August 20, 2008
Sad. Just sad.
According to a news article in Nature, there are now only four PhD scientists left at Bell Labs doing basic research. Four. Leaving aside that I know those guys personally, we should all be saddened by this.
I know that in industry there's always a tension between the immediate company bottom line and longer term investment. I can't help but wonder, though, in these days of institutional investors, mutual funds, and huge executive compensation, if we've really screwed up. Let me put it this way.... Once upon a time there were long-term investors who bought, e.g., AT&T, and really cared about whether AT&T was going to be competitive in ten or fifteen or twenty years. Now most stock is held by institutional investors and mutual funds who really don't care whether AT&T exists in ten years - they just care that there's something with a risk/reward profile like that in ten years. Furthermore, the executive compensation system is designed to massively reward short-term results (how much is this quarter's rate of growth greater than last year's? Note that making a big profit isn't enough - you have to be increasing your profit rate, not just the absolute amount of money the company makes.). As far as I can tell, we have effectively removed much of the economic incentive for long-term investment. No wonder any research and development with a 10-year horizon is almost gone from the American technological landscape. The only exceptions, as is often the case, are companies with so much money that a small research investment is negligible and can give decent PR, like Intel, Google, and Microsoft. Yeah, I know that HP Labs still exists, and I know that IBM still has people playing with STM, and I know that Exxon and Dupont and 3M have lots of talented chemists, but it looks like the days are largely gone of having a staff with a critical mass of tens of physics and chemistry PhDs doing cutting edge long-term research in an industrial setting.
I know that in industry there's always a tension between the immediate company bottom line and longer term investment. I can't help but wonder, though, in these days of institutional investors, mutual funds, and huge executive compensation, if we've really screwed up. Let me put it this way.... Once upon a time there were long-term investors who bought, e.g., AT&T, and really cared about whether AT&T was going to be competitive in ten or fifteen or twenty years. Now most stock is held by institutional investors and mutual funds who really don't care whether AT&T exists in ten years - they just care that there's something with a risk/reward profile like that in ten years. Furthermore, the executive compensation system is designed to massively reward short-term results (how much is this quarter's rate of growth greater than last year's? Note that making a big profit isn't enough - you have to be increasing your profit rate, not just the absolute amount of money the company makes.). As far as I can tell, we have effectively removed much of the economic incentive for long-term investment. No wonder any research and development with a 10-year horizon is almost gone from the American technological landscape. The only exceptions, as is often the case, are companies with so much money that a small research investment is negligible and can give decent PR, like Intel, Google, and Microsoft. Yeah, I know that HP Labs still exists, and I know that IBM still has people playing with STM, and I know that Exxon and Dupont and 3M have lots of talented chemists, but it looks like the days are largely gone of having a staff with a critical mass of tens of physics and chemistry PhDs doing cutting edge long-term research in an industrial setting.
Sunday, August 17, 2008
Slow blogging....
I'm off to Japan for this week, primarily to speak here, and secondarily to guarantee that I'm a jetlagged wreck for the first week of classes. So, unless I do some Lost in Translation blogging, it'll be a quiet week here.
Thursday, August 14, 2008
Cryptophysicists
I think that we need to coin an official term, "cryptophysicist", to describe people who do physics research outside the mainstream. Ronald Mallett is an example of a credentialed cryptophysicist - he wants to build a time machine using circulating optical beams. His tragic motivations aside, this is a scientifically wacky idea - the energy density that you would need in the beams to produce any significant distortion of spacetime is completely unachievable with foreseeable equipment. On the theory side, Harold Puthoff is another example. Puthoff wants to explain things like inertia in terms of interactions between matter and zero-point fluctuations of the electromagnetic field. Mainstream theorists consider this to be a wacky idea for a long list of reasons.
One major difference between cryptophysicists and cryptozooligists is that the public is generally able to perceive that the latter are outside the mainstream. Everyone knows from daily experience that there probably aren't yeti or sea monsters hanging around. Modern physics is abstracted enough from everyday lives and intuition, though, that many people, including some journalists, honestly can't tell when someone's waaay out there. Also, the concept of the lone genius toiling away in obscurity fighting The Scientific Establishment, which makes for good TV, sounds better when applied to a garage tinkerer than to someone camping out looking for the chupacabra. Still, occasionally the biologists do get to have fun with media coverage of this stuff.
One major difference between cryptophysicists and cryptozooligists is that the public is generally able to perceive that the latter are outside the mainstream. Everyone knows from daily experience that there probably aren't yeti or sea monsters hanging around. Modern physics is abstracted enough from everyday lives and intuition, though, that many people, including some journalists, honestly can't tell when someone's waaay out there. Also, the concept of the lone genius toiling away in obscurity fighting The Scientific Establishment, which makes for good TV, sounds better when applied to a garage tinkerer than to someone camping out looking for the chupacabra. Still, occasionally the biologists do get to have fun with media coverage of this stuff.
Tuesday, August 12, 2008
The US election and science
For anyone interested in the US presidential candidates' positions on science, I recommend this NPR piece from today's "All Things Considered". What I find most revealing is near the end. Obama supports the goals of the America COMPETES act, which (among other things) was intended to double support for basic science research over the next several years. McCain does not consider science research to be as high a priority, given fiscal constraints. Bear in mind that as a percentage of GDP the US spending on physical science research has been falling for decades, and we've also witnessed the near-death of industrial long-term research in the sciences. I'm not trying to start a political flamewar here - I just wanted to point people to this information.
Update: From the Department of the Obvious, this article from USA Today about general science literacy and cultural perception in the US. How depressing. 44% of those surveyed couldn't name a single scientist, living or dead, that could be a role model for young people. Wow.
Update: From the Department of the Obvious, this article from USA Today about general science literacy and cultural perception in the US. How depressing. 44% of those surveyed couldn't name a single scientist, living or dead, that could be a role model for young people. Wow.
Thursday, August 07, 2008
Whoa.
This article (thinks, Incoherent Ponderer, for pointing this out and your insightful post) is really distressing. Prof. Plummer is a class act and chaired a session of the APS Focus Topic that Eric Isaacs and I put together for this past March Meeting. I hope Tennessee is able to find the resources to support him fully.
Tuesday, August 05, 2008
Random samples
Three little items....
First, the Foundational Questions folks have announced their funding recipients. These grants, intended for fundamental questions in physics and cosmology, are supposed to target topics "unlikely to be otherwise funded by conventional sources". I'm a bit skeptical about that, since it sure looks like a number of these people are getting funded for topics that sound just like their regular research. Still, I do think it's a good thing to put some resources into real foundational questions. For the condensed matter people in the audience, our token representation on the list is Subir Sachdev, who is extending his theory formalism to cover black holes (!), and Keith Schwab, who is going to do true quantum mechanics by building micromechanical systems that exist in a quantum regime.
Second, this paper is one of the more bizarre things I've seen on the arxiv lately. It's from some folks trying to find efficient ways to compute pagerank numbers (useful in search algorithms for the web). They claim to have a Schroedinger-like equation that describes the pagerank. I was intrigued, in part because of all the lame joke opportunities this suggested. Funky.
Lastly, it's disappointing to read articles like this one and this one. I don't know what's more sad - that the the Holmdel Bell Labs site is being sold to developers, or that the Wall Street Journal can run multiple long articles about the discord and tribulations at the top of Alcatel-Lucent and not even mention Bell Labs. I can't help but think that the US will really miss industrial long-term research.
First, the Foundational Questions folks have announced their funding recipients. These grants, intended for fundamental questions in physics and cosmology, are supposed to target topics "unlikely to be otherwise funded by conventional sources". I'm a bit skeptical about that, since it sure looks like a number of these people are getting funded for topics that sound just like their regular research. Still, I do think it's a good thing to put some resources into real foundational questions. For the condensed matter people in the audience, our token representation on the list is Subir Sachdev, who is extending his theory formalism to cover black holes (!), and Keith Schwab, who is going to do true quantum mechanics by building micromechanical systems that exist in a quantum regime.
Second, this paper is one of the more bizarre things I've seen on the arxiv lately. It's from some folks trying to find efficient ways to compute pagerank numbers (useful in search algorithms for the web). They claim to have a Schroedinger-like equation that describes the pagerank. I was intrigued, in part because of all the lame joke opportunities this suggested. Funky.
Lastly, it's disappointing to read articles like this one and this one. I don't know what's more sad - that the the Holmdel Bell Labs site is being sold to developers, or that the Wall Street Journal can run multiple long articles about the discord and tribulations at the top of Alcatel-Lucent and not even mention Bell Labs. I can't help but think that the US will really miss industrial long-term research.
Monday, August 04, 2008
Ahh, the weather.
It's that time of year again, when we get a brief break from high-90s (F) heat and high humidity by having a tropical weather system blow through the area. Hopefully this will bring us some much needed rain, but not too much. It's interesting to watch the evolution of computer models here, and also to keep track of historical trends. I never had to think about this stuff growing up in Pennsylvania....
Sunday, August 03, 2008
Why the field effect is important.
I'm probably overdue for making a post introducing some physics topic for a nonspecialist audience, so here you go....
What is the field effect? Well, it's the physical process that is used in field effect transistors (FETs), the hundreds of millions of little three-terminal switches that are the basis of the computer you're using right now. In a FET the electrical conduction between two electrodes (a source and a drain) is modulated by applying a voltage to a third electrode (the gate) that is capacitively coupled to (though separated by a thin insulating dielectric layer from) the semiconductor material between the source and drain (the channel). In an ideal (flat band, for the experts) FET, if the gate voltage is adjusted to be positive relative to the source and drain electrodes, a layer of electrons will be capacitively attracted from the source and drain into the channel region. Once there are carriers accumulating in the channel, conduction can take place from the source to the drain. Alternately, one could imagine cranking up the gate voltage to be negative relative to the source and drain, in which case one would expect to accumulate a layer of holes rather than electrons. A device that can really be modulated to have either electrons or holes as the dominant carriers is said to be ambipolar. Of course, real devices are more complicated than this. For example, there can be surface states that live at the semiconductor/dielectric interface close to the gate; the source and drain electrodes may have preferential alignment to either the conduction or valence bands depending on the materials involved; etc. Still, the FET is the basis for modern electronics, and techniques exist for fabricating these little gadgets by the billions in Si with incredible reliability.
That's why the FET is important for industry. Why is the FET important for physics? The FET idea can be rephrased this way: the FET is a way of changing the chemical potential of charge carriers (or, in a related sense, the density of charge carriers) without altering the composition of the material. Think about that for a sec. Chemists understand doping very well. If you're doing chemistry and want to remove some electrons, you add a halogen atom somewhere since those are hugely electronegative and grab an electron. Alternately, you want an extra electron in there? Throw in an alkali metal atom - they love to give up electrons. This kind of doping is commonly done in, e.g., the high temperature superconductors, where the parent copper oxide compound is actually an (antiferromagnetic) insulator that only becomes conducting (and superconducting) if doped chemically. The problem with chemical doping is that adding new atoms to a solid changes its structure and usually increases disorder, since we generally can't control precisely how the extra atoms are distributed. In FET structures, conversely, one can try to tune the carrier density just by turning up a voltage, no chemistry required. In graphene, for example, people have gated readily from having lots of electrons to having lots of holes and back, all in one device. Of course, FET structures do have some serious limitations: they only really put charge onto a surface or interface (rather than the bulk), and it is extremely challenging to get field effect charge densities to be comparable to those achieved easily with chemical doping. Still, the field effect is a tremendous tool for tuning electronic properties without mucking up the disorder or structure. If you're interested in this in more detail, please check out this Rev Mod Phys paper.
What is the field effect? Well, it's the physical process that is used in field effect transistors (FETs), the hundreds of millions of little three-terminal switches that are the basis of the computer you're using right now. In a FET the electrical conduction between two electrodes (a source and a drain) is modulated by applying a voltage to a third electrode (the gate) that is capacitively coupled to (though separated by a thin insulating dielectric layer from) the semiconductor material between the source and drain (the channel). In an ideal (flat band, for the experts) FET, if the gate voltage is adjusted to be positive relative to the source and drain electrodes, a layer of electrons will be capacitively attracted from the source and drain into the channel region. Once there are carriers accumulating in the channel, conduction can take place from the source to the drain. Alternately, one could imagine cranking up the gate voltage to be negative relative to the source and drain, in which case one would expect to accumulate a layer of holes rather than electrons. A device that can really be modulated to have either electrons or holes as the dominant carriers is said to be ambipolar. Of course, real devices are more complicated than this. For example, there can be surface states that live at the semiconductor/dielectric interface close to the gate; the source and drain electrodes may have preferential alignment to either the conduction or valence bands depending on the materials involved; etc. Still, the FET is the basis for modern electronics, and techniques exist for fabricating these little gadgets by the billions in Si with incredible reliability.
That's why the FET is important for industry. Why is the FET important for physics? The FET idea can be rephrased this way: the FET is a way of changing the chemical potential of charge carriers (or, in a related sense, the density of charge carriers) without altering the composition of the material. Think about that for a sec. Chemists understand doping very well. If you're doing chemistry and want to remove some electrons, you add a halogen atom somewhere since those are hugely electronegative and grab an electron. Alternately, you want an extra electron in there? Throw in an alkali metal atom - they love to give up electrons. This kind of doping is commonly done in, e.g., the high temperature superconductors, where the parent copper oxide compound is actually an (antiferromagnetic) insulator that only becomes conducting (and superconducting) if doped chemically. The problem with chemical doping is that adding new atoms to a solid changes its structure and usually increases disorder, since we generally can't control precisely how the extra atoms are distributed. In FET structures, conversely, one can try to tune the carrier density just by turning up a voltage, no chemistry required. In graphene, for example, people have gated readily from having lots of electrons to having lots of holes and back, all in one device. Of course, FET structures do have some serious limitations: they only really put charge onto a surface or interface (rather than the bulk), and it is extremely challenging to get field effect charge densities to be comparable to those achieved easily with chemical doping. Still, the field effect is a tremendous tool for tuning electronic properties without mucking up the disorder or structure. If you're interested in this in more detail, please check out this Rev Mod Phys paper.