A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Tuesday, July 31, 2012
A new big prize
So there's another big scientific prize around now, the Milner Prize for Fundamental Physics. Interesting. Clearly wealthy Russian multimillionaires can do what they like with their money, whether that means physics prizes or miniature giraffes. However, I am not terribly thrilled with the idea that "fundamental physics" includes (to a significant degree so far) theoretical ideas that simply have not been tested experimentally. It would be very unfortunate if this ends up as a high profile media splash that boosts the erroneous public perceptions of fundamental physics as (1) virtually entirely high energy theory; and (2) exotic ideas unconnected to experiment (e.g., the string multiverse).
Monday, July 30, 2012
A few items
This editorial from yesterday's NY Times is remarkable for just how off-base it is. The author, a retired social scientist professor, argues that we should stop teaching algebra to so many people. He actually thinks that teaching algebra to everyone is bad for society: "Making mathematics mandatory prevents us from discovering and developing young talent." Basically, his reasoning is that (1) it's hard for many non-math-inclined people, so it takes up a lot of time that could be spent on other things; and (2) it's really not useful for most people, so it's doubly a waste of time. How anyone can argue publicly that what society really needs is less math literacy is completely beyond me. Like all of these sorts of things, there is a grain of reason in his argument: Most people do not need to prove solutions to cubic equations with professional-grade math rigor. However, algebra and algebraic ideas are absolutely essential to understanding many many things, from financial literacy to probability and statistics. Moreover, algebra teaches real quantitative reasoning, rather than just arithmetic. The fact that this even got printed in the Times is another example of the anti-science/math/engineering bias in our society. If I tried to get an editorial into the Washington Post advocating that we stop teaching history and literature to everyone because it takes away time from other things and writing is hard for many people, I would rightly be decried as an idiot.
This editorial from today's NY Times is also remarkable. The author had historically been a huge skeptic of the case for anthropogenic global warming. Funded by the oil magnate (and totally unbiased about this issue, I'm sure) Koch brothers, he did a study based much more on statistics and data gathering than relying on particular models of climate forecasting. Bottom line: He's now convinced that global warming is real, and that human activities in terms of CO2 are very significant drivers. Funding to be cut off and his name to be publicly excoriated on Fox News in 5...4...3.... See? Quantitative reasoning is important.
Rumor has it that Bill Nye the Science Guy is considering making new episodes of his show. Bill, I know you won't read this, but seven years ago you visited Rice and posed for a picture with my research group. Please make this happen! If there is anything I can do to increase the likelihood that this takes place, let me know.
Finally, from the arxiv tonight, this paper is very interesting. These folks grew a single layer of FeSe on a strontium titanate substrate, and by annealing it under different conditions they affect its structure (as studied by angle-resolved photoemission). The important point here is that they find conditions where this layer superconducts with a transition temperature of 65 K. That may not sound so impressive, but if it holds up, it beats the nearest Fe-based material by a good 10 K, and beats bulk FeSe by more like a factor of 1.5 in transition temperature. Stay tuned. Any upward trend in Tc is worth watching.
This editorial from today's NY Times is also remarkable. The author had historically been a huge skeptic of the case for anthropogenic global warming. Funded by the oil magnate (and totally unbiased about this issue, I'm sure) Koch brothers, he did a study based much more on statistics and data gathering than relying on particular models of climate forecasting. Bottom line: He's now convinced that global warming is real, and that human activities in terms of CO2 are very significant drivers. Funding to be cut off and his name to be publicly excoriated on Fox News in 5...4...3.... See? Quantitative reasoning is important.
Rumor has it that Bill Nye the Science Guy is considering making new episodes of his show. Bill, I know you won't read this, but seven years ago you visited Rice and posed for a picture with my research group. Please make this happen! If there is anything I can do to increase the likelihood that this takes place, let me know.
Finally, from the arxiv tonight, this paper is very interesting. These folks grew a single layer of FeSe on a strontium titanate substrate, and by annealing it under different conditions they affect its structure (as studied by angle-resolved photoemission). The important point here is that they find conditions where this layer superconducts with a transition temperature of 65 K. That may not sound so impressive, but if it holds up, it beats the nearest Fe-based material by a good 10 K, and beats bulk FeSe by more like a factor of 1.5 in transition temperature. Stay tuned. Any upward trend in Tc is worth watching.
Thursday, July 26, 2012
Memristor or not - discussion in Wired.
Earlier in the month, Wired reported that HP is planning to bring their TiO2-based resistive memory to market in 2014. Resistive memory is composed of a bunch of two-terminal devices that function as bits. Each device has a resistance that is determined by the past history of the voltage (equivalently current) applied to the device, and can be toggled between a high resistance state and a low resistance state. In HP's case, their devices are based on the oxidation and reduction of TiO2 and the diffusion of oxygen vacancies.
This announcement and reporting apparently raised some hackles. Wired has finally picked up on the fact that HP's use of the term "memristor" to describe their devices is more of a marketing move than a rigorous scientific claim. As I pointed out almost two years ago, memristors are (in my view) not really fundamental circuit elements in the same way as resistors, capacitors, and inductors; and just because some widget has a history-dependent resistance, that does not make it a memristor in the sense of the original definition.
This announcement and reporting apparently raised some hackles. Wired has finally picked up on the fact that HP's use of the term "memristor" to describe their devices is more of a marketing move than a rigorous scientific claim. As I pointed out almost two years ago, memristors are (in my view) not really fundamental circuit elements in the same way as resistors, capacitors, and inductors; and just because some widget has a history-dependent resistance, that does not make it a memristor in the sense of the original definition.
Thursday, July 19, 2012
Tuesday, July 17, 2012
Replacement for Virtual Journals?
I notice that the APS Virtual Journals are going away. I was a big fan of the nano virtual journal, since the people who ran it generally did a really nice job of aggregating articles from a large number of journals (APS, AIP, IOP, Nature, Science, PNAS, etc.) on a weekly basis. True, they never did manage to work out a deal to coordinate with the ACS, and they'd made a conscious decision to avoid journals dedicated to nano (e.g., Nature Nano). Still it will be missed.
Now I will show my age. Is there a nice web 2.0 way to replace the virtual journal? In the announcement linked above, they justify the end of these virtual journals by saying that there are new and better tools available for gathering this sort of information. What I'd like to do, I think, is look at the RSS feeds of the tables of contents of a bunch of journals, filter on certain keywords, and aggregate together links to all the articles. Is there a nice way to do this without miss and fuss? Suggestions would be appreciated.
Now I will show my age. Is there a nice web 2.0 way to replace the virtual journal? In the announcement linked above, they justify the end of these virtual journals by saying that there are new and better tools available for gathering this sort of information. What I'd like to do, I think, is look at the RSS feeds of the tables of contents of a bunch of journals, filter on certain keywords, and aggregate together links to all the articles. Is there a nice way to do this without miss and fuss? Suggestions would be appreciated.
Thursday, July 12, 2012
Exotic (quasi)particles, and why experimental physics is challenging
There was a very large amount of press, some of it rather breathless, earlier in the year about the reported observation of (effective) Majorana fermions in condensed matter systems. Originally hypothesized in the context of particle physics, Majorana fermions are particles with rather weird properties. Majorana looked hard at the Dirac equation (which is complex), and considered particles "built out of" linear combinations of components of solutions to the Dirac equation. These hypothesized particles would obey a real (not complex) wave equation, and would have the rather odd property that they are their own antiparticles (!) In the language of quantum field theory, if the operator \( \gamma^{\dagger} \) creates a Majorana particle, then \( \gamma^{\dagger} \gamma^{\dagger}\) creates and destroys one, leaving behind nothing. In the context of condensed matter, it has been theorized (here and here, for example) that it's possible to take a superconductor and a semiconductor wire with strong spin-orbit coupling, and end up with a composite system that has low energy excitations (quasiparticles) that have properties like those of Majorana fermions.
So, if you had these funky quasiparticles in your system, how could you tell? What experiment could you do that would give you the relevant information? What knob could you turn and what could you measure that would confirm or deny their presence? That's the challenge (and the fun and the frustration) of experimental physics. There are only so many properties that can be measured in the lab, and only so many control parameters that can be tuned. Is it possible to be clever and find an experimental configuration and a measurement that give an unambiguous result, one that can only be explained in this case by Majorana modes?
In the particular experiment that received the lion's share of attention, the experimental signature was a "zero-bias peak" in the electrical conductance of these structures. The (differential) conductance is the slope of the \(I-V\) curve of an electrical device - at any given voltage (colloquially called "bias"), the (differential) conductance tells you how much more current you would get if you increased the voltage by a tiny amount. In this case, the experimentalists found a peak in the conductance near \( V = 0 \), and that peak stayed put at \(V = 0\) even when a magnetic field was varied quite a bit, and a gate voltage was used to tune the amount of charge in the semiconductor. This agreed well with predictions for the situation when there is a Majorana-like quasiparticle bound to the semiconductor/superconductor interface.
The question is, though, is that, by itself, sufficient to prove the existence of Majorana-like quasiparticles experimentally? According to this new paper, perhaps not. It looks like it's theoretically possible to have other (boring, conventional) quasiparticles that can form bound states at that interface that also give a zero-bias peak in the conductance. Hmm. Looks like it may well be necessary to look at other measurable quantities besides just the conductance to try to settle this once and for all. This is an important point that gets too little appreciation in popular treatments of physics. It's rare that you can directly measure the really interesting property or system directly. Instead, you have to use the tools at your disposal to test the implications of the various possibilities.
So, if you had these funky quasiparticles in your system, how could you tell? What experiment could you do that would give you the relevant information? What knob could you turn and what could you measure that would confirm or deny their presence? That's the challenge (and the fun and the frustration) of experimental physics. There are only so many properties that can be measured in the lab, and only so many control parameters that can be tuned. Is it possible to be clever and find an experimental configuration and a measurement that give an unambiguous result, one that can only be explained in this case by Majorana modes?
In the particular experiment that received the lion's share of attention, the experimental signature was a "zero-bias peak" in the electrical conductance of these structures. The (differential) conductance is the slope of the \(I-V\) curve of an electrical device - at any given voltage (colloquially called "bias"), the (differential) conductance tells you how much more current you would get if you increased the voltage by a tiny amount. In this case, the experimentalists found a peak in the conductance near \( V = 0 \), and that peak stayed put at \(V = 0\) even when a magnetic field was varied quite a bit, and a gate voltage was used to tune the amount of charge in the semiconductor. This agreed well with predictions for the situation when there is a Majorana-like quasiparticle bound to the semiconductor/superconductor interface.
The question is, though, is that, by itself, sufficient to prove the existence of Majorana-like quasiparticles experimentally? According to this new paper, perhaps not. It looks like it's theoretically possible to have other (boring, conventional) quasiparticles that can form bound states at that interface that also give a zero-bias peak in the conductance. Hmm. Looks like it may well be necessary to look at other measurable quantities besides just the conductance to try to settle this once and for all. This is an important point that gets too little appreciation in popular treatments of physics. It's rare that you can directly measure the really interesting property or system directly. Instead, you have to use the tools at your disposal to test the implications of the various possibilities.
Sunday, July 08, 2012
What is effective mass?
Given my previous post and the Higgs excitement, it's worth thinking a bit about what we mean by "effective mass" for charge carriers in solids. At the root of the concept is the idea that it is meaningful to describe the low energy (compared with the bandwidth, which turns out to be on the order of electron-volts) electronic excitations of (the many electrons in) solids as electron-like quasiparticles - quantum objects that are well described as occupying particular states that are described as having definite energy and momentum (for the experts, these states are approximate eigenstates of energy and momentum). One can look at those allowed states, and ask how energy \(E \) varies as a function of momentum \(\mathbf{p} \). If the leading variation is quadratic, then we can define an effective mass by \(E \approx p^{2}/2m* \). Note that this doesn't have to be the situation. Near the "Dirac point" in graphene, where the occupied \(\pi \) electron band has its maximum energy and the unoccupied \(\pi \) band has its minimum energy, the energy of the quasiparticles goes linearly in momentum, analogous to what one would expect for ultrarelativistic particles in free space.
The actual situation is more rich than this. In real space, we believe the universe to be invariant under continuous translational symmetry - that is, the properties of the universe don't depend on where we are. Translating ourselves a little to the right doesn't change the laws of nature. That invariance is what actually implies strict conservation of momentum. In the case of a periodic solid, we have a lower symmetry situation, with discrete translational symmetry - move over one lattice spacing, and you get back to the same physics. In that case, while true momentum is still conserved (the universe is what it is), the parameter that acts like momentum in describing the electronic excitations in the solid is only conserved if one allows the solid as a whole the chance to pick up momentum in certain amounts (proportional to 1/the lattice spacing).
More complicated still, when electron-electron interactions are important, say between the mobile electrons and others localized to the lattice, the spectrum of low energy states can be modified quite a bit. This can lead to the appearance of "heavy fermions", with effective masses hundreds of times larger than the free electron mass. Note that this doesn't mean that the real electrons are actually more massive. Pull one out of the solid and it's like any other electron. Rather, it means that the relationship between the energy of the electronic states and their momentum in the solid differs quite a bit from what you'd see in a free electron.
So, knowing this, how fundamental is mass? Could there be some underlying degrees of freedom of the universe, such that our standard model of particle physics is really an effective low-energy theory, and what we think of as mass really comes from the energy and momentum spectrum of that theory? In a sense that's one aspect of what something like string theory is supposed to do.
On a more nano-related note, this discussion highlights why certain abuses of the term effective mass annoy me. For example, it doesn't really make sense to talk about the effective mass of an electron tunneling through a dodecane molecule - 12 carbons do not make an infinite periodic system. You can use models where effective mass is a parameter in this sort of problem, but you shouldn't attach deep physical meaning to the number at the end of the day.
The actual situation is more rich than this. In real space, we believe the universe to be invariant under continuous translational symmetry - that is, the properties of the universe don't depend on where we are. Translating ourselves a little to the right doesn't change the laws of nature. That invariance is what actually implies strict conservation of momentum. In the case of a periodic solid, we have a lower symmetry situation, with discrete translational symmetry - move over one lattice spacing, and you get back to the same physics. In that case, while true momentum is still conserved (the universe is what it is), the parameter that acts like momentum in describing the electronic excitations in the solid is only conserved if one allows the solid as a whole the chance to pick up momentum in certain amounts (proportional to 1/the lattice spacing).
More complicated still, when electron-electron interactions are important, say between the mobile electrons and others localized to the lattice, the spectrum of low energy states can be modified quite a bit. This can lead to the appearance of "heavy fermions", with effective masses hundreds of times larger than the free electron mass. Note that this doesn't mean that the real electrons are actually more massive. Pull one out of the solid and it's like any other electron. Rather, it means that the relationship between the energy of the electronic states and their momentum in the solid differs quite a bit from what you'd see in a free electron.
So, knowing this, how fundamental is mass? Could there be some underlying degrees of freedom of the universe, such that our standard model of particle physics is really an effective low-energy theory, and what we think of as mass really comes from the energy and momentum spectrum of that theory? In a sense that's one aspect of what something like string theory is supposed to do.
On a more nano-related note, this discussion highlights why certain abuses of the term effective mass annoy me. For example, it doesn't really make sense to talk about the effective mass of an electron tunneling through a dodecane molecule - 12 carbons do not make an infinite periodic system. You can use models where effective mass is a parameter in this sort of problem, but you shouldn't attach deep physical meaning to the number at the end of the day.
Friday, July 06, 2012
What is mass?
The (most likely) Higgs boson discovery brings up a distinction in my mind that seems to be getting overlooked in most of the popular press discussions of the CERN work. What do we mean as physicists when we talk about "mass"? In classical mechanics, there are in some sense two types of mass. There is gravitational mass - in Newtonian gravity, the gravitational force between two particles of masses \( m \) and \( M \) has the magnitude \(G m M/r^{2} \), where \( G \) is the gravitational constant and \(r \) is the distance between the particles. The force is attractive and acts along the line between the two particles. The Higgs boson has no (direct) connection to this at all.
There is also inertial mass, and this can be described in a couple of ways. The way we usually teach beginning students is that a total force of magnitude \(F \) exerted on an object (in an inertial frame of reference, but that's a detail) produces an acceleration \(a \equiv d^{2}r/dt^{2}\) that is linearly proportional to \(F \). Exert twice as much force and get twice as much acceleration. The constant of proportionality is the inertial mass \( m \), and we write all this in one form of Newton's Second Law of Motion, \( \mathbf{F} = m \mathbf{a} \). The more (inertially) massive something is, the smaller the acceleration for a given amount of force.
A more subtle way to define this would be to say that there is this thing called momentum, \(\mathbf{p} \), which we believe to be a conserved quantity in the universe. Empirically, momentum is connected with velocity. At low speeds (compared with \(c \), the speed of light), momentum is directly proportional to velocity, and the constant that connects them is the mass: \( \mathbf{p} = m \mathbf{v} \). (The full relativistic expression is \( \mathbf{p} = m \mathbf{v}/ \sqrt{1-v^{2}/c^{2}} \) ). The more massive something is, for a given speed, the more momentum it has (and the more it's going to pack a whallop when it hits you).
The coupling of elementary particles to the Higgs field is supposed to determine this mass, the relationship between momentum and velocity (or equivalently, between momentum and energy). As far as we know, the inertial mass and the gravitational mass appear to be identical - this is the Equivalence Principle, and it's supported by a wealth of experiment (though there are always ideas out there for new tests and new limits).
There is also inertial mass, and this can be described in a couple of ways. The way we usually teach beginning students is that a total force of magnitude \(F \) exerted on an object (in an inertial frame of reference, but that's a detail) produces an acceleration \(a \equiv d^{2}r/dt^{2}\) that is linearly proportional to \(F \). Exert twice as much force and get twice as much acceleration. The constant of proportionality is the inertial mass \( m \), and we write all this in one form of Newton's Second Law of Motion, \( \mathbf{F} = m \mathbf{a} \). The more (inertially) massive something is, the smaller the acceleration for a given amount of force.
A more subtle way to define this would be to say that there is this thing called momentum, \(\mathbf{p} \), which we believe to be a conserved quantity in the universe. Empirically, momentum is connected with velocity. At low speeds (compared with \(c \), the speed of light), momentum is directly proportional to velocity, and the constant that connects them is the mass: \( \mathbf{p} = m \mathbf{v} \). (The full relativistic expression is \( \mathbf{p} = m \mathbf{v}/ \sqrt{1-v^{2}/c^{2}} \) ). The more massive something is, for a given speed, the more momentum it has (and the more it's going to pack a whallop when it hits you).
The coupling of elementary particles to the Higgs field is supposed to determine this mass, the relationship between momentum and velocity (or equivalently, between momentum and energy). As far as we know, the inertial mass and the gravitational mass appear to be identical - this is the Equivalence Principle, and it's supported by a wealth of experiment (though there are always ideas out there for new tests and new limits).
Thursday, July 05, 2012
congratulations to my high energy colleagues
I'll post something else later now that I'm back from travel, but in the meantime, congratulations to my high energy colleagues on the discovery of what is most likely some form of the Higgs boson. A monumental effort. Let the Nobel speculation (about who should share it w/ Higgs, and whether it should be this year or wait for further confirmation) begin! Oh, and see the image at right (not an original) for my feelings about Leon Lederman's term for this particle. (If you don't get it, and don't mind bad language, see here at around the 4:50 mark.)