Once again the Breakthrough Prize and New Horizons Prize in fundamental physics are seeking nominations. See here. I have very mixed feelings about these prizes, given how the high energy theory components seem increasingly disconnected from experiment (and consider that a feature rather than a bug).
On a related note, the Kavli Prizes are being awarded this Thursday. Past nanowinners are Millie Dresselhaus (2012), Don Eigler (love his current affiliation) and Nadrian Seeman (2010), and Louis Brus and Sumio Iijima (2008). Not exactly a bunch of underachievers. Place your bets. Whitesides? Alivisatos and Bawendi?
Update: Thomas Ebbeson (extraordinary transmission through deep sub-wavelength apertures, thanks to plasmons), Stefan Hell (stimulated emission depletion microscopy, for deep subwavelength fluorescence microscopy resolution), and John Pendry (perfect lenses and cloaking). Congratulations all around - all richly deserved. I do think that the Kavli folks are in a sweet spot for nano prizes, as there is a good-sized pool of outstanding people that has built up, few of whom have been honored already by the Nobel. This is a bit like the early days of the Nobel prize, though hopefully with much less political infighting (see this book if you really want to be disillusioned about the Nobel process in the early years).
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Tuesday, May 27, 2014
Thursday, May 22, 2014
Workshop on structural and electronic instabilities in oxide nanostructures
I've spent the last two days at a fun "Physics at the Falls" workshop at the University of Buffalo. It's been cool learning about the impressive variety of physics at work in these systems. A few takeaways:
- With enough stainless steel and high tech equipment you can grow (and in situ characterize with everything from electron diffraction to photoemission, angle-resolved and otherwise) just about anything these days!
- There's a lot of pretty work getting done growing epitaxial complex oxides down to the single unit cell level, and a lot of accompanying extremely high resolution transmission electron microscopy.
- Untangling thermal effects from optical effects in nonequilibrium experiments can be tricky. Interesting to see that lower energy photons can be more efficient at kicking systems from one phase to another than photons much more energetic than any energy gap.
- There does seem to be some convergence on understanding LAO/STO oxide heterojunctions.
- We still don't understand superconductivity in strontium titanate, even though it's been known for decades.
- Orbitals really matter, when you are dealing with relatively localized electrons.
- Niagara Falls is very impressive!
Tuesday, May 20, 2014
Slow blogging + interesting links
The end of our academic year + travel + some major writing has cramped my blogging of late. Things should pick back up to a more regular pace in a couple of weeks. In the meantime, here are some links that caught my eye lately:
- On Sir Harold Kroto's website, here are some interesting lectures by Richard Feynman. It's absolutely worth browsing around the rest of the site, too - lots of cool videos.
- This preprint by Sean Hartnoll looks very interesting. There are materials out there that act like metals (in the sense of having lots of low energy excitations available, and an electrical resistivity that falls with decreasing temperature), but the electrons interact so strongly and in such a complex way that it no longer makes sense to think about "quasiparticles" that act basically like ordinary electrons. The challenge is, if the quasiparticle picture (which works spectacularly well for materials like gold, copper, aluminum, doped semiconductors) fails, what's the right way to treat these systems? This paper tries to look at what features would have to be there in such a system.
- This video is cute. The material used in this LED has a bandgap that apparently increases a fair bit upon cooling. As a result, the light emitted from the diode shifts toward the blue when the device is dipped in liquid nitrogen, and comes back toward the red when it's warmed.
- We still really don't understand triboelectricity, the "static electricity" you see when you rub a balloon against your hair or rub a glass rod with rabbit fur. News story here. It's amazing to me that we still don't know how this kind of charge transfer works, given that it was discovered thousands of years ago. (As Pauli said, "God made the bulk; surfaces were the work of the devil.")
Tuesday, May 06, 2014
What are the Kramers-Kronig relations, physically?
Let me pose a puzzle. Suppose you are in a completely dark room. You know that at some point in the future, someone will turn on a light in that room for a few minutes, and then turn it off later. Being a mathematically sophisticated person, it occurs to you that you could think about the time dependence of the electric field in the room. It's zero for a while, oscillating (b/c that's what happens when there is light there) for a few minutes, and then zero again. Being clever, you think about Fourier transforming that time dependence, and thinking about all the frequencies in there - the fact that the room right now is dark is actually because of the amazing cancellation of a whole bunch of frequency components! Therefore, you should be able to put on glasses that are frequency-filtering, block out some of those components, and suddenly be able to see in a dark room! Except that totally doesn't work, even in a completely classical world without photons. Why not?
Think about a material placed in a time-varying (say, harmonically varying, because that's what physicists like) electric field. The material responds in some way - electrons rearrange themselves within the material in response to that electric field; if the field is slow enough, atoms or groups of atoms can even shift their positions. The result is a polarization density (electric dipole moment per unit volume) \(\mathbf{P} \equiv \chi_{e}\mathbf{E}\). Here \( \chi_{e}\) is the electric susceptibility (generally a tensor, meaning that \(\mathbf{P}\) and \(\mathbf{E}\) don't have to point in the same direction). The dielectric function of a material is defined \(\epsilon \equiv \epsilon_{0}(1 + \chi_{e})\). In general, the response of the material depends on the frequency \(\omega\) of the electric field, and it can be out of phase with the external electric field. This is described in mathematical shorthand by considering \(\epsilon(\omega)\) to be complex, having real and imaginary components.
The Kramers-Kronig relations are fairly intimidating looking integral expressions that describe relationships that have to be obeyed between the real and imaginary components of \(\epsilon(\omega)\). These relationships come from the fact that \(\mathbf{P}\) now can only depend on \(\mathbf{E}\) in the past, up until now. This restriction of causality, plus the properties of Fourier transforms, are what leads to the K-K integrals. The wikipedia page about this actually has a very nice description here. So, while the math is not something that most people would think of as obvious, the basic idea (electromagnetic fields influence materials in a causal way, and that places constraints on how materials can respond as a function of frequency) is not too surprising.
Think about a material placed in a time-varying (say, harmonically varying, because that's what physicists like) electric field. The material responds in some way - electrons rearrange themselves within the material in response to that electric field; if the field is slow enough, atoms or groups of atoms can even shift their positions. The result is a polarization density (electric dipole moment per unit volume) \(\mathbf{P} \equiv \chi_{e}\mathbf{E}\). Here \( \chi_{e}\) is the electric susceptibility (generally a tensor, meaning that \(\mathbf{P}\) and \(\mathbf{E}\) don't have to point in the same direction). The dielectric function of a material is defined \(\epsilon \equiv \epsilon_{0}(1 + \chi_{e})\). In general, the response of the material depends on the frequency \(\omega\) of the electric field, and it can be out of phase with the external electric field. This is described in mathematical shorthand by considering \(\epsilon(\omega)\) to be complex, having real and imaginary components.
The Kramers-Kronig relations are fairly intimidating looking integral expressions that describe relationships that have to be obeyed between the real and imaginary components of \(\epsilon(\omega)\). These relationships come from the fact that \(\mathbf{P}\) now can only depend on \(\mathbf{E}\) in the past, up until now. This restriction of causality, plus the properties of Fourier transforms, are what leads to the K-K integrals. The wikipedia page about this actually has a very nice description here. So, while the math is not something that most people would think of as obvious, the basic idea (electromagnetic fields influence materials in a causal way, and that places constraints on how materials can respond as a function of frequency) is not too surprising.
Monday, May 05, 2014
National Nano Infrastructure Network - feedback requested
I wrote before about the saga of the NNIN and how painful the outcome was this year - no awards made, after thousands of person-hours invested on the writing and reviewing of proposals. Well, NSF is requesting input, in part because they want guidance on structuring the new solicitation to come out this autumn. So, please give your input if you're in the US and think this kind of support for shared infrastructure is valuable. By the way, if it seems like NSF had already gone through an exercise like this before the last solicitation (the one where they didn't fund anyone), you're right - they even had a two-day workshop and produced a report.
Friday, May 02, 2014
Recurring themes in (condensed matter/nano) physics: Fermi's Golden Rule
Very often in condensed matter (or atomic) physics we are interested in trying to calculate the rate of some quantum process - this could be the absorption of photons by an isolated atom or a solid, for example. In (advanced) undergraduate quantum mechanics, we can apply time-dependent perturbation theory to do such a calculation. Typically you assume that the system starts in some initial state \( |i\rangle \), is subjected to some perturbation \(V\) that turns on at time \(t = 0\), and ends up in final state \( |f\rangle \). If \(V\) has a harmonic time dependence with some (angular) frequency \(\omega\), then you can do a nice bit of math that calculates the rate at which this process happens. You discover that at long times the only allowed transitions are the ones where the energies of the initial and final states differ by \(\hbar \omega\), and that the rate of that process is \( (2\pi/\hbar) |\langle i |V| f\rangle|^{2} \rho \), where \(\rho\) is the number of states per unit energy per unit volume that satisfy the energy constraint.
This result, associated with Enrico Fermi, shows up over and over, with some common motifs in condensed matter and nanoscale physics, at least in spirit (that is, sometimes people apply heuristically even though the perturbation may not be harmonic, for example). First, the \( |\langle i |V| f\rangle|^{2} \) term is what gives us selection rules. If you think about optical transitions in atoms, this is why you get electric dipole transitions from the 2\(p\) state of hydrogen to the 1\(s\) state, rather than from the 2\(s\) state; in the latter, this quantity is zero. In crystalline solids, if the initial and final states are Bloch waves, it's the periodicity of the lattice that makes this quantity zero unless (crystal) momentum is conserved. This is the root of the idea that processes ordinarily forbidden in macroscopic crystals can sometimes take place in nanocrystals or at surfaces.
Similarly, meso- and nanoscale systems can greatly constrain \( \rho \). One reason that you can get very long mean free paths for charge carriers in semiconductor nanowires, carbon nanotubes, graphene, at the edges of quantum Hall systems, etc., is that the density of states available into which carriers can scatter is very restricted. Similarly, enhancing \(\rho\) can pay dividends - this is the source of the Purcell effect, where radiative transition rates can be greatly enhanced by increasing the photon density of states, and is part of the reason for enhanced rates of optical processes near plasmonic nanostructures.
This result, associated with Enrico Fermi, shows up over and over, with some common motifs in condensed matter and nanoscale physics, at least in spirit (that is, sometimes people apply heuristically even though the perturbation may not be harmonic, for example). First, the \( |\langle i |V| f\rangle|^{2} \) term is what gives us selection rules. If you think about optical transitions in atoms, this is why you get electric dipole transitions from the 2\(p\) state of hydrogen to the 1\(s\) state, rather than from the 2\(s\) state; in the latter, this quantity is zero. In crystalline solids, if the initial and final states are Bloch waves, it's the periodicity of the lattice that makes this quantity zero unless (crystal) momentum is conserved. This is the root of the idea that processes ordinarily forbidden in macroscopic crystals can sometimes take place in nanocrystals or at surfaces.
Similarly, meso- and nanoscale systems can greatly constrain \( \rho \). One reason that you can get very long mean free paths for charge carriers in semiconductor nanowires, carbon nanotubes, graphene, at the edges of quantum Hall systems, etc., is that the density of states available into which carriers can scatter is very restricted. Similarly, enhancing \(\rho\) can pay dividends - this is the source of the Purcell effect, where radiative transition rates can be greatly enhanced by increasing the photon density of states, and is part of the reason for enhanced rates of optical processes near plasmonic nanostructures.