The two recent posts about the Seebeck effect and hot electrons give some context so that I can talk about a paper we published last month.
We started out playing around with metal nanowires, and measuring the open-circuit voltage (that is, hook up a volt meter across the device, which nominally doesn't allow current to flow) across those wires as a function of where we illuminated them with a near-IR laser. Because the metal absorbs some of the light, that laser spot acts like a local heat source (though figuring out the temperature profile requires some modeling of the heat transfer processes). As mentioned here, particles tend to diffuse from hot locations to cold locations; in an open circuit, a voltage builds up to balance out this tendency, because in the steady state no net current flows in an open circuit; and in a metal, the way electron motion and scattering depend on the energy of the electrons gives you the magnitude and sign of this process. If the metal is sufficiently nanoscale that boundary scattering matters, you end up with a thermoelectric response that depends on the metal geometry. The end result is shown in the left portion of the figure. If you illuminate the center of the metal wire, you measure no net voltage - you shouldn't, because the whole system is symmetric. The junction where the wire fans out to a bigger pad acts like a thermocouple because of that boundary scattering, and if you illuminate it you get a net thermoelectric voltage (sign depends on how you pick ground and which end you're illuminating). Bottom line: Illumination heats the electrons a bit (say a few Kelvin), and you get a thermoelectric voltage because of that, to offset the tendency of the electrons to diffuse due to the temperature gradient. In this system, the size of the effect is small - microvolts at our illumination conditions.
Now we can take that same nanowire, and break it to make a tunnel junction somewhere in there - a gap between the two electrodes where the electrons are able to "tunnel" across from one side to the other. When we illuminate the tunnel junction, we now see open-circuit photovoltages that are much larger, and very localized to the gap region. So, what is going on here? The physics is related, but not true thermoelectricity (which assumes that it always makes sense to define temperature everywhere). What we believe is happening is something that was discussed theoretically here, and was reported in molecule-containing junctions here. As I said when talking about hot electrons, when light gets absorbed, it is possible to kick electrons way up in energy. Usually that energy gets dissipated by being spread among other electrons very quickly. However, if hot electrons encounter the tunnel junction before they've lost most of that energy, they have a higher likelihood of getting across the tunnel junction, because quantum tunneling is energy-dependent. Producing more hot electrons on one side of the junction than the other will drive a tunneling current. We still have an open circuit, though, so some voltage has to build up so that the net current in the steady state adds up to zero. Bottom line: Illumination here can drive a "hot" electron tunneling current, and you get a photovoltage to offset that process. This isn't strictly a thermoelectric effect because the electrons aren't thermally distributed - it's the short-lived high energy tail that matters most.
It's fun to think about ways to try to better understand and maximize such effects, perhaps for applications in photodetection or other technologies....
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Wednesday, May 24, 2017
Friday, May 19, 2017
What are "hot" electrons?
In basic chemistry or introductory quantum mechanics, you learn about the idea of energy levels for electrons. If you throw a bunch of electrons into some system, you also learn about the ground state, the lowest energy state of the whole system, where the electrons fill up* the levels from the bottom up, in accord with the Pauli principle. In statistical physics, there are often a whole lot of energy levels and a whole lot of electrons (like \(10^{22}\) per cc), so we have to talk about distribution functions, and how many electrons are in the levels with energies between \(E\) and \(E + dE\). In thermal equilibrium (meaning our system of interest is free to exchange energy in the form of heat with some large reservoir described by a well-defined temperature \(T\)), the distribution of electrons as a function of energy is given by the Fermi-Dirac distribution.
So, what are "hot" electrons? If we have a system driven out of equilibrium, it's possible to have the electrons arranged in a non-thermal (non-FD distribution!) way. Two examples are of particular interest at the nanoscale. In a transistor, say, or other nanoelectronic device, it is possible to apply a voltage across the system so that \(eV >> k_{\mathrm{B}}T\) and inject charge carriers at energies well above the thermally distributed population. Often electron-electron scattering on the 10-100 fs timescale redistributes the energy across the electrons, restoring a thermal distribution at some higher effective temperature (and on longer timescales, that energy cascades down into the vibrations of the lattice). Electrons in a metal like Au at the top of the distribution are typically moving at speeds of \(\sim 10^{6}\) m/s (!!), so that means that near where the current is injected, on distance scales like 10-100 nm, there can be "hot" electrons well above the FD distribution.
The other key way to generate "hot" electrons is by optical absorption. A visible photon (perhaps a green one with an energy \(\hbar \omega\) of 2 eV) can be absorbed by a metal or a semiconductor, and this can excite an electron at an energy \(\hbar \omega\) above the top of the FD distribution. Often, on the 10-100 fs timescale, as above, that energy gets redistributed among many electrons, and then later into the lattice. That's heating by optical absorption. In recent years, there has been an enormous amount of interest in trying to capture and use those hot electrons or their energy before there is a chance for that energy go become converted to heat. See here, for instance, for thoughts about solar energy harvesting, or here for a discussion of hot electron photochemistry. Nanoscale systems are of great interest in this field for several reasons, including the essential fact that hot electrons generated in them can access the system surface or boundary in the crucial timespan before energy relaxation.
(Talking about this and thermoelectricity now sets the stage so I can talk about our recent paper in an upcoming post.)
*Really, the whole many-body electron wavefunction has to be antisymmetric under the exchange of any two electrons, so it's wrong to talk as if one particular electron is sitting in one particular state, but let's ignore that for now. Also, in general, the energy levels of the many-electron system actually depend on the number and arrangement of the electrons in the system (correlation effects!), but let's ignore that, too.
Tuesday, May 16, 2017
More coming, soon.
I will be posting more soon. I'm in the midst of finally shifting my group webpage to a more modern design. In the meantime, if there are requests for particular topics, please put them in the comments and I'll see what I can do.
Update: Victory. After a battle with weird permissions issues associated with the way Rice does webhosting, it's up here: natelson.web.rice.edu/group.html
Still a few things that should be updated and cleaned up (including my personal homepage), but the major work is done.
Update: Victory. After a battle with weird permissions issues associated with the way Rice does webhosting, it's up here: natelson.web.rice.edu/group.html
Still a few things that should be updated and cleaned up (including my personal homepage), but the major work is done.
Tuesday, May 09, 2017
Brief items
Some interesting items of note:
- Gil Refael at Cal Tech has a discussion going on the Institute for Quantum Information and Matter blog about the content of "modern physics" undergraduate courses. The dilemma as usual is how to get exciting, genuinely modern physics developments into an already-packed undergrad curriculum.
- The variety and quality of 3d printed materials continues to grow and impress. Last month a team of folks from Karlsruhe demonstrated very nice printing of (after some processing) fused silica. Then last week I ran across this little toy. I want one. (Actually, I want to know how much they cost without getting on their sales engineer call list.) We very recently acquired one of these at Rice for our shared equipment facility, thanks to generous support of the NSF MRI program. There are reasons to be skeptical that additive manufacturing will scale in such a way as to have enormous impact, but it sure is cool and making impressive progress.
- There is a news release about our latest paper that has been picked up by a few places, including the NSF's electronic newsletter. I'll write more about that very soon.
- The NSF and the SRC are having a joint program in "SemiSynBio", trying to work at the interface of semiconductor devices and synthetic biology to do information processing and storage. That's some far out stuff for the SRC - they're usually pretty conservative.
- Don Lincoln has won the AIP's 2017 Gemant Award for his work presenting science to the public - congratulations! You have likely seen his videos put out by Fermilab - they're frequently featured on ZapperZ's blog.
Friday, May 05, 2017
What is thermoelectricity?
I noticed I'd never written up anything about thermoelectricity, and while the wikipedia entry is rather good, it couldn't hurt to have another take on the concept. Thermoelectricity is the mutual interaction of the flow of heat and the flow of charge - this includes creating a voltage gradient by applying a temperature gradient (the Seebeck Effect) and driving a heating or cooling thermal flow by pushing an electrical current (the Peltier Effect). Recently there have been new generalizations, like using a temperature gradient to drive a net accumulation of electronic spin (the spin Seebeck effect).
First, the basic physics. To grossly oversimplify, all other things being equal, particles tend to diffuse from hot locations to cold locations. (This is not entirely obvious in generality, at least not to me, from our definitions of temperature or chemical potential, and clearly in some situations there are still research questions about this. There is certainly a hand-waving argument that hotter particles, be they molecules in a gas or electrons in a solid, tend to have higher kinetic energies, and therefore tend to diffuse more rapidly. That's basically the argument made here.)
Let's take a bar of a conductor and force there to be a temperature gradient across it. The mobile charge carriers will tend to diffuse away from the hot end. Moreover, there will be a net flux of lattice vibrations (phonons) away from the hot end. Those phonons can also tend to scatter charge carriers - an effect called phonon drag. For an isolated bar, though, there can't be any net current, so a voltage gradient develops such that the drift current balances out the diffusion tendency. This is the Seebeck effect, and the Seebeck coefficient is the constant of proportionality between the temperature gradient and the voltage gradient. If you hook up two materials with different (known) Seebeck coefficients as shown, you make a thermocouple and can use the thermoelectric voltage generated as thermometer.
Ignoring the phonon drag bit, the Seebeck coefficient depends on particular material properties - the sign of the charge carriers (thermoelectric measurements are one way to tell if your system is conducting via electrons or holes, leading to some dramatic effects in quantum dots), and the energy dependence of their conductivity (which has wrapped up in it the band structure of the material and extrinsic factors like the mean free path for scattering off impurities and boundaries).
Because of this dependence on extrinsic factors, it is possible to manipulate the Seebeck coefficient through nanoscale structuring or alteration of materials. Using boundary scattering as a tuning parameter for the mean free path is enough to let you make thermocouples just by controlling the geometry of a single metal. This has been pointed out here and here, and in our own group we have seen those effects here. Hopefully I'll have time to write more on this later....
(By the way, as I write this, Amazon is having some kind of sale on my book, at $19 below publisher list price. No idea why or how long that will last, but I thought I'd point it out. I'll delete this text when that expires.)
First, the basic physics. To grossly oversimplify, all other things being equal, particles tend to diffuse from hot locations to cold locations. (This is not entirely obvious in generality, at least not to me, from our definitions of temperature or chemical potential, and clearly in some situations there are still research questions about this. There is certainly a hand-waving argument that hotter particles, be they molecules in a gas or electrons in a solid, tend to have higher kinetic energies, and therefore tend to diffuse more rapidly. That's basically the argument made here.)
Let's take a bar of a conductor and force there to be a temperature gradient across it. The mobile charge carriers will tend to diffuse away from the hot end. Moreover, there will be a net flux of lattice vibrations (phonons) away from the hot end. Those phonons can also tend to scatter charge carriers - an effect called phonon drag. For an isolated bar, though, there can't be any net current, so a voltage gradient develops such that the drift current balances out the diffusion tendency. This is the Seebeck effect, and the Seebeck coefficient is the constant of proportionality between the temperature gradient and the voltage gradient. If you hook up two materials with different (known) Seebeck coefficients as shown, you make a thermocouple and can use the thermoelectric voltage generated as thermometer.
Ignoring the phonon drag bit, the Seebeck coefficient depends on particular material properties - the sign of the charge carriers (thermoelectric measurements are one way to tell if your system is conducting via electrons or holes, leading to some dramatic effects in quantum dots), and the energy dependence of their conductivity (which has wrapped up in it the band structure of the material and extrinsic factors like the mean free path for scattering off impurities and boundaries).
Because of this dependence on extrinsic factors, it is possible to manipulate the Seebeck coefficient through nanoscale structuring or alteration of materials. Using boundary scattering as a tuning parameter for the mean free path is enough to let you make thermocouples just by controlling the geometry of a single metal. This has been pointed out here and here, and in our own group we have seen those effects here. Hopefully I'll have time to write more on this later....
(By the way, as I write this, Amazon is having some kind of sale on my book, at $19 below publisher list price. No idea why or how long that will last, but I thought I'd point it out. I'll delete this text when that expires.)