- Know your audience. If you're giving a seminar, remember that you need to give an introduction that is appropriate for first-year graduate students. If you're giving a colloquium, remember that you're facing a diverse crowd that could include (in a physics department) astrophysicists, biophysicists, high energy physicists, etc., as well as their graduate students. Pitch your talk appropriately. This is (at least) doubly important if you're giving a job talk, and as my postdoctoral mentor used to point out, every talk you give is potentially a job talk.
- Know your time constraints. Don't bring 140 slides for a 50 minute talk, and don't go way over the allotted time. In fact, for an hour talk slot I'd say aim for 50 minutes.
- Avoid jargon; if acronyms are necessary, define them. Just because an acronym or term may be common in your sub-field, don't assume that everyone knows it. Just like most condensed matter people don't know what pseudorapidity means to a high energy physicist, most high energy physicists don't know what ARPES or XAFS are.
- Minimize equations, even if (especially if) you're a theorist. You can always have a backup slide with enough math on it to make people's eyes bleed, if you want. For a main slide in a talk, no one (not even the experts) are going to get much out of a ton of equations. If you have to have equations, have a physical interpretation for them.
- Don't show big scanned text passages from papers. No one is going to read them.
- Explain the big picture. Why is this work interesting? You'd better have an answer that will be intelligible to a non-specialist. Even better, think about how you would explain your work and the point behind it to a sophomore.
- If you're giving a ten-minute talk, don't spend two minutes showing and explaining an outline.
- Avoid technology party fouls. Make sure that your technology works. Make sure that your fonts are readable and correct. Too many colors, too much animation, too many cutesy transitions - all of these things are distracting.
- Make sure to repeat a question back to the questioner. This helps everyone - the semisleeping audience gets to hear what was asked, and you get to make sure that you're actually understanding the question correctly. No one wins when the speaker and questioner are talking at cross-purposes.
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Sunday, November 30, 2008
Words of advice about giving talks
I know that there are many many resources out there on the web about how to give scientific talks (see here (pdf), here, here, and here, for example). Still, I have a few pointers to suggest, based on some recent talks that I've seen.
Wednesday, November 26, 2008
Hard times.
Wow. I guess those rumors about Harvard getting burned playing hedge fund games are true. They're putting in place a staff hiring freeze and getting ready to cancel faculty searches. Steps like that aren't surprising at, e.g., public universities in states hit hard by the housing crunch or ensuing economic crisis, but for a university with an endowment bigger than the GDP of some countries, this seems rather drastic. Still, it's hard to have too much sympathy for them, since their endowment is coming off a high of nearly $37B.
Tuesday, November 25, 2008
A (serious) modest proposal
Hopefully someone in the vast (ahem.) readership of this blog will pass this along to someone with connections in the Obama transition team. I've already submitted this idea to change.gov, but who knows the rate at which that gets read.
As part of the forthcoming major economic stimulus package, I propose that the Obama administration fully fund the America Competes initiative immediately. If the goal of the package is to stimulate the economy while doing something for the long-term health of the country (e.g., creating jobs while fixing roads, bridges, etc.), then funding basic research via the various agencies is a great thing to do. Think about it: the US spends less as a percentage of GDP than most of the rest of the developed world on science research. Rectifying that to some degree would (a) help the long-term prospects for technological innovation in the US; (b) create jobs; (c) support the goal of developing energy-related technologies; (d) support our universities, many of which are getting hammered by falling state revenues and/or poor endowment returns. Best of all, you could do all of this and it would be a freakin' bargain! You could double the research funding in NSF, NIH, DOE, NASA, and NIST, and not even come close to the amount of money we've already given to AIG. I'm suggesting something far more modest and much less disruptive. Seriously, ask yourself what's better for the long-term health of the country. Cutting basic science to pay for propping up Goldman Sachs is perverse.
Update: If you think that this is a good idea, I encourage you to submit your suggestion here, here, and/or here.
As part of the forthcoming major economic stimulus package, I propose that the Obama administration fully fund the America Competes initiative immediately. If the goal of the package is to stimulate the economy while doing something for the long-term health of the country (e.g., creating jobs while fixing roads, bridges, etc.), then funding basic research via the various agencies is a great thing to do. Think about it: the US spends less as a percentage of GDP than most of the rest of the developed world on science research. Rectifying that to some degree would (a) help the long-term prospects for technological innovation in the US; (b) create jobs; (c) support the goal of developing energy-related technologies; (d) support our universities, many of which are getting hammered by falling state revenues and/or poor endowment returns. Best of all, you could do all of this and it would be a freakin' bargain! You could double the research funding in NSF, NIH, DOE, NASA, and NIST, and not even come close to the amount of money we've already given to AIG. I'm suggesting something far more modest and much less disruptive. Seriously, ask yourself what's better for the long-term health of the country. Cutting basic science to pay for propping up Goldman Sachs is perverse.
Update: If you think that this is a good idea, I encourage you to submit your suggestion here, here, and/or here.
Monday, November 24, 2008
Spin
Many particles possess an internal degree of freedom called "spin" that is an intrinsic amount of angular momentum associated with that particle. The name is meant to evoke a spinning top, which has some rotational angular momentum about its axis when, well, spinning. Electrons have "spin 1/2", meaning that if you pick a convenient axis of reference ("quantization axis") that we'll call z, the z-component of the electron's spin angular momentum is either +1/2 hbar or -1/2 hbar. All too often we treat spin in a rather cavalier way. When people talk about "spintronics", they are interested in using the spin degree of freedom of electrons to store and move information, rather than using the charge as in conventional electronics. One complication is that while charge is strictly conserved, spin is not. If you start off with a population of spin-aligned electrons and inject them into a typical solid, over time the spin orientation of those electrons will become randomized. Now, angular momentum is strictly conserved, so this relaxation of the electron spins must coincide with a transfer of angular momentum to the rest of the solid. Feynman pointed this out (somewhere in vol. III of his lectures on physics) - if you fire a stream of spin-polarized electrons into a brick hanging on the end of a thread, you are really applying a torque to the brick since you are supplying a flow of angular momentum into it, and the thread will twist to apply a balancing torque. Well, Zolfagharkhani et al. have actually gone and done this experiment. They use a ferromagnetic wire to supply a polarized spin current and an extremely sensitive nanomechanical torsional oscillator to measure the resulting torque. Very nice stuff.
Thursday, November 20, 2008
Nature Journal Club
My media onslaught continues. This past week I had a Journal Club contribution in Nature, which was fun and a nice opportunity for a wider audience. Here's a version of it before it was (by necessity) trimmed and tweaked, with added hyperlinks....
Tunable charge densities become very large, with super consequences
The electronic properties of materials depend dramatically on the density of mobile charge carriers. One way to tune that density is through doping, the controlled addition of impurity atoms or molecules that either donate or take up an electron from the rest of the material. Unfortunately, doping also leads to charged dopants that can act as scattering sites.
Fortunately, there is a way to change the carrier concentration without doping. In 1925 J. E. Lilienfeld first proposed what is now called the “field effect”, in which the sample material of interest is used as one electrode of a capacitor. When a voltage is applied to the other (“gate”) electrode, equal and opposite charge densities accumulate on the gate and sample surfaces, provided charge can move in the sample without getting trapped. While the density of charge that can be accumulated this way is rather limited by the properties of the insulating spacer between the gate and the sample, the field effect has been incredibly useful in transistors, serving as the basis for modern consumer electronics.
Recently it has become clear that another of Lilienfeld’s inventions, the electrolytic capacitor, holds the key to achieving much higher field effect charge densities. The dramatic consequences of this were made clear by researchers at Tohoku University in Sendai, Japan (K. Ueno et al., Nature Mater. 7, 856-858 (2008)), who used a polymer electrolyte to achieve gated charge densities at a SrTiO3 surface sufficiently large to produce superconductivity. While superconductivity had been observed previously in highly doped SrTiO3, this new approach allows the exploration of the 2d superconducting transition without the disorder inherent in doping.
The most exciting aspect of this work is that this approach, using mobile ions in an electrolyte for gating, can reach charge densities approaching those in chemically doped, strongly correlated materials such as the high temperature superconductors. As an added bonus, this approach should also be very flexible, not needing special substrates. Tuning the electronic density in strongly correlated materials without the associated pain of chemical doping would, indeed, be super.
Tunable charge densities become very large, with super consequences
The electronic properties of materials depend dramatically on the density of mobile charge carriers. One way to tune that density is through doping, the controlled addition of impurity atoms or molecules that either donate or take up an electron from the rest of the material. Unfortunately, doping also leads to charged dopants that can act as scattering sites.
Fortunately, there is a way to change the carrier concentration without doping. In 1925 J. E. Lilienfeld first proposed what is now called the “field effect”, in which the sample material of interest is used as one electrode of a capacitor. When a voltage is applied to the other (“gate”) electrode, equal and opposite charge densities accumulate on the gate and sample surfaces, provided charge can move in the sample without getting trapped. While the density of charge that can be accumulated this way is rather limited by the properties of the insulating spacer between the gate and the sample, the field effect has been incredibly useful in transistors, serving as the basis for modern consumer electronics.
Recently it has become clear that another of Lilienfeld’s inventions, the electrolytic capacitor, holds the key to achieving much higher field effect charge densities. The dramatic consequences of this were made clear by researchers at Tohoku University in Sendai, Japan (K. Ueno et al., Nature Mater. 7, 856-858 (2008)), who used a polymer electrolyte to achieve gated charge densities at a SrTiO3 surface sufficiently large to produce superconductivity. While superconductivity had been observed previously in highly doped SrTiO3, this new approach allows the exploration of the 2d superconducting transition without the disorder inherent in doping.
The most exciting aspect of this work is that this approach, using mobile ions in an electrolyte for gating, can reach charge densities approaching those in chemically doped, strongly correlated materials such as the high temperature superconductors. As an added bonus, this approach should also be very flexible, not needing special substrates. Tuning the electronic density in strongly correlated materials without the associated pain of chemical doping would, indeed, be super.
Tuesday, November 18, 2008
This week in cond-mat
One paper today in the arxiv:
arxiv:0811.2914 - Zwanenburg et al., Spin states of the first four holes in a silicon nanowire quantum dot
This is another typically exquisite paper by the Kouwenhoven group at Delft, in collaboration with Charlie Lieber at Harvard. The Harvard folks have grown a Si wire segment in the middle of a long NiSi wire. The NiSi ends act as source and drain electrodes for conduction measurements, and the Si segment acts as a quantum dot, with the underlying substrate acting as a gate electrode. As usual, the small size of the Si segment leads to a discrete level spectrum, and the weak electronic coupling of the Si segment to the NiSi combined with the small size of the Si segment results in strong charging effects (Coulomb blockade, which I'll explain at length for nonexperts sometime soon). By measuring at low temperatures very carefully, the Delft team can see, in the conductance data as a function of source-drain voltage and gate voltage, the energy level spectrum of the dot. By looking at the spectrum as a function of magnetic field, they can deduce the spin states of the ground and excited levels of the dot for each value of dot charge. That's cute, but the part that I found most interesting was the careful measurement of excited states of the empty dot. The inelastic excitations that they see are not electronic in nature - they're phonons. They have been able to see evidence for the launching (via inelastic tunneling) of quantized acoustic vibrations. Figure 5 is particularly nice.
arxiv:0811.2914 - Zwanenburg et al., Spin states of the first four holes in a silicon nanowire quantum dot
This is another typically exquisite paper by the Kouwenhoven group at Delft, in collaboration with Charlie Lieber at Harvard. The Harvard folks have grown a Si wire segment in the middle of a long NiSi wire. The NiSi ends act as source and drain electrodes for conduction measurements, and the Si segment acts as a quantum dot, with the underlying substrate acting as a gate electrode. As usual, the small size of the Si segment leads to a discrete level spectrum, and the weak electronic coupling of the Si segment to the NiSi combined with the small size of the Si segment results in strong charging effects (Coulomb blockade, which I'll explain at length for nonexperts sometime soon). By measuring at low temperatures very carefully, the Delft team can see, in the conductance data as a function of source-drain voltage and gate voltage, the energy level spectrum of the dot. By looking at the spectrum as a function of magnetic field, they can deduce the spin states of the ground and excited levels of the dot for each value of dot charge. That's cute, but the part that I found most interesting was the careful measurement of excited states of the empty dot. The inelastic excitations that they see are not electronic in nature - they're phonons. They have been able to see evidence for the launching (via inelastic tunneling) of quantized acoustic vibrations. Figure 5 is particularly nice.
Sunday, November 16, 2008
Workshop on new iron arsenide superconductors
This weekend is a big workshop at the University of Maryland on the new iron arsenide high temperature superconductors. Since it's not really my area, I didn't go. Anyone want to give a little update? Any cool news?
Tuesday, November 11, 2008
Poor Doug's Almanack
Welcome, readers of Discover Magazine! Thanks for coming by, and I hope that you find the discussion here interesting. The historical target audience of this blog has been undergrads, grad students, and faculty interested in condensed matter (solid state) physics and nanoscience. The readership also includes some science journalists and other scientific/engineering professionals. I would like very much to reach a more general lay-audience as well, since I think we condensed matter types historically have been pretty lousy at explaining the usefulness and intellectually richness of our discipline. Anyway, thanks again.
(By the way, I don't compare in any serious way with Ben Franklin - that was a bit of hyperbole from Discover that I didn't know was coming. Fun science fact: Franklin's to blame that the electron charge is defined to be negative, leading to the unfortunate annoyance that current flow and electron flow point in opposite directions. He had a 50/50 chance, and in hindsight his choice of definition could've been better.)
(By the way, I don't compare in any serious way with Ben Franklin - that was a bit of hyperbole from Discover that I didn't know was coming. Fun science fact: Franklin's to blame that the electron charge is defined to be negative, leading to the unfortunate annoyance that current flow and electron flow point in opposite directions. He had a 50/50 chance, and in hindsight his choice of definition could've been better.)
Sunday, November 09, 2008
This week in cond-mat
This week the subject is boundary conditions. When we teach about statistical physics (as I am this semester), we often need to count allowed states of quantum particles or waves. The standard approach is to show how boundary conditions (for example, the idea that the tangential electric field has to go to zero at the walls of a conducting cavity) lead to restrictions on the wavelengths allowed. Boundary conditions = discrete list of allowed wavelengths. We then count up those allowed modes, converting the sum to an integral if we have to count many. The integrand is the density of states. One remarkable feature crops up when doing this for confined quantum particles: the resulting density of states is insensitive to the exact choice of boundary conditions. Hard wall boundary conditions (all particles bounce off the walls - no probability for finding the particle at or beyond the walls) and periodic boundary conditions (particles that leave one side of the system reappear on the other side, as in Asteroids) give the same density of states. The statistical physics in a big system is then usually relatively insensitive to the boundaries.
There are a couple of physical systems where we can really test the differences between the two types of boundary conditions.
arxiv:0811.1124 - Pfeffer and Zawadzki, "Electrons in superlattices: birth of the crystal momentum"
This paper considers semiconductor superlattices of various sizes. These structures are multilayers of nanoscale thickness semiconductor films that can be engineered with exquisite precision. The authors consider how the finite superlattice result (nonperiodic potential; effective hardwall boundaries) evolves toward the infinite superlattice result (immunity to details of boundary conditions). Very pedagogical.
arxiv:0811.0565, 0811.0676, 0811.0694 all concern themselves with graphene that has been etched laterally into finite strips. Now, we already have a laboratory example of graphene with periodic boundary conditions: the carbon nanotube, which is basically a graphene sheet rolled up into a cylinder. Depending on how the rolling is done, the nanotube can be metallic or semiconducting. In general, the larger the diameter of a semiconducting nanotube, the smaller the bandgap. This makes sense, since the infinite diameter limit would just be infinite 2d graphene again, which has no band gap. So, the question naturally arises, if we could cut graphene into narrow strips (hardwall boundary conditions transverse to the strip direction), would these strips have an electronic structure resembling that of nanotubes (periodic boundary conditions transverse to the tube direction), including a bandgap? The experimental answer is, yes, etched graphene strips to act like they have a bandgap, though it's clear that disorder from the etching process (and from having the strips supported by an underlying substrate) can dominate the electronic properties.
There are a couple of physical systems where we can really test the differences between the two types of boundary conditions.
arxiv:0811.1124 - Pfeffer and Zawadzki, "Electrons in superlattices: birth of the crystal momentum"
This paper considers semiconductor superlattices of various sizes. These structures are multilayers of nanoscale thickness semiconductor films that can be engineered with exquisite precision. The authors consider how the finite superlattice result (nonperiodic potential; effective hardwall boundaries) evolves toward the infinite superlattice result (immunity to details of boundary conditions). Very pedagogical.
arxiv:0811.0565, 0811.0676, 0811.0694 all concern themselves with graphene that has been etched laterally into finite strips. Now, we already have a laboratory example of graphene with periodic boundary conditions: the carbon nanotube, which is basically a graphene sheet rolled up into a cylinder. Depending on how the rolling is done, the nanotube can be metallic or semiconducting. In general, the larger the diameter of a semiconducting nanotube, the smaller the bandgap. This makes sense, since the infinite diameter limit would just be infinite 2d graphene again, which has no band gap. So, the question naturally arises, if we could cut graphene into narrow strips (hardwall boundary conditions transverse to the strip direction), would these strips have an electronic structure resembling that of nanotubes (periodic boundary conditions transverse to the tube direction), including a bandgap? The experimental answer is, yes, etched graphene strips to act like they have a bandgap, though it's clear that disorder from the etching process (and from having the strips supported by an underlying substrate) can dominate the electronic properties.
Thursday, November 06, 2008
Two new papers in Nano Letters
Two recent papers in Nano Letters caught my eye.
Kuemmeth et al., "Measurement of Discrete Energy-Level Spectra in Individual Chemically Synthesized Gold Nanoparticles"
One of the first things that I try to teach student in my nano courses is the influence of nanoscale confinement on the electronic properties of metals. We learn in high school chemistry about the discrete orbitals in atoms and small molecules, and how we can think about filling up those orbitals. The same basic idea works reasonably well in larger systems, but the energy difference between subsequent levels becomes much smaller as system size increases. In bulk metals the single-particle levels are so close together as to be almost continuous. In nanoparticles at low temperatures, however, the spacing is reasonably large compared to the available thermal energy that one can do experiments which probe this discrete spectrum. Now, in principle the detailed spectrum depends on the exact arrangement of metal atoms, but in practice one can look at the statistical distribution of levels and compare that distribution with a theory (in this case, "random matrix theory") that averages in some way over possible configurations. This paper is a beautiful example of fabrication skill and measurement technique. There are no big physics surprises here, but the data are extremely pretty.
Xiao et al., "Flexible, stretchable, transparent carbon nanotube thin film loudspeakers"
This is just damned cool. The authors take very thin films of carbon nanotubes and are able to use them as speakers even without making the films vibrate directly. The idea is very simple: convert the acoustic signal into current (just as you would to send it through an ordinary speaker) and run that current through the film. Because of the electrical resistance of the film (low, but nonzero), the film gets hot when the current is at a maximum. Because the film is so impressively low-mass, it has a tiny heat capacity, meaning that small energy inputs result in whopping big temperature changes. The film locally heats the air adjacent to the film surface, launching acoustic waves. Voila. A speaker with no moving parts. This is so simple it may well have real practical implementation. Very clever.
Kuemmeth et al., "Measurement of Discrete Energy-Level Spectra in Individual Chemically Synthesized Gold Nanoparticles"
One of the first things that I try to teach student in my nano courses is the influence of nanoscale confinement on the electronic properties of metals. We learn in high school chemistry about the discrete orbitals in atoms and small molecules, and how we can think about filling up those orbitals. The same basic idea works reasonably well in larger systems, but the energy difference between subsequent levels becomes much smaller as system size increases. In bulk metals the single-particle levels are so close together as to be almost continuous. In nanoparticles at low temperatures, however, the spacing is reasonably large compared to the available thermal energy that one can do experiments which probe this discrete spectrum. Now, in principle the detailed spectrum depends on the exact arrangement of metal atoms, but in practice one can look at the statistical distribution of levels and compare that distribution with a theory (in this case, "random matrix theory") that averages in some way over possible configurations. This paper is a beautiful example of fabrication skill and measurement technique. There are no big physics surprises here, but the data are extremely pretty.
Xiao et al., "Flexible, stretchable, transparent carbon nanotube thin film loudspeakers"
This is just damned cool. The authors take very thin films of carbon nanotubes and are able to use them as speakers even without making the films vibrate directly. The idea is very simple: convert the acoustic signal into current (just as you would to send it through an ordinary speaker) and run that current through the film. Because of the electrical resistance of the film (low, but nonzero), the film gets hot when the current is at a maximum. Because the film is so impressively low-mass, it has a tiny heat capacity, meaning that small energy inputs result in whopping big temperature changes. The film locally heats the air adjacent to the film surface, launching acoustic waves. Voila. A speaker with no moving parts. This is so simple it may well have real practical implementation. Very clever.
Wednesday, November 05, 2008
To quell speculation....
Yes, if asked, I would serve as President Obama's science advisor. (Come on - you would, too, right? Of course, it's easy for me to joke about this since it's about as probable as me being asked to serve as head of the National Science Board.)
Monday, November 03, 2008
This one's easy.
Has Bush been good for science? I agree with ZapperZ: No. How Marburger can argue that research funding has kept pace with inflation is beyond me, given the last three years of continuing resolutions, unless one (a) fudges the definition of research to include a lot of military development, and (b) fudges the definition of inflation to ignore things like food, fuel, and health care costs.
Could Bush have been even worse? Yes.
Could Bush have been even worse? Yes.
Statistical physics
This fall I'm teaching Statistical and Thermal Physics, a senior (in the most common Rice physics curriculum, anyway) undergraduate course, and once again I'm struck by the power and profundity of the material. Rather like quantum, stat mech can be a difficult course to teach and to take; from the student perspective, you're learning a new vocabulary, a new physical intuition, and some new mathematical tools. Some of the concepts are rather slippery and may be difficult to absorb at a first hearing. Still, the subject matter is some of the best intellectual content in physics: you learn about some of the reasons for the "demise" of classical physics (the Ultraviolet Catastrophe; the heat capacity problem), major foundational issues (macroscopic irreversibility and the arrow of time; the precise issue where quantum mechanics and general relativity are at odds (or, as I like to call it, "Ultraviolet Catastrophe II: Electric Boogaloo")), and the meat of some of the hottest topics in current physics (Fermi gases and their properties; Bose Einstein condensation). Beyond all that you also get practical, useful topics like thermodynamic cycles, how engines and refrigerators work, chemical equilibria, and an intro to phase transitions. Someone should write a popular book about some of this, along the lines of Feynman's QED. If only there were enough hours in the day (and my nano book was further along). Anyway, I bring this up because over time I'm thinking about doing a series of blog posts at a popular level about some of these topics. We'll see how it goes.
Sunday, November 02, 2008
Ahh, Texas, again.
Stories like this one depress me. Is it really any wonder that our state has a difficult time attracting large high-tech companies from, e.g., California and Illinois, even though corporate taxation policies are very friendly here?