Search This Blog

Tuesday, September 25, 2007

Revised: Primer on faculty searches, part I

It's that time of year again, with Chad Orzel and the Incoherent Ponderer both posting about the faculty job market and job hunting. So, I'm recycling a post of mine from last year describing the search process, at least the way it's done at Rice. I'm going to insert some revisions that are essentially tips to would-be candidates, though I think the IP has already done a good job on this, and some are basically common sense. An obvious disclaimer: this is based on my experience, and may not generalize well to other departments with vastly differing cultures or circumstances.

Here are the main steps in a search:
  • The search gets authorized. This is a big step - it determines what the position is, exactly: junior vs. junior or senior; a new faculty line vs. a replacement vs. a bridging position (i.e. we'll hire now, and when X retires in three years, we won't look for a replacement then).
  • The search committee gets put together. In my dept., the chair asks people to serve. If the search is in condensed matter, for example, there will be several condensed matter people on the committee, as well as representation from the other major groups in the department, and one knowledgeable person from outside the department (in chemistry or ECE, for example). The chairperson or chairpeople of the committee meet with the committee or at least those in the focus area, and come up with draft text for the ad.
  • The ad gets placed, and canvassing begins of lots of people who might know promising candidates. A special effort is made to make sure that all qualified women and underrepresented minority candidates know about the position and are asked to apply (the APS has mailing lists to help with this, and direct recommendations are always appreciated - this is in the search plan). Generally, the ad really does list what the department is interested in. It's a huge waste of everyone's time to have an ad that draws a large number of inappropriate (i.e. don't fit the dept.'s needs) applicants. The exception to this is the generic ad typically placed by MIT and Berkeley: "We are looking for smart folks. Doing good stuff. In some area." They run the same ad every year, trolling for talent. They seem to do ok. The other exception is when a university already knows who they want to get for a senior position, and writes an ad so narrow that only one person is really qualified. I've never seen this personally, but I've heard anecdotes.
  • In the meantime, a search plan is formulated and approved by the dean. The plan details how the search will work, what the timeline is, etc. This plan is largely a checklist to make sure that we follow all the right procedures and don't screw anything up. It also brings to the fore the importance of "beating the bushes" - see above. A couple of people on the search committee will be particularly in charge of oversight on affirmative action/equal opportunity issues.
  • The dean meets with the committee and we go over the plan, including a refresher for everyone on what is or is not appropriate for discussion in an interview (for an obvious example, you can't ask about someone's religion.).
  • Applications come in and are sorted; rec letters are collated. Each candidate has a folder.
  • The committee begins to review the applications. Generally the members of the committee who are from the target discipline do a first pass, to at least wean out the inevitable applications from people who are not qualified according to the ad (i.e. no PhD; senior people wanting a senior position even though the ad is explicitly for a junior slot; people with research interests or expertise in the wrong area). Applications are roughly rated by everyone into a top, middle, and bottom category. Each committee member comes up with their own ratings, so there is naturally some variability from person to person. Some people are "harsh graders". Some value high impact publications more than numbers of papers. Others place more of an emphasis on the research plan, the teaching statement, or the rec letters. Yes, people do value the teaching statement - we wouldn't waste everyone's time with it if we didn't care. Interestingly, often (not always) the people who are the strongest researchers also have very good ideas and actually care about teaching. This shouldn't be that surprising. As a friend of mine at a large state school once half-joked to me: 15% of the faculty in any department do the best research; 15% do the best teaching; 15% do the most service and committee work; and it's often the same 15%.
  • Once all the folders have been reviewed and rated, a relatively short list (say 20-25 or so out of 120 applications) is arrived at, and the committee meets to hash that down to, in the end, five or so to invite for interviews. In my experience, this happens by consensus, with the target discipline members having a bit more sway in practice since they know the area and can appreciate subtleties - the feasibility and originality of the proposed research, the calibration of the letter writers (are they first-rate folks? Do they always claim every candidate is the best postdoc they've ever seen?). I'm not kidding about consensus; I can't recall a case where there really was a big, hard argument within the committee. I know I've been lucky in this respect, and that other institutions can be much more fiesty. The best, meaning most useful, letters, by the way, are the ones who say things like "This candidate is very much like CCC and DDD were at this stage in their careers." Real comparisons like that are much more helpful than "The candidate is bright, creative, and a good communicator." Regarding research plans, the best ones (for me, anyway) give a good sense of near-term plans, medium-term ideas, and the long-term big picture, all while being relatively brief and written so that a general committee member can understand much of it (why the work is important, what is new) without being an expert in the target field. It's also good to know that, at least at my university, if we come across an applicant that doesn't really fit our needs, but meshes well with an open search in another department, we send over the file. This, like the consensus stuff above, is a benefit of good, nonpathological communication within the department and between departments.
That's pretty much it up to the interview stage. No big secrets. No automated ranking schemes based exclusively on h numbers or citation counts.

Tips for candidates:
  • Don't wrap your self-worth up in this any more than is unavoidable. It's a game of small numbers, and who gets interviewed where can easily be dominated by factors extrinsic to the candidates - what a department's pressing needs are, what the demographics of a subdiscipline are like, etc. Every candidate takes job searches personally to some degree because of our culture, but don't feel like this is some evaluation of you as a human being.
  • Don't automatically limit your job search because of geography unless you have some overwhelming personal reasons. I almost didn't apply to Rice because neither my wife nor I were particularly thrilled about Texas, despite the fact that neither of us had ever actually visited the place. Limiting my search that way would've been a really poor decision.
  • Really read the ads carefully and make sure that you don't leave anything out. If a place asks for a teaching statement, put some real thought into what you say - they want to see that you have actually given this some thought, or they wouldn't have asked for it.
  • Research statements are challenging because you need to appeal to both the specialists on the committee and the people who are way outside your area. My own research statement back in the day was around three pages. If you want to write a lot more, I recommend having a brief (2-3 page) summary at the beginning followed by more details for the specialists. It's good to identify near-term, mid-range, and long-term goals - you need to think about those timescales anyway. Don't get bogged down in specific technique details unless they're essential. You need committee members to come away from the proposal knowing "These are the Scientific Questions I'm trying to answer", not just "These are the kinds of techniques I know".
  • Be realistic about what undergrads, grad students, and postdocs are each capable of doing. If you're applying for a job at a four-year college, don't propose to do work that would require an experienced grad student putting in 60 hours a week.
  • Even if they don't ask for it, you need to think about what resources you'll need to accomplish your research goals. This includes equipment for your lab as well as space and shared facilities. Talk to colleagues and get a sense of what the going rate is for start-up in your area. Remember that four-year colleges do not have the resources of major research universities. Start-up packages at a four-year college are likely to be 1/4 of what they would be at a big research school (though there are occasional exceptions). Don't shave pennies - this is the one prime chance you get to ask for stuff! On the other hand, don't make unreasonable requests. No one is going to give a junior person a start-up package comparable to a mid-career scientist.
  • Pick letter-writers intelligently. Actually check with them that they're willing to write you a nice letter - it's polite and it's common sense. Beyond the obvious two (thesis advisor, postdoctoral mentor), it can sometimes be tough finding an additional person who can really say something about your research or teaching abilities. Sometimes you can ask those two for advice about this. Make sure your letter-writers know the deadlines and the addresses.
I'll revise more later if I have the time.

Monday, September 24, 2007

2007 Nobel Prize in Physics

Time for pointless speculation. I suggest Michael Berry and Yakir Aharonov for the 2007 physics Nobel, because of their seminal work on nonclassical phase factors in quantum mechanics. Thoughts?

Saturday, September 22, 2007

Two seminars this past week

I've been remiss by not posting more interesting physics, either arxiv or published. I'll try to be better about that, though usually those aren't the posts that actually seem to generate comments. For starters, I'll write a little about two interesting condensed matter seminars that we had this week. (We actually ended up with three in one week, which is highly unusual, but I was only able to go to two.)

First, my old friend Mike Manfra from Bell Labs came and gave a talk about the interesting things that one sees in two-dimensional hole systems (2dhs) on GaAs (100). Over the last 25 years, practically a whole subdiscipline (including two Nobel prizes) has sprung up out of our ability to make high quality two-dimensional electron systems (2des). If you have a single interface between GaAs below and AlxGa(1-x)As above, and you put silicon dopants in the AlGaAs close to the interface, charge transfer plus band alignment plus band bending combine to give you a layer of mobile electrons confined in a roughly triangular potential well at the interface. Those electrons are free to move within the plane of the interface, but they typically have no ability to move out of the plane. (That is, the energy to excite momentum in the z direction is greater than their Fermi energy.) Now it's become possible to grow extremely high quality 2dhs, using carbon as a dopant rather than silicon. The physics of these systems is more complicated than the electron case, because holes live in the valence band and experience strong spin-orbit effects (in contrast to electrons in the conduction band). In the electron system, it's known that at relatively low densities, low temperatures, and moderate magnetic fields, there is a competition between different possible ground states, including ones where the electron density is spatially complicated ("stripes", "bubbles", "nematics"). Manfra presented some nice work on the analogous case with holes, where the spin-orbit complications make things even more rich.

Then yesterday we had a talk by Satoru Nakatsuji from the ISSP at the University of Tokyo. He was talking about an extremely cool material, Pr2Ir2O7. This material is a metal, but because of its structure it has very complicated low temperature properties. For example, the Pr ions live on a pyrochlore lattice, which consists of corner-sharing tetrahedra. The ions are ferromagnetically coupled (they want to align their spins), but the lattice structure is a problem because it results in geometric frustration - not all the spins can be satisfied. As a result, the spins never order at nonzero temperature (at least, down to the milliKelvin range) despite having relatively strong couplings. This kind of frustration is important in things like water ice, too. In water ice, the hydrogens can be thought of as being at the corners of such tetrahedra, but the O-H bond lengths can't all be the same. For each tetrahedron, two are short (the covalent O-H bonds) and two are long (hydrogen bonds). The result is a ground state for water ice that is highly degenerate, leading to an unusual "extra" residual entropy at T = 0 of R/2 ln 3/2 per mole (in contrast to the classical third law of thermodynamics that says entropy goes to zero at T = 0. The same kind of thing happens in Pr2I2O7 - the spins on the tetrahedron corners have to be "two-in" and "two-out" (see the link above), leading to the same kind of residual entropy as in water ice. This frustration physics is just the tip of the iceberg (sorry.) of what Nakatsuji discussed. Very neat.

Friday, September 14, 2007

The secret joys of running a lab II: equipment

The good news is that we're getting a cool new piece of equipment to be installed here next week. The bad news (apart from the fact that it uses liquid helium - see previous post) is that I've been spending my morning shifting through US import tariff codes trying to come up with a number that will make the shipping agent happy. You might think that the tariff code supplied by the vendor would be good enough. Apparently you'd be wrong. You might think that this would be the job of a customs broker. Again, apparently you'd be wrong. As the Incoherent Ponderer pointed out, there are many aspects of our jobs for which we never receive formal training. Customs agent is one. By the way: can anyone explain to me why US tariff codes are maintained by the US Census Bureau? Ok, so they're part of the Department of Commerce, but this is just odd.

Thursday, September 13, 2007

The secret joys of running a lab: helium.

In my lab, and in many condensed matter physics labs around the world, we use liquid helium to run many of our experiments. At low temperatures, many complicating effects in condensed matter systems are "frozen out", and it becomes easier to understand the effects that remain. Often we are interested in the ground state of some system and want to reduce thermal excitations. Quantum effects are usually more apparent at low temperatures because the inelastic processes that lead to decoherence are suppressed as T approaches zero. For example, the quantum coherence length (the distance scale over which the phase of an electron's wavefunction is well defined before it gets messed up due to inelastic effects of the environment) of an electron in a metal like silver at room temperature is on the order of 1 nm, while that length can be thousands of times longer at 4.2 K, the boiling point of liquid helium at atmospheric pressure. Those kinds of temperatures are also necessary for running good superconducting magnet systems.

The downside of liquid helium is that it's damned expensive, and getting more so by the minute. Running at full capacity I could blow through several thousand liters in a year, and at several dollars a liter minimum plus overhead, that's real money. As a bonus, lately our supplier of helium has become incredibly unreliable, missing orders and generally flaking out, while simultaneously raising prices because of actual production shortages. I just had to read the sales guy the riot act, and if service doesn't improve darn fast, we'll take our business elsewhere, as will the other users on campus. (Helium comes from the radioactive decay of uranium and other alpha emitters deep in the earth, and comes out of natural gas wells.) The long-term solutions are (a) set up as many cryogen-free systems as possible, and (b) get a helium liquifier to recycle the helium that we do use. Unfortunately, (a) requires an upfront cost comparable to about 8 years of a system's helium consumption per system, and (b) also necessitates big capital expenses as well as an ongoing maintenance issue. Of course none of these kinds of costs are the sort of thing that it's easy to convince a funding agency to support. Too boring and pedestrian.

Fortunately, when you work at really nanometer scales, interesting physics often happens at higher temperatures. I've been lucky that two major things going on in my lab right now don't require helium at all. Still, it's bad enough worrying about paying students without the added fun of helium concerns.

UPDATE: See here.

Sunday, September 09, 2007

Other Packard meeting highlights

I'm back from California, and the remainder of the Packard meeting was just as much intellectual fun as the first day. It's great to see so much good science and engineering outside my own discipline. Some fun things I learned:
  • Plants really can communicate by smell (that is, by giving off and detecting volatile compounds).
  • Many flying insects have evolutionarily found wing flap patterns that optimize for minimum energy consumption when hovering.
  • Most of the huge number of insect species in tropical rainforests (at least in New Guinea) are specialist feeders, preferring to eat only one type of plant.
  • When you split a molecular ion (say I2-) into a neutral atom and an atomic ion, the coherent superposition (in this case, 1/\sqrt(2) [(I + I-) + (I- + I)]) can persist even when the atom and ion are separated by more than 10 atomic diameters.
  • Super fancy mass spec plus amazing statistical capabilities can let you do serious proteomics.
  • There may have been as many as four supercontinent phases and two "snowball earth" phases in the last three billion years.
  • If you come up with a computationally efficient way to model viscoelastic materials (e.g. jello, human skin), you can develop virtual surgery tools for reconstructive surgeons, and win an Oscar for special effects by modeling Davey Jones for POTC II.
  • If you develop a DNA microarray chip that lets you cheaply and reliably identify any known virus or the nearest relative of an unknown virus, and you want to use this clinically, the established medical testing companies will react in a very negative way (because they're afraid that if you're successful, they won't be able to keep chargin insurers $3K per possibly unnecessary blood test). The fact that you can save lives won't be of interest to them.
  • Comparing different measurement techniques can really tell you a lot about how cells sense and respond to touch.
  • You can design a Si photonic crystal to act as a superprism and show negative refraction and negative diffraction, all at the same time, over a useful bandwidth near 1.55 microns wavelength (the standard telecommunications band).
I know I'm leaving some out, too. Very fun stuff.

Friday, September 07, 2007

Packard meeting

I'm currently in Monterey thanking the Packard Foundation for their generous support. They're fantastic, and their fellowship has been a godsend that's really given me the flexibility in my research that I've needed. The best part about their annual meetings is that it's a chance for me to listen to good talks pitched to a general audience on an enormously broad set of science and engineering subjects. Some things that I learned yesterday:
  • It's possible to do successful astronomical planet-hunting surveys using 300mm camera lenses to make a telescope array.
  • There are molecules and molecular ions in astronomical gas clouds that are extremely difficult to make and study on earth (e.g., CH5-; C6H7+).
  • The human brain is 2% of the body's mass but uses 20% of the body's oxygen. It also has roughly 10x the concentration of iron, copper, and zinc as other soft tissues on the body.
  • Chemical "noise" (e.g., concentration fluctuations) is essential for some kinds of cell differentiation.
  • There are other photoactive parts in your eye besides rods and cones, and if those other parts are intact, your body clock can still re-set itself even in the absence of vision.
  • Soft tissue can (pretty convincingly) survive inside fossil bones dating back tens of millions of years.
  • Viral phylogeny shows convincingly that HIV did not start from contaminated polio vaccines grown in monkeys, and that HIV came from Africa first to Haiti, and then from Haiti to the US in the late 1960s.
  • Lots of microbes live as biofilms on the ocean floor via chemical energy gained from the decomposition of basaltic rock.

Wednesday, August 29, 2007

Invited talk suggestions, APS March Meeting 2008

Along with Eric Isaacs, I am co-organizing a focus topic at the March Meeting of the APS this year on "Fundamental Challenges in Transport Properties of Nanostructures". The description is:
This focus topic will address the fundamental issues that are critical to our understanding, characterization and control of electronic transport in electronic, optical, or mechanical nanostructures. Contributions are solicited in areas that reflect recent advances in our ability to synthesize, characterize and calculate the transport properties of individual quantum dots, molecules and self-assembled functional systems. Resolving open questions regarding transport in nanostructures can have a huge impact on a broad range of future technologies, from quantum computation to light harvesting for energy. Specific topics of interest include: fabrication or synthesis of nanostructures involved with charge transport; nanoscale structural characterization of materials and interfaces related to transport properties; advances in the theoretical treatment of electronic transport at the nanoscale; and experimental studies of charge transport in electronic, optical, or mechanical nanostructures.
The sorting category is 13.6.2, if you would like to submit a contributed talk. Until Friday August 31, we're still soliciting suggestions for invited speakers for this topic, and I would like to hear what you out there would want to see. If you've got a suggestion, feel free either to post below in the comments, or to email me with it, including the name of the suggested speaker and a brief description of why you think they'd be appropriate. The main restriction is that suggested speakers can't have given an invited talk at the 2007 meeting. Beyond that, while talks by senior people can be illuminating, it's a great opportunity for postdocs or senior students to present their work to an audience. Obviously space is limited, and I can make no promises, but suggestions would be appreciated. Thanks.

Tuesday, August 28, 2007

Quantum impurities from Germany II

A recurring theme at the workshop in Dresden last week was quantum impurities driven out of equilibrium. In general this is an extremely difficult problem! One of the approaches discussed was that of Natan Andrei's group, presented here and here. I don't claim to understand the details, but schematically the idea is to remap the general problem into a scattering language. You set up the nonequilibrium aspect (in the case of a quantum dot under bias, this corresponds to setting the chemical potentials of the leads at unequal values) as a boundary condition. By recasting things this way, you can use a clever ansatz to find eigenstates of the scattering form of the problem, and if you're sufficiently clever you can do this for different initial conditions and map out the full nonequilibrium response. Entropy production and eventual relaxation of the charge carriers far from the dot happens "at infinity". Andrei gives a good (if dense) talk, and this formalism seems very promising, though it also seems like actually calculating anything for a realistic system requires really solving for many-body wavefunctions for a given system.

Tuesday, August 21, 2007

Quantum impurities from Germany

I'm currently at a workshop on quantum impurity problems in nanostructures and molecular systems, sponsored by the Max Planck Institute for Complex Systems here in Dresden. A quantum impurity problem is defined by a localized subsystem (the impurity) with some specific quantum numbers (e.g. charge; spin) coupled to nonlocal degrees of freedom (e.g. a sea of delocalized conduction electrons; spin waves; phonons). The whole coupled system of impurity (or impurities) + environment can have extremely rich properties that are very challenging to deduce, even if the individual subsystems are relatively simple.

A classic example is the Kondo problem, with a localized impurity site coupled via tunneling to ordinary conduction electrons. The Coulomb repulsion is strong enough that the local site can really be occupied by only one electron at a time. However, the total energy of the system can be reduced if the localized electron can undergo high order virtual processes where it can pop into the conduction electron sea and back. The result is an effective magnetic exchange between the impurity site and the conduction electrons, as well as an enhanced density of states at the Fermi level for the conduction electrons. The ground state of this coupled system involves correlations between many electrons, and results in a net spin singlet. The Kondo problem can't be solved by perturbation theory, like many impurity problems.

The point is, with nanostructures it is now possible to implement all kinds of impurity problems experimentally. What is really exciting is the prospect of using these kinds of tunable model systems to study strong correlation physics (e.g. quantum phase transitions in heavy fermion compounds; non-Fermi liquid "bad metals") in a very controlled setting, or in regimes that are otherwise hard to probe (e.g., impurities driven out of equilibrium). This workshop is about 70 or 80 people, a mix of theorists and experimentalists, all interested in this stuff. When I get back I'll highlight a couple of the talks.

Thursday, August 16, 2007

Superluminality



Today this blurb from the New Scientist cause a bit of excitement around the web. While it sounds at first glance like complete crackpottery, and is almost certainly a case of terrible science journalism, it does involve an interesting physics story that I first encountered back when I was looking at grad schools. I visited Berkeley as a prospective student and got to meet Ray Chiao, who asked me how long it takes a particle with energy E to tunnel through a rectangular barrier of energetic height U > E and thickness d. He went to get a glass of water, and wanted me to give a quick answer when he got back a couple of minutes later. Well, if I wasn't supposed to do a real calculation, I figured there were three obvious guesses: (1) \( d/c\); (2) \(d/ (\hbar k/m)\), where \(k = \sqrt{2 m (U-E)}/\hbar\) - basically solving for the (magnitude of the imaginary) classical velocity and using that; (3) 0. It turns out that this tunneling time controversy is actually very subtle. When you think about it, it's a funny question from the standpoint of quantum mechanics. You're asking, of the particles that successfully traversed the barrier, how long were they in the classically forbidden region? This has a long, glorious history that is discussed in detail here. Amazingly, the answer is that the tunneling velocity (d / the tunneling time) can exceed c, the speed of light in a vacuum, depending on how it's defined. For example, you can consider a gaussian wave packet incident on a barrier, and ask how fast does the packet make it through. There will be some (smaller than incident) transmitted wavepacket, and if you look at how long it takes the center of the transmitted wave packet to emerge from the barrier after the center of the incident packet hits the barrier, you can get superluminal speeds out for the center of the wavepacket. (You can build up these distributions statistically by doing lots of single-photon counting experiments.) Amazingly, you can actually have a situation where the exiting pulse leaves the barrier before the entering pulse peak hits the barrier. This would correspond to negative (average) velocity (!), and has actually been demonstrated in the lab. So, shouldn't this bother you? Why doesn't this violate causality and break special relativity? The conventional answer is that no information is actually going faster than light here. The wavepackets we've been considering are all smooth, analytic functions, so that the very leading tail of the incident packet contains all the information. Since that leading tail is, in Gaussian packets anyway, infinite in extent, all that's going on here is some kind of pulse re-shaping. The exiting pulse is just a modified version in some sense of information that was already present there. It all comes down to how one defines a signal velocity, as opposed to a phase velocity, group velocity, energy velocity, or any of the other concepts dreamed up by Sommerfeld back in the early 20th century when people first worried about this. Now, this kind of argument from analyticity isn't very satisfying to everyone, particularly Prof. Nimtz. He has long argued that something more subtle is at work here - that superluminal signalling is possible, but tradeoffs between bandwidth and message duration ensure that causality can't be violated. Well, according to his quotes in today's news, apparently related to this 2-page thing on the arxiv, he is making very strong statements now about violating special relativity. The preprint is woefully brief and shows no actual data - for such an extraordinary claim in the popular press, this paper is completely inadequate. Anyway, it's a fun topic, and it really forces you to think about what causality and information transfer really mean.

Sunday, August 12, 2007

Kinds of papers

I've seen some recent writings about how theory papers come to be, and it got me thinking a bit about how experimental condensed matter papers come about, at least in my experience. Papers, or more accurately, scientific research projects and their results, seem to fall into three rough groupings for me:
  • The Specific Question. There's some particular piece of physics in an established area that isn't well understood, and after reading the literature and thinking hard, you've come up with an approach for getting the answer. Alternately, you may think that previous approaches that others have tried are inadequate, or are chasing the wrong idea. Either way, you've got a very specific physics goal in mind, a well-defined (in advance) set of experiments that will elucidate the situation, and a plan in place for the data analysis and how different types of data will allow you to distinguish between alternative physics explanations.
  • The New Capability. You've got an idea about a new experimental capability or technique, and you're out to develop and test this. If successful, you'll have a new tool in your kit for doing physics that you (and ideally everyone else) has never had before. While you can do cool science at this stage (and often you need to, if you want to publish in a good journal), pulling off this kind of project really sets the stage for a whole line of work along the lines of The Specific Question - applying your new skill to answer a variety of physics questions. The ideal examples of this would be the development of the scanning tunneling microscope or the atomic force microscope.
  • The (Well-Motivated) Surprise. You're trying to do either The Specific Question or The New Capability, and then all of the sudden you see something very intriguing, and that leads to a beautiful (to you, at least, and ideally to everyone else) piece of physics. This is the one that can get people hooked on doing research: you can know something about the universe that no one else knows. Luck naturally can play a role here, but "well-motivated" means that you make your own luck to some degree: you're much more likely to get this kind of surprise if you're looking at a system that is known to be physically interesting or rich, and/or using a new technique or tool.
Hopefully sometime in the future I'll give an anecdote or two about these. In the mean time, does anyone have suggestions on other categories that I've missed?

Behold the power of google

I am easily amused. They just put up google street-view maps of Houston, and while they didn't do every little road, they did index the driving routes through Rice University. In fact, you can clearly see my car here (it's the silver Saturn station wagon just to the right of the oak tree). Kind of cool, if a bit disturbing in terms of privacy.

Tuesday, August 07, 2007

This week in cond-mat

Another couple of papers that caught my eye recently....

arxiv:0707.2946 - Reilly et al., Fast single-charge sensing with an rf quantum point contact
arxiv:0708.0861 - Thalakulam et al., Shot-noise-limited operation of a fast quantum-point-contact charge sensor
It has become possible relatively recently to use the exquisit charge sensitivity of single-electron transistors (SETs) to detect motion of single electrons at MHz rates. The tricky bit is that a SET usually has a characteristic impedance on the order of tens of kOhms, much higher than either free space (377 Ohms) or typical radio-frequency hardware (50 Ohms). The standard approach that has developed is to terminate a coax line with an rf-SET; as the charge environment of the rf-SET changes, so does its impedance, and therefore so does the rf power reflected back up the coax. One can improve signal to noise by making an LC resonant circuit down at the rf-SET that has a resonance tuned to the carrier frequency used in the measurement. With some work, one can use a 1 GHz carrier wave and detect single charge motion near the rf-SET with MHz bandwidths. Well, these two papers use a gate-defined quantum point contact in a 2d electron gas instead of an rf-SET. See, rf-SETs are tough to make, are fragile, and have stability problems, all because they rely on ultrathin (2-3 nm) aluminum oxide tunnel barriers for their properties. In contrast, quantum point contacts (formed when a 2d electron gas is laterally constricted down to a size scale comparable to the Fermi wavelength of the electrons) are tunable, and like rf-SETs can be configured to have an impedance (typically 13 kOhms) that can be strongly dependent on the local charge configuration. Both the Harvard and Dartmouth groups have implemented these rf-QPCs, and the Dartmouth folks have demonstrated very nicely that theirs is as optimized as possible - its performance is limited by the fact that the current flowing through the QPC is composed of discrete electrons.

arxiv:0708.0646 - Hirsch, Does the h-index have predictive power?
*sigh*. The h-index is, like all attempts to quantify something inherently complex and multidimensional (in this case, scientific productivity and impact) in a single number, of limited utility. Here, Hirsch argues that the h-index is a good predictor of future scientific performance, and takes the opportunity to rebut criticisms that other metrics (e.g. average citations per paper) are better. This paper is a bit depressing to me. First, I think things like the citation index, etc. are a blessing and a curse. It's great to be able to follow reference trails around and learn new things. It's sociologically and psychologically of questionable good to be able to check on the impact of your own work and any competitor whose name you can spell. Second, Hirsch actually cites wikipedia as an authoritative source on how great the h-index is in academic fields beyond physics. I love wikipedia and use it all the time, but citing it in a serious context is silly. Ahh well. Back to trying to boost my own h-index by submitting papers.

Tuesday, July 31, 2007

Recent ACS + cond-mat

A couple of interesting recent results - a busy summer has really cut into my non-essential paper-reading, unfortunately.

One sideline that has popped up with the recent graphene feeding frenzy is trying to understand its optical properties. I don't mean anything terribly exotic - I mean just trying to get a good understanding of why it is possible, in a simple optical microscope, to see any optical contrast from atomically thin single layers of graphene. Papers that have looked at this include:
arxiv:0705.0259 - Blake et al., Making graphene visible
arxiv:0706.0029 - Jung et al., Simple approach for high-contrast optical imaging and characterization of graphene-based sheets
doi:10.1021/nl071254m (Nano Lett., in press) - Ni et al., Graphene thickness determination using reflection and contrast spectroscopy
UPDATE: Here's another one:
doi:10.1021/nl071158l (Nano Lett., in press) - Roddaro et al., The optical visibility of graphene: interference colors of ultrathin graphite on SiO2
It all comes down to the dielectric function of graphene sheets, how that evolves with thickness, and how that ultrathin dielectric layer interacts optically with the oxide coating on the substrate.

Another paper that looks important at a quick read is:
doi: 10.1021/nl071486l (Nano Lett., in press) - Beard et al., Multiple exciton generation in colloidal silicon nanocrystals
To excite the charge carriers in a (direct gap) semiconductor optically typically requires a photon with an energy exceeding the band gap, Eg, between the top of the valence band and the bottom of the conduction band. If an incident photon has excess energy, say 2Eg, what ordinarily happens is that a single electron-hole pair is produced, but that pair has excess kinetic energy. It's been shown recently that in certain direct-gap semiconductor nanocrystals, it's possible to generate multiple e-h pairs with single photons. That is, a photon with energy 3Eg might be able to make three e-h pairs. That's potentially big news for photovoltaics. In this new paper, Beard and coauthors have demonstrated the same sort of effect in Si nanocrystals. This is even more remarkable because bulk Si is an indirect gap semiconductor (this means that the because of the crystal structure of Si, taking an electron from the top of the valence band to the bottom of the conduction band requires more momentum than can be provided by just a photon with energy Eg). At a quick read, I don't quite get how this works in this material, but the data are pretty exciting.

Thursday, July 26, 2007

Texas and education

Governor Perry, why did you have to go and ruin my week? It's bad enough that the Texas Republican Party platform explicitly declares that "America is a Christian nation" - so much for not establishing a preferred religion. Now our governor has gone and appointed a creationist anti-intellectual to be the head of the state board of education. Frankly I don't care what his personal religious beliefs are, but I am extremely bothered that the governor has appointed a man who believes that education and intellectualism are essentially useless ("The belief seems to be spreading that intellectuals are no wiser as mentors, or worthier as exemplars, than the witch doctors or priests of old. I share that scepticism.") to run the state educational system. Great move, Governor. Ever wonder why it's hard to convince high tech industry to create jobs here?

Wednesday, July 25, 2007

Ob: Potter

This is the obligatory Harry Potter post. Yes, I read the 7th book, and while it's got a few narrative problems (characters sometimes behaving in deliberately obtuse ways for dramatic necessity - like nearly every episode of Lost), on the whole it was a satisfying wrap-up of the series. If you don't care about spoilers, here is a great parody of the whole thing (via Chad Orzel).

Thursday, July 19, 2007

This week in cond-mat

It's been a busy summer, hence the sparseness of my recent postings. Here are a couple of papers that caught my eye this past week.

arxiv:0707.1923 - Hogele et al., Quantum light from a carbon nanotube
Here the authors do careful time-resolved photoluminescence experiments on individual single-walled carbon nanotubes. By studying the time distribution of photon production, they can get insights into the exciton (bound electron-hole) dynamics that lead to light emission. They find evidence that photons are produced one-at-a-time in these structures, and that multiphoton processes are strongly suppressed. Perhaps nanotubes could be useful as sources of single photons, strongly desired for quantum cryptography applications.

arxiv:0707.2091 - Quek et al., Amine-gold linked single-molecule junctions: experiment and theory
This is a nice example of a mixed experiment/calculation paper in molecular electronics that actually has an interesting point. Very pretty experimental work by Venkataraman et al. at Columbia has shown that NH2-terminated molecules form better-defined contacts with Au electrodes than the conventional thiol (sulfur)-based chemistry. For example, looking at huge data sets from thousands of junction configurations, benzene diamine glommed into a Au break junction has a well-defined most likely conductance of around 0.0064 x 2e^2/h. Now theory collaborators have done a detailed examination via density functional theory of more than a dozen likely contact geometries and configurations for comparison. The calculations do show a well-defined junction conductance that's robust - however, the calculations overestimate the conductance by a factor of seven compared to experiment. The authors say that this shows that DFT likely misses important electronic correlation effects. Hmmm. It's a neat result, and now that they mention it, the almost every non-resonant molecular conduction calculation I've ever seen based on DFT overestimates the conduction by nearly an order of magnitude. The only underestimates of molecular conduction that come to mind are in the case of Kondo-based mechanisms, which can strongly boost conductance and are always missed by ordinary DFT.

Friday, July 13, 2007

This is just silly.

I got an email about an audio conference about faculty recruiting titled "How to Recruit Gen X Faculty Members". I shouldn't pre-judge, and I should be glad that anyone is trying to improve the faculty recruiting process, but it's sad that anyone needs to be told this stuff. The premise is this:
The era when colleges and universities could rely on prestige and a little cash to recruit top academic talent is gone. Increasingly, up-and-coming faculty talent is from Generation X, the much derided and little understood generation that is much more than the Gap-employee stereotype you heard about a decade ago. This generation has a different set of work priorities, and colleges that understand these priorities stand a better chance of landing the best candidates and keeping them.
Riiiggght. It must be because of their generational culture, not the fact that two income families are vastly more common now, and there are many more women faculty candidates then forty years ago, etc. The topics to be covered include:
  • Why prestige and tenure may not matter as much to this generation as previous generations, and what that means for recruiting.

  • The importance of being "family friendly" and how job candidates judge that now that all colleges are claiming that they are.

  • How Gen X professors view hierarchy and what that means in the context of departments.

  • The importance of transparency and collegiality.
  • So, basically we can sum this up in a few words that generalize beyond the university setting: People don't want to work at places where they will be treated poorly. People may want to actually have lives outside of their jobs, and like to work at places that understand that. Smart, educated people don't like being told what to do by people who are clueless just because the clueless have seniority. People don't like it when their employers are rude or have obscure, byzantine policies. My goodness, those Gen X slackers are totally unreasonable.


    Tuesday, July 10, 2007

    Organic Microelectronics workshop

    I just spent two days at the 3rd Annual Organic Microelectronics Workshop, meeting this year in Seattle. The workshop, sponsored jointly by the ACS, MRS, IEEE, and APS, was really very good - about 90 participants, and most of the big movers in the field. The talks were a great mix from the very applied (e.g. trying to optimize solvent conditions to avoid the coffee ring problem when inkjet or gravure printing solution-processable organic semiconductors) to the basic physics and chemistry of these materials. Among the things that I learned:
    • Among the single-crystal organic semiconductors, rubrene is truly special in a number of ways. The most important point from the perspective of understanding electronic transport is that it can be made particularly pure, and oxidation in this material is reversible, unlike, e.g., pentacene.
    • With polymer electrolytes, it is possible to make field-effect devices with gated surface charge densities exceeding 10^14 carriers/cm^2. I'd seen a couple of papers on this, and it's looking very impressive as a technique.
    • Clever phase separation tricks can produce self-assembling organic devices that encapsulate themselves within a protective coating.
    • RFID tags from Si are very very cheap.
    • When developing a manufacturing process, "'Good enough' is good enough, and 'better' is not necessarily better."

    Wednesday, July 04, 2007

    This ought to be fun.

    Looks like those folks at Steorn are going to do a 'demo' of their alleged free energy machine. I think I can safely predict (a) Steorn will claim success; (b) the reporting will generally give them the benefit of the doubt and "report the controversy"; and (c) we will not cure all the world's energy needs with magnet-based machines that violate the first law of thermodynamics.

    UPDATE: Wow - it turns out that I'd overestimated Steorn. They couldn't get their demo to work. Apparently they'd decided to ignore back-ups, rehearsals, and contingency planning in addition to the laws of physics. So, was this self-deception, the long con, a postmodern publicity stunt designed to show how effectively they could market vaporware, or something else?

    Tuesday, July 03, 2007

    four interesting ACS journal articles

    Here are four recent articles ACS journals, two from Nano Letters and two from JACS, that made an impression on me.

    Dattoli et al., Fully transparent thin-film transistor devices based on SnO2 nanowires
    The authors of this paper have made fully functional n-type FETs based on lightly doped tin oxide nanowires with indium tin oxide source, drain, and gate electrodes, and the performance of these FETs is reasonable when compared with the ones currently driving the pixels in your flat panel display. Since the entire FET structure is very transparent in the visible, this could have some significant applications in display technologies.

    Angus et al., Gate-defined quantum dots in intrinsic silicon
    People have been making Coulomb blockade devices out of puddles of gate-confined two-dimensional electron gas for nearly two decades now. Mostly this has been done at the GaAs/AlGaAs interface, and more recently it's been achieved in nanotubes, semiconductor nanowires, and SiGe heterostructures. The authors of this work have managed to do this nicely at the Si/SiO2 interface in a MOSFET. What this really shows is how well the interface states at that junction are passivated, how nicely the authors can make gates without messing up the surrounding material, and that properly made Ohmic contacts in Si FETs can operate well down to cryogenic temperatures. This could be a very important paper if one can build on it to manipulating electron spins in these dots - unlike GaAs structures, there should be many fewer nuclear spins to worry about for effects like hyperfine-induced decoherence of electron spins.

    Albrecht et al., Intrinsic multistate switching of gold clusters through electrochemical gating
    Lots of people in the molecular electronics community have pointed out the similarities and differences between three-terminal (electrostatically gated) molecular devices and solution-based electrochemical oxidation/reduction experiments in electron transfer. These authors are some of the only experimentalists out there that I have seen really delving into this, trying to unravel how the electrochemical case really works. This experiment is analogous to the Coulomb blockade experiment of the preceding paper, but performed using an STM in an electrochemical medium, with ligand-protected gold clusters playing the role of the quantum dot.

    Shim et al., Control and measurement of the phase behavior of aqueous solutions using microfluidics
    This isn't particularly deep, but it sure is cool. Microfluidics has come a long way, and the extremely nice properties of polydimethylsiloxane (PDMS) have been a big help. That's the transparent silicone rubber used for many microfluidic applications, as well as being related to the silicone used for soft contact lenses and breast implants. The authors here have carefully used the water and gas permeability of thin PDMS layers to control the concentrations of solutes in water-based solutions, allowing them to do things like gently make supersaturated conditions to control crystallization of proteins. We're just at the leading edge of the potential applications for these kinds of systems.

    Tuesday, June 26, 2007

    This week in cond-mat

    Two good review articles in the last week appeared on cond-mat....

    arxiv:0706.3015 - Bibes et al., Oxide spintronics
    This is a nice overview of recent developments in using transition metal oxides, which often exhibit strong electronic correlations, for measurements and devices involving spin. This includes materials like the manganites (colossal magnetoresistance oxides), half-metals (magnetite, CrO2), magnetically doped oxides (TiO2, ZnO) as wide-band gap dilute magnetic semiconductors, and new multiferroic materials (ferroelectricity + magnetic order all wrapped up in one system). Good stuff.

    arxiv:0706.3369 - Saminadayar et al., Equilibrium properties of mesoscopic quantum conductors
    Despite being rendered in some species of pdf that my viewer finds nearly unreadable, this is a very nice article all about equilibrium quantum effects in nanostructures comparable in size to the electronic phase coherence length. This includes persistent currents in small metal and semiconductor loops. These persistent currents (flowing without dissipating!) result in part from the requirement that the electronic phase be single-valued when traversing a loop trajectory in a coherent manner. The persistent currents are very challenging to measure, and as far as I know there continues to be controversy about whether the magnitude and sign of the resulting magnetic moments is consistent with theory.

    Thursday, June 21, 2007

    ACS journal articles

    One reason why I've been writing up arxiv preprints rather than published articles in PRL/APL/Science/Nature is that the APS Virtual Journals do a very good job of aggregating articles from those sources. The Virtual Journal of Nanoscale Science and Technology in particular is one of my favorite places to look for nano-themed condensed matter work. One unfortunate flaw of the virtual journals, however, is that they do not have a nice agreement in place to let them include links to articles published in ACS journals. That's really too bad, since an awful lot of very neat results have been showing up there, particularly in Nano Letters, and I suspect that the new longer-paper ACS Nano is going to be of similar high quality. So, I'm going to try pointing out a couple of JACS/Nano Lett/ACS Nano articles that catch my eye every week or two.

    Monday, June 18, 2007

    Prolific theorists

    How do they do it? No, really. How can some theorists be so prolific? I know they're not constrained by little things like having to get experiments to work, but surely it takes a certain amount of intellectual effort and creativity (or at least, supervision of students and postdocs, or correspondence with collaborators at other institutions) to produce a decent paper. At a little before the midpoint of the year, I can think of two CM theorists who have already produced, between the two of them, 23 preprints on the arxiv. That's something like one paper every 2.5 weeks for each of these people. Wow.

    Sunday, June 17, 2007

    Grand challenges

    As a condensed matter blogger, I am obligated to comment on the new report out from the National Research Council, titled "Condensed-Matter and Materials Physics: the Science of the World Around Us". This report is intended to list grand challenges for the discipline in the coming decade(s). I agree with the title, of course. As I wrote when I started this blog, while high energy physics and astrophysics grab much of the cachet and popular attention, it's very hard to dispute that condensed matter physics has had a much more direct impact on the daily lives of people living in developed societies. The transistor, the solid-state laser, and magnetic data storage are three prime examples of technologies that originated from condensed matter physics.

    I haven't read the full report yet, but I had read the interim report and know several of the people who put this thing together. I think the substance is definitely there, though I do wonder if the summary suffers because of the decision to write the grand challenges in language for the consumption of the lay public. The challenges are:
    1. How do complex phenomena emerge from simple ingredients? Phrased this way this challenge sounds rather naive; the whole point of condensed matter physics is that rich phenomena can be emergent from systems with many (simply) interacting degrees of freedom. Still, this gets to the heart of the discipline and many outstanding questions. Why can one material system exhibit metallic behavior, superconductivity, and antiferromagnetic insulating order with only minor tweaks in composition? Figure that one out, and win a trip to Stockholm.
    2. How will the energy demands of future generations be met? This is clearly not the purview of condensed matter alone, but there is little doubt that our discipline can contribute here. Photovoltaic materials, supercapacitor and battery electrodes, catalytically active materials, light/strong composites, novel superconductors for transmission.... There are any number of reasons why investing in CMMP is an intelligent component of a sound energy policy.
    3. What is the physics of life? This is really a biophysics question, though certainly condensed matter physics is closely relevant. At the very least, the principles and methods of condensed matter physics are highly likely to play roles in unraveling some of the basic questions in living systems (e.g., How does the chemical energy released in the conversion of ATP to ADP actually get translated into mechanical motion in the protein motor that turns the flagellum of a bacterium?).
    4. What happens far from equilibrium and why? This is a good one. Equilibrium statistical mechanics and its quantum form are tremendously useful, but nonequilibrium problems are very important and there exists no general formulation for treating them. Heck, any electronic transport measurement is a nonequilibrium experiment, and beyond linear response theory life can get very complicated. Add in strong electronic correlations, and you are at the frontiers of some of the most interesting work (to me, anyway) going on right now.
    5. What new discoveries await us in the nanoworld? Wow - this one really sounds like a sixth-grade filmstrip title. I would've preferred something like, "What new physics will be found when we control materials on the nanoscale?" The ability to manipulate and engineer systems with precision approaching the atomic scale lets us examine systems (e.g., single quantum impurities; candidate qubits) that can reveal rich physics as well as possible applications to technology.
    6. How will the information technology revolution be extended? I don't know.... While this is certainly a useful goal of CMMP, and this point clearly encompasses exciting physics relevant in quantum computation as well as things like plasmonics and nanophotonics, I'm not sure that this is really a physics grand challenge per se - more of an engineering challenge.
    So, what's missing? Well, I'm sure people will make suggestions in the comments, but here's one from me (though I'm sure that the NRC panelists consider this to be subsumed under point 1 above): Is there an efficient and exact general computational method for finding the ground state of the general strongly-interacting, strongly correlated many-electron problem? Basically I want something better than DFT that handles strong correlations. That would definitely be a grand challenge, though it's way too detailed ("physicsy") to fit the structure used in the above list.

    The report also emphasizes the fact that research funding in the physical sciences, particularly CMMP, is lagging that in other nations these days, and that this is probably not to our competitive advantage. The demise of long-term industrial R&D in the US has not helped matters. None of this is news, really, but one major purpose of reports like this one is to send a message to Congress. Hence the use of non-physicsy language for the challenges, I'm sure.

    Wednesday, June 13, 2007

    Albany Nanotech

    I returned today from a 1-day visit to Albany Nanotech, the absolutely enormous joint venture between SUNY Albany and a whole slew of collaborators, including International Sematech. In terms of facilities, this place is unparalleled. They have multiple photolithography tools for 300mm wafer processing, including standard (in-air, capable of 65 nm features), immersion (using the refractive index of very pure water to shrink the wavelength, allowing features down to 33 nm), EUV (reflective optics, 13.6 nm wavelength source, one of only two such systems in the world), and e-beam. They have every etching, deposition, polishing, and characterization tool you can think of. 80000 ft^2 of cleanroom space. I confess: I have facility envy. No other university could pull this off - this is an unprecedented confluence of industrial investment, educational initiative, and gobs of state funding, and seems to me like a sustainable model, at least for the next decade or more. No wonder Sematech is shifting lots (most?) of their operations to Albany.

    Saturday, June 09, 2007

    This week in cond-mat

    Two more papers that look interesting.

    arxiv:0706.0792 - Koop et al., Persistence of the 0.7 anomaly of quantum point contacts in high magnetic fields
    One of the neatest results (in my opinion) in mesoscopic physics is the appearance of conductance quantization in quantum point contacts, first shown in the late 1980s. The basic idea is simple. Start with a two-dimensional electron gas such as that formed at the interface between GaAs and modulation-doped AlGaAs. Metal gates on top of such a structure can be used to deplete the electron gas in particular places. Two closely spaced gates may be used to create a narrow constriction between two large reservoirs of 2d electron gas. As the constriction width is reduced until it is comparable to the Fermi wavelength of the confined electrons, the conductance through the constriction is quantized (at zero magnetic field) in integer multiples of G0 = 2e^2/h, the quantum of conductance (about 1/(13 kOhms)). That is, each spatial mode (each transverse subband of the constriction) can transport e^2/h worth of conductance per spin degree of freedom. Indeed, at very large magnetic fields, the conductance is quantized as integer multiples of G0/2, as one would expect if the different subbands are spin-split due to the Zeeman effect. This is all well explained by single-particle theory and the Landauer-Buttiker picture of conduction through small systems. In very clean quantum point contacts, additional structure is seen at 0.7 G0 - this is the so-called 0.7 anomaly. In the presence of a little bit of in-plane magnetic field, this approaches 0.5 G0, and therefore looks like there is some spontaneous spin-splitting, and this is a many-body effect that is the result of some kind of electron-electron correlation physics. This paper is an extensive study of 14 such point contacts, fully mapping out their magnetic field dependence and nonequilibrium (large bias voltage) properties.

    arxiv:0706.0906 - Clark et al., Nonclassical rotational inertia in single crystal helium
    The controversy over whether 4He has a true supersolid phase continues. This week this article appeared in Science, summarizing a number of recent experiments, and strongly suggesting that single crystals of pure 4He should not show a real supersolid phase - basically the claim is that the effects ascribed to such a phase are really due to disorder (glassy 4He at grain boundaries between crystals? 3He impurities somehow?). Now comes this paper from Moses Chan's group, arguing from new experiments that even carefully nucleated and grown single crystals of 4He show evidence of supersolid behavior (in the form of a nonclassical moment of rotational inertia). Hmmm. Neat, clever experimental design. It'll be interesting to see how this all pans out.

    Monday, June 04, 2007

    Link plus a couple of papers

    The Incoherent Ponderer has a fascinating analysis up of the statistics of the PhD-to-faculty pipeline in physics. The one thing missing (for lack of a good source of statistics) is how many physics PhDs go on to become faculty in a different discipline. This is increasingly common in this age of interdisciplinary work. For example, while by the IP's rankings Rice only places 1.9 percent of its PhDs as faculty members in top-50 physics departments, I can think of a few who are now faculty in, e.g., EE, Mat Sci, BioE, Chemistry, etc. It would be very interesting to look at the trends over the last twenty or thirty years. One reason for the pedigree effect is that good science is correlated with having cutting-edge resources - as fancier facilities (at least in condensed matter) have trickled down to the masses, so to speak, have things become more egalitarian?

    Two more points.... First, I have some nagging doubts about the validity of some of those numbers. I can already count 7 Stanford PhD alumni that I know who have assistant/assoc. faculty positions in top-50 universities. According to the AIP numbers, that's 25% of all of the ones out there. That seems hard for me to believe. Second, Chad Orzel has a very valid observation that goes to the heart of a pathology in our field. 93% of all colleges and universities are not in the top 50. As a discipline I think we do real sociological damage to our students when we brain-wash them into thinking that the only successful outcome of a graduate degree is a tenured job at Harvard. That kind of snobbery is harmful, and probably has something to do with attrition rates. People should not decide that they're failures because R1 academia isn't what they want to do. I thought hard about taking a job offer from a college, and I still resent the fact that some people clearly thought I was loopy for even considering that path.

    arxiv:0706.0381 - Fiebig et al., Conservation of energy in coherent backscattering of light
    This paper is at once a very nice piece of experimental work, and an example of the kind of argument that I really don't like. In mesoscopic physics, there is a phenomenon known as weak localization for electrons. Consider an electron moving through a disordered medium, and look at one particular trajectory that contains a closed loop (made up of straight propagation pieces and elastic scattering events). Feynman says that the amplitude corresponding to this trajectory is a complex number whose phase is found by adding up the phase from propagation along the straight segments plus the phase shifts from the scattering events. Now consider a second trajectory, identical to the first, but traversing the loop in the opposite direction. It turns out that the amplitudes of these two trajectories interfere constructively for backscattering by the loop. That is, the quantum probability for getting through the loop is below the classical value, and the quantum probability for getting reflected by the loop excedes the classical value. It turns out something very analogous to this can happen for light propagating through a diffusive medium, and this can be the basis for some really cool things, like random lasers (where the back-scattering itself acts like an effective cavity!). The authors of this paper show the physics of this beautifully, but they present it in the form of a straw man argument, saying that the coherent scattering result (with greater than classical backscattering) looks at first glance like it violates conservation of energy. No, it doesn't. It looks like coherent scattering. It doesn't look like a violation of conservation of energy any more than typical diffraction does.

    arxiv:0705.4260 - Huang et al., Experimental realization of a silicon spin field-effect transistor
    For nearly 17 years people have been trying to make a spin transistor of the type discussed here. The idea is that spins are injected from a magnetically polarized source, traverse a channel region, and then try to leave through a magnetically polarized grain. Depending on the gate electric field, the moving spins precess and either get out of the system or not depending on their eventual alignment relative to the drain magnetization. This has historically been extremely difficult for many reasons, not the least of which are the difficulty in injecting highly polarized carriers into a semiconductor and the annoying fact that spin polarization, unlike charge, can relax away to nothing. Well, this is a pretty convincing demo of a device quite close in concept to the original idea, though it's not a field-effect geometry as first conceived. Very pretty data.

    Thursday, May 31, 2007

    Hype. Again.

    Remember this post, where I reported on interesting Shubnikov-deHaas oscillations in very pure high-Tc material? Well, that paper has now come out in Nature. Unsurprisingly, there has been an associated flurry of press, including this article. In case that link doesn't work, I'll spoil the punchline for you:
    Canadian physicists have cracked a decades-old mystery surrounding metals that carry electricity without resistance, opening the door for everyday trains that levitate on magnetic fields, ultrapowerful quantum computers and big savings for utilities.
    ...
    Taillefer predicted the discovery would lead to room-temperature superconductors within 10 years, triggering a technological revolution similar to the invention of the transistor.

    One of the most promising applications for such superconducting metals is in magnetic levitation trains, which can theoretically run at speeds of up to 500 km/h.
    ...
    Other possible superconducting applications include shrinking MRI machines to the size of laptops, eliminating the 10 to 20 per cent electricity lost from resistance inside power stations and building quantum computers, machines so powerful they would make today's supercomputers resemble mere pocket calculators.

    Wow. They get from Shubnikov-deHaas oscillations to room temperature superconductors to maglev trains and quantum computers.
    I had no idea that getting clean samples could do so much. I'm presuming that most of the fault for this lies in the journalism rather than the scientists, but let this be a cautionary tale.

    Annoying conventions

    What do you find to be the most annoying conventions in physics? The classic example is the choice (darn you, Ben Franklin) of sign for the charge of the electron. Franklin had a 50/50 chance, and we ended up with the often confusing situation that current flow and particle flow are oppositely directed, that "up" on energy level diagrams corresponds to more negative voltages, etc. [EDIT: I think my earlier statements here about UPS conventions stem from a particular paper that isn't representative; never mind.... ]. Do any of you out there have other examples of really bad/misleading conventions?

    Thursday, May 24, 2007

    This week in cond-mat

    Only one quick blurb for now - there have been a number of neat looking papers on the arxiv lately, but I just haven't had time to read them. I am actually making some progress on my book, though.

    arxiv:0705.2180 - Martin et al., Observation of electron-hole puddles in graphene using a scanning single electron transistor
    A single-electron transistor (SET) consists of an "island" (in this case, a patch of aluminum film) weakly connected by tunnel barriers (in this case, aluminum oxide) to source and drain electrodes (also aluminum films here). Defining the total capacitance of the island to be C, the Coulomb energy cost of adding another electron to the island is E_c ~ e^2/C. If E_c >> kT, the thermal energy scale, and the tunneling resistances of the barriers are >~ h/e^2 (~ 26 kOhms), then the number of electrons on the island is fixed to be an integer. By varying the voltage on a nearby gate electrode coupled capacitively to the island, it is possible to change the average population of the island by one electron at a time. When the gate is set such that the island is just on the cusp of going from an electronic population of n to n+1, the source-island-drain conductance of the device has a peak and is very strongly dependent on that gate voltage. Instead of using a gate electrode, one could use the local electronic environment near the island to modulate the island potential (and hence the conductance). SETs are incredibly good electrometers, able to sense tiny fractions of an electronic charge nearby. Now consider sticking such an SET electrometer on the end of a scanned probe tip (in this case, fabricate it directly on the end of a tapered optical fiber). This is the scanning SET, a wonderful imaging tool developed and refined originally at Bell Labs by people like Harald Hess, Ted Fulton, Bob Willett, Mike Yoo, Amir Yacoby, and Nicolai Zhitenev.

    In this paper Amir and colleagues (von Klitzing and company) use the scanning SET to look at graphene near the charge neutrality point as well as in the quantum Hall regime. They can see how the system breaks up into puddles of electron-rich and hole-rich regions with ~ 100 nm spatial resolution. This is a nice application of the S-SET technique, which can be extremely arduous - meeting the temperature requirement for good charge sensitivity requires working at very low temperatures (at least 3He fridge); the SET itself is very fragile and static sensitive; and the scanned probe setup is easy to crash into the sample surface. All in all, a tour de force tool that is unlikely to make its way into common usage any time soon.

    Thursday, May 17, 2007

    FOIA

    I got a very surprising email this morning from the NSF. Someone made a Freedom of Information Act request to get a copy of one of my NSF grant proposals. Now, I know that technically this is allowed - in principle, if someone wanted to, they could get (via FOIA) copies of their direct competitor's federal grants (with certain privacy information like social security numbers redacted). However, I've never actually heard of anyone doing this in practice - it's just not cricket, so to speak. The NSF gave me the name of the person, and I'm left to wonder: did they do this just to see an example of a funded proposal? Why didn't they contact me directly? Did they know that NSF was going to tell me about this? It's all perfectly legal, but I find it unsettling, and I can't pinpoint the precise reason. Has this ever happened to anyone else out there?

    SCES '07 day 4

    Back to the SCES conference this afternoon, after my campus commitments, for the second session on strong correlations in mesoscopic systems. Some highlights:

    David Goldhaber-Gordon gave a nice talk about his group's work on using semiconductor nanostructures to engineer the two-channel Kondo effect. The work has been published here and is available on the arxiv here. In the single-channel Kondo problem, a free spin is coupled via tunneling to a single electronic bath. Antiferromagnetic exchange between the spin and the conduction electrons leads to the formation of a singlet at low temperatures - the spin is screened, and the ground state of the system is a Fermi liquid. In the two-channel Kondo problem, a single spin is coupled via tunneling to two independent electronic baths. Each bath tries to "screen" the spin via antiferromagnetic exchange, with the result that the spin is overscreened. The ground state of that system is supposed to be a non-Fermi-liquid, meaning that its low energy excitations don't look like weakly interacting quasiparticles. The hard part about testing this is actually making two truly independent electronic baths. The paper shows a clever implementation that effectively does this, at least over a limited temperature range.

    Yong Chen, one of Randy Hulet's postdocs, gave a talk about using cold atoms to study Anderson localization. By sending a laser through frosted glass, they can use the resulting speckle pattern to provide a disordered potential for trapped cold bosonic atoms. They can dial around the strength of the disorder potential by changing that laser's intensity. Then they can play games with the trap potential to test how delocalized the Bose-condensed atoms are (kick the trap and look for resulting oscillations), and independently check for coherence by looking for interference fringes. The preliminary data are pretty exciting.

    Ravin Bhatt talked about (theoretical) ways to try and produce ferromagnetism in doped semiconductors containing only nonmagnetic atoms, at very low carrier densities. The trick is to somehow get the system to have more electrons than there are donors. One can imagine doing this with clever modulation doping schemes. No one's pulled it off yet, but it sounds cool and the numerical results look suggestive.

    Unfortunately more Rice commitments mean that I won't make it to the last day of the conference tomorrow. Ahh well. It was an interesting meeting.

    Tuesday, May 15, 2007

    SCES '07 day 2

    Again I only was able to see the morning session today (and will be at Rice until Thursday pm). This means I'll miss the big "BCS@50" plenary session. However, here are a couple of talks that I did get to see....

    First, T. Senthil started the day with a talk about spin liquids. This is a theoretically deep concept that I would love to understand better. The basic idea is that one can recast the interacting many-body problem in terms of new excitations of spinons (chargeless spin 1/2 excitations). The cost of doing this is that the spinons have "infinitely nonlocal" statistical correlations. However, these interactions can be made to look simple by introducing some effective gauge "charge" for the spinons and some effective gauge "magnetic field" - then the correlations look like the Aharonov-Bohm effect in this gauge language. If this sounds vague, it's partly because I don't really understand it. The upshot is that the spinons can be fermionic, and therefore have a Fermi surface, and this leads to nontrivial low temperature properties, particularly in systems where the whole weakly interacting quasiparticle picture falls apart. If anyone can point me to a good review article about this, I'd appreciate it.

    There were a couple of other strong theory talks. Natan Andrei talked about a general approach to quantum impurities driven out of equilibrium (e.g., as in a quantum dot in the Kondo regime at large source-drain bias). Strong correlations + nonequilibrium is a tough nut to crack. Andrei argued that one can rewrite the problem in terms of scattering of initial states via simple phase shifts, provided that one picks the right (nasty, complicated) basis for the initial states that somehow wraps up the strong correlation effects. This choice of basis is apparently a form of the Bethe Ansatz, which I also need to understand better.

    On the experimental side, besides my talk, Gleb Finkelstein from Duke gave a very nice talk about Kondo physics in carbon nanotube quantum dots. The really clever aspect of the work is that, through careful engineering of the contacts to the tube, the actual leads to the dot + the tunnel barriers + the dot itself are all formed out of the same nanotube. As a result the tunnel barriers preserve the special band structure symmetry (SO(4)) of the tube and the leads, leading to profoundly neat effects in transport.

    Monday, May 14, 2007

    SCES '07, Day 1

    Today was the first day of the 2007 International Conference on Strongly Correlated Electron Systems (SCES), hosted this year in Houston jointly by Rice and UH. It's a pretty big meeting, typically with between 600 and 700 participants. Traditionally the meeting has had a very strong European and Asian participation, with a focus on heavy fermion compounds and high-Tc superconductivity. This year, there's an increased inclusion of strongly correlated physics in mesoscopic systems (quantum dots, nanotubes, graphene, single-molecule devices), as well as discussion of model correlated systems based on ultracold trapped atoms and molecules.

    Because I'm on an internal search committee for a dean, my semester still hasn't really ended, which means that I'm going to miss a fair bit of the meeting. However, I'll still try to blog a couple of highlights daily. Here are two neat, new (and as yet unpublished) results that I saw this morning.

    First, Louis Taillefer spoke about new measurements done in extremely high quality hole-doped YBCO at high magnetic fields (pulsed up to 60 Tesla). This work has been focused on trying to suppress superconductivity with a field in this, the grand-daddy of the high-Tc compounds, and to understand the ground state and possible quantum phase transitions (as a function of doping) of the normal phase. The exciting new result is that Taillefer and collaborators have been able to see Shubnikov-deHaas oscillations in this material for the first time. This is a big deal. First, it tells you that there are some sort of excitations in the normal state that can execute closed cyclotron orbits in the presence of a magnetic field. Since the validity of weakly interacting quasiparticles in the normal state is in significant doubt, this is interesting. Second, the frequency of the oscillations in 1/B reveals the area enclosed by those orbits in k-space - essentially it tells you how big the hole pockets are in the Brillouin zone, and therefore how many mobile holes there are per copper atom. Third, the temperature dependence of the S-deH oscillations lets you infer an effective mass (in this case, about 1.9 free electron masses) for whatever's doing the cyclotron motion. Very neat!

    Second, Abhay Pasupathy from Ali Yazdani's group at Princeton showed some beautiful new STM data on BSCCO. The neat thing here is that their superfancy STM is absurdly stable over a big temperature range for days. That means that they can map out the tunneling density of states of the material on the atomic scale as a function of temperature, from deep within the superconducting state to well above the resistively detected Tc. They see that the gap in the density of states indicative of pair formation vanishes nonuniformly over the surface, with local bits persisting to well above the average Tc. They also show that the temperature dependence of the gap as a function of gap size is very different than that in low-Tc materials.

    Sunday, May 13, 2007

    Tenure - some advice.

    This past week there was a flurry of science blogging regarding tenure, such as these posts at Cosmic Varience (here and here), Rob Knop's post about his situation, Chad Orzel's commentary, and the Incoherent Ponderer's take on the tenure process here. The IP's take on things summarizes my general thoughts on the tenure process pretty well. In terms of the job pipeline, the biggest cut in population happens when trying to get a faculty position, not at the tenure stage. In reasonable departments, no one is happy when a tenure promotion case fails. Good departments (and schools and universities) try very hard to filter at the hiring level and give their faculty the resources they need to succeed. I can only think of two or three places (in physics anyway) that historically have had a "sink or swim" attitude (that is, hiring a junior person in an area today means that seven years from now the university wants the best senior person in the world in that area - being in-house is not advantage), and I'm not sure that's even true anymore.

    In the links above, people are mostly focused on the process and outcomes; I think it would be useful to give some suggestions about how to approach the tenure process from the position of the junior candidate. I am hardly in a position to give too much sage advice about tenure, and what follows below is largely common sense. Obviously the situation is different in various disciplines and at different universities, but here's some basic points that I think should be considered. I'm sure I'll leave things out - feel free to chide me in the comments.

    Understand the process. Find out how the tenure process works at your institution. This should be written down in a faculty handbook. Talk to your department chair, your faculty mentor (if your department has such a thing) or senior colleagues. Understand the timeline. Get a sense of the weight that your institution places on the different components of the job (see below). Does the departmental vote carry a lot of weight (as it usually does at Rice, for example), or are the deans or the university promotions and tenure (P&T) committee commonly overriding departmental decisions?

    The process probably goes something like this: the candidate is hired for a 4-year tenure-track appointment, with some kind of annual reviews and a more major renewal review in year 3 or 4. (This gives the university a chance to end the process early if there's a major problem with an assistant prof, and forces departments to give some concrete feedback to the assistant prof about how they stand.) In the summer before year 6 (at most places) the candidate is asked to put together a dossier (complete CV, reprints of papers, a summary of funding, a statement about university service, a statement about teaching, a summary of research accomplishments, etc.) and suggest names for external evaluators. The department comes up with additional names for external evaluation, and sends the full dossier to some mix of the external people. Eventually these external letters come back, and the department reads them, puts the whole package together, and there's a vote of the tenured faculty (in October or November) about whether to recommend the assistant prof for tenure. The departmental recommendation then goes to the cognizant dean, and from there to the university P&T committee (which generally would have people from all sorts of disciplines on there, from bio to French lit). Sometimes P&T committees or deans can request more external letters, and they get copies of teaching evaluations, etc., and may meet directly with department chairs. Eventually the P&T committee makes its decisions (in late spring) and the candidate finds out. That decision is finally signed off by the president of the university and the board of trustees.

    The research component. To get tenure you need actually need to be getting science done. There's no sure-fire recipe for success here, but let me make a few suggestions:
    • Have a mix of projects that range from easier to high-risk/high-reward. Having only one major project can be very risky, particularly if it takes five years to get any results. One key element of getting tenure is that people in your community need to know who you are, what you've done, and what you've been doing that's really yours - new stuff from your professorial position, not rehash of your thesis or postdoc work.
    • Make sure that your colleagues know what you're doing. Your colleagues are going to need to understand your work at least on some level, and particularly for hard projects, they will need to have some idea why it may take four years before a paper comes out.
    • Have backup plans. High risk things may not succeed (no kidding.). Make sure, for your students' sake and yours, that you have thought out the projects well, so that even if you don't achieve the BIG goal, you are still learning useful things that are worth publishing.
    • Have a high attempt frequency for funding. If there's literally only one agency in the world that funds your work, that's risky and unfortunate. Make sure that you know what your options are for funding sources. Call up program officers. Ask to get a chance to serve on review panels - you'll learn a huge amount about writing proposals that way! Know if there are state funding opportunities. Think ahead about private foundations (e.g., Research Corporation).
    • Do some self-promotion but don't sell your soul. If your external evaluators don't know who you are, that's the kiss of death. Make sure you give talks at meetings. See what you can do about getting invited to give seminars at other schools. Yes, this is one issue where "well-connected" people really benefit, but if you go to meetings and get to know the people in your field, it's not that bad. Get involved in your own department's seminar series, and invite in people that you'd like to meet and talk to.
    • Publish good stuff. This is always the tricky bit, and people joke about the "least publishable unit". Still, holding back everything for the one big Nature paper that may not happen is not necessarily the best strategy, for you or your students.
    The teaching component. Do a good job teaching. Most universities have resources available to help you - teaching centers that will videotape your lectures, offer suggestions for improved technique, etc. Good teaching can only help tenure in limited ways at a research university, but poor teaching can certainly hurt a borderline case.

    The service component. Do a decent job in departmental and university service. Don't let it eat all your time, but get involved in things that matter to you. It's also a good way to get to know your administrators and people in other departments. I'm not suggesting currying favor - just be a solid citizen.

    Common sense. People argue about whether blogging can hurt your tenure chances. Blogging is only one example of a public forum, though. Use some common sense. Publicly badmouthing your institution, colleagues, administrators, etc. is not a good idea. (I'm not talking about hushing up legitimate grievances - I'm saying don't antagonize people gratuitously.)

    I'm sure I could write more, and will probably update this later. This is some food for thought for now, at least.

    Thursday, May 10, 2007

    The trouble with mercenaries

    The trouble with mercenaries is that they can be bought. For example, the state of Texas bribed fair and square - errr, gave $40M and lots of tax incentives - to International Sematech, the big consortium of semiconductor manufacturers, in 2004 in exchange for them staying in Austin. That worked out really well for all concerned: today Sematech announced that they're picking up and moving to Albany because New York offered them more money. If I was Gov. Perry, I'd be pretty annoyed.
    Update: Some Sematech people came to Rice yesterday and were grilled a bit about this. They say that the New York business is a parallel operation and won't affect their Texas activities; they also said that the reporting on this was pretty awful. Interesting. I guess time will tell about how much the focus of their work shifts to Albany. Given that the state of NY put over $3B into that setup, it seems like Texas will either have to do something similar, or face a possible eventual slide into secondary status.

    Thursday, May 03, 2007

    An article I'd missed

    While I was traveling, the Wall Street Journal ran this article about Bob Laughlin and his tenure as president of KAIST, one of the premiere research universities in South Korea. The article is definitely worth a read. It has some classic understatements:
    Dr. Laughlin, a Stanford physicist, is a talkative man quick to express his opinions.
    KAIST hired Laughlin to come in and shake things up. When he did, he did so in characteristic Bob fashion, and they reacted negatively. Things went south from there:
    In an attempt to assert his control, Dr. Laughlin in December 2005 set out to personally interview and evaluate every one of Kaist's 400 or so faculty members, focusing mainly on the quality of their research projects and academic work. For those professors who agreed to the interviews, he gave them a one-paragraph summary grading their work from "unimportant" to "very important."
    Yeah, that may have rubbed people the wrong way.

    To be fair to Bob, the leaders of KAIST were crazy to hire him - all issues of personality clashes aside, he'd never managed a group of more than a handful of people, let alone an enormous research institution with a complex bureaucracy, large staff, and huge budget. Surprise: a Nobel prize in physics doesn't automatically imply success in extremely sophisticated management problems. It's also entirely possible that his assigned task was essentially impossible by design. An interesting read, anyway.

    Wednesday, May 02, 2007

    NSF grantees conference post mortem

    It was useful to network with fellow NSF grantees for the last two days in Reno - an interesting mix of people and projects. A few observations:
    • Rice's webmail client is so pathetically slow that it's nearly unusable.
    • Hotels in Nevada walk a fine line between needing to compete with other hotels on the one hand, and trying to make sure you'd rather sit in the casino than your room on the other.
    • "Nuggets" = out. "Highlights" = in.
    • "Disruptive technology" = out. "Transformative" = in.
    • 4 hours of 5-minute talks in one day is too many, at least for me.
    • Some people don't understand the meaning of "Your talk should be three slides."
    • Most of the NER projects (high-risk, high-reward one-year single investigator grants) from FY04 in the ECS part of the engineering directorate of NSF actually worked, and were pretty cool.
    • The free wifi in the Reno airport makes up for the beeping, blinking slot machines in the terminal.
    UPDATE: Rice upgraded their webmail service today. How's that for timing?