## Friday, December 30, 2011

### Tidbits.

First, I have a guest post on the Houston Chronicle's science blog today.  Thanks for the opportunity, Eric.

Second, here is a great example of science popularization from the BBC.  We should do things like this on US television, instead of having Discovery Channel and TLC show garbage about "alien astronauts" and "ghost hunting".

Third, if you see the latest Sherlock Holmes flick, keep an eye out for subtle details about Prof. Moriarty - there's some fun math/physics stuff hidden in there (pdf) for real devotees of the Holmes canon.

## Wednesday, December 28, 2011

### Shifting gears

One of the most appealing aspects of a career in academic science and engineering is the freedom to choose your area of research. This freedom is extremely rare in an industrial setting, and becoming more so all the time. Taking myself as an example, I was hired as an experimental condensed matter physicist, presumably because my department felt that this was a fruitful area in which they would like to expand and in which they had teaching needs. During the application and interview process, I had to submit a "research plan" document, meant to give the department a sense of what I planned to do. However, as long as I was able to produce good science and bring in sufficient funding to finance that research, the department really had no say-so at all about what I did - no one read my proposals before they went out the door (unless I wanted proposal-writing advice), no one told me what to do scientifically. You would be very hard-pressed to find an industrial setting with that much freedom.

So, how does a scientist or engineer with this much freedom determine what to do and how to allocate intellectual resources? I can only speak for myself, but it would be interesting to hear from others in the comments. I look for problems where (a) I think there are scientific questions that need to be answered, ideally tied to deeper issues that interest me; (b) my background, skill set, or point of view give me what I perceive to be either a competitive advantage or a unique angle on the problem; and (c) there is some credible path for funding. I suspect this is typical, with people weighting these factors variously. Certainly those who run giant "supergroups" in chemistry and materials science by necessity have more of a "That's where the money is" attitude; however, I don't personally know anyone who works in an area in which they have zero intellectual interest just because it's well funded. Getting resources is hard work, and you can't do it effectively if your heart's not in it.

A related question is, when and how do you shift topics? These days, it's increasingly rare to find a person in academic science who picks a narrow specialty and sits there for decades. Research problems actually get solved. Fields evolve. There are competing factors, though, particularly for experimentalists. Once you become invested in a given area (say scanned probe microscopy), this results in a lot of inertia - new tools are expensive and hard to get. It can also be difficult to get into the mainstream of a new topic from the outside, in terms of grants and papers. Jumping on the latest bandwagon is not necessarily the best path to success. On the other hand, remaining in a small niche isn't healthy. All of these are "first-world problems", of course - for someone in research, it's far better to be wrestling with these challenges than the alternative.

## Saturday, December 17, 2011

### students and their mental health

There was an interesting article earlier this week in the Wall Street Journal, on mental health concerns in college students. It's no secret that mental illness often has an onset in the late teens and early twenties. It's also not a surprise that there are significant stressors associated with college (or graduate school), including being in a new environment w/ a different (possibly much smaller) social support structure, the pressure to succeed academically, the need to budget time much more self-sufficiently than at previous stages of life, and simple things like lack of sleep. As a result, sometimes as a faculty member you come across students who have real problems.

In undergrads, often these issues manifest as persistent erratic or academically self-destructive behavior (failure to hand in assignments, failure to show up for exams). Different faculty members have various ways to deal with this. One approach is to be hands-off - from the privacy and social boundaries perspective, it's challenging to inquire about these behaviors (is a student just having a tough time in college or in a particular class, or is a student afflicted with a debilitating mental health issue, or are is the student somewhere on the continuum in between). The sink-or-swim attitude doesn't really sit well with me, but it's always a challenge to figure out the best way to handle this stuff.

In grad students, these issues can become even more critical - students are older, expectations of self-sufficiency are much higher, and the interactions between faculty and students are somewhere between teacher/student, boss/employee, and collaborator/collaborator. The most important thing, of course, is to ensure that at the end of the day the student is healthy, regardless of degree progress. If the right answer is that a student should take time off or drop out of a program for treatment or convalescence, then that's what has to happen. Of course, it's never that simple, for the student, for the advisor, for the university.

Anyway, I suggest reading the WSJ article if you have access. It's quite thought-provoking.

## Friday, December 16, 2011

### Universality and "glassy" physics

One remarkable aspect of Nature is the recurrence of certain mathematically interesting motifs in different contexts.  When we see a certain property or relationship that shows up again and again, we tend to call that "universality", and we look for underlying physical reasons to explain its reappearance in many apparently disparate contexts.  A great review of one such type of physics was posted on the arxiv the other day.

Physicists commonly talk about highly ordered, idealized systems (like infinite, perfectly periodic crystals), because often such regularity is comparatively simple to describe mathematically.  The energy of such a crystal is nicely minimized by the regular arrangement of atoms.   At the other extreme are very strongly disordered systems.  These disordered systems are often called "glassy" because structural glasses (like the stuff in your display) are an example.  In these systems, disorder dominates completely; the "landscape" of energy as a function of configuration is a big mess, with many local minima - a whole statistical distribution of possible configurations, with a whole distribution of energy "barriers" between them.  Systems like that crop up all the time in different contexts, and yet share some amazingly universal properties.  One of the most dramatic is that when disturbed, these systems take an exceedingly long time to respond completely.  Some parts of the system respond fast, others more slowly, and when you add them all together, you get total responses that look logarithmic in time (not exponential, which would indicate a single timescale for relaxation).  For example, the deformation response of crumpled paper (!) shows a relaxation that is described by constant*log(t) for more than 6 decades in time!  Likewise, the speed of sound or dielectric response in a glass at very low temperatures also shows logarithmic decays.  This review gives a great discussion of this - I highly recommend it (even though the papers they cite from my PhD advisor's lab came after I left :-)  ).

## Monday, December 12, 2011

### Higgs or no

The answer is going to be, to quote the Magic 8-Ball, "Ask again later." Sounds like the folks at CERN are on track to make a more definitive statement about the Higgs boson in about one more Friedman Unit. That won't stop an enormous surge of media attention tomorrow, as CERN tries very hard to have their cake and eat it, too ("We've found [evidence consistent with] the God Particle! At least, it's [evidence not inconsistent with] the God Particle!"). What this exercise will really demonstrate is that many news media figures are statistically illiterate.

I should point out that, with the rumors of a statistically not yet huge bump in the data near 125 GeV, there has suddenly been an uptick in predictions of Higgs bosons with just that mass. How convenient.

Update - Interesting.  For the best write-up I've seen about this, check out Prof. Matt Strassler.  Seems like the central question is, are the two detectors both seeing something in the same place, or not?  That is, is 123-ish GeV the same as 126-ish GeV?  Tune in next year, same Stat-time, same Stat-channel!  (lame joke for fans of 1960s US TV....)

## Saturday, December 10, 2011

### Nano book recommendation

My colleague in Rice's history department, Cyrus Mody, has a new book out called Instrumental Community, about the invention and spread of scanned probe microscopy (and microscopists) that's a very interesting read. If you've ever wondered how and why the scanning tunneling microscope and atomic force microscope took off, and why related ideas like the topografiner (pdf) did not, this is the book for you. It also does a great job of giving a sense of the personalities and work environments at places like IBM Zurich, IBM TJ Watson, IBM Almaden, and Bell Labs.

There are a couple of surprising quotes in there. Stan Williams, these days at HP Labs, says that the environment at Bell Labs was so cut-throat that people would sabotage each others' experiments and steal each others' data. Having been a postdoc there, that surprised me greatly, and doesn't gibe with my impressions or stories I'd heard. Any Bell Labs alumni readers out there care to comment?

The book really drives home what has been lost with the drastic decline of long-term industrial R&D in the US. You can see it all happening in slow motion - the constant struggle to explain why these research efforts are not a waste of shareholder resources, as companies become ever more focused on short term profits and stock prices.

## Wednesday, September 14, 2011

### Lab habits + data management

The reason I had been looking for that Sydney Harris cartoon is that I was putting together a guest lecture for our university's "Responsible Conduct of Research" course. I was speaking today about data management and retention, a topic I've come to know well over the last year through some university service work working on policies in that area. After speaking, it occurred to me that it's not a bad idea to summarize important points on this for the benefit of student readers of this blog.  In brief:
• Everything is data.  Not just raw numbers or images, but also the final analyzed graphs, the software used to do the analysis, the descriptions of the instrument settings used to acquire the raw numbers - everything.
• The data are the science.  The data are the foundation for all the analysis, model-building, papers, arguments, further refinements, patents, etc.  Protect the data!
• If you didn't document it, you didn't do it.
• Write down everything.  Fill up notebooks.  Annotate liberally, including false starts, what you were thinking when you set up the little sub-experiments or trials that go into any major research endeavor.  I guarantee, you will never, ever in your life look back and say, "I regret that I was so thorough, and I wish I had written down less."  After years of observation, I am convinced that good notebook skills genuinely reduce mean time to thesis completion in many cases.  If you actually keep track of what you've been doing, and really write down your logic, you are less likely to go down blind alleys or have to repeat mistakes.
• You may think that you own your data.  You don't, technically.  In an academic setting, the university has legal title to the data (that gives them the legal authority that they need to adjudicate disputes about access to data, including those that arise in the rare but unfortunate cases of research misconduct), while investigators are shepherds or custodians of the data.  Both have their own responsibilities and rights.  Some of those responsibilities are inherent in good science and engineering (e.g., the duty to do your best to make sure that the published results are accurate and correct, as much as possible), and others are imposed externally (e.g., federal funding agencies require preservation of data for some number of years beyond the end of an award).
• Back everything up.  In multiple ways.  With the advent of scanners, digital cameras, cheap external hard drives, laptops, thumbdrives, "the cloud" (as long as it's better than this), etc., there is absolutely no excuse for not properly backing up data.  To repeat, back everything up.  No, seriously.  Have a backup copy at an off-site location, as a sensible precaution against disaster (fire, hurricane, earthquake, zombie apocalypse).
• Good habits are habits, and must be habituated.  It took me more than 25 years to get in the habit of really flossing.  Do yourself a favor, and get in the habit of properly caring for your data.  Please.

## Monday, September 12, 2011

### Help finding a Syndey Harris cartoon

I am trying to find a particular Syndey Harris physics cartoon, and google has let me down. The one I'm picturing has an obvious experimentalist at a workbench strewn with lab equipment. There's an angel on one shoulder, and a devil on the other. Anyone who has this cartoon, I'd be very grateful for a link to a scanned version! Thanks.

## Wednesday, September 07, 2011

### Single-molecule electric motor

As a nano person, I feel like I'm practically obligated to comment on this paper, which has gotten a good deal of media attention. In this experiment, the authors have anchored a single small molecule down to a single-crystal copper surface, in such a way that the molecule can pivot about the single anchoring atom, rotating in the plane of the copper surface. Because of the surface atom arrangement and its interactions with the molecule, the molecule has six energetically equivalent ways that it can be oriented on the metal surface. It's experimentally impressive that the authors came up with a way to track the rotation of the molecule one discrete hop between orientations at a time. This is only do-able when the temperature is sufficiently low that thermally driven orientational diffusion is suppressed. When a current of electrons is properly directed at the molecule, the electrons can dump enough energy into the molecule (inelastically) to kick the molecule around rotationally. In that sense, this is an electric motor. (Of course, while the rotor is a single small molecule, the metal substrate and scanning tunneling microscope tip are macroscopic in size.) The requirements for this particular scheme to work include cryogenic temperatures, ultrahigh vacuum, and ultraclean surfaces. In that sense, talk in the press release about how this will be useful for pushing things around and so forth in, e.g., medical devices is a bit ridiculous. Still a nice experiment, though.  I continue to find the whole problem of nanoscale systems driven out of thermal equilibrium (e.g., by the flow of "hot" electrons) to be fascinating - how is a steady state established, where does the energy go, where does irreversibility come into play, etc.

## Friday, September 02, 2011

### Playing with interfaces for optical fun and profit

A team at Harvard has published in Science a fun and interesting result.  When light passes from one medium to another, there are boundary conditions that have to be obeyed by the electromagnetic field (that is, light still has to obey Maxwell's equations, even when there's a discontinuity in the dielectric function somewhere).  Because of those boundary conditions, we end up with the familiar rules of reflection and refraction.  Going up a level in sophistication and worrying about multiple interfaces, we are used to having to keep track of the phase of the electromagnetic waves and how those phases are affected by the interfaces.  In fact, we have gotten good at manipulating those phases, to produce gadgets like antireflection coatings and dielectric mirrors (and on a more sophisticated level, photonic band gap materials).  What the Harvard team does is use plasmonic metal structures to pattern phase effects at a single interface.  The result is that they can engineer some bizarre reflection and refraction properties when they properly stack the deck in terms of phases.  Very cute.  I must confess, though, that since Federico Capasso was once my boss's boss at Bell Labs, I'm more than a little disturbed by the photo accompanying the physorg article.

## Tuesday, August 30, 2011

### Supersymmetry, the Higgs boson, the LHC, and all that

Lately there has been a big kerfluffle (technical term of art, there) in the blog-o-sphere about what the high energy physics experimentalists are finding, or not finding, at the LHC. See, for example, posts here and here, which reference newspaper articles and the like. Someone asked me what I thought about this the other day, and I thought it might be worth a post.

For non-experts (and in high energy matters, that's about the right level for me to be talking anyway), the main issues can be summarized as follows. There is a theoretical picture, the Standard Model of particle physics, that does an extremely good job (perhaps an unreasonably good job) of describing what appear to be the fundamental building blocks of matter (the quarks and leptons) and their interactions. Unfortunately, the Standard Model has several problems. First, it's not at all clear why many of the parameters in the model (e.g., the masses of the particles) have the values that they do. This may only be a problem with our world view, meaning the precise values of parameters may come essentially from random chance, in which case we'll just have to deal with it. However, it's hard to know that for sure. Moreover, there is an elegant (to some) theoretical idea called the Higgs mechanism that is thought to explain at the same time why particles have mass at all, and how the electroweak interaction has the strength and symmetry that it does. Unfortunately, that mechanism predicts at least one particle which hasn't been seen yet, the Higgs boson. Second, we know that the Standard Model is incomplete, because it doesn't cover gravitational interactions. Attempts to develop a truly complete "theory of everything" have, over the last couple of decades, become increasingly exotic, encompassing ideas like supersymmetry (which would require every particle to have a "superpartner" with the other kind of quantum statistics), extra dimensions (perhaps the universe really has more than 3 spatial dimensions), and flavors of string theory, multiverses, and whatnot. There is zero experimental evidence for any of those concepts so far, and a number of people are concerned that some of the ideas aren't even testable (or falsifiable) in the conventional science sense.

So, the LHC has been running for a while now, the detectors are working well, and data is coming in, and so far, no exotic stuff has been seen. No supersymmetric partners, no Higgs boson over the range of parameters examined, etc. Now, this is not scientifically unreasonable or worrisome. There are many possible scales for supersymmetric partners and we've only looked at a small fraction (though this verges into the issue of falsifiability - will theorists always claim that the superpartners are hiding out there just beyond the edge of what's measurable?). The experts running the LHC experiments knew ahead of time that the most likely mass range for the Higgs would require a *lot* of data before any strong statement can be made. Fine.

So what's the big deal? Why all the attention? It's partly because the LHC is expensive, but mostly it's because the hype surrounding the LHC and the proposed physics exotica has been absolutely out of control for years. If the CERN press office hadn't put out a steady stream of news releases promising that extra dimensions and superpartners and mini black holes and so forth were just around the corner, the reaction out there wouldn't be nearly so strong. The news backlash isn't rational scientifically, but it makes complete sense sociologically. In the mean time, the right thing to do is to sit back and wait patiently while the data comes in and is analyzed. The truth will out - that's the point of science. What will really be interesting from the history and philosophy of science perspective will be the reactions down the line to what is found.

## Wednesday, August 24, 2011

### great post by ZZ

Before I go to teach class this morning, I wanted to link to this great post by ZapperZ about the grad student/research adviser relationship.  Excellent.

## Saturday, August 20, 2011

### Gating and "real" metals.

Orientation week has kept me very busy - hence the paucity of posts.  I did see something intriguing on the arxiv recently (several things, actually, but time is limited at the moment), though.

Suppose I want to make a capacitor out of two metal plates separated by empty space.  If I apply a voltage, V, across the capacitor using a battery, the electrons in the two plates shift their positions slightly, producing a bit of excess charge density at the plate surfaces.  One electrode ends up with an excess of electrons at the surface, so that it has a negative surface charge density.  The other electrode ends up with a deficit of electrons at the surface, and the ion cores of the metal atoms lead to a positive surface charge density.  The net charge on one plate is Q, and the capacitance is defined as C = Q/V.

So, how deep into the metal surfaces is the charge density altered from that in the bulk metal?  The relevant distance is called the screening length, and it's set in large part by the density of mobile electrons.  In a normal metal like copper or gold, which has a high density of mobile (conduction) electrons on the order of 1022 per cm3, the screening length is comparable to an atomic diameter!  That's very short, and it tells you that it's extremely hard to alter the electronic properties of a piece of normal metal by capacitively messing about with its surface - you just don't mess with the electronic density in most of the material.  (This is in contrast to the situation in semiconductors or graphene, by the way, when a capacitive "gate" electrode can change the number of mobile electrons by orders of magnitude.)

That's why this paper was surprising.  The authors use ionic liquids (essentially a kind of salt that's molten at room temperature) to modulate the surface charge density of gold films by something like 1015 electrons per cm2.  The surprising thing is that they claim to see large (e.g., 10%) changes in the conductance of quite thick (40 nm) gold films as a result of this.  This is weird.  For example, the total number of electrons per cm2 already in such a film is something like (6 x 1022/cm3) x (4 x 10-5 cm) = 2.4 x 1018 per cm2.  That means that the gating should only be changing the 2d electron density by something like a tenth of a percent.  Moreover, only the top 0.1 nm of the Au should really be affected.  The data are what they are, but boy this is odd.  There's no doubt that these ionic liquids are an amazing enabling tool for pushing the frontiers of high charge densities in CM physics....

## Sunday, August 14, 2011

### Topological insulator question

I have a question, and I'm hoping one of my reader experts might be able to answer it for me.  Let me set the stage.  One reason 3d topological insulators are a hot topic these days is the idea that they have special 2d states that live at their surfaces.  These surface states are supposed to be "topologically protected" - in lay terms, this means that they are very robust; something deep about their character means that true back-scattering is forbidden.  What this means is, if an electron is in such a state traveling to the right, it is forbidden by symmetry for simple disorder (like a missing atom in the lattice) to scatter the electron into a state traveling to the left.  Now, these surface states are also supposed to have some unusual properties when particle positions are swapped around.  These unconventional statistics are supposed to be of great potential use for quantum computation.  Of course, to do any experiments that are sensitive to these statistics, one needs to do quantum interference measurements using these states.   The lore goes that since the states are topologically protected and therefore robust, this should be not too bad.

Here's my question.  While topological protection suppresses 180 degree backscattering, it does not suppress (as far as I can tell) small angle scattering, and in the case of quantum decoherence, it's the small angle scattering that actually dominates.  It looks to me like the coherence of these surface states shouldn't necessarily be any better than that in conventional materials.  Am I wrong about this?  If so, how?  I've now seen multiple papers in the literature (here, here, and here, for example) that show weak antilocalization physics at work in such materials.  In the last one in particular, it looks like the coherence lengths in these systems (a few hundred nanometers at 1 K) are not even as good as what one would see in a conventional metal film (e.g., high purity Ag or Au) at the same temperatures.  That doesn't seem too protected or robust to me....  I know that the situation is likely to be much more exciting if superconductivity is induced in these systems.  Are the normal state coherence properties just not that important?

## Tuesday, August 09, 2011

### DOE BES CMX PI mtg

Went for the cryptic headline.  I'm off for a Department of Energy Basic Energy Sciences Condensed Matter Experiment principal investigator meeting (the first of its kind, I believe) in the DC area.  This should be really interesting, getting a chance to get a perspective on the variety of condensed matter and materials physics being done out there.  This looks like it will be much more useful than a dog-and-pony show that I went to for one part of another agency a few years ago....

## Monday, August 08, 2011

### Evolution of blogger spam

Over the last couple of weeks, new forms of spam comments have been appearing on blogger. One type takes a sentence or two from the post itself, and feeds them through a parser reminiscent of ELIZA, to produce a vaguely coherent statement in a comment. Another type that I've noticed grabs a sentence or two from an article that was linked in the original post. A third type combines these two, taking a sentence from a linked article, and chewing on it with the ELIZA-like parser. A few more years of this, and we'll have the spontaneous evolutionary development of generalized natural-language artificial intelligence from blogger spam....

## Friday, August 05, 2011

### Summer colloquium

Every year at Rice in early August, the Rice Quantum Institute (old website) (shorthand: people who care about interdisciplinary science and engineering involving hbar) has its annual Summer Colloquium. Today is the twenty-fifth such event. It's a day-long miniconference, featuring oral presentations by grad students and posters, by both grad students and undergrad researchers from a couple of REU programs (this year, the RQI REU and the NanoJapan REU). It's a full day, with many talks. It's a friendly way for students to get more presentation experience, and a good way for faculty to learn what their colleagues are doing. I'd be curious to know if other institutions have similar things - my impression has been that this is comparatively unique, particularly its very broad interdisciplinary nature (e.g., talks on spectroscopy for pollution monitoring, topological insulators, plasmons, carbon nanotube composites, batteries) and combination of undergrads and grad students.

## Thursday, July 28, 2011

### Plutonium: a case study in why CM physics is rich

At the heart of condensed matter physics are two key concepts: the emergence of rich phenomena (including spontaneously occurring order - structural, magnetic, or otherwise) in the many-particle limit; and the critical role played by quantum mechanics in describing the many-body states of the system. I've tried to explain this before to lay persons by pointing out that while complicated electronic structure techniques can do an adequate job of describing the electronic and vibrational properties of a single water molecule at zero temperature, we still have a difficult time predicting really emergent properties, such as phase diagram of liquid, solid, and vapor water, or the viscosity or surface tension of liquid water.

Plutonium is an even more striking example, given that we cannot even understand its properties from first principle when we only have a single type of atom to worry about. The thermodynamic phase diagram of plutonium is very complicated, with seven different crystal structures known, depending on temperature and pressure. Moreover, as a resident of the actinide row of the periodic table, Pu has unpaired 5f electrons, though it is not magnetically ordered. At the same time, Pu is very heavy, with 94 total electrons, so that relativistic spin-orbit effects can't be neglected in trying to understand its structure. The most sophisticated electronic structure techniques out there can't handle this combination of circumstances. It's rather humbling that more than 70 years after its discovery/synthesis, we still can't understand this material, despite the many thousands of person-hours spent on it via various nations' nuclear weapons programs.

## Sunday, July 24, 2011

### Einstein, thermodynamics, and elegance

Recently, in the course of other writing I've been doing, I again came to the topic of what are called Einstein A and B coefficients, and it struck me again that this has to be one of the most elegant, clever physics arguments ever made.  It's also conceptually simple enough that I think it can be explained to nonexperts, so I'm going to give it a shot.

Ninety-four years ago, one of the most shocking ideas in physics was the concept of the spontaneous, apparently random, breakdown of an atomic system.  Radioactive decay is one example, but even light emission from an atom in an excited state will serve.  Take ten hydrogen atoms, all in their first electronically excited state (electron kicked up into a 2p orbital from the 1s orbital).  These will decay back into the 1s ground state (spitting out a photon) at some average rate, but each one will decay independently of the others, and most likely at a different moment in time.  To people brought up in the Newtonian clockwork universe, this was shocking.  How could truly identical atoms have individually differing emission times?  Where does the randomness come from, and can we ever hope to calculate the rate of spontaneous emission?

Around this time (1917), Einstein made a typically brilliant argument:  While we do not yet know [in 1917] how to calculate the rate at which the atoms transition from the ground state "a" to the excited state "b" when we shine light on them (the absorption rate), we can reason that the rate of atoms going from a to b should be proportional to the number of atoms in the ground state (Na) and the amount of energy density in the light available at the right frequency (u(f)).  That is, the rate of transitions "up" = Bab Na u(f), where B is some number that can at least be measured in experiments.  [It turns out that people figured out how to calculate B using perturbation theory in quantum mechanics about ten years later.].  Einstein also figured that there should be an inverse process (stimulated emission), that causes transitions downward from b to a, with a rate = Bba Nb u(f).  However, there is also the spontaneous emission rate = AbaNb, where he introduced the A coefficient.

Here is the brilliance.  Einstein considered the case of thermal equilibrium between atoms and radiation in some cavity.  In steady state, the rate of transitions from a to b must equal the rate of transitions from b to a - in steady state, no atoms are piling up in the ground or excited states.  Moreover, from thermodynamics, in thermal equilibrium, the ratio of Nb to Na should just be a Boltzmann factor, exp(-Eab/kBT), where Eab is the energy difference between the two states, kB is Boltzmann's constant, and T is the temperature.  From this, Einstein shows that the two Bs were equal, was able to solve for the unknown A in terms of B (which can be measured and nowdays calculated), and to show that the energy density of the radiation (u(f,T)) is Planck's blackbody formula.

My feeble writing here doesn't do this justice.  The point is, from basic thermodynamic reasoning, Einstein made it possible to derive an expression for the spontaneous emission rate of atoms, many years in advance of the theory (quantum electrodynamics) that allows one to calculate it directly.  This is what people mean by the elegance of physics - in a few pages, from proper reasoning on fundamental grounds, Einstein was able to deduce relationships that had to exist between different physical parameters; and these parameters could be measured and tested experimentally.  For more on this, here is a page at MIT that links to a great Physics Today article about the topic, and an English translation of Einstein's 1917 paper.

## Thursday, July 21, 2011

### Slackers, coasters, and sherpas, oh my.

This is mostly for my American readers - be forewarned.

I wrote last year about a plan put forward by Rick O'Donnell, a controversial "consultant" hired by the state of Texas (hint: Gov. Rick Perry, apparent 2012 presidential hopeful, wanted this guy.) to study the way public universities work in Texas. Specifically, O'Donnell came from a think tank that had very firm predetermined concept about higher education: Faculty are overpaid slackers that are ripping off students, and research is not of value in the educational environment. O'Donnell has written a report (pdf) about this topic, and he's shocked, shocked to find that he was absolutely right. By his metrics of number of students taught and research dollars brought in, he grouped faculty at UT and Texas A&M into "Dodgers, Coasters, Sherpas, Pioneers, and Stars". Pioneers are the people who bring in big grants and buy out of teaching. Stars are the people who bring in grants and teach large lecture classes. Sherpas are mostly instructors (he doesn't seem to differentiate between instructors and faculty) who lecture to large classes but don't bring in grants. Dodgers teach small classes and don't bring in grant money. Coasters teach small classes and bring in some grant money.

This is the exact incarnation of what I warned about in comments on my old post. This analysis basically declares that all social science and humanities faculty that teach upper division classes are worthless leeches (small classes, no grants) sponging off the university. People in the sciences and engineering who teach upper level classes aren't any better, unless they're bringing in multiple large research grants. Oh, and apparently the only metric for research and scholarship is money.

Nice. Perry, by the way, also appointed Barbara Cargill to run the state board of education. She's a biologist who wants evolution's perceived weaknesses to be emphasized in public schools, and she also was upset because the school board only has "six true conservative Christians" as members. I guess Jews, Muslims, Buddhists, Hindus, and atheists need not apply.  Update:  It looks like Texas has dodged creationism for another couple of years.  Whew.

## Wednesday, July 20, 2011

### What is so hard about understanding high temperature superconductivity?

As ZZ has pointed out, Nature is running a feature article on the history of high temperature superconductivity over the last 25 years. I remember blogging about this topic five years ago when Nature Physics ran an excellent special issue on the subject. At the time, I wrote a brief summary of the field, and I've touched on this topic a few times in the intervening years. Over that time, it's pretty clear that the most important event was the discovery of the iron-based high temperature superconductors. It showed that there are additional whole families of high temperature superconducting materials that are not all copper oxides.

Now is a reasonable time to ask again, what is so hard about this problem? Why don't we have a general theory of high temperature superconductivity?  Here are my opinions, and I'd be happy for more from the readers.
• First, be patient.  Low-T superconductivity was discovered in 1911, and we didn't have a decent theory until 1957.  By that metric, we shouldn't start getting annoyed until 2032.  I'm not just being flippant here.  The high-Tc materials are generally complicated (with a few exceptions) structurally, with large unit cells, and lots of disorder associated with chemical doping.  This is very different than the situation in, e.g., lead or niobium.
• Electron-electron interactions seem to be very important in describing the normal state of these materials.  In the low-Tc superconductors, we really can get very far understanding the normal starting point.  Aluminum is a classic metal, and you can do a pretty good job getting quantitative accuracy on its properties from the theory side even in single-particle, non-interacting treatments (basic band theory).  In contrast, the high-Tc material normal states are tricky.  Heck, the copper oxide parent compound is a Mott insulator - a system that single-particle band structure tells you should be a metal, but is in fact insulating because of the electron-electron repulsion!
• Spin seems to be important, too.   In the low-Tc systems, spin is unimportant in the normal state, and the electrons pair up so that each electron is paired with one of opposite spin, so that the net spin of the pair is zero, but that's about it.  In high-Tc systems, on the other hand, very often the normal state involves magnetic order of some sort, and spin-spin interactions may well be important.
• Sample quality has been a persistent challenge (particularly in the early days).
• The analytical techniques that exist tend to be indirect or invasive, at least compared to the desired thought experiments.  This is a persistent challenge in condensed matter physics.  You can't just go and yank on a particular electron to see what else moves, in an effort to unravel the "glue" that holds pairs together (though the photoemission community might disagree).  While the order parameter (describing the superconducting state) may vary microscopically in magnitude, sign, and phase, you can't just order up a gadget to measure, e.g., phase as a function of position within a sample.  Instead, experimentalists are forced to be more baroque and more clever.
• Computational methods are good, but not that good.  Exact solutions of systems of large numbers of interacting electrons remain elusive and computationally extremely expensive.  Properly dealing with strong electronic correlations, finite temperature, etc. are all challenges.
Still, it's a beguiling problem, and now is an exciting time - because of the iron compounds, there are probably more people working on novel superconductors than at any time since the heady days of the late '80s, and they're working with the benefit of all that experience and hindsight.  Maybe I won't have to write something like this for the 30th high-Tc anniversary in 2016....