Search This Blog

Tuesday, August 30, 2011

Supersymmetry, the Higgs boson, the LHC, and all that

Lately there has been a big kerfluffle (technical term of art, there) in the blog-o-sphere about what the high energy physics experimentalists are finding, or not finding, at the LHC. See, for example, posts here and here, which reference newspaper articles and the like. Someone asked me what I thought about this the other day, and I thought it might be worth a post.

For non-experts (and in high energy matters, that's about the right level for me to be talking anyway), the main issues can be summarized as follows. There is a theoretical picture, the Standard Model of particle physics, that does an extremely good job (perhaps an unreasonably good job) of describing what appear to be the fundamental building blocks of matter (the quarks and leptons) and their interactions. Unfortunately, the Standard Model has several problems. First, it's not at all clear why many of the parameters in the model (e.g., the masses of the particles) have the values that they do. This may only be a problem with our world view, meaning the precise values of parameters may come essentially from random chance, in which case we'll just have to deal with it. However, it's hard to know that for sure. Moreover, there is an elegant (to some) theoretical idea called the Higgs mechanism that is thought to explain at the same time why particles have mass at all, and how the electroweak interaction has the strength and symmetry that it does. Unfortunately, that mechanism predicts at least one particle which hasn't been seen yet, the Higgs boson. Second, we know that the Standard Model is incomplete, because it doesn't cover gravitational interactions. Attempts to develop a truly complete "theory of everything" have, over the last couple of decades, become increasingly exotic, encompassing ideas like supersymmetry (which would require every particle to have a "superpartner" with the other kind of quantum statistics), extra dimensions (perhaps the universe really has more than 3 spatial dimensions), and flavors of string theory, multiverses, and whatnot. There is zero experimental evidence for any of those concepts so far, and a number of people are concerned that some of the ideas aren't even testable (or falsifiable) in the conventional science sense.

So, the LHC has been running for a while now, the detectors are working well, and data is coming in, and so far, no exotic stuff has been seen. No supersymmetric partners, no Higgs boson over the range of parameters examined, etc. Now, this is not scientifically unreasonable or worrisome. There are many possible scales for supersymmetric partners and we've only looked at a small fraction (though this verges into the issue of falsifiability - will theorists always claim that the superpartners are hiding out there just beyond the edge of what's measurable?). The experts running the LHC experiments knew ahead of time that the most likely mass range for the Higgs would require a *lot* of data before any strong statement can be made. Fine.

So what's the big deal? Why all the attention? It's partly because the LHC is expensive, but mostly it's because the hype surrounding the LHC and the proposed physics exotica has been absolutely out of control for years. If the CERN press office hadn't put out a steady stream of news releases promising that extra dimensions and superpartners and mini black holes and so forth were just around the corner, the reaction out there wouldn't be nearly so strong. The news backlash isn't rational scientifically, but it makes complete sense sociologically. In the mean time, the right thing to do is to sit back and wait patiently while the data comes in and is analyzed. The truth will out - that's the point of science. What will really be interesting from the history and philosophy of science perspective will be the reactions down the line to what is found.

Wednesday, August 24, 2011

great post by ZZ

Before I go to teach class this morning, I wanted to link to this great post by ZapperZ about the grad student/research adviser relationship.  Excellent.

Saturday, August 20, 2011

Gating and "real" metals.

Orientation week has kept me very busy - hence the paucity of posts.  I did see something intriguing on the arxiv recently (several things, actually, but time is limited at the moment), though.

Suppose I want to make a capacitor out of two metal plates separated by empty space.  If I apply a voltage, V, across the capacitor using a battery, the electrons in the two plates shift their positions slightly, producing a bit of excess charge density at the plate surfaces.  One electrode ends up with an excess of electrons at the surface, so that it has a negative surface charge density.  The other electrode ends up with a deficit of electrons at the surface, and the ion cores of the metal atoms lead to a positive surface charge density.  The net charge on one plate is Q, and the capacitance is defined as C = Q/V.

So, how deep into the metal surfaces is the charge density altered from that in the bulk metal?  The relevant distance is called the screening length, and it's set in large part by the density of mobile electrons.  In a normal metal like copper or gold, which has a high density of mobile (conduction) electrons on the order of 1022 per cm3, the screening length is comparable to an atomic diameter!  That's very short, and it tells you that it's extremely hard to alter the electronic properties of a piece of normal metal by capacitively messing about with its surface - you just don't mess with the electronic density in most of the material.  (This is in contrast to the situation in semiconductors or graphene, by the way, when a capacitive "gate" electrode can change the number of mobile electrons by orders of magnitude.)

That's why this paper was surprising.  The authors use ionic liquids (essentially a kind of salt that's molten at room temperature) to modulate the surface charge density of gold films by something like 1015 electrons per cm2.  The surprising thing is that they claim to see large (e.g., 10%) changes in the conductance of quite thick (40 nm) gold films as a result of this.  This is weird.  For example, the total number of electrons per cm2 already in such a film is something like (6 x 1022/cm3) x (4 x 10-5 cm) = 2.4 x 1018 per cm2.  That means that the gating should only be changing the 2d electron density by something like a tenth of a percent.  Moreover, only the top 0.1 nm of the Au should really be affected.  The data are what they are, but boy this is odd.  There's no doubt that these ionic liquids are an amazing enabling tool for pushing the frontiers of high charge densities in CM physics....

Sunday, August 14, 2011

Topological insulator question

I have a question, and I'm hoping one of my reader experts might be able to answer it for me.  Let me set the stage.  One reason 3d topological insulators are a hot topic these days is the idea that they have special 2d states that live at their surfaces.  These surface states are supposed to be "topologically protected" - in lay terms, this means that they are very robust; something deep about their character means that true back-scattering is forbidden.  What this means is, if an electron is in such a state traveling to the right, it is forbidden by symmetry for simple disorder (like a missing atom in the lattice) to scatter the electron into a state traveling to the left.  Now, these surface states are also supposed to have some unusual properties when particle positions are swapped around.  These unconventional statistics are supposed to be of great potential use for quantum computation.  Of course, to do any experiments that are sensitive to these statistics, one needs to do quantum interference measurements using these states.   The lore goes that since the states are topologically protected and therefore robust, this should be not too bad.

Here's my question.  While topological protection suppresses 180 degree backscattering, it does not suppress (as far as I can tell) small angle scattering, and in the case of quantum decoherence, it's the small angle scattering that actually dominates.  It looks to me like the coherence of these surface states shouldn't necessarily be any better than that in conventional materials.  Am I wrong about this?  If so, how?  I've now seen multiple papers in the literature (here, here, and here, for example) that show weak antilocalization physics at work in such materials.  In the last one in particular, it looks like the coherence lengths in these systems (a few hundred nanometers at 1 K) are not even as good as what one would see in a conventional metal film (e.g., high purity Ag or Au) at the same temperatures.  That doesn't seem too protected or robust to me....  I know that the situation is likely to be much more exciting if superconductivity is induced in these systems.  Are the normal state coherence properties just not that important?

Tuesday, August 09, 2011

DOE BES CMX PI mtg

Went for the cryptic headline.  I'm off for a Department of Energy Basic Energy Sciences Condensed Matter Experiment principal investigator meeting (the first of its kind, I believe) in the DC area.  This should be really interesting, getting a chance to get a perspective on the variety of condensed matter and materials physics being done out there.  This looks like it will be much more useful than a dog-and-pony show that I went to for one part of another agency a few years ago....

Monday, August 08, 2011

Evolution of blogger spam

Over the last couple of weeks, new forms of spam comments have been appearing on blogger. One type takes a sentence or two from the post itself, and feeds them through a parser reminiscent of ELIZA, to produce a vaguely coherent statement in a comment. Another type that I've noticed grabs a sentence or two from an article that was linked in the original post. A third type combines these two, taking a sentence from a linked article, and chewing on it with the ELIZA-like parser. A few more years of this, and we'll have the spontaneous evolutionary development of generalized natural-language artificial intelligence from blogger spam....

Friday, August 05, 2011

Summer colloquium

Every year at Rice in early August, the Rice Quantum Institute (old website) (shorthand: people who care about interdisciplinary science and engineering involving hbar) has its annual Summer Colloquium. Today is the twenty-fifth such event. It's a day-long miniconference, featuring oral presentations by grad students and posters, by both grad students and undergrad researchers from a couple of REU programs (this year, the RQI REU and the NanoJapan REU). It's a full day, with many talks. It's a friendly way for students to get more presentation experience, and a good way for faculty to learn what their colleagues are doing. I'd be curious to know if other institutions have similar things - my impression has been that this is comparatively unique, particularly its very broad interdisciplinary nature (e.g., talks on spectroscopy for pollution monitoring, topological insulators, plasmons, carbon nanotube composites, batteries) and combination of undergrads and grad students.

Thursday, July 28, 2011

Plutonium: a case study in why CM physics is rich

At the heart of condensed matter physics are two key concepts: the emergence of rich phenomena (including spontaneously occurring order - structural, magnetic, or otherwise) in the many-particle limit; and the critical role played by quantum mechanics in describing the many-body states of the system. I've tried to explain this before to lay persons by pointing out that while complicated electronic structure techniques can do an adequate job of describing the electronic and vibrational properties of a single water molecule at zero temperature, we still have a difficult time predicting really emergent properties, such as phase diagram of liquid, solid, and vapor water, or the viscosity or surface tension of liquid water.

Plutonium is an even more striking example, given that we cannot even understand its properties from first principle when we only have a single type of atom to worry about. The thermodynamic phase diagram of plutonium is very complicated, with seven different crystal structures known, depending on temperature and pressure. Moreover, as a resident of the actinide row of the periodic table, Pu has unpaired 5f electrons, though it is not magnetically ordered. At the same time, Pu is very heavy, with 94 total electrons, so that relativistic spin-orbit effects can't be neglected in trying to understand its structure. The most sophisticated electronic structure techniques out there can't handle this combination of circumstances. It's rather humbling that more than 70 years after its discovery/synthesis, we still can't understand this material, despite the many thousands of person-hours spent on it via various nations' nuclear weapons programs.

Sunday, July 24, 2011

Einstein, thermodynamics, and elegance

Recently, in the course of other writing I've been doing, I again came to the topic of what are called Einstein A and B coefficients, and it struck me again that this has to be one of the most elegant, clever physics arguments ever made.  It's also conceptually simple enough that I think it can be explained to nonexperts, so I'm going to give it a shot.

Ninety-four years ago, one of the most shocking ideas in physics was the concept of the spontaneous, apparently random, breakdown of an atomic system.  Radioactive decay is one example, but even light emission from an atom in an excited state will serve.  Take ten hydrogen atoms, all in their first electronically excited state (electron kicked up into a 2p orbital from the 1s orbital).  These will decay back into the 1s ground state (spitting out a photon) at some average rate, but each one will decay independently of the others, and most likely at a different moment in time.  To people brought up in the Newtonian clockwork universe, this was shocking.  How could truly identical atoms have individually differing emission times?  Where does the randomness come from, and can we ever hope to calculate the rate of spontaneous emission?

Around this time (1917), Einstein made a typically brilliant argument:  While we do not yet know [in 1917] how to calculate the rate at which the atoms transition from the ground state "a" to the excited state "b" when we shine light on them (the absorption rate), we can reason that the rate of atoms going from a to b should be proportional to the number of atoms in the ground state (Na) and the amount of energy density in the light available at the right frequency (u(f)).  That is, the rate of transitions "up" = Bab Na u(f), where B is some number that can at least be measured in experiments.  [It turns out that people figured out how to calculate B using perturbation theory in quantum mechanics about ten years later.].  Einstein also figured that there should be an inverse process (stimulated emission), that causes transitions downward from b to a, with a rate = Bba Nb u(f).  However, there is also the spontaneous emission rate = AbaNb, where he introduced the A coefficient.

Here is the brilliance.  Einstein considered the case of thermal equilibrium between atoms and radiation in some cavity.  In steady state, the rate of transitions from a to b must equal the rate of transitions from b to a - in steady state, no atoms are piling up in the ground or excited states.  Moreover, from thermodynamics, in thermal equilibrium, the ratio of Nb to Na should just be a Boltzmann factor, exp(-Eab/kBT), where Eab is the energy difference between the two states, kB is Boltzmann's constant, and T is the temperature.  From this, Einstein shows that the two Bs were equal, was able to solve for the unknown A in terms of B (which can be measured and nowdays calculated), and to show that the energy density of the radiation (u(f,T)) is Planck's blackbody formula.

My feeble writing here doesn't do this justice.  The point is, from basic thermodynamic reasoning, Einstein made it possible to derive an expression for the spontaneous emission rate of atoms, many years in advance of the theory (quantum electrodynamics) that allows one to calculate it directly.  This is what people mean by the elegance of physics - in a few pages, from proper reasoning on fundamental grounds, Einstein was able to deduce relationships that had to exist between different physical parameters; and these parameters could be measured and tested experimentally.  For more on this, here is a page at MIT that links to a great Physics Today article about the topic, and an English translation of Einstein's 1917 paper.  

Thursday, July 21, 2011

Slackers, coasters, and sherpas, oh my.

This is mostly for my American readers - be forewarned.

I wrote last year about a plan put forward by Rick O'Donnell, a controversial "consultant" hired by the state of Texas (hint: Gov. Rick Perry, apparent 2012 presidential hopeful, wanted this guy.) to study the way public universities work in Texas. Specifically, O'Donnell came from a think tank that had very firm predetermined concept about higher education: Faculty are overpaid slackers that are ripping off students, and research is not of value in the educational environment. O'Donnell has written a report (pdf) about this topic, and he's shocked, shocked to find that he was absolutely right. By his metrics of number of students taught and research dollars brought in, he grouped faculty at UT and Texas A&M into "Dodgers, Coasters, Sherpas, Pioneers, and Stars". Pioneers are the people who bring in big grants and buy out of teaching. Stars are the people who bring in grants and teach large lecture classes. Sherpas are mostly instructors (he doesn't seem to differentiate between instructors and faculty) who lecture to large classes but don't bring in grants. Dodgers teach small classes and don't bring in grant money. Coasters teach small classes and bring in some grant money.

This is the exact incarnation of what I warned about in comments on my old post. This analysis basically declares that all social science and humanities faculty that teach upper division classes are worthless leeches (small classes, no grants) sponging off the university. People in the sciences and engineering who teach upper level classes aren't any better, unless they're bringing in multiple large research grants. Oh, and apparently the only metric for research and scholarship is money.

Nice. Perry, by the way, also appointed Barbara Cargill to run the state board of education. She's a biologist who wants evolution's perceived weaknesses to be emphasized in public schools, and she also was upset because the school board only has "six true conservative Christians" as members. I guess Jews, Muslims, Buddhists, Hindus, and atheists need not apply.  Update:  It looks like Texas has dodged creationism for another couple of years.  Whew.

Wednesday, July 20, 2011

What is so hard about understanding high temperature superconductivity?

As ZZ has pointed out, Nature is running a feature article on the history of high temperature superconductivity over the last 25 years. I remember blogging about this topic five years ago when Nature Physics ran an excellent special issue on the subject. At the time, I wrote a brief summary of the field, and I've touched on this topic a few times in the intervening years. Over that time, it's pretty clear that the most important event was the discovery of the iron-based high temperature superconductors. It showed that there are additional whole families of high temperature superconducting materials that are not all copper oxides.

Now is a reasonable time to ask again, what is so hard about this problem? Why don't we have a general theory of high temperature superconductivity?  Here are my opinions, and I'd be happy for more from the readers.
  • First, be patient.  Low-T superconductivity was discovered in 1911, and we didn't have a decent theory until 1957.  By that metric, we shouldn't start getting annoyed until 2032.  I'm not just being flippant here.  The high-Tc materials are generally complicated (with a few exceptions) structurally, with large unit cells, and lots of disorder associated with chemical doping.  This is very different than the situation in, e.g., lead or niobium.
  • Electron-electron interactions seem to be very important in describing the normal state of these materials.  In the low-Tc superconductors, we really can get very far understanding the normal starting point.  Aluminum is a classic metal, and you can do a pretty good job getting quantitative accuracy on its properties from the theory side even in single-particle, non-interacting treatments (basic band theory).  In contrast, the high-Tc material normal states are tricky.  Heck, the copper oxide parent compound is a Mott insulator - a system that single-particle band structure tells you should be a metal, but is in fact insulating because of the electron-electron repulsion!  
  • Spin seems to be important, too.   In the low-Tc systems, spin is unimportant in the normal state, and the electrons pair up so that each electron is paired with one of opposite spin, so that the net spin of the pair is zero, but that's about it.  In high-Tc systems, on the other hand, very often the normal state involves magnetic order of some sort, and spin-spin interactions may well be important.
  • Sample quality has been a persistent challenge (particularly in the early days).
  • The analytical techniques that exist tend to be indirect or invasive, at least compared to the desired thought experiments.  This is a persistent challenge in condensed matter physics.  You can't just go and yank on a particular electron to see what else moves, in an effort to unravel the "glue" that holds pairs together (though the photoemission community might disagree).  While the order parameter (describing the superconducting state) may vary microscopically in magnitude, sign, and phase, you can't just order up a gadget to measure, e.g., phase as a function of position within a sample.  Instead, experimentalists are forced to be more baroque and more clever.
  • Computational methods are good, but not that good.  Exact solutions of systems of large numbers of interacting electrons remain elusive and computationally extremely expensive.  Properly dealing with strong electronic correlations, finite temperature, etc. are all challenges.
Still, it's a beguiling problem, and now is an exciting time - because of the iron compounds, there are probably more people working on novel superconductors than at any time since the heady days of the late '80s, and they're working with the benefit of all that experience and hindsight.  Maybe I won't have to write something like this for the 30th high-Tc anniversary in 2016....

Monday, July 18, 2011

Updated look.

I finally bit the bullet and updated the look of the blog.  I'm still keeping it ad-free, though.

Sunday, July 17, 2011

google+

I have a nagging feeling that google+ could somehow be used to significantly increase readership of my blog, if only I was appropriately savvy.  Anyone have any suggestions or thoughts on this?  I don't crave the attention per se, but I'd be fibbing if I said I wasn't jealous of the readership numbers of the folks that blog at, e.g., scienceblogs, discovermagazine.com, or scientificamerican.com.  Larger readership would undoubtedly motivate more writing, too, though that's not necessarily great for my time management....

Saturday, July 16, 2011

It's all at the interface. Again.

Over the last decade, there has been a great deal of exciting work in making electronically interesting systems at atomically sharp interfaces between different oxide materials (oxide heterostructures). Analogous efforts at semiconductor-dielectric interfaces have given us the conventional field-effect transistor, something like 109 of which are being used to render this page for you. Likewise, heterointerfaces in compound semiconductor systems (especially the technologically relevant III-V materials like GaAs) have given us two Nobel Prizes and a great deal of quantum electronic fun. Oxides are much trickier beasts from the materials science side, making growth and interfacial control a major challenge. Moreover, with respect to basic science, transition metal oxides can be incredibly rich systems, because in many of them electron-electron interactions lead to competing electronic and magnetic phases, with consequences like the emergence of high temperature superconductivity.

A few years ago, this paper demonstrated that it was possible to get superconductivity at the interface between SrTiO3 and LaAlO3, two oxides that are both insulating if perfectly stoichiometric. Still, SrTiO3 is known to superconduct if highly doped, and therefore this observation, while a great experiment, wasn't hugely shocking, given the existence of a high density electron gas at the STO/LAO interface. More recently, this paper showed that high temperature superconductivity could happen at the interface between a nominally insulating oxide and a metallic (but not superconducting) cuprate related to the high-Tc materials. This past week on the arxiv, a logical successor to these works appeared here. The authors use two nominally insulating oxides (STO again, and CaCuO2. Because of imperfect stoichiometry at the interface (excess oxygen, apparently), there is a conducting layer at the interface, with a superconducting transition around 50 K (in one sample, though others all show transitions exceeding 25 K). Bearing in mind that this is a preprint (and therefore has not been refereed), it is still very exciting. We are finally approaching the ability to engineer complex materials (not just semiconductors) on the atomic layer level, and this should be an incredible playground for basic science and materials engineering. It'd be great to get plugged into a collaboration working in this area.

Thursday, July 14, 2011

Science and the public

I couldn't help but notice that one of my favorite producers of animated films, Aardman Animation, is coming out with a new movie (trailer here). I find it very interesting that the UK version of the movie is "The Pirates! In an Adventure with Scientists!", while the US version is "The Pirates! Band of Misfits!". The film is based on a book with the former title, by the way. I don't want to overanalyze this, but it's hard to escape the conclusion that some marketing drone decided, "scientist" is box-office poison, and that "misfit" was an acceptable and more marketable substitute in the US. Great. Wonderful. In case you're wondering, Charles Darwin shows up as a character in the book/movie. I imagine that the US ads won't be playing that up very much, or there will be protests. Sigh.

(I do have a science post I'll make shortly. I just couldn't let this pass w/o comment. And it's taking enormous self-restraint not to launch into extended political invective about the US, but there are many places where people can read that if they want to.)

Thursday, July 07, 2011

Follow-up, and blogger drop-off

Regarding the story mentioned here, Nature has published both a provocative and interesting article by Eugenie Reich about the larger issues raised, and an editorial. Sorry that these are behind a pay-wall. To summarize in a few sentences: Eugenie Reich points out that the misconduct investigation relevant to this discussion highlights important problems with the US Department of Energy's handling of such cases. To wit: There are issues of independence and chain of authority of the investigators, and lack of proper record keeping, documentation, etc. of investigation reports. The conclusion is that this is a powerful argument for the DOE to establish an Office of Research Integrity, like those in some other agencies. The editorial from Nature chastises the DOE along these lines. Interesting that the Nature editorial makes no mention at all of their own role in not publishing technical comments relevant to this particular matter.

In blogging news, there has been a drop-off in the number of active physical science bloggers. David Bacon's Quantum Pontiff has decohered. The Incoherent Ponderer has gone so far as to apparently delete his entire blog and blogger profile. Other blogs have not been updated in many months. It's likely that this is all part of a natural stabilization of blogging - people run out of things to say, and the novelty of blogging has trailed off. It will be interesting to see where this trend resolves. It'll be a shame to have fewer interesting voices to follow, though. (Clearly we should all switch to Twitter, since 140 characters should be more than sufficient to carry out detailed science discussions or popularizations for the lay audience. Ahem.)

Tuesday, July 05, 2011

Crowd-sourcing, video games, and the world's problems

This past weekend, I caught a snippet of a rebroadcast of this NPR story about Jane McGonigal and the thesis of her recent book. In short, she points out that as a species we have spent literally millions of person-years playing World of Warcraft, an online game that involves teamwork and puzzle-solving (as well as all the usual fun silliness of videogames). Her point is that in the game environment, people have demonstrated great creativity as well as a willingness to keep coming back, over and over, to tackle challenging problems (in part because there is recognition by the players that problems are pitched at a level that is tricky but not insurmountable). She wants to harness this kind of intellectual output for good, rather than just have it as a social (or antisocial) outlet. She's not the first person to have this sort of idea, of course (see, e.g., Ender's Game, or the Timothy Zahn short story "The Challenge"), but the WoW numbers are truly eye-popping.

It would be great if there were certain scientific problems to which this could be applied. The overall concept seems easiest to adapt to logistics (e.g., coming up with clever ways of routing shipping containers or disaster relief supplies), since that's a puzzle-solving subdiscipline where the basic problems are at least accessible to lay-people. Trying this with meaty scientific challenges would be much more difficult, unless those challenges could be translated effectively into problems that don't require years and years of foreknowledge. Hmm. Still very thought-provoking.

Friday, July 01, 2011

The tyranny of the buried interface

Time and again, a major impediment to research progress in condensed matter physics, electrical engineering, materials science, and physical chemistry is the need to understand what is happening in some system at a buried interface. For example, in organic photovoltaic devices, it is of great importance to learn more about what is happening at metal/organic semiconductor interfaces (charge transfer, interfacial dipole formation, Fermi level pinning) and organic/organic interfaces (exciton splitting at the interface between electron- and hole-transporting materials). Another example: in lithium ion batteries, at the interface between either the cathode or the anode and the electrolyte, after the first couple of charge and discharge cycles, there forms the "solid electrolyte interface" (SEI) layer. The SEI is nanoscale in thickness, stabilizes the electrode surface, establishes the energetic lineup between the electrolyte redox chemistry and the actual electrode surface, strongly affects the kinetics of the lithium ion transport, etc.

Unfortunately, probing buried interfaces in situ in functioning systems is extremely hard. There generally is no Star Trek scanner device that can nondestructively reveal atomic-scale details of buried 3d structures. Many of our best characterization approaches are surface-based, or require thinned down samples, and there are always difficult questions about how information gained in such investigations translates to the real situation of interest. This is not a new problem. From the early days of surface science and before, people have been worrying about, e.g., how to connect studies performed in UHV on single crystal surfaces with "real world" situations on polycrystalline surfaces with ambient contaminants. There are some macro-scale interface sensitive approaches (exploiting x-ray standing waves, or interfacial optical effects). Still, the more people working on developing better characterization tools toward this end, the better, even if it doesn't sound terribly exciting to the masses.

Thursday, June 23, 2011

a recurring story

Five years ago, there was a controversy in the pages of Nature regarding this paper from 1993, the first to claim atomic-resolution chemical analysis via scanning transmission electron microscopy.  At issue was whether or not the data in the paper had been reprocessed (in response to referee concerns) in a legitimate or misrepresentative way, and whether the authors had been honest and forthcoming with the journal and the reviewers about the procedures they'd followed.  The reason that matters came to a head more than 12 years after the original paper was the appearance of a preprint in the arxiv and subsequently submitted to Nature Physics, sharing two of the authors of the original paper, with further questions raised about the handling and analysis of data and images.  This was all discussed clearly and succinctly by ZZ at the time.  Nature allowed the authors to publish a corrigendum, a correction rather than a retraction, regarding the original '93 paper.  This was sufficiently controversial that Nature felt the need to write an editorial explaining their decision.  Oak Ridge did an investigation of the matter, and concluded that there was no fabrication or falsification of data; that report and a response by the authors are linked here.  Judging from the appearance of this on the arxiv last night, it would appear that this isn't quite the end of things.

Wednesday, June 15, 2011

Pitch for a tv show

Summer blogging has been and will continue to be light, as I try to get some professional writing done. In the meantime, though, I have to give my elevator pitch for the awesome new TV show that would be great fun. It's "Chopped" meets "Mythbusters" meets "Scrap Heap Challenge"/"Junkyard Wars". Start off with three teams. Give them a physics- or engineering-related task that they have to accomplish (e.g., write the opening crawl from Star Wars in one mm^2; weigh a single grain of salt), some number of tools that they have to use (e.g., a green laser pointer and an infrared corrected microscope objective), and access to a stocked "pantry" (including a PC, electronics components, etc.). Give them a time limit (4 hours, cleverly edited down to half an hour in broadcast). Points awarded for success at the task, time used, and elegance. I think it could be a hit, particularly if there are explanations (narrated by cool resident experts) delivered in a fun, accessible tone. It'd be fun, even if it did conjure up images of Guy Fleegman in Galaxy Quest.