A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Search This Blog
Tuesday, May 29, 2012
Buying out of teaching - opinions?
This is a topic that comes up at many research universities, and I'd be curious for your opinions. Some institutions formally allow researchers to "buy" out of teaching responsibilities. Some places actively encourage this practice, as a way to try to boost research output and standing. Does this work overall? Faculty who spend all their time on research should generally be more research-productive, though it would be interesting to see quantitatively how much so. Of course, undergraduate and graduate classroom education is also an essential part of university life, and often (though certainly not always) productive researchers are among the better teachers. It's a fair question to ask whether teaching buyout is a net good for the university as a whole. What do you think?
Sunday, May 27, 2012
Work functions - a challenge of molecular-scale electronics
This past week I was fortunate enough to attend this workshop at Trinity College, Dublin, all about the physics of atomic- and molecular-scale electronics. It was a great meeting, and I feel like I really learned several new things (some of which I may elaborate upon in future posts). One topic that comes up persistently when looking at this subject is the concept of the work function, defined typically as the minimum amount of energy it takes to kick an electron completely out of a material (so that it can go "all the way to infinity", rather than being bound to the material somehow). As Einstein and others pointed out when trying to understand the photoelectric effect, each material has an intrinsic work function that can be measured, in principle, using photoemission. You can hit a material surface with ultraviolet light and measure the energy of the electrons that get kicked out (for example, by slowing them down with an electric field and seeing how long it takes them to arrive at a detector). Alternately, with a fancy tunable light source like a synchrotron, you can dial around the energy of the incident light and see when electrons start getting kicked out. As you might imagine, if you are trying to understand electronic transport, where an electron has to leave one electrode, traverse through a system such as a molecule, and end up back in another electrode, the work function is important to know.
One problem with work functions is, they are extremely sensitive to the atomic-scale details of a surface. For example, different crystallographic faces of even the same material (e.g., gold) can have work functions that differ by a couple of hundred millielectronvolts (meV). Remember, the thermal energy scale at room temperature is 25 meV or so, so these are not small differences. Moreover, anything that messes with the electronic cloud that spills a little out of the surface of materials at the atomic scale can alter the work function. Adsorbed impurities on metal surfaces can change the effective work function by more than 1 eV (!). To see how tricky this gets, imagine chemically assembling a layer of covalently bound molecules on a metal surface. There is some charge transfer where the molecule chemically bonds to the metal, leading to an electric dipole moment and a corresponding change in work function. The molecule itself can also polarize or be inherently polar based on its structure. In the end, ordinary photoemission measures just the total of all of these effects. Finally, ponder what then happens if the other end of the molecules is also tethered chemically to a piece of metal. How big are all the dipole shifts? What is the actual energy landscape "seen" by an electron going from one metal to the other, and is there any way to measure it experimentally, let alone compute it reliably from quantum chemistry methods? Really understanding the details is difficult yet ultimately essential for progress here.
Monday, May 21, 2012
Catalysis seems like magic.
In our most recent paper, we found that we could dope a particularly interesting material, vanadium dioxide, with atomic hydrogen, via "catalytic spillover". By getting hydrogen in there in interstitial sites, we could dramatically alter the electrical properties of the material, allowing us to stabilize its unusual metallic state down to low temperatures. The funkiest part of this to me is the catalysis part. The metal electrodes that we use for electronic measurements have enough catalytic activity that they can split hydrogen molecules into atomic hydrogen at an appreciable rate even under very modest conditions (e.g., not much warmer than the boiling point of water). This paper (sorry it is subscription only) shows an elegant experimental demonstration of this, where gold is exposed to H2 and D2 gas and HD molecules are then detected. I would love to understand the physics at work here better. Any recommendations for a physics-based discussion would be appreciated - I know there is enormous empirical and phenomenological knowledge about this stuff, but something closer to an underlying physics description would be excellent.
Wednesday, May 16, 2012
Vanity journals: you've got to be kidding me.
I just received the following email:
Dear Pro. ,
Considering your research in related areas, we cordially invite you to submit a paper to Modern Internet of Things (MIOT).
The Journal of Modern Internet of Things (MIOT) is published in English, and is a peer reviewed free-access journal which provides rapid publications and a forum for researchers, research results, and knowledge on Internet of Things. It serves the objective of international academic exchange.
Wow! I feel so honored, given my vast research experience connected to "Internet of Things".
Monday, May 14, 2012
The unreasonable clarity of E. M. Purcell
Edward Purcell was one of the great physicists of the 20th century. He won the Nobel Prize in physics for his (independent) discovery of nuclear magnetic resonance, and was justifiably known for the extraordinarily clarity of his writing. He went on to author the incredibly good second volume of the Berkeley Physics Course (soon to be re-issued in updated form by Cambridge University Press), and late in life became interested in biophysics, writing the evocative "Life at Low Reynolds Number" (pdf).
Purcell is also known for the Purcell Factor, a really neat bit of physics. As I mentioned previously, Einstein showed through a brilliant thermodynamic argument that it's possible to infer the spontaneous transition rate for an emitter in an excited state dropping down to the ground state and spitting out a photon. The spontaneous emission rate is related to the stimulated rate and the absorption rate. Both of the latter two may be calculated using "Fermi's Golden Rule", which explains (with some specific caveats that I won't list here) that the rate of a quantum mechanical radiative transition for electrons (for example) is proportional to (among other things) the density of states (number of states per unit energy per unit volume) of the electrons and the density of states of the photons. The density of states for photons in 3d can be calculated readily, and is quadratic in frequency.
Purcell had the insight that in a cavity, the number of states available for photons is not quadratic in frequency anymore. Instead, a cavity on resonance has a photon density of states that is proportional to the "quality factor", Q, of the cavity, and inversely proportional to the size of the cavity. The better the cavity and the smaller the cavity, the higher the density of states at the cavity resonance frequency, and off-resonance the photon density of states approaches zero. This means that the spontaneous emission rate of atoms, a property that seems like it should be fundamental, can actually be tuned by the local environment of the radiating system. The Purcell factor is the ratio of the spontaneous emission rate with the cavity to that in free space.
Saturday, May 05, 2012
Models and how physics works
Thanks to ZapperZ for bringing this to my attention. This paper is about to appear in Phys Rev Letters, and argues that the Lorentz force law (as written to apply to magnetic materials, not isolated point charges) is incompatible with Special Relativity. The argument includes a simple thought experiment. In one reference frame, you have a point charge and a little piece of magnetic material. Because the magnet is neutral (and for now we ignore any dielectric polarization of the magnet), there is no net force on the charge or the magnet, and no net torque on the magnet either. Now consider the situation when viewed from a frame moving along a line perpendicular to the line between the magnet and the charge. In the moving frame, the charge seems to be moving, so that produces a current. However (and this is the essential bit!), in first year physics, we model permanent magnetization as a collection of current loops. If we then consider what those current loops look like in the moving frame, the result involves an electric dipole moment, meaning that the charge should now exert a net torque on the magnet when all is said and done. Since observers in the two frames of reference disagree on whether a torque exists, there is a problem! Now, the author points out that there is a way to fix this, and it involves modifying the Lorentz force law in terms of how it treats magnetization, M (and electric polarization, P). This modification was already suggested by Einstein and a coauthor back in 1908.
I think (and invite comments one way or the other) that the real issue here is that our traditional way to model magnetization is unphysical at the semiclassical level. You really shouldn't be able to have a current loop that persists, classically. A charge moving in a loop is accelerating all the time, and should therefore radiate. By postulating no radiation and permanent current loops, we are already inserting something fishy in terms of our treatment of energy and momentum in electrodynamics right at the beginning. The argument by the author of the paper seems right to me, though I do wonder (as did a commenter in ZZ's post) whether this all would have been much more clear if it had been written out in four-vector/covariant notation rather that conventional 3-vectors.
This raises a valuable point about models in physics, though. Our model of M as resulting from current loops is extremely useful for many situations, even though it is a wee bit unphysical. We only run into trouble when we push the model beyond where it should ever have been expected to be valid. The general public doesn't always understand this distinction - that something can be a little wrong in some sense yet still be useful. Science journalists and scientists trying to reach the public need to keep this in mind. Simplistically declaring something to be wrong, period, is often neither accurate nor helpful.
I think (and invite comments one way or the other) that the real issue here is that our traditional way to model magnetization is unphysical at the semiclassical level. You really shouldn't be able to have a current loop that persists, classically. A charge moving in a loop is accelerating all the time, and should therefore radiate. By postulating no radiation and permanent current loops, we are already inserting something fishy in terms of our treatment of energy and momentum in electrodynamics right at the beginning. The argument by the author of the paper seems right to me, though I do wonder (as did a commenter in ZZ's post) whether this all would have been much more clear if it had been written out in four-vector/covariant notation rather that conventional 3-vectors.
This raises a valuable point about models in physics, though. Our model of M as resulting from current loops is extremely useful for many situations, even though it is a wee bit unphysical. We only run into trouble when we push the model beyond where it should ever have been expected to be valid. The general public doesn't always understand this distinction - that something can be a little wrong in some sense yet still be useful. Science journalists and scientists trying to reach the public need to keep this in mind. Simplistically declaring something to be wrong, period, is often neither accurate nor helpful.
Wednesday, April 25, 2012
Heat flow at the mesoscale
When we teach about thermal physics at the macroscopic scale, we talk in terms of the thermal conductivity, k. For the 1d problem of a homogeneous rod of cross sectional area A and length L, the rate that energy flows from one end of the rod to the other is given by (kA/L)(Th-Tc), where Th and Tc are the temperatures of the hot and cold ends of the rod, respectively. Built into this approach is the tacit assumption that the phonons, the quantized vibrational modes of the lattice that carry what we consider to be the thermal energy of the atoms in the solid, move in a diffusive way. That is, if a phonon is launched, it bounces many times in a random walk sort of motion before it traverses across our region of interest. Phonons can scatter off disorder in the lattice, or mobile charge carriers (or even each other, if the vibrations aren't perfectly harmonic).
Thursday, April 19, 2012
Persistent currents and an impressive experiment
A long while ago, I brought up the topic of persistent currents in normal metal rings. Please click the link to get the context. The point is, even in a normal metal (as opposed to a superconductor), if you consider a metal ring small enough that the electrons remain quantum mechanically coherent in going about the ring, the electronic wavefunction must remain single-valued. That means that the quantum mechanical phase accumulated by an electron diffusing around the ring back to its starting point (to speak in a semiclassical way) has to add up to an integer multiple of 2 pi. Since magnetic flux through the ring tweaks the accumulated phase (via the Aharonov-Bohm effect), a persistent current develops in the ring to make sure that the total phase (that from the electron motion and that from the resulting Aharonov-Bohm contribution) add up to a multiple of 2 pi. As I'd discussed before, these currents and the magnetic fields they produce tend to be quite small and difficult to detect.
Tuesday, April 17, 2012
Academic science researchers and economics
This article in the NY Times is rather provocative in several ways. First, it raises the question of whether there is a dramatic rise taking place in the number of journal article retractions (spread across all disciplines). The answer is, it's really not clear, given the enormous increase in the number of published articles. Moreover, it's certainly much easier for people to find, read, and compare articles than ever before. Google Scholar, for example, can see through most pay-walls enough to search for words and phrases, making it far easier than ever before to test for plagiarism.
Moving on, the article then looks at whether the culture of academic science research is, for lack of a better word, ailing. There are some choice quotes:
[L]abs continue to have an incentive to take on lots of graduate students to produce more research. “I refer to it as a pyramid scheme,” said Paula Stephan, a Georgia State University economist and author of “How Economics Shapes Science,” published in January by Harvard University Press.
In such an environment, a high-profile paper can mean the difference between a career in science or leaving the field. “It’s becoming the price of admission,” Dr. Fang said.
The scramble isn’t over once young scientists get a job. “Everyone feels nervous even when they’re successful,” he continued. “They ask, ‘Will this be the beginning of the decline?’ ”
...
“What people do is they count papers, and they look at the prestige of the journal in which the research is published, and they see how many grant dollars scientists have, and if they don’t have funding, they don’t get promoted,” Dr. Fang said. “It’s not about the quality of the research.”
Dr. Ness likens scientists today to small-business owners, rather than people trying to satisfy their curiosity about how the world works. “You’re marketing and selling to other scientists,” she said. “To the degree you can market and sell your products better, you’re creating the revenue stream to fund your enterprise.”
I don't want to quote any more for fear of running afoul of fair use. Read the article.
This does hit some of the insecurities felt by any reasonable US faculty science or engineering researcher. I would dispute the pyramid scheme comment because it's based on a false premise, that every doctoral student is looking to become a professor and is crushed if they don't get a faculty position. The prestige paper comments are more worrisomely accurate.
Sunday, April 15, 2012
Getting the most out of an experimental technique
This post is a mini-summary of a Perspectives piece I wrote for ACS Nano. One conceptually simple way to measure the electronic properties of materials at the atomic scale is to use a "break junction". Imagine taking a metal needle touching a metal surface, and slowly lifting up on the needle. At some point, the needle will come out of contact with the surface. As it does so, at the last instant, the contact between the two will take place only at the atomic scale. If you hook up one end of a battery to the needle and the other through an ammeter to the metal surface to measure the flow of current, you can measure the electrical conduction throughout this process. Thanks to the availability of high speed electronics these days, it is possible to record conductance, G, vs. time data throughout the process. A standard analytic approach is then to compile a histogram of all the data points, counting how many times each value of G is measured. As explained here, the most stable junction configurations naturally have more data points, and this will lead to peaks in the conductance histogram at the values of conductance corresponding to those configurations. Molecules may be incorporated into such junctions (as I've written about here). Since it's possible to set up a system to make and break junctions repeatedly and rapidly in an automated way, this approach has proven very fruitful and revealing.
Of course, only looking at the histograms is wasteful. You actually have an enormous amount of additional information contained in the G vs. t traces. For instance, you can check to see if the occurrence of a "plateau" in G vs. t at one conductance level always (or never!) correlates with a similar plateau at a different conductance value. These kinds of cross-correlations are best represented in two-dimensional histograms of various types. Makk et al. have written a very clear and tutorial paper about how this works in practice, and what kinds of things one can learn from such analyses. It's definitely worth a read if you work on this stuff, and it's also a great lesson in how as much of your data as possible.
Monday, April 09, 2012
DOI numbers, Web of Science, and article numbers
Two recurring complaints about bibliographies and citations for papers and proposals:
- Most people really like DOI, a system meant to assure that reference materials like journal articles get an effectively permanent web address, something that will "always" point to that article. It's become very very popular, and every online journal that I know provides a doi reference for each article. It shows up in every Web of Science reference these days, too, if it exists. So, why can't Web of Science make those doi numbers a clickable link? That is, instead of forcing me to copy and paste the doi into a browser URL line with "http://dx.doi.org/" stuck in front, why not just make the doi itself a link to that? I mean, why would anyone just want the doi without the link?? Is this some weird bs rule about Web of Science not wanting to have direct links?
- How come Physical Review handles bibliographic information so badly when it comes to article numbers? A number of years ago, Phys Rev switched from old-fashioned page numbers for articles to 6-digit article numbers. Unfortunately, when you try to export bibliographic information for reference management software, for many Phys Rev articles, the automatic response is to stick the article number (which replaced the page number for all practical purposes) in some completely random field, and instead list the page numbers as either blank or the oh-so-useful "1-4" for a four-page article. Can someone please fix this?
Both of these are trivial, silly things, but I'd be willing to be that hundreds of person-hours (at least) are lost per year dealing with the latter one.
Sunday, April 08, 2012
Commitment and conflicts
One of the various hats I wear right now is chair of Rice's university committee on research, and one topic that has come up lately (in the context of the US government's new regs about conflict of interest) is the discussion of "commitment". Conflict of interest is comparatively simple to explain to people - everyone grasps the idea that financial or other compensation that may give the appearance of affecting your scholarly objectivity is potentially a conflict of interest. Commitment is a more challenging concept. Most universities expect their science and engineering faculty in particular to spend some of their time doing things that are not immediately, directly connected to their simplest academic duties (teaching courses, supervising research students and postdocs, performing university service). For example, technical consulting isn't that unusual. Likewise, there are other broadly defined academic duties that can come up (serving on advisory or editorial boards; professional society work) that can enhance the academic mission of the university in a higher order way. However, it's clear that there have to be limits of some kind on these auxiliary activities - we would all agree that someone who does so much alternative work that they can't teach their classes or adequately do their normal job is having problems with time allocation. The general question is, how should a university manage these situations - how are they identified, how are they mitigated, and what are the consequences if someone is knowingly going over the line (e.g., spending three working days per week running the day to day operations of a startup company rather than doing their academic job)? Things get particularly complicated when you factor in disciplines that basically demand external work (architecture, business school), and the increasingly common practice of special appointments at foreign universities. If anyone has suggestions of universities with what they think are especially good approaches (or lousy ones, for that matter) to this issue, please post in the comments.
Tuesday, April 03, 2012
An open letter to Neil deGrasse Tyson
Hello, Dr. Tyson. First, let me say that I'm a huge fan. You do the scientific community a tremendous service by being such an approachable, clear spokesman, maintaining scientific accuracy while also entertaining the public. Astronomy is a great side interest of mine (like many scientists and engineers), and I really wanted to be an astronaut for a while (until my eyes were demonstrably lousy); that's why on some gut level I enjoyed your call for a renewed vigor in space exploration.
However, my brain's response to your call is, is this really the best strategy? Much as I'd love to one day walk on the moon or Mars, I can't help but be deeply skeptical of NASA's ability to allocate resources. Right now their annual budget is about $17B, more than twice that of the NSF, and more than three times that of the DOE Office of Science. While the achievements of the robotic spacecraft missions are truly amazing, much of the rest of NASA seems very dysfunctional. I'll admit, my impression colored by my thesis advisor's experience on the Columbia accident investigation board, my knowledge of the ISS (hint: the Soyuz "lifeboats" where the ISS crew shelters in case of debris impact? They're actually the most debris-vulnerable part of the ISS.), and the fact that NASA has employees that do things like this and this at some rate.
If taxpayers are going to be persuaded to invest another $17B/yr in federally funded research, I think a much more compelling case needs to be made that NASA is the place for that investment, given the alternatives. Yes, NASA's history and subject matter are inspiring, but you need to convince me that NASA as an agency will really get value out of that investment, given that their recent leadership has been singularly unimpressive.
PS - If you ever need a sub to go onto Colbert in your stead, please call.
PS - If you ever need a sub to go onto Colbert in your stead, please call.
Monday, April 02, 2012
Several items
My apologies to my readers for low blogging rate recently. Multiple papers, proposals, teaching, travel, etc. have all contributed to this slow-down. Here are a few brief items to consider:
- The (nearly) final details have come out regarding the OPERA experiment. Goodbye, superluminal neutrinos - we hardly knew ye. Would've been fun!
- It would appear that one can correlate political affiliation in the US with the somewhat ill-defined concept of "trust in science". Much as it's tempting to make a wry comment here, I suspect that some of this is due to the very disparate nature of those self-identifying as "conservative" these days. Either way, this is a problem, though. Science (in the sense of careful, rigorous testing of hypotheses that allege predictive power) is an incredibly useful way to look at much of the world, and I would hope that this would be appreciated by the vast majority of people out there.
- Someone has advanced the idea that Mitt Romney is a quantum object. Clearly we should put him through some sort of interferometer to test this idea. Alternately, he should interlace his fingers and make a loop with his arms - we can then thread magnetic flux through him and see if his response about the individual mandate for healthcare oscillates as the magnetic field is swept.
- Visiting NSF is always enlightening. I really hadn't appreciated before the quantitative problem that they face in proposal evaluation and administration: the number of proposals that are submitted has more than doubled in the last few years, while their staffing has remained unchanged. Even apart from overall resource problems (e.g., the runaway positive feedback cycle, when people realize that the odds of funding are bad, so they submit more proposals, making the odds of funding worse), just the challenge of properly handling all the paperwork is becoming incredibly difficult.
- April Fools is always fun on the web. This is one of my favorites.
Sunday, March 25, 2012
Responsibilities, rational and otherwise
Professors have many responsibilities - to their students and postdocs, to their departments and colleagues, to their university, to the scientific community, and to the public. When on a doctoral committee, for example, a professor's duty is to make sure that the candidate's thesis is rigorous and careful, and that the student actually knows what they're talking about. Obviously primary responsibility for supervision of the student lies with the advisor(s), but the committee members are not window dressing; they're supposed to serve a valuable role in upholding the quality of the work.
I have a colleague at another institution (names and circumstances have been changed here; I'll say no more about specifics) who really had to put his foot down several years ago, as a committee member, to make sure that a student (the last one of a just-retired professor) didn't hand in a thesis sufficiently fringe that it bordered on pseudoscience. It was pretty clear that the advisor would have been willing to let this slide (!) for the sake of getting the last student out the door. My colleague (junior faculty at the time) had to push hard to make sure that this got resolved. Eventually the student did complete an acceptable thesis (on a much more mainstream topic) and got the degree. This colleague just recently came across the former student again, and was disappointed and sad to see that the fringe aspects of science are back in what he's doing. My colleague is now feeling (irrational) guilt about this (that the former student is now credentialed and pushing this stuff), even though the actual thesis was fine in the end. This does raise the question, though: how much of a gatekeeper should a committee member be?
Sunday, March 18, 2012
Paranormal activity edition
Two items, oddly about parapsychology (as a means to raise points about science and the public). First, this article from The Guardian last week is both unsurprising and disappointing. It is not at all surprising that careful attempts to reproduce almost-certainly-spurious results implying precognitive phenomena have shown that those effects apparently to not really exist. What is worth pondering and discussion, however, is the fact that the authors who tried to check the original results had such a hard time publishing their work, because the major journals dismiss attempts to reproduce controversial results as unoriginal or derivative. This is a problem. Sure, you don't want to take up premiere journal space with lots of confirmations or repetitions of previous work. However, if a journal is willing to hype controversial results to boost circulation, then surely there is some burden on them to follow up on whether those extraordinary claims withstand the test of time.
Second, this morning's Dear Abby column (yes, I still read a newspaper on Sundays) had a letter from a woman seeking advice about how to use her "psychic gifts". It's very depressing that the response said "Many people have psychic abilities to a greater or lesser degree than you do, and those "vibes" can be invaluable." Really? Many people have psychic abilities? How's this for advice: if you really have psychic abilities, go to the James Randi Foundation and take their Million Dollar Challenge. Once you pass, you can use the money to make peoples' lives better. I know it's stupid to get annoyed by this, just as it's pointless to complain about the horoscopes that run in the paper. Still, if someone has an audience as large as Dear Abby, they should think a little bit about spreading this silliness.
Friday, March 16, 2012
Tidbits
Some interesting and thought-provoking things have come up in the last week or so. For instance, here is an article from the IEEE that discusses the decline in science and engineering jobs in the US. Figure 2 is particularly thought-provoking, showing that the number of US undergrad STEM degrees is very strongly correlated with the number of non-medical US federal research dollars spent, from 1955-2000. My personal take is, if you really want Americans to become scientists, engineers, and more broadly supportive of technical education, you need to create a culture where those professions are (more) respected and valued, not viewed as nerdy, geeky, asocial, elitist, or otherwise unacceptable.
On this same theme, there was this op-ed in the New York Times about why so few American political figures are scientists. Accurate (in my opinion) and depressing. I'm not saying we should live in a society run by technocrats, but surely we can be better than this. As a culture, do we really need more lawyers and undergrad "business" majors?
On a more technical note, the ICARUS collaboration, another group in Gran Sasso in Italy working with neutrinos produced by CERN, has announced (paper here) that their measurements show neutrinos traveling at a speed consistent w/ c. Not surprising, and only truly independent measurements can really pin down the issues w/ the OPERA work.
Here is a beautiful new paper by the Manoharan group at Stanford. By arranging spatially ordered arrays of CO molecules on a copper surface, they can manipulate surface states in a way that produces dispersion relations (the relationship between energy and momentum for electrons) with the same kinds of features seen in graphene. While I haven't had a chance to read this in detail yet, it is very slick, and makes explicit the connection between real-space distortions of the graphene structure and how these are mathematically equivalent to electric and magnetic fields for the charge carriers confined to that 2d environment. It's also a great demonstration of how the motion of charge carriers in a condensed matter environment depends on the potential energy's distribution as a function of position, rather than the details. Here, the electrons are not carbon p electrons feeling the "chickenwire" potential energy of the carbon atom lattice in graphene. Rather, the electrons are those that live in the copper surface state, and they feel a designer "chickenwire" potential energy due to the arrangement of CO molecules on the copper surface. However, the net effect is the same. Very pretty. (Still makes me wonder a bit about the details, though.... At the end of the day, electrons have to scatter out of that surface state and into the bulk for the STM measurement to work, and yet that process has to be sufficiently weak that it doesn't screw up the surface state much. Very fortunate that the numbers happen to work!)
Finally, here is a cool, fun project, using nanofab tools to make art (too small to see with the unaided eye). Sameer Walavalkar did his PhD with the well known nano group of Axel Sherer at CalTech. This kind of creative outlet is another way to do outreach, and it's a heck of a lot cooler than many other approaches.
Saturday, March 10, 2012
Mini update
I am out on a brief break, but I wold be remiss if I didn't point out this exciting result. The investigators have managed to make a light emitting diode with greater than 100% electrical efficiency when operated just right. The trick is, the LED gets the energy for the "extra" photons from the temperature difference between the LED and it's surroundings. Basically it's a combination LED and heat engine. Very clever. I wonder if there are some entropy restrictions that come into play, particularly if the final photon state is, e.g., the macroscopically occupied state of a laser cavity.
Tuesday, March 06, 2012
NSF - proposal compliance
This is for everyone out there who submits to proposals to the Division of Materials Research, and more broadly, to the National Science Foundation. Here's some context for those who don't know the story. The NSF has a Grant Proposal Guide that spells out, in detail, the proper content and formatting for proposals. You can understand why they do this, particularly with regard to things like font size. There's a 15 page limit on the "Project Description" part of a proposal, and if they didn't specify a font size and margins, there would be people trying to game the system by submitting proposals in 6-pt unreadable font with 1cm margins. Historically, NSF has erred on the side of latitude about the minutiae, however. For example, they have never really been aggressive about policing whether the bibliographic references are perfectly formatted.
That's why this news came as a surprise: As part of a new policy, starting this past fall, DMR is taking basically a zero-tolerance approach regarding compliance with the Grant Proposal Guide. That means, for example, that any letter of collaboration included with a proposal can only say, in effect, "I agree to do the tasks listed in the Project Description". Anything more (e.g., context about what the collaborator's expertise is, or mentioning that this continues an existing collaboration) is no longer allowed, and would be cause for either deletion of the letter or outright rejection of the proposal without review. This new policy also means, and this is scary, that your references have to be perfectly formatted - leaving out titles, or leaving out the second page number, or using "et al." instead of long author lists - all of these can lead to a proposal being rejected without review. I heard this first hand from a program officer. Imagine spending weeks writing a proposal, and having it get bounced because you used the wrong setting in bibTeX or EndNote.
We can have a vigorous discussion in the comments about whether this policy makes much sense. In the meantime, though, I think it's very important that people be aware of this change. The bottom line: Scrupulously follow the Grant Proposal Guide. Cross every "t" and dot every "i".
Please spread this information - if one division of NSF is doing this, you can bet that it will spread, and you don't want to be the one whose proposal gets bounced.
Sunday, March 04, 2012
March Meeting last day and wrap-up
Not too much to report from the final day of the March Meeting. Lots of good conversations with colleagues, though I never did get a chance to sit down with a couple of folks I'd wanted to see. Ahh well.
I split most of my time between two invited sessions. The first of these was on the unusual properties of the nu=5/2 fractional quantum Hall state. This may sound very narrow and esoteric, but it is actually quite profound. A good review of the whole topic in more generality is here. At a very particular value of perpendicular magnetic field (related to the number of charge carriers per square centimeter), the electrons in a 2d layer in GaAs/AlGaAs semiconductor structures apparently condense into a really weird state. The lowest energy excitations of this state, its quasiparticles, have very strange properties. First, they have an effective electronic charge of 1/4 e. Second, when two of these fractionally charged quasiparticles are moved around each other to swap positions, the whole quantum mechanical state of the system changes (to another state with the same energy as the original), in a way much more complex than just picking up a phase factor (which would be -1 if the quasiparticles acted like ordinary electrons). Somehow the detailed history of winding the particles around each other is supposedly encoded in the many-body state itself. Quasiparticles with this bizarre property are said to obey "non-Abelian statistics". To date, there has not been an experimental "smoking gun" demonstrating these weird properties unambiguously. My postdoc mentor, Bob Willett, gave a very data-heavy talk showing persuasive evidence for consistency with a number of the relevant theory predictions in this system. Following him, Woowon Kang of the University of Chicago showed other data that also looks consistent with some of these ideas (though I'm no expert).
The other invited session dealt with the theory behind the transport of electrons and ions in nanoscale systems. Unfortunately I missed the beginning (since I was seeing the other talks above), but I did get to hear a neat discussion by Kirk Bevan of McGill University about the physics of electromigration. Electromigration is the mechanism by which flowing electrons can scatter off defects and grain boundaries, dumping momentum into atoms and pushing them around.
Final suggestions for the APS:
1) Don't have the small rooms arranged so that getting to seats in the front requires blocking the projector. The result of that is that the front 6 rows or so remain almost completely empty, while people pile up in the back of the rooms.
2) Would it really be that hard to have wireless internet access that doesn't suck? Are there no convention centers that can really support this?
3) Having a big bio presence at the meeting and then scheduling it directly opposite the Biophysical Society meeting seems odd.
4) Every year, there is an electronic letter-writing or petition campaign to support federal funding of research. That's fine and dandy, but is there any way we could try to get some representative Congress-critters to come hear a session, perhaps one of the fun, general invited sessions, or one about industrially relevant research? Remember, next year in Baltimore is quite close to DC....
Subscribe to:
Posts (Atom)