A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Sunday, March 30, 2014
Any advice: LaTeX and makeindex
Readers - For a long time now I have been working on a very large LaTeX document (actually built out of a number of sub-documents) that I will discuss further in later posts. I greatly desire to create an index for this document, and I know about the LaTeX package \makeindex. The question is, does anyone know of a good frontend application that can make the creation of the index less tedious? The brute force approach would require me to go through the document(s) by hand and insert a \makeindex tag every time a term that I wish to index appears. For an index containing a couple of hundred entries, this looks excruciating. What I would love is an application where I identify the terms for which I want index entries, and it then automatically inserts the appropriate tags (in a smart way, not putting tags inside LaTeX equations, for example). While this would be imperfect, it would be easier to start from an over-complete index and pare down or modify than to start from scratch. Yes, I am sure I could use perl or another scripting language to make something, but I'd rather not reinvent the wheel if someone has already solved this problem. Thanks for any suggestions.
Friday, March 28, 2014
Recurring themes in (condensed matter/nano) physics: boundary conditions
This is the first in a series of posts about tropes that recur in (condensed matter/nano) physics. I put that qualifier in parentheses because these topics obviously come up in many other places as well, but I run across them from my perspective.
Very often in physics we are effectively solving boundary value problems. That is, we have some physical system that obeys some kind of differential equations describing the spatial dependence of some variable of interest. This could be the electron wavefunction \( \psi(\mathbf{r})\), which has to obey the Schroedinger equation in some region of space with a potential energy \( V(\mathbf{r})\). This could be the electric field \( \mathbf{E}(\mathbf{r})\), which has to satisfy Maxwell's equations in some region of space that has a dielectric function \( \epsilon(\mathbf{r})\). This could be the deflection of a drumhead \( u(x,y) \), where the drumhead itself must follow the rules of continuum elasticity. This could be the pressure field \( p(z) \) of the air in a pipe that's part of a pipe organ.
The thread that unites these diverse systems is that, in the absence of boundaries, these problems allow a continuum of solutions, but the imposition of boundaries drastically limits the solutions to a discrete set. For example, the pressure in that pipe could (within reasonable limits set by the description of the air as a nice gas) have any spatial periodicity, described by some wavenumber \(k\), and along with that it would have some periodic time dependence with a frequency \(\omega\), so that \( \omega/k = c_{\mathrm{s}}\), where \(c_{\mathrm{s}}\) is the sound speed. However, once we specify boundary conditions - say one end of the pipe closed, one end open - the rules that have to be satisfied at the boundary force there to be a discrete spectrum of allowed wavelengths, and hence frequencies. Even trying to have no boundary, by installing periodic boundary conditions, does this. This general property, the emergence of discrete modes from the continuum, is what gives us the spectra of atoms and the sounds of guitars.
Very often in physics we are effectively solving boundary value problems. That is, we have some physical system that obeys some kind of differential equations describing the spatial dependence of some variable of interest. This could be the electron wavefunction \( \psi(\mathbf{r})\), which has to obey the Schroedinger equation in some region of space with a potential energy \( V(\mathbf{r})\). This could be the electric field \( \mathbf{E}(\mathbf{r})\), which has to satisfy Maxwell's equations in some region of space that has a dielectric function \( \epsilon(\mathbf{r})\). This could be the deflection of a drumhead \( u(x,y) \), where the drumhead itself must follow the rules of continuum elasticity. This could be the pressure field \( p(z) \) of the air in a pipe that's part of a pipe organ.
The thread that unites these diverse systems is that, in the absence of boundaries, these problems allow a continuum of solutions, but the imposition of boundaries drastically limits the solutions to a discrete set. For example, the pressure in that pipe could (within reasonable limits set by the description of the air as a nice gas) have any spatial periodicity, described by some wavenumber \(k\), and along with that it would have some periodic time dependence with a frequency \(\omega\), so that \( \omega/k = c_{\mathrm{s}}\), where \(c_{\mathrm{s}}\) is the sound speed. However, once we specify boundary conditions - say one end of the pipe closed, one end open - the rules that have to be satisfied at the boundary force there to be a discrete spectrum of allowed wavelengths, and hence frequencies. Even trying to have no boundary, by installing periodic boundary conditions, does this. This general property, the emergence of discrete modes from the continuum, is what gives us the spectra of atoms and the sounds of guitars.
Friday, March 21, 2014
How should philanthropists and foundations fund science?
This article from the NY Times discusses the funding of science research by wealthy individuals and, by extension, philanthropic foundations set up by those folks. It brings up an issue that I phrased in the form of a question as the title of this post. I'm not going to offer any simple overarching answer, but I do want to make a couple of observations (strictly my opinion, of course):
- Many wealthy people and foundations support medical research. This makes a lot of sense - generally philanthropists want to Help People, and supporting research that directly affects medical care and quality of life is a completely sensible choice.
- A smaller number of people and foundations support research in the physical sciences and engineering; like those who support medical research, they want to Help People, and they realize that supporting basic research and the education of technically skilled and creative people is a great way to do that.
- Both groups, however, face the challenge that any investment that they make is basically a drop in the bucket compared with what governments can do. NIH puts tens of billions of dollars a year into medical research. NSF's annual budget is around $7B.
- In my experience, the philanthropic supporters of science are well aware of this - if they want to make sure that their money makes a difference, they need to invest in supporting things that are not what government agencies are already doing. Their challenge, then, is to identify areas (and eventually institutions and people) where their investment will really move the needle. Of course, they need to be able to tell wheat from chaff. Peer review is a customary way to do this, and that is often how the government agencies (most of them, anyway) make judgments, but peer review tends toward being risk averse. An alternative is to have a dedicated science board to do reviews and make decisions. This, too, is tricky, particularly if the members aren't exact subject matter experts. How much weight should be placed on prior track record? Researchers who are senior and already have major awards, etc. can be a lower risk - they have already demonstrated that they can do great work. On the other hand, if someone is already extremely well supported (as such people often are), how much difference will philanthropic support really make? It seems like a very tricky decision process, particularly depending on the amounts involved.
- There is no question that having grants with wide flexibility (e.g., Packard; presumably MacArthur) can be wonderful. At the single investigator level, there is also no question that there can be real benefits from being able to concentrate on actual science - that's an argument for funding support large enough that it allows investigators to lay off writing other grants to some extent. (That's one aim of things like the Howard Hughes Investigator program.)
Friday, March 14, 2014
Taking a few days, + a philanthropy suggest for Google or Intel
Last post for a few days. I want to make a suggestion, though. Hey large tech companies, like Intel or Google, or for that matter Sematech or the SRC, or foundations like Keck, Moore, Packard, MacArthur: You may have heard that NSF seems to have put shared physical sciences research infrastructure on the back burner. I firmly believe that this is a bad decision that will have lasting negative consequences for many people. I've written before about how much impact on science and engineering research and education there would be if companies (or individuals, I suppose, if they were sufficiently wealthy) would step forward and endow shared research equipment and staffing at universities. Now is the time, when there is likely to be a real federal gap here. I'm serious, and I'd be happy to talk with any interested parties about how this could be done - just email me.
Update: this is highly relevant.
Update: this is highly relevant.
Monday, March 10, 2014
Coolest paper of 2014 so far, by a wide margin.
Sorry for the brief post, but I could not pass this up.
Check this out: http://arxiv.org/abs/1403.1211
I bow down before the awesomeness of an origami-based microscope.
Check this out: http://arxiv.org/abs/1403.1211
I bow down before the awesomeness of an origami-based microscope.
March Meeting wrap-up
I've been slow about writing a day 3/4/wrapup of the APS meeting because of general busy-ness. I saw fewer general interest talks over that last day and a half in part because my own group's talks were clustered in that timeframe. Still, I did see a couple of interesting bits.
- There was a great talk by Zhenchao Dong about this paper, where they are able to use the plasmonic properties of a scanning tunneling microscope tip to perform surface-enhanced Raman spectroscopy on single molecules (in ultrahigh vacuum and cryogenic conditions) with sub-nm lateral resolution. The data are gorgeous, though how the lateral resolution can possibly be that good is very mysterious. Usually the lateral extent of the enhanced optical fields is something like the geometric mean of the tip radius of curvature and the tip-sample distance. It's very hard to see how that ever gets to the sub-nm level, so something funky must be going on.
- I saw a talk by Yoshihiro Iwasa all about MoS2, including work on optics and ionic liquid gating.
- I went to a session on the presentation of physics to the public. The talks that I managed to see were quite good, and Dennis Overbye's insights into the NY Times' science reporting were particularly interesting. He pointed out that it's a very challenging marketplace when so much good (or at least interesting) science writing is given away for free (as in here or here or here). He did give a shout-out to Peter Woit, particularly mentioning how good Peter's sources are.
Wednesday, March 05, 2014
The end of the National Nano Infrastructure Network? Federal support for shared facilities.
The National Nanotechnology Infrastructure Network is, as their page says, "an integrated networked partnership of user facilities, supported by the
National Science Foundation, serving the needs of nanoscale science,
engineering and technology". Basically, the NNIN has been a mechanism for establishing nodes of excellence at sites around the US, where people could travel to use equipment and capabilities (high resolution transmission electron microscopy; sophisticated wafer-scale electron beam lithography; deep etching) that they lack at their home institutions. Crucially, these shared facilities are supported by skilled technical staff that can train users, work with users to develop processes, perform fee-for-service work on occasion, etc. The most famous sites are the Stanford Nanofab Facility and the Cornell Nanofab. Over the years, the NNIN has been instrumental in an enormous amount of research progress. Note that this effort is distinct from Major User Facilities (such as synchrotrons, neutron sources, etc).
This year, there was a competition for a Next Generation NNIN - the call is here. The idea was very much to broaden the network into characterization as well as fabrication, and to reach new, growing communities of users in areas like bio, the environment, earth sciences/geo. After a proposal process that boiled down to two teams (one with 18 universities; one with 20), very extensive full proposals, reverse site visits, written responses to reverse site visits and reviews, etc., the NSF decided not to make an award. It would appear that there will be another call of some kind issued in fall, 2014. For now, what this means is that the NNIN is ending. Cornell, Stanford, and the other sites face major cuts in funding for staff and support for external users. (Full disclosure: I was the Rice rep on one of the teams.)
This whole issue is very complex, but it raises a number of questions that would benefit from a discussion in the community. What should be the pathway to federal support for shared facilities and staffing, particularly tools and techniques that would be prohibitively expensive for individual universities to support via internal funds? Should there be federal support for this? Should it come from NSF? How can we have a stable, sustained level of research infrastructure, including staffing, that serves the broad scientific community, in an era when funding is squeezed ever more tightly? If the burden is shifting more toward individual universities having to support shared infrastructure basically with internal funding and user fees, what impact will that have? Comment is invited.
UPDATE: Here is a story that Science is running regarding the decision, or lack thereof.
This year, there was a competition for a Next Generation NNIN - the call is here. The idea was very much to broaden the network into characterization as well as fabrication, and to reach new, growing communities of users in areas like bio, the environment, earth sciences/geo. After a proposal process that boiled down to two teams (one with 18 universities; one with 20), very extensive full proposals, reverse site visits, written responses to reverse site visits and reviews, etc., the NSF decided not to make an award. It would appear that there will be another call of some kind issued in fall, 2014. For now, what this means is that the NNIN is ending. Cornell, Stanford, and the other sites face major cuts in funding for staff and support for external users. (Full disclosure: I was the Rice rep on one of the teams.)
This whole issue is very complex, but it raises a number of questions that would benefit from a discussion in the community. What should be the pathway to federal support for shared facilities and staffing, particularly tools and techniques that would be prohibitively expensive for individual universities to support via internal funds? Should there be federal support for this? Should it come from NSF? How can we have a stable, sustained level of research infrastructure, including staffing, that serves the broad scientific community, in an era when funding is squeezed ever more tightly? If the burden is shifting more toward individual universities having to support shared infrastructure basically with internal funding and user fees, what impact will that have? Comment is invited.
UPDATE: Here is a story that Science is running regarding the decision, or lack thereof.
March Meeting, Day 2
This is a meta-post - I'm writing it while sitting in the back of a session on presenting science to the public. A brief list of some of the neat things I heard yesterday:
- I saw a very nice talk by Jelena Vuckovic about doing nonlinear and cavity optics, with (self-assembled InAs) quantum dots as the emitters, and the cavity being formed in 2d photonic band gap systems. The latest work looks at nonlinear effects like photon blockade, and makes contact to some work involving "circuit" quantum electrodynamics (see here).
- I went to a talk by Ken Golden, who taught me sophomore differential equations, and he gave a fascinating presentation about applying rigorous math (percolation theory, treating microstructured composites like effective media) to the challenging problem of understanding melting polar sea ice. As a side note, he showed a great picture that is an example of the "quasistatic limit" - long wavelength surface ocean waves don't "see" individual ice floes, but instead propagate in an effective medium.
- There was a great invited session about oxide heterostructures. Mobilities are improving (under the right conditions) to the point where some ways of learning about the band structure through electronic transport are now becoming possible. Very impressive was a talk by Shahal Ilani, where he presented a very compelling view of the importance of structural domains ("ferroelasticity") in the underlying strontium titanate - when those domains are under control, transport becomes much more clean, revealing the apparent existence of a magnetically interesting phase at high carrier density and high in-plane magnetic field.
Tuesday, March 04, 2014
March Meeting, Day 1
Observations from the first day of the APS March Meeting:
- There has been a lot of progress and excitement in looking at layered materials "beyond graphene". It's interesting to see a resurgence of interest in transition metal (Ti, but more frequently W and Mo) dichalcogenides (S, Se, Te), a topic of great activity in bulk materials growth in the 1970s and early 80s. There are clearly a lot of bright people working on ways to grow these materials layer-by-layer, with the long-term idea of making structures somewhat like semiconductor heterostructures (e.g., GaAs/AlGaAs), but with the richer palette provided by these materials (exhibiting charge density waves, strong spin-orbit effects, complex band structure, etc.). Molecular beam epitaxy of these materials with high quality is generally very hard. For example, Mo and W are extremely refractory, requiring electron beam evaporation at temperatures exceeding 2500 C, and sticking at the sample surface without much diffusion. Whoever really gets layer-by-layer, large-area growth working with diverse materials is going to make a big impact.
- I saw Heinrich Jaeger give a great talk about granular materials by design. These are entirely classical systems, but they are extremely challenging. If you think about it, they are not crystalline (no long-range symmetries to exploit in modeling), they are non-ergodic (the constituent grains are kinetically limited, and can't explore all possible configurations), and nonlinear (the interactions between particles are short-ranged and very strong). Very interesting.
- I caught two talks in the session looking at silicon-based quantum information processing. It's possible to create and manipulate dangling bonds on the Si surface (localized states that can trap electrons) and look at how those bonds interact with each other. Very neat. Looking at particular individual impurities, with the right system (erbium in Si), you can couple a single impurity to a single-electron transistor charge sensor. Then, you can manipulate that impurity with optical techniques and use the charge detection to determine its state. Very impressive.
- The session on secrecy in science was very good. The ability to manufacture viruses by design is genuinely frightening (though it loses some menace when the words "Pandemic - millions of deaths?" are projected in Comic Sans). The discussion of intellectual property was great and the role of universities merits its own blog post. Lastly, I was unaware of the WATCHMAN project, which is a very interesting neutrino physics experiment that as an added bonus should allow the international community to detect rogue nuclear reactors meant for weapons development.
Subscribe to:
Posts (Atom)