Search This Blog

Sunday, June 15, 2025

So you want to build a science/engineering laboratory building

A very quick summary of some non-negative news developments:
  • The NSF awarded 500 more graduate fellowships this week, bringing the total for this year up to 1500.  (Apologies for the X link.)  This is still 25% lower than last year's number, and of course far below the original CHIPS and Science act target of 3000, but it's better than the alternative.  I think we can now all agree that the supposed large-scale bipartisan support for the CHIPS and Science act was illusory.
  • There seems to be some initial signs of pushback on the senate side regarding the proposed massive science funding cuts.  Again, now is the time to make views known to legislators - I am told by multiple people with experience in this arena that it really can matter.
  • There was a statement earlier this week that apparently the US won't be going after Chinese student visas.  This would carry more weight if it didn't look like US leadership was wandering ergodically through all possible things to say with no actual plan or memory.
On to the main topic of this post.  Thanks to my professional age (older than dirt) and my experience (overseeing shared research infrastructure; being involved in a couple of building design and construction projects; and working on PI lab designs and build-outs), I have some key advice and lessons learned for anyone designing a new big science/engineering research building.  This list is by no means complete, and I invite readers to add their insights in the comments.  While it seems likely that many universities will be curtailing big capital construction projects in the near term because of financial uncertainty, I hope this may still come in handy to someone.  
  • Any big laboratory building should have a dedicated loading dock with central receiving.  If you're spending $100M-200M on a building, this is not something that you should "value engineer" away.  The long term goal is a building that operates well for the PIs and is easy to maintain, and you're going to need to be able to bring in big crates for lab and service equipment.  You should have a freight elevator adjacent to the dock.  
  • You should also think hard about what kind of equipment will have to be moved in and out of the building when designing hallways, floor layouts, and door widths.  You don't want to have to take out walls, doorframes, or windows, or to need a crane to hoist equipment into upper floors because it can't get around corners.
  • Think hard about process gasses and storage tanks at the beginning.  Will PIs need to have gas cylinders and liquid nitrogen and argon tanks brought in and out in high volumes all the time, with all the attendant safety concerns?  Would you be better off getting LN2 or LAr tanks even though campus architects will say they are unsightly?  
  • Likewise, consider whether you should have building-wide service for "lab vacuum", N2 gas, compressed air, DI water, etc.  If not and PIs have those needs, you should plan ahead to deal with this.
  • Gas cylinder and chemical storage - do you have enough on-site storage space for empty cylinders and back-up supply cylinders?  If this is a very chemistry-heavy building, think hard about safety and storing solvents. 
  • Make sure you design for adequate exhaust capacity for fume hoods.  Someone will always want to add more hoods.  While all things are possible with huge expenditures, it's better to make sure you have capacity to spare, because adding hoods beyond the initial capacity would likely require a huge redo of the building HVAC systems.
  • Speaking of HVAC, think really hard about controls and monitoring.  Are you going to have labs that need tight requirements on temperature and humidity?  When you set these up, put have enough sensors of the right types in the right places, and make sure that your system is designed to work even when the outside air conditions are at their seasonal extremes (hot and humid in the summer, cold and dry in the winter).  Also, consider having a vestibule (air lock) for the main building entrance - you'd rather not scoop a bunch of hot, humid air (or freezing, super-dry air) into the building every time a student opens the door.
  • Still on HVAC, make sure that power outages and restarts don't lead to weird situations like having the whole building at negative pressure relative to the outside, or duct work bulging or collapsing.
  • Still on HVAC, actually think about where the condensate drains for the fan units will overflow if they get plugged up or overwhelmed.  You really don't want water spilling all over a rack of networking equipment in an IT closet.  Trust me.
  • Chilled water:  Whether it's the process chilled water for the air conditioning, or the secondary chilled water for lab equipment, make sure that the loop is built correctly.   Incompatible metals (e.g., some genius throws in a cast iron fitting somewhere, or joints between dissimilar metals) can lead to years and years of problems down the line.  Make sure lines are flushed and monitored for cleanliness, and have filters in each lab that can be checked and maintained easily.
  • Electrical - design with future needs in mind.  If possible, it's a good idea to have PI labs with their own isolation transformers, to try to mitigate inter-lab electrical noise issues.  Make sure your electrical contractors understand the idea of having "clean" vs. "dirty" power and can set up the grounding accordingly while still being in code.
  • Still on electrical, consider building-wide surge protection, and think about emergency power capacity.  For those who don't know, emergency power is usually a motor-generator that kicks in after a few seconds to make sure that emergency lighting and critical systems (including lab exhaust) keep going.
  • Ceiling heights, duct work, etc. - It's not unusual for some PIs to have tall pieces of equipment.  Think about how you will accommodate these.  Pits in the floors of basement labs?  5 meter slab-to-slab spacing?  Think also about how ductwork and conduits are routed.  You don't want someone to tell you that installation of a new apparatus is going to cost a bonus $100K because shifting a duct sideways by half a meter will require a complete HVAC redesign.
  • Think about the balance between lab space and office space/student seating.  No one likes giant cubicle farm student seating, but it does have capacity.  In these days of zoom and remote access to experiments, the way students and postdocs use offices is evolving, which makes planning difficult.  Health and safety folks would definitely prefer not to have personnel effectively headquartered directly in lab spaces.  Seriously, though, when programming a building, you need to think about how many people per PI lab space will need places to sit.  I have yet to see a building initially designed with enough seating to handle all the personnel needs if every PI lab were fully occupied and at a high level of research activity. 
  • Think about maintenance down the line.  Every major building system has some lifespan.  If a big air handler fails, is it accessible and serviceable, or would that require taking out walls or cutting equipment into pieces and disrupting the entire building?  Do you want to set up a situation where you may have to do this every decade?  (Asking for a friend.)
  • Entering the realm of fantasy, use your vast power and influence to get your organization to emphasize preventative maintenance at an appropriate level, consistently over the years.  Universities (and national labs and industrial labs) love "deferred maintenance" because kicking the can down the road can make a possible cost issue now into someone else's problem later.  Saving money in the short term can be very tempting.  It's also often easier and more glamorous to raise money for the new J. Smith Laboratory for Physical Sciences than it is to raise money to replace the HVAC system in the old D. Jones Engineering Building.  Avoid this temptation, or one day (inevitably when times are tight) your university will notice that it has $300M in deferred maintenance needs.
I may update this list as more items occur to me, but please feel free to add input/ideas.

Saturday, June 07, 2025

A precision measurement science mystery - new physics or incomplete calculations?

Again, as a distraction from persistently concerning news, here is a science mystery of which I was previously unaware.

The role of approximations in physics is something that very often comes as a shock to new students.  There is this cultural expectation out there that because physics is all about quantitative understanding of physical phenomena, and the typical way we teach math and science in K12 education, we should be able to get exact solutions to many of our attempts to model nature mathematically.   In practice, though, constructing physics theories is almost always about approximations, either in the formulation of the model itself (e.g. let's consider the motion of an electron about the proton in the hydrogen atom by treating the proton as infinitely massive and of negligible size) or in solving the mathematics (e.g., we can't write an exact analytical solution of the problem when including relativity, but we can do an order-by-order expansion in powers of \(p/mc\)).  Theorists have a very clear understanding of what means to say that an approximation is "well controlled" - you know on both physical and mathematical grounds that a series expansion actually converges, for example.  

Some problems are simpler than others, just by virtue of having a very limited number of particles and degrees of freedom, and some problems also lend themselves to high precision measurements.  The hydrogen atom problem is an example of both features.  Just two spin-1/2 particles (if we approximate the proton as a lumped object) and readily accessible to optical spectroscopy to measure the energy levels for comparison with theory.  We can do perturbative treatments to account for other effects of relativity, spin-orbit coupling, interactions with nuclear spin, and quantum electrodynamic corrections (here and here).  A hallmark of atomic physics is the remarkable precision and accuracy of these calculations when compared with experiment.  (The \(g\)-factor of the electron is experimentally known to a part in \(10^{10}\) and matches calculations out to fifth order in \(\alpha = e^2/(4 \pi \epsilon_{0}\hbar c)\).).  

The helium atom is a bit more complicated, having two electrons and a more complicated nucleus, but over the last hundred years we've learned a lot about how to do both calculations and spectroscopy.   As explained here, there is a problem.  It is possible to put helium into an excited metastable triplet state with one electron in the \(1s\) orbital, the other electron in the \(2s\) orbital, and their spins in a triplet configuration.  Then one can measure the ionization energy of that system - the minimum energy required to kick an electron out of the atom and off to infinity.  This energy can be calculated to seventh order in \(\alpha\), and the theorists think that they're accounting for everything, including the finite (but tiny) size of the nucleus.  The issue:  The calculation and the experiment differ by about 2 nano-eV.  That may not sound like a big deal, but the experimental uncertainty is supposed to be a little over 0.08 nano-eV, and the uncertainty in the calculation is estimated to be 0.4 nano-eV.  This works out to something like a 9\(\sigma\) discrepancy.  Most recently, a quantitatively very similar discrepancy shows up in the case of measurements performed in 3He rather than 4He.  

This is pretty weird.  Historically, it would seem that the most likely answer is a problem with either the measurements (though that seems doubtful, since precision spectroscopy is such a well-developed set of techniques), the calculation (though that also seems weird, since the relevant physics seems well known), or both.  The exciting possibility is that somehow there is new physics at work that we don't understand, but that's a long shot.  Still, something fun to consider (as my colleagues (and I) try to push back on the dismantling of US scientific research.)



Sunday, June 01, 2025

Pushing back on US science cuts: Now is a critical time

Every week has brought more news about actions that, either as a collateral effect or a deliberate goal, will deeply damage science and engineering research in the US.  Put aside for a moment the tremendously important issue of student visas (where there seems to be a policy of strategic vagueness, to maximize the implicit threat that there may be selective actions).  Put aside the statement from a Justice Department official that there is a general plan is to "bring these universities to their knees", on the pretext that this is somehow about civil rights.  

The detailed version of the presidential budget request for FY26 is now out (pdf here for the NSF portion).  If enacted, it would be deeply damaging to science and engineering research in the US and the pipeline of trained students who support the technology sector.  Taking NSF first:  The topline NSF budget would be cut from $8.34B to $3.28B.  Engineering would be cut by 75%, Math and Physical Science by 66.8%.  The anticipated agency-wide success rate for grants would nominally drop below 7%, though that is misleading (basically taking the present average success rate and cutting it by 2/3, while some programs are already more competitive than others.).  In practice, many programs already have future-year obligations, and any remaining funds will have to go there, meaning that many programs would likely have no awards at all in the coming fiscal year.  The NSF's CAREER program (that agency's flagship young investigator program) would go away  This plan would also close one of the LIGO observatories (see previous link).  (This would be an extra bonus level of stupid, since LIGO's ability to do science relies on having two facilities, to avoid false positives and to identify event locations in the sky.  You might as well say that you'll keep an accelerator running but not the detector.)  Here is the table that I think hits hardest, dollars aside:

The number of people involved in NSF activities would drop by 240,000.  The graduate research fellowship program would be cut by more than half.  The NSF research training grant program (another vector for grad fellowships) would be eliminated.  

The situation at NIH and NASA is at least as bleak.  See here for a discussion from Joshua Weitz at Maryland which includes this plot: 


This proposed dismantling of US research and especially the pipeline of students who support the technology sector (including medical research, computer science, AI, the semiconductor industry, chemistry and chemical engineering, the energy industry) is astonishing in absolute terms.  It also does not square with the claim of some of our elected officials and high tech CEOs to worry about US competitiveness in science and engineering.  (These proposed cuts are not about fiscal responsibility; just the amount added in the proposed DOD budget dwarfs these cuts by more than a factor of 3.)

If you are a US citizen and think this is the wrong direction, now is the time to talk to your representatives in Congress. In the past, Congress has ignored presidential budget requests for big cuts.  The American Physical Society, for example, has tools to help with this.  Contacting legislators by phone is also made easy these days.  From the standpoint of public outreach, Cornell has an effort backing large-scale writing of editorials and letters to the editor.




Thursday, May 29, 2025

Quick survey - machine shops and maker spaces

Recent events are very dire for research at US universities, and I will write further about those, but first a quick unrelated survey for those at such institutions.  Back in the day, it was common for physics and some other (mechanical engineering?) departments to have machine shops with professional staff.  In the last 15-20 years, there has been a huge growth in maker-spaces on campuses to modernize and augment those capabilities, though often maker-spaces are aimed at undergraduate design courses rather than doing work to support sponsored research projects (and grad students, postdocs, etc.).  At the same time, it is now easier than ever (modulo tariffs) to upload CAD drawings to a website and get a shop in another country to ship finished parts to you.

Quick questions:   Does your university have a traditional or maker-space-augmented machine shop available to support sponsored research?  If so, who administers this - a department, a college/school, the office of research?  Does the shop charge competitive rates relative to outside vendors?  Are grad students trained to do work themselves, and are there professional machinists - how does that mix work?

Thanks for your responses.  Feel free to email me if you'd prefer to discuss offline.

Thursday, May 22, 2025

How badly has NSF funding already been effectively cut?

This NY Times feature lets you see how each piece of NSF's funding has been reduced this year relative to the normalized average spanning in the last decade.  Note: this fiscal year, thanks to the continuing resolution, the actual agency budget has not actually been cut like this. They are just not spending congressionally appropriated agency funds.  The agency, fearing/assuming that its budget will get hammered next fiscal year, does not want to start awards that it won't be able to fund in out-years. The result is that this is effectively obeying in advance the presidential budget request for FY26.  (And it's highly likely that some will point to unspent funds later in the year and use that as a justification for cuts, when in fact it's anticipation of possible cuts that has led to unspent funds.  I'm sure the Germans have a polysyllabic word for this.  In English, "Catch-22" is close.)


I encourage you to click the link and go to the article where the graphic is interactive (if it works in your location - not sure about whether the link works internationally).  The different colored regions are approximately each of the NSF directorates (in their old organizational structure).  Each subsection is a particular program.  

Seems like whoever designed the graphic was a fan of Tufte, and the scaling of the shaded areas does quantitatively reflect funding changes.  However, most people have a tough time estimating relative areas of irregular polygons.  Award funding in physics (the left-most section of the middle region) is down 85% relative to past years.  Math is down 72%.  Chemistry is down 57%.  Materials is down 63%.  Earth sciences is down 80%.  Polar programs (you know, those folks who run all the amazing experiments in Antarctica) is down 88%.  

I know my readers are likely tired of me harping on NSF, but it's both important and a comparatively transparent example of what is also happening at other agencies.  If you are a US citizen and think that this is the wrong path, then push on your congressional delegation about the upcoming budget. 

Sunday, May 18, 2025

A science anecdote palate cleanser

Apologies for slow posting.  Real life has been very intense, and I also was rather concerned when one of my readers mentioned last weekend that these days my blog was like concentrated doom-scrolling.  I will have more to say about the present university research crisis later, but first I wanted to give a hopefully diverting example of the kind of problem-solving and following-your-nose that crops up in research.

Recently in my lab we have had a need to measure very small changes in electrical resistance of some devices, at the level of a few milliOhms out of kiloOhms - parts in \(10^6\).  One of my students put together a special kind of resistance bridge to do this, and it works very well.  Note to interested readers: if you want to do this, make sure that you use components with very low temperature coefficients of their properties (e.g., resistors with a very small \(dR/dT\)), because otherwise your bridge becomes an extremely effective thermometer for your lab.  It’s kind of cool to be able to see the lab temperature drift around by milliKelvins, but it's not great for measuring your sample of interest.

There are a few ways to measure resistance.  The simplest is the two-terminal approach, where you drive currents through and measure voltages across your device with the same two wires.  This is easy, but it means that the voltage you measure includes contributions from the contacts those wires make with the device.  A better alternative is the four-terminal method, where you use separate wires to supply/collect the current.  

Anyway, in the course of doing some measurements of a particular device's resistance as a function of magnetic field at low temperatures, we saw something weird.  Below some rather low temperatures, when we measured in a 2-terminal arrangement, we saw a jump up in resistance by around 20 milliOhms (out of a couple of kOhms) as magnetic field was swept up from zero, and a small amount of resistance hysteresis with magnetic field sweep that vanished above maybe 0.25 T.  This vanished completely in a 4-terminal arrangement, and also disappeared above about 3.4 K.  What was this?  Turns out that I think we accidentally rediscovered the superconducting transition in indium.  While our contact pads on our sample mount looked clean to the unaided eye, they had previously had indium on there.  The magic temperature is very close to the bulk \(T_{c}\) for indium.

For one post, rather than dwelling on the terrible news about the US science ecosystem, does anyone out there have other, similar fun experimental anecdotes?  Glitches that turned out to be something surprising?  Please share in the comments.

Monday, May 05, 2025

Updates, thoughts about industrial support of university research

Lots of news in the last few days regarding federal funding of university research:
  • NSF has now frozen all funding for new and continuing awards.  This is not good; just how bad it is depends on the definition of "until further notice".  
  • Here is an open letter from the NSF employees union to the basically-silent-so-far National Science Board, asking for the NSB to support the agency.
  • Here is a grass roots SaveNSF website with good information and suggestions for action - please take a look.
  • NSF also wants to cap indirect cost rates at 15% for higher ed institutions for new awards.  This will almost certainly generate a law suit from the AAU and others.  
  • Speaking of the AAU, last week there was a hearing in the Massachusetts district court regarding the lawsuits about the DOE setting indirect cost rates to 15% for active and new awards.  There had already been a temporary restraining order in place nominally stopping the change; the hearing resulted in that order being extended "until a further order is issued resolving the request for a temporary injunction."  (See here, the entry for April 29.)
  • In the meantime, the presidential budget request has come out, and if enacted it would be devastating to the science agencies.  Proposed cuts include 55% to NSF, 40% to NIH, 33% to USGS, 25% to NOAA, etc.   If these cuts went through, we are taking about more than $35B, at a rough eyeball estimate. 
  • And here is a letter from former NSF directors and NSB chairs to the appropriators in Congress, asking them to ignore that budget request and continue to support government sponsored science and engineering research.
Unsurprisingly, during these times there is a lot of talk about the need for universities to diversify their research portfolios - that is, expanding non-federally-supported ways to continue generating new knowledge, training the next generation of the technically literate workforce, and producing IP and entrepreneurial startup companies.  (Let's take it as read that it would be economically and societally desirable to continue these things, for the purposes of this post.)

Philanthropy is great, and foundations do fantastic work in supporting university research, philanthropy can't come close to making up for sharp drawdowns of federal support.  The numbers just don't work.  The endowment of the Moore Foundation, for example, is around $10B, implying an annual payout of $500M or so, which is great but around 1.4% of the cuts being envisioned.  

Industry seems like the only non-governmental possibility that could in principle muster the resources that could make a large-scale difference.   Consider the estimated profits (not revenues) of different industrial sectors.  The US semiconductor market had revenues last year of around $500B with an annualized net margin of around 17%, giving $85B/yr of profit.  US aerospace and defense similarly have an annual profit of around $70B.  The financial/banking sector, which has historically benefitted greatly from PhD-trained quants, has an annual net income of $250B.  I haven't even listed numbers for the energy and medical sectors, because those are challenging to parse (but large). 

All of those industries have been helped greatly by university research, directly and indirectly.  It's the source of trained people.  It's the source of initial work that is too long-term for corporations to be able to support without short-time-horizon shareholders getting annoyed.  It's the source of many startup companies that sometimes grow and other times get gobbled up by bigger fish. 

Encouraging greater industrial sponsorship of university research is a key challenge.  The value proposition must be made clear to both the companies and universities.  The market is unforgiving and exerts pressure to worry about the short term not the long term.  Given how Congress is functioning, it does not look like there are going to be changes to the tax code put in place that could incentivize long term investment.  

Cracking this and meaningfully growing the scale of industrial support for university research could be enormously impactful.  Something to ponder.



Friday, April 25, 2025

NSF, quo vadis?

There is a lot going on.  Today, some words about NSF.

Yesterday Sethuraman Panchanathan, the director of the National Science Foundation, resigned 16 months before the end of his six year term.  The relevant Science article raises the possibility that this is because, as an executive branch appointee, he would effectively have to endorse the upcoming presidential budget request, which is rumored to be a 55% cut to the agency budget (from around $9B/yr to $4B/yr) and a 50% reduction in agency staffing.  (Note:  actual appropriations are set by Congress, which has ignored presidential budget requests in the past.)  This comes at the end of a week when all new awards were halted at the agency while non-agency personnel conducted "a second review" of all grants, and many active grants have been terminated.  Bear in mind, awards this year from NSF are already down 50% over last year, even without official budget cuts.  UpdateHere is Nature's reporting from earlier today.

The NSF has been absolutely critical to a long list of scientific and technological advances over the last 70 years (see here while it's still up).  As mentioned previously, government support of basic research has a great return on investment for the national economy, and it's a tiny fraction of government spending.  Less than three years ago, the CHIPS & Science Act was passed with supposed bipartisan support in Congress, authorizing the doubling of the NSF budget.  Last summer I posted in frustration that this support seemed to be an illusion when it came to actual funding.  

People can have disagreements about the "right" level of government support for science in times of fiscal challenges, but as far as I can tell, no one (including and especially Congress so far) voted for the dismantling of the NSF.  If you think the present trajectory is wrong, contact your legislators and make your voices heard. 

Sunday, April 20, 2025

A Grand Bargain and its chaotic dissolution

After World War II, under the influence (direct and indirect) of people like Vannevar Bush, a "grand bargain" was effectively struck between the US government and the nation's universities.  The war had demonstrated how important science and engineering research could be, through the Manhattan Project and the development of radar, among other things.  University researchers had effectively and sometimes literally been conscripted into the war effort.  In the postwar period, with more citizens than ever going to college because of the GI Bill, universities went through a period of rapid growth, and the government began funding research at universities on the large scale.  This was a way of accomplishing multiple goals.  This funding got hundreds of scientists and engineers to work on projects that agencies and the academic community itself (through peer review) thought would be important but perhaps were of such long-term or indirect economic impact that industry would be unlikely to support them.  It trained the next generation of researchers and of the technically skilled workforce.  It accomplished this as a complement to national laboratories and direct federal agency work.

After Sputnik, there was an enormous ramp-up of investment.  This figure (see here for an interactive version) shows different contributions to investment in research and development in the US from 1953 through 2021:

A couple of days ago, the New York Times published a related figure, showing the growth in dollars of total federal funds sent to US universities, but I think this is a more meaningful graph (hat tip to Prof. Elizabeth Popp Berman at Michigan for her discussion of this).  In 2021, federal investment in research (the large majority of which is happening at universities) as a percentage of GDP was at its lowest level since 1953, and it was sinking further even before this year (for those worried about US competitiveness....  Also, industry does a lot more D than they do long-term R.). There are many studies by economists showing that federal investment in research has a large return (for example, here is one by the Federal Reserve Bank of Dallas saying that returns to the US economy on federal research expenditures are between 150% and 300%).  Remember, these funds are not just given to universities - they are in the form of grants and contracts, for which specific work is done and reported.   These investments also helped make US higher education the envy of much of the world and led to education of international students as a tremendous effective export business for the country.

Of course, like any system created organically by people, there are problems.  Universities are complicated and full of (ugh) academics.  Higher education is too expensive.  Compliance bureaucracy can be onerous.  Any deliberative process like peer review trades efficiency for collective expertise but also the hazards of group-think.  At the same time, the relationship between federally sponsored research and universities has led to an enormous amount of economic, technological, and medical benefit over the last 70 years.

Right now it looks like this whole apparatus is being radically altered, if not dismantled in part or in whole.  Moreover, this is not happening as a result of a debate or discussion about the proper role and scale of federal spending at universities, or an in-depth look at the flaws and benefits of the historically developed research ecosystem.  It's happening because "elections have consequences", and I'd be willing to bet that very very few people in the electorate cast their votes even secondarily because of this topic.   Sincere people can have differing opinions about these issues, but decisions of such consequence and magnitude should not be taken lightly or incidentally.  

(I am turning off comments on this one b/c I don't have time right now to pay close attention.  Take it as read that some people would comment that US spending must be cut back and that this is a consequence.)


Tuesday, April 15, 2025

Talk about "The Direct Democracy of Matter"

The Scientia Institute at Rice sponsors series of public lectures annually, centered around a theme.  The intent is to get a wide variety of perspectives spanning across the humanities, social sciences, arts, sciences, and engineering, presented in an accessible way.  The youtube channel with recordings of recent talks is here.

This past year, the theme was "democracy" in its broadest sense.  I was honored to be invited last year to contribute a talk, which I gave this past Tuesday, following a presentation by my CS colleague Rodrigo Ferreira about whether AI has politics.  Below I've embedded the video, with the start time set where I begin (27:00, so you can rewind to see Rodrigo).  


Which (macroscopic) states of matter to we see?  The ones that "win the popular vote" of the microscopic configurations.

Sunday, April 13, 2025

US science situation updates and what's on deck

Many things have been happening in and around US science.  This is a non-exhaustive list of recent developments and links:
  • There have been very large scale personnel cuts across HHS, FDA, CDC, NIH - see here.  This includes groups like the people who monitor lead in drinking water.  
  • There is reporting about the upcoming presidential budget requests about NASA and NOAA.  The requested cuts are very deep.  To quote Eric Berger's article linked above, for the science part of NASA, "Among the proposals were: A two-thirds cut to astrophysics, down to $487 million; a greater than two-thirds cut to heliophysics, down to $455 million; a greater than 50 percent cut to Earth science, down to $1.033 billion; and a 30 percent cut to Planetary science, down to $1.929 billion."  The proposed cuts to NOAA are similarly deep, seeking to end climate study in the agency, as Science puts it. The full presidential budget request, including NSF, DOE, NIST, etc. is still to come.  Remember, Congress in the past has often essentially ignored presidential budget requests.  It is unclear if the will exists to do so now. 
  • Speaking of NSF, the graduate research fellowship program award announcements for this year came out this past week.  The agency awarded slightly under half as many of these prestigious 3-year fellowships as in each of the last 15 years.  I can only presume that this is because the agency is deeply concerned about its budgets for the next couple of fiscal years.
  • Grants are being frozen at several top private universities - these include Columbia (new cancellations), the University of Pennsylvania (here), Harvard (here), Northwestern and Cornell (here), and Princeton (here).  There are various law suits filed about all of these.  Princeton and Harvard have been borrowing money (issuing bonds) to partly deal with the disruption as litigation continues.  The president of Princeton has been more vocal than many about this.
  • There has been a surge in visa revocations and unannounced student status changes in SEVIS for international students in the US.  To say that this is unsettling is an enormous understatement.  See here for a limited discussion.  There seems to be deep reluctance for universities to speak out about this, presumably from the worry that saying the wrong thing will end up placing their international students and scholars at greater exposure.
  • On Friday evening, the US Department of Energy put out a "policy flash", stating that indirect cost rates on its grants would be cut immediately to 15%.  This sounds familiar.  Legal challenges are undoubtedly beginning.  
  • Added bonus:  According to the Washington Post, DOGE (whatever they say they are this week) is now in control of grants.gov, the website that posts funding opportunities.  As the article says, "Now the responsibility of posting these grant opportunities is poised to rest with DOGE — and if its employees delay those postings or stop them altogether, 'it could effectively shut down federal-grant making,' said one federal official who spoke on the condition of anonymity to describe internal operations."  
None of this is good news for the future of science and engineering research in the US.  If you are a US voter and you think that university-based research is important, I encourage you to contact your legislators and make your opinions heard.  

(As I have put in my profile, what I write here are my personal opinions; I am not in any way speaking for my employer.  That should be obvious, but it never hurts to state it explicitly.)

Update:  NSF has "disestablished" the advisory committees associated with its directorates (except the recently created TIP directorate).  Coverage here in Science.   This is not good, and I worry that it bodes ill for large cutbacks.

Update 4/18:  NSF is now terminating active grants, having "updated" their "priorities".  

Saturday, April 12, 2025

What is multiferroicity?

(A post summarizing recent US science-related events will be coming later.  For now, here is my promised post about multiferroics, inspired in part by a recent visit to Rice by Yoshi Tokura.)

Electrons carry spins and therefore magnetic moments (that is, they can act in some ways like little bar magnets), and as I was teaching undergrads this past week, under certain conditions some of the electrons in a material can spontaneously develop long-range magnetic order.  That is, rather than being, on average, randomly oriented, instead below some critical temperature the spins take on a pattern that repeats throughout the material.  In the ordered state, if you know the arrangement of spins in one (magnetic) unit cell of the material, that pattern is repeated over many (perhaps all, if the system is a single domain) the unit cells.  In picking out this pattern, the overall symmetry of the material is lowered compared to the non-ordered state.  (There can be local moment magnets, when the electrons with the magnetic moments are localized to particular atoms; there can also be itinerant magnets, when the mobile electrons in a metal take on a net spin polarization.)  The most famous kind of magnetic order is ferromagnetism, when the magnetic moments spontaneously align along a particular direction, often leading to magnetic fields projected out of the material.    Magnetic materials can be metals, semiconductors, or insulators.

In insulators, an additional kind of order is possible, based on electric polarization, \(\mathbf{P}\).  There is subtlety about defining polarization, but for the purposes of this discussion, the question is whether the atoms within each unit cell bond appropriately and are displaced below some critical temperature to create a net electric dipole moment, leading to ferroelectricity.  (Antiferroelectricity is also possible.) Again, the ordered state has lower symmetry than the non-ordered state.  Ferroelectric materials have some interesting applications.  

BiFeO3, a multiferroic antiferromagnet,
image from here.

Multiferroics are materials that have simultaneous magnetic order and electric polarization order.  A good recent review is here.  For applications, obviously it would be convenient if both the magnetic and polarization ordering happened well above room temperature.  There can be deep connections between the magnetic order and the electric polarization - see this paper, and this commentary.   Because of these connections, the low energy excitations of multiferroics can be really complicated, like electromagnons.  Similarly, there can be combined "spin textures" and polarization textures in such materials - see here and here.   Multiferroics raise the possibility of using applied voltages (and hence electric fields) to flip \(\mathbf{P}\), and thus toggle around \(\mathbf{M}\).  This has been proposed as a key enabling capability for information processing devices, as in this approach.  These materials are extremely rich, and it feels like their full potential has not yet been realized.  

Sunday, March 30, 2025

Science updates - brief items

Here are a couple of neat papers that I came across in the last week.  (Planning to write something about multiferroics as well, once I have a bit of time.)

  • The idea of directly extracting useful energy from the rotation of the earth sounds like something out of an H. G. Wells novel.  At a rough estimate (and it's impressive to me that AI tools are now able to provide a convincing step-by-step calculation of this; I tried w/ gemini.google.com) the rotational kinetic energy of the earth is about \(2.6 \times 10^{29}\) J.  The tricky bit is, how do you get at it?  You might imagine constructing some kind of big space-based pick-up coil and getting some inductive voltage generation as the earth rotates its magnetic field past the coil.  Intuitively, though, it seems like while sitting on the (rotating) earth, you should in some sense be comoving with respect to the local magnetic field, so it shouldn't be possible to do anything clever that way.  It turns out, though, that Lorentz forces still apply when moving a wire through the axially symmetric parts of the earth's field.  This has some conceptual contact with Faraday's dc electric generator.   With the right choice of geometry and materials, it is possible to use such an approach to extract some (tiny at the moment) power.  For the theory proposal, see here.  For an experimental demonstration, using thermoelectric effects as a way to measure this (and confirm that the orientation of the cylindrical shell has the expected effect), see here.  I need to read this more closely to decide if I really understand the nuances of how it works.
  • On a completely different note, this paper came out on Friday.  (Full disclosure:  The PI is my former postdoc and the second author was one of my students.)  It's an impressive technical achievement.  We are used to the fact that usually macroscopic objects don't show signatures of quantum interference.  Inelastic interactions of the object with its environment effectively suppress quantum interference effects on some time scale (and therefore some distance scale).  Small molecules are expected to still show electronic quantum effects at room temperature, since they are tiny and their electronic levels are widely spaced, and here is a review of what this could do in electronic measurements.  Quantum interference effects should also be possible in molecular vibrations at room temperature, and they could manifest themselves through the vibrational thermal conduction through single molecules, as considered theoretically here.  This experimental paper does a bridge measurement to compare the thermal transport between a single-molecule-containing junction between a tip and a surface, and an empty (farther spaced) twin tip-surface geometry.  They argue that they see differences between two kinds of molecules that originate from such quantum interference effects.
As for more global issues about the US research climate, there will be more announcements soon about reductions in force and the forthcoming presidential budget request.  (Here is an online petition regarding the plan to shutter the NIST atomic spectroscopy group.)  Please pay attention to these issues, and if you're a US citizen, I urge you to contact your legislators and make your voice heard.  

Thursday, March 20, 2025

March Meeting 2025, Day 4 and wrap-up

 I saw a couple of interesting talks this morning before heading out:

  • Alessandro Chiesa of Parma spoke about using spin-containing molecules potentially as qubits, and about chiral-induced spin selectivity (CISS) in electron transfer.  Regarding the former, here is a review.  Spin-containing molecules can have interesting properties as single qubits, or, for spins higher than 1/2, qudits, with unpaired electrons often confined to a transition metal or rare earth ion somewhat protected from the rest of the universe by the rest of the molecule.  The result can be very long coherence times for their spins.  Doing multi-qubit operations is very challenging with such building blocks, however.  There are some theory proposals and attempts to couple molecular qubits to superconducting resonators, but it's tough!   Regarding chiral induced spin selectivity, he discused recent work trying to use molecules where a donor region is linked to an acceptor region via a chiral bridge, and trying to manipulate spin centers this way.  A question in all the CISS work is, how can the effects be large when spin-orbit coupling is generally very weak in light, organic molecules?  He has a recent treatment of this, arguing that if one models the bridge as a chain of sites with large \(U/t\), where \(U\) is the on-site repulsion energy and \(t\) is the hopping contribution, then exchange processes between sites can effectively amplify the otherwise weak spin-orbit effects.  I need to read and think more about this.
  • Richard Schlitz of Konstanz gave a nice talk about some pretty recent research using a scanning tunneling microscope tip (with magnetic iron atoms on the end) to drive electron paramagnetic resonance in a single pentacene molecule (sitting on MgO on Ag, where it tends to grab an electron from the silver and host a spin).  The experimental approach was initially explained here.  The actual polarized tunneling current can drive the resonance, and exactly how depends on the bias conditions.  At high bias, when there is strong resonant tunneling, the current exerts a damping-like torque, while at low bias, when tunneling is far off resonance, the current exerts a field-like torque.  Neat stuff.
  • Leah Weiss from Chicago gave a clear presentation about not-yet-published results (based on earlier work), doing optically detected EPR of Er-containing molecules.  These condense into mm-sized molecular crystals, with the molecular environment being nice and clean, leading to very little inhomogeneous broadening of the lines.  There are spin-selective transitions that can be driven using near telecom-wavelength (1.55 \(\mu m\)) light.  When the (anisotropic) \(g\)-factors of the different levels are different, there are some very promising ways to do orientation-selective and spin-selective spectroscopy.  Looking forward to seeing the paper on this.
And that's it for me for the meeting.  A couple of thoughts:
  • I'm not sold on the combined March/April meeting.  Six years ago when I was a DCMP member-at-large, the discussion was all about how the March Meeting was too big, making it hard to find and get good deals on host sites, and maybe the meeting should split.  Now they've made it even bigger.  Doesn't this make planning more difficult and hosting more expensive since there are fewer options?  (I'm not an economist, but....)  A benefit for the April meeting attendees is that grad students and postdocs get access to the career/networking events held at the MM.  If you're going to do the combination, then it seems like you should have the courage of your convictions and really mingle the two, rather than keeping the March talks in the convention center and the April talks in site hotels.
  • I understand that van der Waals/twisted materials are great laboratories for physics, and that topological states in these are exciting.  Still, by my count there were 7 invited sessions broadly about this topic, and 35 invited talks on this over four days seems a bit extreme.  
  • By my count, there were eight dilution refrigerator vendors at the exhibition (Maybell, Bluefors, Ice, Oxford, Danaher/Leiden, Formfactor, Zero-Point Cryo, and Quantum Design if you count their PPMS insert).  Wow.  
I'm sure there will be other cool results presented today and tomorrow that I am missing - feel free to mention them in the comments.

Wednesday, March 19, 2025

March Meeting 2025, Day 3

Another busy day at the APS Global Physics Summit.  Here are a few highlights:

  • Shahal Ilani of the Weizmann gave an absolutely fantastic talk about his group's latest results from their quantum twisting microscope.  In a scanning tunneling microscope, because tunneling happens at an atomic-scale location between the tip and the sample, the momentum in the transverse direction is not conserved - that is, the tunneling averages over a huge range of \(\mathbf{k}\) vectors for the tunneling electron.  In the quantum twisting microscope, electrons tunnel from a flat (graphite) patch something like \(d \sim\) 100 nm across, coherently, through a couple of layers of some insulator (like WSe2) and into a van der Waals sample.  In this case, \(\mathbf{k}\) in the plane is comparatively conserved, and by rotating the sample relative to the tip, it is possible to build up a picture of the sample's electronic energy vs. \(\mathbf{k}\) dispersion, rather like in angle-resolved photoemission.  This has allowed, e.g., mapping of phonons via inelastic tunneling.  His group has applied this to magic angle twisted bilayer graphene, a system that has a peculiar combination of properties, where in some ways the electrons act like very local objects, and in other ways they act like delocalized objects.  The answer seems to be that this system at the magic angle is a bit of an analog of a heavy fermion system, where there are sort of local moments (living in very flat bands) interacting and hybridizing with "conduction" electrons (bands crossing the Fermi level at the Brillouin zone center).  The experimental data (movies of the bands as a function of energy and \(\mathbf{k}\) in the plane as the filling is tuned via gate) are gorgeous and look very much like theoretical models.
  • I saw a talk by Roger Melko about applying large language models to try to get efficient knowledge of many-body quantum states, or at least the possible outputs of evolution of a quantum system like a quantum computer based on Rydberg atoms.  It started fairly pedagogically, but I confess that I got lost in the AI/ML jargon about halfway through.
  • Francis M. Ross, recipient of this year's Keithley Award, gave a great talk about using transmission electron microscopy to watch the growth of materials in real time.  She had some fantastic videos - here is a review article about some of the techniques used.  She also showed some very new work using a focused electron beam to make arrays of point defects in 2D materials that looks very promising.
  • Steve Kivelson, recipient of this year's Buckley Prize, presented a very nice talk about his personal views on the theory of high temperature superconductivity in the cuprates.  One basic point:  these materials are balancing between multiple different kinds of emergent order (spin density waves, charge density waves, electronic nematics, perhaps pair density waves).   This magnifies the effects of quenched disorder, which can locally tip the balance one way or another.  Recent investigations of the famous 2D square lattice Hubbard model show this as well.  He argues that the ground state of the Hubbard model for a broad range \(1/2 < U/t < 8\), where \(U\) is the on-site repulsion and \(t\) is the hopping term, the ground state is in fact a charge density wave, not a superconductor.  However, if there is some amount of disorder in the form of \(\delta t/t \sim 0.1-0.2\), the result is a robust, unavoidable superconducting state.  He further argues that increasing the superconducting transition temperature requires striking a balance between the underdoped case (strong pairing, weak superfluid phase stiffness) and the overdoped case (weak pairing, strong superfluid stiffness), and that one way to achieve this would be in a bilayer with broken mirror symmetry (say different charge reservoir layers above and below, and/or a big displacement field perpendicular to the plane).  (Apologies for how technical that sounded - hard to reduce that one to something super accessible without writing much more.)
A bit more tomorrow before I depart back to Houston.

March Meeting 2025, Day 2

I spent a portion of today catching up with old friends and colleagues, so fewer highlights, but here are a couple:

  • Like a few hundred other people, I went to the invited talk by Chetan Nayak, leader of Microsoft's quantum computing effort. It was sufficiently crowded that the session chair warned everyone about fire code regulations and that people should not sit on the floor blocking the aisles.  To set the landscape:  Microsoft's approach to quantum computing is to develop topological qubits based on interesting physics that is predicted to happen (see here and here) if one induces superconductivity (via the proximity effect) in a semiconductor nanowire with spin-orbit coupling.  When the right combination of gate voltage and external magnetic field is applied, the nanowire should cross into a topologically nontrivial state with majorana fermions localized to each end of the nanowire, leading to "zero energy states" seen as peaks in the conductance \(dI/dV\) centered at zero bias (\(V=0\)).  A major challenge is that disorder in these devices can lead to other sources of zero-bias peaks (Andreev bound states).  A 2023 paper outlines a protocol that is supposed to give good statistical feedback on whether a given device is in the topologically interesting or trivial regime.  I don't want to rehash the history of all of this.  In a paper published last month, a single proximitized, gate-defined InAs quantum wire is connected to a long quantum dot to form an interferometer, and the capacitance of that dot is sensed via RF techniques as a function of the magnetic flux threading the interferometer, showing oscillations with period \(h/2e\), interpreted as charge parity oscillations of the proximitized nanowire.  In new data, not yet reported in a paper, Nayak presented measurements on a system comprising two such wires and associated other structures.  The argument is that each wire can be individually tuned simultaneously into the topologically nontrivial regime via the protocol above.  Then interferometer measurements can be performed in one wire (the Z channel) and in a configuration involving two ends of different wires (the X channel), and they interpret their data as early evidence that they have achieved the desired majorana modes and their parity measurements.  I look forward to when a paper is out on this, as it is hard to make informed statements about this based just on what I saw quickly on slides from a distance.  
  • In a completely different session, Garnet Chan gave a very nice talk about applying advanced quantum chemistry and embedding techniques to look at some serious correlated materials physics.  Embedding methods are somewhat reminiscent of mean field theories:  Instead of trying to solve the Schrödinger equation for a whole solid, for example, you can treat the solid as a self-consistent theory of a unit cell or set of unit cells embedded in a more coarse-grained bath (made up of other unit cells appropriately averaged).  See here, for example. He presented recent results on computing the Kondo effect of magnetic impurities in metals, understanding the trends of antiferromagnetic properties of the parent cuprates, and trying to describe superconductivity in the doped cuprates.  Neat stuff.
  • In the same session, my collaborator Silke Buehler-Paschen gave a nice discussion of ways to use heavy fermion materials to examine strange metals, looking beyond just resistivity measurements.  Particularly interesting is the idea of trying to figure out quantum Fisher information, which in principle can tell you how entangled your many-body system is (that is, estimating how many other degrees of freedom are entangled with one particular degree of freedom).  See here for an intro to the idea, and here for an implementation in a strange metal, Ce3Pd20Si6.  
More tomorrow....

(On a separate note, holy cow, the trade show this year is enormous - seems like it's 50% bigger than last year.  I never would have dreamed when I was a grad student that you could go to this and have your pick of maybe 10 different dilution refrigerator vendors.  One minor mystery:  Who did World Scientific tick off?  Their table is located on the completely opposite side of the very large hall from every other publisher.)

Monday, March 17, 2025

March Meeting 2025, Day 1

The APS Global Physics Summit is an intimate affair, with a mere 14,000 attendees, all apparently vying for lunch capacity for about 2,000 people.   The first day of the meeting was the usual controlled chaos of people trying to learn the layout of the convention center while looking for talks and hanging out having conversations.  On the plus side, the APS wifi seems to function well, and the projectors and slide upload system are finally technologically mature (though the pointers/clickers seem to have some issues).  Some brief highlights of sessions I attended:

  • I spent the first block of time at this invited session about progress in understanding quantum spin liquids and quantum spin ice.  Spin ices are generally based on the pyrochlore structure, where atoms hosting local magnetic moments sit at the vertices of corner-sharing tetrahedra, as I had discussed here.  The idea is that the crystal environment and interactions between spins are such that the moments are favored to satisfy the ice rules, where in each tetrahedron two moments point inward toward the center and two point outward.  Classically there are a huge number of spin arrangements that all have about the same ground state energy.  In a quantum spin ice, the idea is that quantum fluctuations are large, so that the true ground state would be some enormous superposition of all possible ice-rule-satistfying configurations.  One consequence of this is that there are low energy excitations that look like an emergent form of electromagnetism, including a gapless phonon-like mode.  Bruce Gaulin spoke about one strong candidate quantum spin ice, Ce2Zr2O7, in a very pedagogical talk that covered all this.  A relevant recent review is this one.   There were two other talks in the session also about pyrochlores, an experimentally focused one by Sylvain Petit discussing Tb2Ti2O7 (see here), and a theory talk by Yong-Baek Kim focused again on the cerium zirconate.    Also in the session was an interesting talk by Jeff Rau about K2IrCl6, a material with a completely different structure that (above its ordering temperature of 3 K) acts like a "nodal line spin liquid".
  • In part because I had students speaking there, I also attended a contributed session about nanomaterials (wires, tubes, dots, particles, liquids).  There were some neat talks.  The one that I found most surprising was from the Cha group at Cornell, where they were using a method developed by the Schroer group at Yale (see here and here) to fabricate nanowires of two difficult to grow, topologically interesting metals, CoIn3 and RhIn3.  The idea is to create a template with an array of tubular holes, and squeeze that template against a bulk crystal of the desired material at around 350C, so that the crystal is extruded into the holes to form wires.  Then the template can be etched away and the wires recovered for study.  I'm amazed that this works.
  • In the afternoon, I went back and forth between the very crowded session on fractional quantum anomalous Hall physics in stacked van der Waals materials, and a contributed session about strange metals.  Interesting stuff for sure.
I'm still trying to figure out what to see tomorrow, but there will be another update in the evening.

Sunday, March 16, 2025

March Meeting 2025, Day 0

Technically, this year the conference is known as the APS Global Physics Summit rather than the March Meeting, but I'm keeping my blog post titles consistent with previous years.   Over 14,000 physicists have descended upon Anaheim, and there are parallel events in more than a dozen countries around the world as well.

Late this afternoon I attended an APS town hall session about "Protecting Science".  There were brief remarks by APS President John Doyle, APS CEO Jonathan Bagger, and APS External Affairs Officer Francis "Slake" Slakey, followed by an audience Q&A.  It was a solid event attended by about 300 people in person and more online, as the society tries to thread its way through some very challenging times for science and scholarship in the US.  Main take-aways from the intro remarks:

  • The mission and values of the APS have not changed. 
  • Paraphrasing:  We must explain to the public and officials the wonder of science and the economic impact of what we do.  Discovery and application reinforce each other, and this dynamic is what drives progress.  We need the public to hear this.  We need Congress to hear this.  We need the executive branch and its advisors to hear this.   APS needs to promote physics, and physicists need to tell the truth, even when uncomfortable.  The truth is our currency with the public.  It is our superpower.  APS is not a blue or red state organization; it's an organization that champions physics.
  • Slake thanked and asked the audience to stand and thank the many federal science agency employees who are feeling dispirited and unsupported.  "You are part of this community and no federal disruption is going to change that."
  • Slake also mentioned that the critical short-term issue is the upcoming budget.  The White House will announce its version in April, and the APS is pursuing a 50-state coordinated approach to have people speak to their congressional delegations in their states and districts, to explain what the harm and true costs are if the science agency budgets are slashed.  They are targeting certain key states in particular (Alaska, Kansas, Indiana, Pennsylvania, Maine, South Dakota were mentioned).
  • APS is continuing its support for bridge and mentorship programs, as well as the STEP-UP program; see here.  These programs are open to all.  
Tomorrow, some highlights of the scientific program.  Apologies for unavoidably missing a lot of cool stuff - I go to my students' sessions and try to see other topics that interest me, but because the meeting is so large, with so many parallel talks, I know that I inevitably can't see all the exciting science.

Tuesday, March 11, 2025

The 2025 Wolf Prize in Physics

One nice bit of condensed matter/nanoscale physics news:  This year's Wolf Prize in Physics has gone to three outstanding scientists, Jim Eisenstein, Moty Heiblum, and Jainendra Jain, each of whom have done very impactful work involving 2D electron gases - systems of electrons confined to move only in two dimensions by the electronic structure and alignment of energy bands at interfaces between semiconductors.  Of particular relevance to these folks are the particularly clean 2D electron gases at the interfaces between GaAs and AlGaAs, or in GaAs quantum wells embedded in AlGaAs.

A thread that connects all three of these scientists is the fractional quantum Hall effect in these 2D systems.  Electrons confined to move in 2D, in the presence of a magnetic field perpendicular to the plane of motion, form a remarkable system.  The quantum wavefunction of an electron in this situation changes as the magnetic induction \(B\) is increased.  The energy levels of such an electron are given by \((n+1/2)\hbar \omega_{c}\), where \(\omega_c \equiv eB/m*\) is the cyclotron frequency.  These energy levels are called Landau Levels.  The ratio between the 2D density of electrons and the density of magnetic flux in fundamental units (\(B/(h/e)\)) is called the "filling factor", \(\nu\), and when this is an integer, the Hall conductance is quantized in fundamental units - see here.  
Figure 4 from this article by Jain, with \(R_{xx}(B)\) data from here.  Notice how the data around \(B=0\) looks a lot like the data around \(\nu = 1/2\), which looks a lot like the data around \(\nu=1/4\). 

A remarkable thing happens when \(\nu = 1/2\) - see the figure above.  There is no quantum Hall effect there; in fact, if you look at the longitudinal resistance \(R_{xx}\) as a function of \(B\) near \(\nu = 1/2\), it looks remarkably like \(R_{xx}(B)\) near \(B = 0\).  At this half-integer filling factor, the 2D electrons plus the magnetic flux "bundle together", leading to a state with new low-energy excitations called composite fermions that act like they are in zero magnetic field.  In many ways the FQHE looks like the integer quantum Hall effect for these composite fermions, though the situation is more complicated than that.  Jainendra Jain did foundational work on the theory of composite fermions, among many other things.

Jim Eisenstein has done a lot of great experimental work involving composite fermions and even-denominator FQH states.  My postdoctoral mentor, Bob Willett, and he are first two authors on the paper where an unusual quantum Hall state was discovered at \(\nu = 5/2\), a state still under active investigation for potential topological quantum computing applications.   One particularly surprising result from Eisenstein's group was the discovery that some "high" Landau level even-denominator fillings (\(\nu = 9/2, 11/2\)) showed enormously anisotropic resistances, with big differences between \(R_{xx}\) and \(R_{yy}\), an example of the onset of a "stripe" phase of alternating fillings.  

Another very exciting result from Eisenstein's group used 2D electron gases in close proximity parallel layers and in high magnetic fields, as well as 2D electron gases near 2D hole gases.  Both can allow the formation of excitons, bound states of electrons and holes, but with the electrons and holes in neighboring layers so that they could not annihilate each other.  Moreover, a Bose-Einstein condensation of those excitons is possible leading to remarkable superflow of excitons and resonant tunneling between the layers.  This review article is a great discussion of all of this.

Moty Heiblum's group at the Weizmann Institute has been one of the world-leading groups investigating "mesoscopic" physics of confined electrons in the past 30+ years.  They have performed some truly elegant experiments using 2D electron gases as their platform.  A favorite of mine (mentioned in my textbook) is this one, in which they make a loop-shaped interferometer for electrons which shows oscillations in the conductance as they thread magnetic flux through the loop; they then use a nearby quantum point contact as a charge sensor near one arm of the interferometer, a which-path detector that tunably suppresses the quantum interference. 

His group also did foundational work on the use of shot noise as a tool to examine the nature and transport of charge carriers in condensed matter systems (an idea that I found inspiring).  Their results showing that the quasiparticles in the fractional quantum Hall regime can have fractional charges are remarkable.  More recently, they have shown how subtle these measurements really can be, in 2D electron systems that can support neutral excitations as well as charged ones.

All in all, this is a great recognition of outstanding scientists for a large volume of important, influential work.

(On a separate note:  I will be attending 3+ days of the APS meeting next week.  I'll try to do my usual brief highlight posts, time permitting.  If people have suggestions of cool content, please let me know.)

Thursday, March 06, 2025

Some updates on the NSF and related issues

Non-blog life has been very busy, and events have been changing rapidly, but I thought it would be a good idea to give a brief bulleted list of updates regarding the NSF and associated issues:
  • A court decision regarding who has the authority to fire probationary federal workers has led to the NSF hiring back 84 of the employees that it had previously dismissed, at least for now.  The Office of Personnel Management is still altering their wording on this.
  • There is likely some kind of continuing resolution in the offing in Congress, as the current funding stopgap expires on March 14.  If a CR passes that extends to the rest of the fiscal year (Sept 30), that would stave off any big cuts until next FY's budget.
  • At the same time, a number of NSF-funded research experience for undergraduate programs are being cancelled for this year.  This is very unfortunate, as REU programs are many undergrads' first exposure to real research, while also being a critical mechanism for students at non-research-heavy institutions to get research experience.
  • The concerns about next year's funding are real.  As I've written before, cuts and programmatic changes have been proposed by past presidents (including this one in his first term), but historically Congressional appropriators have tended not to follow those.  It seems very likely that the White House's budget proposal will be very bleak for science.  The big question is the degree to which Congress will ignore that.  
  • In addition to the budget, agencies (including NSF) have been ordered to prepare plans for reductions in force - staffing cuts - with deadlines to prepare those plans by 13 March and another set of plans by 14 April. 
  • Because of all this, a number of universities are cutting back on doctoral program admissions (either in specific departments or more broadly).  My sense is that universities with very large components of NIH funding thanks to medical schools are being particularly cautious.  Schools are being careful because many places guarantee some amount of support for at least several years, and it's difficult for them to be full-speed-ahead given uncertainties in federal sponsor budgets, possible endowment taxes, possible revisions to indirect cost policies, etc.
Enormous uncertainty remains in the wake of all of this activity, and this period of comparative quiet before the staffing plans and CR are due is an eerie calm.  (Reminds me of the line from here, about how it can be unsettling when a day goes by and you don't hear anything about the horse loose in the hospital.)

In other news, there is a national Stand Up for Science set of rallies tomorrow.  Hopefully the net impact of this will be positive.  The public and our legislators need to understand that support for basic science is not a partisan issue and has been the underpinning of enormous economic and technological progress.

Update:  My very brief remarks at the Stand Up for Science event at Rice today:

Hello everyone –

Thanks for turning out for this important event.  

Science research has shaped the world we know.  Our understanding of the universe (physics, chemistry, biology, mathematics, and all the engineering disciplines that have come from those foundations) is one of humanity’s great intellectual achievements.  We know a lot, and we know enough to know that we still have much more to learn.

One of the great things about basic science is that you never know where it can lead.  Basic research into heat-tolerant bacteria gave us the polymerase chain reaction technique, which led to the mapping of genomes, enormous advances in biology and medicine, and a lot of unrealistic scenes in TV police procedurals.  Basic research into semiconductors gave us the light emitting diode, which has transformed lighting around the world and given us the laser pointer, the blue ray, and those annoying programmable holiday lights you see all the time.

Particularly since WWII, science research has been supported by the US government, with the idea that while industry is good at many things, there is a need for public support of research that does not have a short-term profit motive.  

Thanks to several agencies (the National Science Foundation, the National Institutes of Health, the Department of Energy, the Department of Defense, and others), the result has led to enormous progress, and great economic and societal benefit to the country and the world.  

We need to remind everyone – the general person on the street and the politicians in Austin and Washington – that science research and education is vital for our future.  Science is not partisan, and good science can and should inform policy making.  We face many challenges, and continued support for science and engineering research is essential to our future success.

Thanks again for turning out, and let’s keep reminding everyone that supporting science is incredibly important for all of us.